New Report Suggests Facebook Ignored Research Which Indicated That it Contributed to Societal Division

/578621

  • New Report Suggests Facebook Ignored Research Which Indicated That it Contributed to Societal Division | Social Media Today
    https://www.socialmediatoday.com/news/new-report-suggests-facebook-ignored-research-which-indicated-that-it-contr/578621

    Has Facebook made use more divided, and more likely to ’take sides’ in political debate?

    It certainly seems that way, with ’us against them’ tribalism now a part of almost every major discussion - even medical advice has seemingly become a point of political contention in the modern age. Of course, such division has always existed, at least to some degree - but has Facebook, and social media more broadly, made it worse than ever before?

    This became a key point of discussion in the aftermath of the 2016 US Presidential Election, with suggestions that Russian troll farms and political activist groups had been using Facebook to influence voter opinions through targeted, manipulative posts and ads.

    Is that possible? Could our minds really be changed by the content displayed in our News Feeds?

    Access now➔

    TRENDLINE
    Everything you need to know about personalization
    Explore how major retailers like E.l.f. Cosmetics and Shiseido are leveraging personalized experiences in 2020.

    Past research has shown that, indeed, voter action can be influenced by what people see on Facebook. And according to a new report by The Wall Street Journal, Facebook is well aware of this, with The Social Network conducting a study back in 2018 which found that the platform’s notorious algorithms “exploit the human brain’s attraction to divisiveness”.

    So what did Facebook do in response?

    As per WSJ:

    “Mr. Zuckerberg and other senior executives largely shelved the basic research [...] and weakened or blocked efforts to apply its conclusions to Facebook products. Facebook policy chief Joel Kaplan, who played a central role in vetting proposed changes, argued at the time that efforts to make conversations on the platform more civil were “paternalistic,” said people familiar with his comments.”

    Really, the revelation is not surprising - not so much in the context that Facebook might choose to ignore it, but in the first point, that Facebook can exacerbate division.

    Indeed, Facebook’s own executives have indirectly pointed to such in their various comments on the topic - earlier this year, Facebook’s head of VR and AR Andrew Bosworth published a long explanation of his personal thoughts on various Facebook controversies, including the idea that Facebook increases societal divides.

    Bosworth refuted the suggestion that Facebook reinforces political opinions through a ’filter bubble’ effect, an oft-laid claim against the platform, because if anything, Facebook users actually see content from more sources on any given subject, not less.

    But then again, that’s not necessarily beneficial either - as explained by Bosworth:

    “What happens when you see more content from people you don’t agree with? Does it help you empathize with them as everyone has been suggesting? Nope. It makes you dislike them even more.” 

    Bosworth noted that, contrary to popular opinion, Facebook actually exposes users to significantly more content sources than they would have seen in times before the internet.

    “Ask yourself how many newspapers and news programs people read/watched before the internet. If you guessed “one and one” on average you are right, and if you guessed those were ideologically aligned with them you are right again. The internet exposes them to far more content from other sources (26% more on Facebook, according to our research).”

    Facebook’s COO Sheryl Sandberg quoted this same research in October last year, noting more specifically that 26% of the news which Facebook users see in their represents “another point of view.”

    Get social media news like this in your inbox daily. Subscribe to Social Media Today:
    Email: Sign up
    But again, that’s not necessarily a good thing - in working to defend the idea that Facebook doesn’t cause division through one process (only showing you content that you’re more likely to agree with), Facebook’s execs also inadvertently highlighted how it causes the same type of division through the opposite - showing you more content, from more sources, according to Facebook’s own leaders, likely increases political polarization.

    So again, it comes as little surprise that researchers came to the same conclusion in regards to how Facebook could, as noted by WSJ, “exploit the human brain’s attraction to divisiveness”. Facebook knows this, they’ve stated such already. The problem is that there’s no simple solution to addressing such.

    That’s likely why Facebook seemingly shelved the findings, because there’s no way for it to police such and change inherent behaviors. 

    Interestingly, a few months before these findings were reportedly presented to Facebook’s leadership team, Facebook published a blog post titled ’Is Spending Time on Social Media Bad for Us?​’, in which it explored the various negative consequences of social media engagement.

    The conclusion of that report?

    “According to the research, it really comes down to how you use the technology.”

    The findings in this instance suggested that passive consumption of social media content - reading but not interacting with people - lead to negative mental health impacts, but active interaction, including “sharing messages, posts and comments with close friends and reminiscing about past interactions”, had positive benefits.

    In this instance, Facebook wasn’t looking at the polarizing effect of what people share on social platforms, specifically, but the finding is more closely aligned with something that Facebook can actually control. The platform subsequently made algorithm tweaks “to provide more opportunities for meaningful interactions and reduce passive consumption of low-quality content”, while it also added more tools to help people address negative usage behaviors.

    But Facebook can’t stop people sharing what they choose to on the platform (within reason). Otherwise it’s not a social network, it’s not facilitating discussion relevant to what people are looking to talk about. That exposure to what people personally align with, in terms of their political leanings, their personal opinions, etc., that’s an element that we’ve never had in times past, and while there is significant value in enabling connection around such, that’s also likely the cause of increased division and angst.

    In the past, for example, you wouldn’t have known that your Uncle Barry was a left-leaning Democrat or a right-leaning Republican. But now you do - and in sharing his opinions, Barry is able to align with more Facebook users who agree with his perspective, facilitating like-minded community. That’s how Facebook is fueling increased division. But how do you fix that? How do you stop Barry sharing his opinions, without changing the entire approach of the platform?

    Facebook can’t essentially stop users sharing what interests them, which means it can only really set clear parameters around what can be shared, and what’s not allowed. And for this, Facebook has set up its new Content Oversight Board, which will provide advice on how the platform can improve its systems and processes for the greater good. 

    So while the suggestion is that Facebook has ignored reports that highlight its role in fueling division, the reality, it seems, is that Facebook is working to improve on such, and to provide more tools to increase transparency, and give users more insight into how such division is occurring, including more data on ad targeting, fake news, conspiracy theories, etc. Facebook does seem to be acting on such - but if users choose not to investigate, if they choose to believe what they want, and share what aligns with their established beliefs, what then?

    So, does that mean that Facebook overall is a good or bad thing for society? Given its capacity to fuel division, that’s clearly a concern, but you also have to weigh that against the benefits that it provides in terms of facilitating community connection, giving people more ways than ever to keep in touch and come together for greater benefit.

    How you view each will come down to personal perspective - which is largely the same impetus that dictates what people share on the platform overall.

    Who can say what people should and should not be able to share on the network, beyond what’s acceptable within the platform’s rules? And with that being a factor, how can you stop Facebook, and other social platforms, from amplifying division?  

    NOTE: Facebook has provided SMT with this statement on the WSJ report:

    "We’ve learned a lot since 2016 and are not the same company today. We’ve built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform’s impact on society so we continue to improve. Just this past February we announced $2M in funding to support independent research proposals on polarization.”

    #Facebook #Division #Polarisation #Ethique