• New Report Suggests Facebook Ignored Research Which Indicated That it Contributed to Societal Division | Social Media Today
    https://www.socialmediatoday.com/news/new-report-suggests-facebook-ignored-research-which-indicated-that-it-contr/578621

    Has Facebook made use more divided, and more likely to ’take sides’ in political debate?

    It certainly seems that way, with ’us against them’ tribalism now a part of almost every major discussion - even medical advice has seemingly become a point of political contention in the modern age. Of course, such division has always existed, at least to some degree - but has Facebook, and social media more broadly, made it worse than ever before?

    This became a key point of discussion in the aftermath of the 2016 US Presidential Election, with suggestions that Russian troll farms and political activist groups had been using Facebook to influence voter opinions through targeted, manipulative posts and ads.

    Is that possible? Could our minds really be changed by the content displayed in our News Feeds?

    Access now➔

    TRENDLINE
    Everything you need to know about personalization
    Explore how major retailers like E.l.f. Cosmetics and Shiseido are leveraging personalized experiences in 2020.

    Past research has shown that, indeed, voter action can be influenced by what people see on Facebook. And according to a new report by The Wall Street Journal, Facebook is well aware of this, with The Social Network conducting a study back in 2018 which found that the platform’s notorious algorithms “exploit the human brain’s attraction to divisiveness”.

    So what did Facebook do in response?

    As per WSJ:

    “Mr. Zuckerberg and other senior executives largely shelved the basic research [...] and weakened or blocked efforts to apply its conclusions to Facebook products. Facebook policy chief Joel Kaplan, who played a central role in vetting proposed changes, argued at the time that efforts to make conversations on the platform more civil were “paternalistic,” said people familiar with his comments.”

    Really, the revelation is not surprising - not so much in the context that Facebook might choose to ignore it, but in the first point, that Facebook can exacerbate division.

    Indeed, Facebook’s own executives have indirectly pointed to such in their various comments on the topic - earlier this year, Facebook’s head of VR and AR Andrew Bosworth published a long explanation of his personal thoughts on various Facebook controversies, including the idea that Facebook increases societal divides.

    Bosworth refuted the suggestion that Facebook reinforces political opinions through a ’filter bubble’ effect, an oft-laid claim against the platform, because if anything, Facebook users actually see content from more sources on any given subject, not less.

    But then again, that’s not necessarily beneficial either - as explained by Bosworth:

    “What happens when you see more content from people you don’t agree with? Does it help you empathize with them as everyone has been suggesting? Nope. It makes you dislike them even more.” 

    Bosworth noted that, contrary to popular opinion, Facebook actually exposes users to significantly more content sources than they would have seen in times before the internet.

    “Ask yourself how many newspapers and news programs people read/watched before the internet. If you guessed “one and one” on average you are right, and if you guessed those were ideologically aligned with them you are right again. The internet exposes them to far more content from other sources (26% more on Facebook, according to our research).”

    Facebook’s COO Sheryl Sandberg quoted this same research in October last year, noting more specifically that 26% of the news which Facebook users see in their represents “another point of view.”

    Get social media news like this in your inbox daily. Subscribe to Social Media Today:
    Email: Sign up
    But again, that’s not necessarily a good thing - in working to defend the idea that Facebook doesn’t cause division through one process (only showing you content that you’re more likely to agree with), Facebook’s execs also inadvertently highlighted how it causes the same type of division through the opposite - showing you more content, from more sources, according to Facebook’s own leaders, likely increases political polarization.

    So again, it comes as little surprise that researchers came to the same conclusion in regards to how Facebook could, as noted by WSJ, “exploit the human brain’s attraction to divisiveness”. Facebook knows this, they’ve stated such already. The problem is that there’s no simple solution to addressing such.

    That’s likely why Facebook seemingly shelved the findings, because there’s no way for it to police such and change inherent behaviors. 

    Interestingly, a few months before these findings were reportedly presented to Facebook’s leadership team, Facebook published a blog post titled ’Is Spending Time on Social Media Bad for Us?​’, in which it explored the various negative consequences of social media engagement.

    The conclusion of that report?

    “According to the research, it really comes down to how you use the technology.”

    The findings in this instance suggested that passive consumption of social media content - reading but not interacting with people - lead to negative mental health impacts, but active interaction, including “sharing messages, posts and comments with close friends and reminiscing about past interactions”, had positive benefits.

    In this instance, Facebook wasn’t looking at the polarizing effect of what people share on social platforms, specifically, but the finding is more closely aligned with something that Facebook can actually control. The platform subsequently made algorithm tweaks “to provide more opportunities for meaningful interactions and reduce passive consumption of low-quality content”, while it also added more tools to help people address negative usage behaviors.

    But Facebook can’t stop people sharing what they choose to on the platform (within reason). Otherwise it’s not a social network, it’s not facilitating discussion relevant to what people are looking to talk about. That exposure to what people personally align with, in terms of their political leanings, their personal opinions, etc., that’s an element that we’ve never had in times past, and while there is significant value in enabling connection around such, that’s also likely the cause of increased division and angst.

    In the past, for example, you wouldn’t have known that your Uncle Barry was a left-leaning Democrat or a right-leaning Republican. But now you do - and in sharing his opinions, Barry is able to align with more Facebook users who agree with his perspective, facilitating like-minded community. That’s how Facebook is fueling increased division. But how do you fix that? How do you stop Barry sharing his opinions, without changing the entire approach of the platform?

    Facebook can’t essentially stop users sharing what interests them, which means it can only really set clear parameters around what can be shared, and what’s not allowed. And for this, Facebook has set up its new Content Oversight Board, which will provide advice on how the platform can improve its systems and processes for the greater good. 

    So while the suggestion is that Facebook has ignored reports that highlight its role in fueling division, the reality, it seems, is that Facebook is working to improve on such, and to provide more tools to increase transparency, and give users more insight into how such division is occurring, including more data on ad targeting, fake news, conspiracy theories, etc. Facebook does seem to be acting on such - but if users choose not to investigate, if they choose to believe what they want, and share what aligns with their established beliefs, what then?

    So, does that mean that Facebook overall is a good or bad thing for society? Given its capacity to fuel division, that’s clearly a concern, but you also have to weigh that against the benefits that it provides in terms of facilitating community connection, giving people more ways than ever to keep in touch and come together for greater benefit.

    How you view each will come down to personal perspective - which is largely the same impetus that dictates what people share on the platform overall.

    Who can say what people should and should not be able to share on the network, beyond what’s acceptable within the platform’s rules? And with that being a factor, how can you stop Facebook, and other social platforms, from amplifying division?  

    NOTE: Facebook has provided SMT with this statement on the WSJ report:

    "We’ve learned a lot since 2016 and are not the same company today. We’ve built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform’s impact on society so we continue to improve. Just this past February we announced $2M in funding to support independent research proposals on polarization.”

    #Facebook #Division #Polarisation #Ethique

  • #Twitter is Considering Tipping via Tweet, New Identifiers for #Trolls
    https://www.socialmediatoday.com/news/twitter-is-considering-tipping-via-tweet-new-identifiers-for-trolls-and-mo/570337

    After years of criticism for its perceived failure to evolve its platform, and take action on key elements fast enough, Twitter says that it’s now "picking up the pace” of innovation, and looking to implement more changes and options to help improve the overall Twitter experience.

    We’ve already seen some of these in its updates to lists and the addition of topics to follow, and there are more on the way, with controls over who can reply to your tweets and it’s long-awaited ’conversational’ features. But these are just some of the elements that Twitter’s working on - according to new reports, Twitter is also looking into a new tipping option for tweets, improved identification of trolls, tweaks for lists, and more.

    Here’s an overview of some of the additional Twitter updates we may see in the near future.

    1. Tipping in Tweets

    With Facebook moving further into on-platform payments, and Twitter CEO Jack Dorsey’s enthusiasm for cryptocurrency, it may come as little surprise that Twitter too is looking at its own payment option, with tipping via tweet.

    As reported by The Information:

    “Twitter is considering a feature that will allow users to tip - sending each other money from their tweets - according to two people familiar with the company’s decisions. [...] Twitter and Square already partner to let users make donations to politicians through tweets, according to company filings.”

    There’s a lot to consider here - the capacity to exchange money via tweet could have significant implications for the service, and may provide a whole new revenue stream to popular tweet creators.

    Or not. The key strength of Twitter is the capacity to be able to contribute to public discourse, to have your say on any given topic and add that into the wider Twitter stream. If you were able to charge people to see your tweets - which is not currently the proposal, but maybe an extension of the same - that would also, theoretically, reduce your exposure, which may negate its value anyway.

    But still, no doubt those who share scoops and exclusive insights on Twitter will be doing the math in their heads, calculating what, exactly, their tweets are worth. The truth is, probably not much - but maybe, if they could call for contributions from their loyal fans, it could provide another incremental income stream, if the option were to go that route.

    More likely the option would be a boon for those sharing adult content on the platform - but still, the capacity to raise funds via tweet would open up a range of possibilities. And worth noting, YouTube recently reported that more than 100,000 of its channels are now earning money via its live-stream tipping option, called ’Super Chat’, with some streams generating more than $400 per minute “as fans reach out to creators to say hello, send congratulations, or just to connect”.

    The use case is obviously different, but it may provide some pointers as to where Twitter is looking on this front. 

    It’s also worth noting that various Twitter employees have said that, while this has been discussed, such a project is not actively in development at this stage.