industryterm:online influence

  • How Russia Hacked U.S. Politics With Instagram Marketing – Foreign Policy
    https://foreignpolicy.com/2018/12/17/how-russia-hacked-us-politics-with-instagram-marketing

    While the message itself was not aimed at swaying voters in any direction, researchers now believe it served another purpose for the Russian group: It boosted the reach of its account, likely won it new followers, and tried to establish the account’s bona fides as an authentic voice for the black community.

    That advertising pitch was revealed in a report released Monday by the Senate Intelligence Committee and produced by the cybersecurity firm New Knowledge. The report provides the most comprehensive look to date at the Kremlin’s attempt to boost Trump’s candidacy and offers a surprising insight regarding that campaign: Moscow’s operatives operated much like digital marketers, making use of Instagram to reach a huge audience.

    By blending marketing tactics with political messaging, the Internet Research Agency (IRA) established a formidable online presence in the run-up to the 2016 election (and later), generating 264 million total engagements—a measure of activity such as liking and sharing content—and building a media ecosystem across Facebook and Instagram.

    The authors of the report believe @blackstagram__ served as a vehicle for Kremlin propaganda targeting the American black community, skillfully adopting the language of Instagram, where viral marketing schemes exist side by side with artfully arranged photographs of toast.

    As Americans streamed to the polls on Nov. 8, 2016, @blackstagram__ offered its contribution to the Kremlin’s campaign to depress turnout, borrowing a line from a Michael Jackson song to tell African-Americans that their votes didn’t matter: “Think twice before you vote. All I wanna say is that they don’t really care about us. #Blacktivist #hotnews.”

    While the effect of the IRA’s coordinated campaign to depress voter turnout is difficult to assess, the evidence of the group’s online influence is stark. Of its 133 Instagram accounts, 12 racked up more than 100,000 followers—the typical threshold for being considered an online “influencer” in the world of digital marketing. Around 50 amassed more than 10,000 followers, making them what marketers call “micro-influencers.”

    These accounts made savvy use of hashtags, built relationships with real people, promoted merchandise, and targeted niche communities. The IRA’s most popular Instagram accounts included pages devoted to veterans’ issues (@american.veterans), American Christianity (@army_of_jesus), and feminism (@feminism_tag).

    In a measure of the agency’s creativity, @army_of_jesus appears to have been launched in 2015 as a meme account featuring Kermit the Frog. It then switched subjects and began exclusively posting memes related to the television show The Simpsons. By January 2016, the account had amassed a significant following and reached its final iteration with a post making extensive use of religious hashtags: ““#freedom #love #god #bible #trust #blessed #grateful.” It later posted memes comparing Democratic presidential nominee Hillary Clinton to Satan.

    “The Internet Research Agency operated like a digital marketing agency: develop a brand (both visual and voice), build presences on all channels across the entire social ecosystem, and grow an audience with paid ads as well as partnerships, influencers, and link-sharing,” the New Knowledge report concludes. “Instagram was perhaps the most effective platform.”

    #Instagram #Politique #USA #Russie #Médias_sociaux

  • How Russia Hacked U.S. Politics With Instagram Marketing – Foreign Policy
    https://foreignpolicy.com/2018/12/17/how-russia-hacked-us-politics-with-instagram-marketing

    The Internet Research Agency took to the photo-sharing network to boost Trump and depress voter turnout.

    Donald Trump as U.S. president, Kremlin operatives running a digital interference campaign in American politics scored a viral success with a post on Instagram.

    The post appeared on the account @blackstagram__, which was in fact being run by the Internet Research Agency, a Kremlin-linked troll farm that U.S. authorities say orchestrated an online campaign to boost Trump’s candidacy in 2016. It racked up 254,000 likes and nearly 7,000 comments—huge numbers for the Kremlin campaign.

    But oddly, the post contained no political content.

    Instead, it repurposed an ad for a women’s shoe, with a photo of women of different skin tones wearing the same strappy high heel in different colors. The caption pitched the shoes as a symbol of racial equality: “All the tones are nude! Get over it!

    While the message itself was not aimed at swaying voters in any direction, researchers now believe it served another purpose for the Russian group: It boosted the reach of its account, likely won it new followers, and tried to establish the account’s bona fides as an authentic voice for the black community.

    That advertising pitch was revealed in a report released Monday by the Senate Intelligence Committee and produced by the cybersecurity firm New Knowledge. The report provides the most comprehensive look to date at the Kremlin’s attempt to boost Trump’s candidacy and offers a surprising insight regarding that campaign: Moscow’s operatives operated much like digital marketers, making use of Instagram to reach a huge audience.

    By blending marketing tactics with political messaging, the Internet Research Agency (IRA) established a formidable online presence in the run-up to the 2016 election (and later), generating 264 million total engagements—a measure of activity such as liking and sharing content—and building a media ecosystem across Facebook and Instagram.

    That campaign sought to bring Russian political goals into the mainstream, exacerbate and inflame divisions in American society, and blur the line between truth and fiction, New Knowledge’s report concludes.

    Amid the intense discussion of Russian interference in the 2016 election, investigators probing that campaign had devoted relatively little attention to Instagram until now. But following their exposure in 2016 and early 2017, the IRA’s operatives shifted resources to Instagram, where their content often outperformed its postings on Facebook. (Instagram is owned by Facebook.)

    Of the 133 Instagram accounts created by the IRA, @blackstagram__ was arguably its most successful, with more than 300,000 followers. Its June 2017 ad for the shoe, made by Kahmune, was the most widely circulated post dreamed up by the Kremlin’s operatives—from a total of some 116,000. (The shoe continues to be marketed by Kahmune. Company officials did not respond to questions from Foreign Policy.)

    The authors of the report believe @blackstagram__ served as a vehicle for Kremlin propaganda targeting the American black community, skillfully adopting the language of Instagram, where viral marketing schemes exist side by side with artfully arranged photographs of toast.

    As Americans streamed to the polls on Nov. 8, 2016, @blackstagram__ offered its contribution to the Kremlin’s campaign to depress turnout, borrowing a line from a Michael Jackson song to tell African-Americans that their votes didn’t matter: “Think twice before you vote. All I wanna say is that they don’t really care about us. #Blacktivist #hotnews._

    Special counsel Robert Mueller and his team of investigators have secured indictments against the Internet Research Agency’s owner, Yevgeny Prigozhin, and a dozen of its employees.

    While the effect of the IRA’s coordinated campaign to depress voter turnout is difficult to assess, the evidence of the group’s online influence is stark. Of its 133 Instagram accounts, 12 racked up more than 100,000 followers—the typical threshold for being considered an online “_influencer” in the world of digital marketing. Around 50 amassed more than 10,000 followers, making them what marketers call “micro-influencers.”

    These accounts made savvy use of hashtags, built relationships with real people, promoted merchandise, and targeted niche communities. The IRA’s most popular Instagram accounts included pages devoted to veterans’ issues (@american.veterans), American Christianity (@army_of_jesus), and feminism (@feminism_tag).

    In a measure of the agency’s creativity, @army_of_jesus appears to have been launched in 2015 as a meme account featuring Kermit the Frog. It then switched subjects and began exclusively posting memes related to the television show The Simpsons. By January 2016, the account had amassed a significant following and reached its final iteration with a post making extensive use of religious hashtags: “#freedom #love #god #bible #trust #blessed #grateful. ” It later posted memes comparing Democratic presidential nominee Hillary Clinton to Satan.

    The Internet Research Agency operated like a digital marketing agency: develop a brand (both visual and voice), build presences on all channels across the entire social ecosystem, and grow an audience with paid ads as well as partnerships, influencers, and link-sharing,” the New Knowledge report concludes. “Instagram was perhaps the most effective platform.

    Monday’s report, which was published alongside another by researchers at the University of Oxford and the network analysis firm Graphika, is likely to increase scrutiny of social media platforms. The New Knowledge report accuses technology firms of possibly misleading Congress and says companies have not been sufficiently transparent in providing data related to the Russian campaign.

  • Russian bots were used to sow divisions on vaccines, researchers say - STAT
    https://www.statnews.com/2018/08/23/vaccines-russian-bots

    An analysis of Twitter accounts previously identified as having been operated by Russian bots and trolls found they dove into the vaccine debate as early as January 2015, the researchers reported. They did not take one side or the other, but seemed to tweet pro-vaccine and anti-vaccine messages in roughly equal measure.

    (…) “The more the vaccine ‘debate’… is amplified it gains an undeserved sense of legitimacy and gives vaccine-hesitant individuals a pretense to forgo vaccination for themselves and their children,” said Adalja, who was harshly critical of the use of vaccinations in efforts to turn people against each other, calling it “overtly nihilistic.”

    #nihilisme #santé #vaccins #twitter #bots (russes évidemment)

    • Le résumé de l’étude (texte complet derrière #paywall)

      Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate | AJPH | Ahead of Print
      https://ajph.aphapublications.org/doi/10.2105/AJPH.2018.304567

      Abstract

      Objectives. To understand how Twitter bots and trolls (“bots”) promote online health content.

      Methods. We compared bots’ to average users’ rates of vaccine-relevant messages, which we collected online from July 2014 through September 2017. We estimated the likelihood that users were bots, comparing proportions of polarized and antivaccine tweets across user types. We conducted a content analysis of a Twitter hashtag associated with Russian troll activity.

      Results. Compared with average users, Russian trolls (χ2(1) = 102.0; P < .001), sophisticated bots (χ2(1) = 28.6; P < .001), and “content polluters” (χ2(1) = 7.0; P < .001) tweeted about vaccination at higher rates. Whereas content polluters posted more antivaccine content (χ2(1) = 11.18; P < .001), Russian trolls amplified both sides. Unidentifiable accounts were more polarized (χ2(1) = 12.1; P < .001) and antivaccine (χ2(1) = 35.9; P < .001). Analysis of the Russian troll hashtag showed that its messages were more political and divisive.

      Conclusions. Whereas bots that spread malware and unsolicited content disseminated antivaccine messages, Russian trolls promoted discord. Accounts masquerading as legitimate users create false equivalency, eroding public consensus on vaccination.

      Public Health Implications. Directly confronting vaccine skeptics enables bots to legitimize the vaccine debate. More research is needed to determine how best to combat bot-driven content.

      a Twitter hashtag associated with Russian troll activity
      mais encore ?

    • aucun #DOI ne résiste à sci-hub :)

      This analysis is supplemented by a qualitative study of #VaccinateUS — a Twitter hashtag designed to promote discord using vaccination as a political wedge issue. #VaccinateUS tweets were uniquely identified with Russian troll accounts linked to the Internet Research Agency—a company backed by the Russian government specializing in online influence operations. 20

      [20]. Popken B. Twitter deleted Russian troll tweets. So we published more than 200,000 of them. Available at:
      https://www.nbcnews.com/tech/social-media/nowavailable-more-200-000-deleted-russian-troll-tweetsn844731.

  • The Follower Factory - The New York Times
    https://www.nytimes.com/interactive/2018/01/27/technology/100000005704904.app.html

    All these accounts belong to customers of an obscure American company named Devumi that has collected millions of dollars in a shadowy global marketplace for social media fraud. Devumi sells Twitter followers and retweets to celebrities, businesses and anyone who wants to appear more popular or exert influence online. Drawing on an estimated stock of at least 3.5 million automated accounts, each sold many times over, the company has provided customers with more than 200 million Twitter followers, a New York Times investigation found.

    The accounts that most resemble real people, like Ms. Rychly, reveal a kind of large-scale social identity theft. At least 55,000 of the accounts use the names, profile pictures, hometowns and other personal details of real Twitter users, including minors, according to a Times data analysis.
    Jessica Rychly, whose social identity was stolen by a Twitter bot when she was in high school.

    “I don’t want my picture connected to the account, nor my name,” Ms. Rychly, now 19, said. “I can’t believe that someone would even pay for it. It is just horrible.”

    Intéressant cette reprise/acceptation du concept d’industrie de l’influence

    These accounts are counterfeit coins in the booming economy of online influence, reaching into virtually any industry where a mass audience — or the illusion of it — can be monetized. Fake accounts, deployed by governments, criminals and entrepreneurs, now infest social media networks. By some calculations, as many as 48 million of Twitter’s reported active users — nearly 15 percent — are automated accounts designed to simulate real people, though the company claims that number is far lower.

    In November, Facebook disclosed to investors that it had at least twice as many fake users as it previously estimated, indicating that up to 60 million automated accounts may roam the world’s largest social media platform. These fake accounts, known as bots, can help sway advertising audiences and reshape political debates. They can defraud businesses and ruin reputations. Yet their creation and sale fall into a legal gray zone.

    J’aime beaucoup « Economie de l’influence »

    The Influence Economy

    Last year, three billion people logged on to social media networks like Facebook, WhatsApp and China’s Sina Weibo. The world’s collective yearning for connection has not only reshaped the Fortune 500 and upended the advertising industry but also created a new status marker: the number of people who follow, like or “friend” you. For some entertainers and entrepreneurs, this virtual status is a real-world currency. Follower counts on social networks help determine who will hire them, how much they are paid for bookings or endorsements, even how potential customers evaluate their businesses or products.

    High follower counts are also critical for so-called influencers, a budding market of amateur tastemakers and YouTube stars where advertisers now lavish billions of dollars a year on sponsorship deals. The more people influencers reach, the more money they make. According to data collected by Captiv8, a company that connects influencers to brands, an influencer with 100,000 followers might earn an average of $2,000 for a promotional tweet, while an influencer with a million followers might earn $20,000.

    Influencers need not be well known to rake in endorsement money. According to a recent profile in the British tabloid The Sun, two young siblings, Arabella and Jaadin Daho, earn a combined $100,000 a year as influencers, working with brands such as Amazon, Disney, Louis Vuitton and Nintendo. Arabella, who is 14, tweets under the name Amazing Arabella.

    But her Twitter account — and her brother’s — are boosted by thousands of retweets purchased by their mother and manager, Shadia Daho, according to Devumi records. Ms. Daho did not respond to repeated attempts to reach her by email and through a public relations firm.

    “I don’t know why they’d take my identity — I’m a 20-year-old college student,” Mr. Dodd said. “I’m not well known.” But even unknown, Mr. Dodd’s social identity has value in the influence economy. At prices posted in December, Devumi was selling high-quality followers for under two cents each. Sold to about 2,000 customers — the rough number that many Devumi bot accounts follow — his social identity could bring Devumi around $30.

    #Industrie_influence #Twitter #Followers