• Facebook suspends data firm hired by Vote Leave over alleged Cambridge Analytica ties
    https://www.theguardian.com/us-news/2018/apr/06/facebook-suspends-aggregate-iq-cambridge-analytica-vote-leave-brexit

    AggregateIQ, which played a pivotal role in the Brexit campaign, suspended after reports it may have improperly obtained user data Facebook has suspended the Canadian data firm with which the official Vote Leave campaign spent 40% of its budget, as the Cambridge Analytica scandal continues to unfold. On Friday, Facebook announced it had suspended AggregateIQ (AIQ) from its platform following reports the company may be connected to Cambridge Analytica’s parent company, SCL. In its (...)

    #AggregateIQ #CambridgeAnalytica #Facebook #algorithme #élections #manipulation #données #BigData (...)

    ##profiling

  • “Le scandale Cambridge Analytica n’est pas une faille technique mais un problème politique” - Idées - Télérama.fr
    http://www.telerama.fr/idees/le-scandale-cambridge-analytica-nest-pas-une-faille-technique-mais-un-probl

    Interview de Zeynep Tufekci

    Cet incident, écriviez-vous dans le New York Times, est « une conséquence naturelle du modèle économique de Facebook »…

    Facebook comme Google attirent l’essentiel de l’attention car ce sont des mastodontes, mais n’oublions pas que tout le modèle de la publicité en ligne – et celui d’une majorité de médias – repose sur les mêmes fondations. Partout, le principe est identique : où que vous cliquiez, vous êtes suivi, ciblé, monétisé et vendu au plus offrant. Les pages que vous consultez, les contenus que vous publiez, toutes vos traces numériques sont utilisées à des fins commerciales. Qu’il s’agisse de Cambridge Analytica, d’un dictateur en herbe ou d’une marque d’aspirateurs importe peu, puisque c’est un système totalement asymétrique dans lequel vous ne connaissez pas l’identité des passeurs d’ordre. C’est le problème majeur d’Internet aujourd’hui. Dans cette « économie de l’attention », Facebook peut compter sur une infrastructure sans équivalent. Grâce à elle, la plateforme peut toucher deux milliards d’utilisateurs, écran par écran, sans même qu’ils s’en rendent compte.

    Faut-il craindre la multiplication d’épisodes de ce genre ?

    De toute évidence. Il est mécaniquement impossible de prédire l’utilisation qui sera faite de nos données dans les années à venir. C’est un puits sans fond ! Même si vous n’êtes pas sur Facebook, une quantité gigantesque d’informations à votre sujet circulent et permettent de vous profiler. Grâce aux progrès de l’intelligence artificielle, des algorithmes sont capables d’analyser vos amitiés, votre activité, vos modes de consommation. Nous figurons probablement tous dans des bases de données commerciales dont nous ignorons l’existence, mises en relation et croisées avec d’autres bases de données que nous ne connaissons pas davantage. Dans le cas de Cambridge Analytica, l’immense majorité des personnes siphonnées ignoraient tout de ce qui était en train de se passer.

    Réagissons-nous si tardivement à cause de cette opacité ?

    Pour une personne ordinaire, il est extrêmement difficile de réagir, car cette collecte est invisible, inodore et incolore. En tant qu’internaute, vous ne voyez rien d’autre que les contenus qui s’affichent sur votre écran.

    A ce titre, que pensez-vous de la réaction de Mark Zuckerberg ?

    Il s’est mollement excusé parce qu’il n’avait pas le choix. Mais il s’est quand même posé en victime, comme s’il avait été dupé par un tiers renégat ne respectant pas les règles d’un jeu qu’il a lui-même créé. Je pense que nous ne devrions croire aucune entreprise sur parole. Nous avons besoin de contrôle et de mécanismes de protection. Prenons l’exemple des voitures. Elles peuvent avoir des accidents ou présenter des risques pour l’environnement. Pour lutter contre ces facteurs négatifs, les gouvernements ont imposé des limitations de vitesse, le port de la ceinture de sécurité ou des normes environnementales. Ces changements ne sont pas intervenus par l’opération du Saint-Esprit : il a fallu les imposer. Et quand une entreprise ne respecte pas ces règles, elle est sanctionnée. L’économie liée à l’exploitation des données est encore un Far West à civiliser.

    Ces dernières semaines, les appels à la déconnexion de Facebook se sont multipliés. Est-ce une option viable ?

    Ça ne peut être qu’une décision individuelle. C’est le droit le plus strict de chacun, mais c’est un luxe qui ne résoudra pas le problème : dans de nombreux pays, Facebook est le seul moyen pour communiquer avec sa famille ou ses amis, et c’est un vecteur important d’organisation sociale. Il vaudrait mieux réfléchir au démantèlement de Facebook tout en réfléchissant à ses conséquences possibles : si nous ne réformons pas en profondeur le modèle économique du Web, des légions de petits Facebook pourraient en effet se montrer encore plus nocifs qu’une plateforme centralisée…

    #Zeynep_Tufekci #Facebook #Cambridge_analytica #Vie_privée #Données_personnelles

  • #Facebook, emblème du « capitalisme de surveillance »
    https://www.mediapart.fr/journal/international/070418/facebook-embleme-du-capitalisme-de-surveillance

    Le détournement de 87 millions de profils Facebook par #Cambridge_Analytica afin d’influer sur la campagne présidentielle américaine a Mark Zuckerberg lors du Mobile World Congress en février 2016 à Barcelone © Facebook plongé la société de Mark Zuckerberg dans une crise sans précédent en la mettant face à un dilemme en apparence insoluble : comment rassurer investisseurs et utilisateurs alors que son modèle économique repose par définition sur la collecte massive des données personnelles ?

    #International #capitalisme_de_surveillance #GAFAM

    • Même si Cambridge Analytica a bien dérobé quelque 87 millions de profils, c’est en fait quasiment toute l’économie du numérique qui repose sur une collecte massive des données des utilisateurs. Avant l’explosion du scandale, la société se vantait d’ailleurs de posséder des profils sur 220 millions d’Américains, des données obtenues légalement auprès des multiples « data brokers » ou vendeurs de données du pays. Ce modèle économique auquel sont irrémédiablement liés Facebook mais également Google ou Twitter a été théorisé sous l’expression de « capitalisme de surveillance ».

      Le concept a fait l’objet d’une première définition dans un article des universitaires John Bellamy Foster et Robert W. McChesney, publié en juillet 2014 sur le site Monthly Review. Dans celui-ci, les auteurs font du capitalisme de surveillance l’évolution naturelle du complexe militaro-industriel dénoncé par les président Dwight Eisenhower dans son célèbre discours de janvier 1961. Ils rappellent les origines militaires du Net et les efforts constants déployés par l’armée pour contrôler la recherche et les industries de pointe.

      L’article du Monthly Review en question, que j’avais pas vu passer : ▻https://seenthis.net/messages/328312

    • Avec aussi le concept de #surplus_comportemental développé par https://twitter.com/shoshanazuboff
      « [...] les utilisateurs ne sont ni des acheteurs ni des vendeurs ni des produits. Les utilisateurs sont une source de matériaux bruts gratuits [...] »

      Dans un article publié en 2016 dans la Frankfurter Allgemeine Zeitung, elle explique comment « le capitalisme a été détourné par un projet de surveillance lucratif ». Ce détournement a eu lieu, raconte Shoshana Zuboff, lorsque Google a pris conscience du profit qu’il pouvait tirer des données comportementales de ses utilisateurs, données qu’elle désigne sous le terme de « surplus comportemental ».

      « Le succès dramatique de Google dans le “matching” des publicités et des pages a révélé la valeur transformationnelle de ce surplus comportemental comme moyen de générer des revenus et, finalement, de transformer les investissements en capital », écrit Shoshana Zuboff. « Dans ce contexte, les utilisateurs ne sont plus une fin en eux-mêmes. À la place, ils sont devenus un moyen de faire des profits dans un nouveau type de marché dans lequel les utilisateurs ne sont ni des acheteurs ni des vendeurs ni des produits. Les utilisateurs sont une source de matériaux bruts gratuits qui nourrit un nouveau type de processus de fabrication », poursuit la professeure. « Le jeu ne consiste plus à vous envoyer un catalogue de vente par correspondance ou même à cibler la publicité en ligne. Le jeu consiste à vendre l’accès au flux en temps réel de votre vie quotidienne – votre réalité – afin de l’influencer directement et de modifier votre comportement pour faire des profits. »

    • Les références de l’article sont plus qu’intéressantes :
      – John Bellamy Foster et Robert W. McChesney sur le site Monthly Review sur le complexe militaro-industriel et le #capitalisme_de_surveillance
      https://monthlyreview.org/2014/07/01/surveillance-capitalism

      – Nicholas Confessore et Matthew Rosendbergmarch dans le NY Times sur le lien entre #Cambrige_Analytica et #Palantir
      https://www.nytimes.com/2018/03/27/us/cambridge-analytica-palantir.html

      – Shoshana Zuboff dans la Frankfurter Allgemeine Zeitung sur le #capitalisme_de_surveillance
      http://www.faz.net/aktuell/feuilleton/debatten/the-digital-debate/shoshana-zuboff-secrets-of-surveillance-capitalism-14103616.html
      => Article récemment traduit par l’équipe #Framalang
      https://framablog.org/2017/03/28/google-nouvel-avatar-du-capitalisme-celui-de-la-surveillance

  • Cambridge Analytica possède les données de 61.000 Belges
    http://www.lalibre.be/actu/international/cambridge-analytica-possede-les-donnees-de-61-000-belges-5ac51fdecd709bfa6b2

    Près de 61.000 Belges pourraient faire partie des 87 millions d’utilisateurs du réseau social Facebook dont les données ont été exploitées à leur insu par la firme Cambridge Analytica, a indiqué la porte-parole de Facebook pour le Benelux, Tineke Meijerman, à Belga mercredi soir. Les données personnelles de 60.957 Belges auraient, probablement insiste la porte-parole, été récupérées via l’installation de l’application concernée par seulement huit personnes. Le réseau social Facebook a revu mercredi à la (...)

    #CambridgeAnalytica #Facebook #algorithme #thisisyourdigitallife #données #BigData

  • After the Facebook scandal it’s time to base the digital economy on public v private ownership of data
    https://www.theguardian.com/technology/2018/mar/31/big-data-lie-exposed-simply-blaming-facebook-wont-fix-reclaim-private-i

    The data-mining scandal offers a unique chance to reclaim our private information and use it in a way that will benefit us all The continuing collapse of public trust in Facebook is welcome news to those of us who have been warning about the perils of “data extractivism” for years. It’s reassuring to have final, definitive proof that beneath Facebook’s highfalutin rhetoric of “building a global community that works for all of us” lies a cynical, aggressive project – of building a global data (...)

    #Alphabet #CambridgeAnalytica #Microsoft #Amazon #Facebook #algorithme #domination #BigData (...)

    ##marketing

  • « Ceux qui détiennent les données possèdent le pouvoir »
    http://www.lepoint.fr/chroniqueurs-du-point/laurence-neuer/ceux-qui-detiennent-les-donnees-possedent-le-pouvoir-23-03-2018-2204904_56.p

    « L’Empire des données » d’Adrien Basdevant et Jean-Pierre Mignard appelle à questionner les algorithmes qui formatent nos profils et quadrillent nos vies. En une année, plus de données ont été récoltées que depuis le début de l’histoire de l’humanité. Et cette tendance va s’accroître de manière exponentielle avec les quelque 100 milliards d’objets connectés annoncés à l’horizon 2025. Partagées, interprétées, retraitées, les données ne nous appartiennent plus. L’expression « données personnelles » a-t-elle (...)

    #CambridgeAnalytica #Facebook #algorithme #bénéfices #justice #criminalité #données #BigData #data-mining #marketing #profiling (...)

    ##criminalité ##domination

  • Why (almost) everything reported about the Cambridge Analytica Facebook ‘hacking’ controversy is wrong
    https://medium.com/@CKava/why-almost-everything-reported-about-the-cambridge-analytica-facebook-hackin

    First, There was no hack.

    The data collected was scraped from #Facebook user profiles, after users granted permission for a third party app to access their data. You know those little confirmation windows that pop up when someone wants to play Candy Crush or use Facebook to log in, rather than make a new password, for a random site? Yeah those.

    A Cambridge academic called Aleksandr Kogan — NOT Cambridge Analytica and NOT the whistleblower Christopher Wylie — made a ‘Test Your Personality’ app, helped to promote it by paying people $2–4 to install it on Amazon’s Mechanical Turk crowdsourcing site, and used the permissions granted to harvest profile data. 270,000 users installed the app, so you might expect that 270,000 profiles were collected but the app actually collected data from 50 million profiles.

    50 million?!?

    Yes. You see back in the heady days of 2014, Facebook had a feature called ‘friends permission’ that allowed developers to access the profiles of not only the person who installed their app but all their friends too. The only way to prevent this from happening was to have toggled a privacy setting, which few Facebook users even knew existed (here is a blog from 2012 explaining how to do so). The friends permission feature is how Kogan multiplied 270,000 permissions into 50 million profiles worth of data.

    (…)

    The real story then is not that Kogan, Wylie, and Cambridge Analytica developed some incredibly high tech ‘hack’ of Facebook. It is that, aside from Kogan’s data selling, they used methods that were common place and permitted by Facebook prior to 2015. Cambridge Analytica has since the story broke been outed as a rather obnoxious, unethical company- at least in how it promotes itself to potential clients. But the majority of what is being reported in the media about its manipulative power is just an uncritical regurgitation of Cambridge Analytica (and Chris Wylie’s) self-promotional claims. The problem is that there is little evidence that the company can do what it claims and plenty of evidence that it is not as effective as it likes to pretend; see the fact that Ted Cruz is not currently president.

    No one is totally immune to marketing or political messaging but *there is little evidence that Cambridge Analytica is better than other similar PR or political canvassing companies at targeting voters. Political targeting and disinformation campaigns, including those promoted by Russia, certainly had an impact on recent elections but were they the critical factor? Did they have a bigger impact than Comey announcing he was ‘reopening’ the Hillary email investigation the week before the US election? Or Brexiteers claiming that £250 million was being stolen from the NHS by the EU every week? Colour me skeptical.

    To be crystal clear, I’m not arguing that Cambridge Analytica and Kogan were innocent. At the very least, it is clear they were doing things that were contrary to Facebook’s data sharing policies. And similarly Facebook seems to have been altogether too cavalier with permitting developers to access its users’ private data.

    What I am arguing is that #Cambridge_Analytica are not the puppet masters they are being widely portrayed as. If anything they are much more akin to Donald Trump; making widely exaggerated claims about their abilities and getting lots of #attention as a result.

  • #eelo is more than tech, it’s a societal project for Freedom and Democracy
    https://hackernoon.com/eelo-is-more-than-tech-its-a-societal-project-for-freedom-and-democracy-

    As Lawrence Lessig explained on January, 1st 2000, Code is Law!Computer systems, and more specifically software and data networks, have been driving the way the world has evolved recently. Software is now everywhere: in cars, trains and planes, in your house, in businesses and in industry.Smartphones are taking over our livesSince 2007, software has also taken control in our personal lives: smartphones have become our companions of life. They empower us with new abilities. They help us find information quickly, they help us with directions, they help us to communicate quickly and at a low cost with other people anywhere in the world.The digital age we’re presently living in is a “far west quest”. The few who understand how things work are releasing products which often gain quick and (...)

    #cambridge-analytica #facebook #google #privacy

  • Vie privée : après Cambridge Analytica, les annonces en trompe-l’œil de Facebook
    http://www.lemonde.fr/pixels/article/2018/03/28/vie-privee-apres-cambridge-analytica-les-annonces-en-trompe-l-il-de-facebook

    Après le scandale Cambridge Analytica, le réseau social a annoncé des mesures en matière de vie privée, pour la plupart anciennes et imposées par le droit européen. Lorsqu’un gouvernement a du mal à faire passer une réforme, c’est souvent un problème de « pédagogie ». Pour Facebook, c’est un peu la même histoire : alors que le réseau social peine à éteindre le scandale Cambridge Analytica, il a présenté, mercredi 28 mars, des mesures « pour aider [ses] utilisateurs à mieux comprendre [ses] outils ». « La (...)

    #CambridgeAnalytica #Facebook #algorithme #élections #manipulation #électeurs #données #BigData (...)

    ##terms

  • Cambridge Analytica : Facebook reporte la présentation de ses enceintes connectées
    http://www.lefigaro.fr/secteur/high-tech/2018/03/28/32001-20180328ARTFIG00153-cambridge-analytica-facebook-reporte-la-presentat

    Empêtré dans un scandale après la révélation de la fuite des données de 50 millions d’utilisateurs, Facebook diffère la présentation de ses enceintes connectées, puissants outils de collectes de données. Question de calendrier. Suite au scandale Cambridge Analytica et à l’enregistrement révélé des journaux d’appels et SMS, Facebook ne dévoilera finalement pas ses nouveaux produits lors de sa grande conférence « F8 » du 1er mai. Les deux enceintes connectées du géant américain, dont le lancement était (...)

    #Apple #CambridgeAnalytica #Google #Amazon #Facebook #Alexa #Echo #domotique #Look #Home #biométrie #facial #BigData #profiling (...)

    ##algorithme

  • Why the Cambridge Analytica Scandal Is a Watershed Moment for Social Media - Knowledge Wharton
    http://knowledge.wharton.upenn.edu/article/fallout-cambridge-analytica

    “We’re experiencing a watershed moment with regard to social media,” said Aral. “People are now beginning to realize that social media is not just either a fun plaything or a nuisance. It can have potentially real consequences in society.”

    The Cambridge Analytica scandal underscores how little consumers know about the potential uses of their data, according to Berman. He recalled a scene in the film Minority Report where Tom Cruise enters a mall and sees holograms of personally targeted ads. “Online advertising today has reached about the same level of sophistication, in terms of targeting, and also some level of prediction,” he said. “It’s not only that the advertiser can tell what you bought in the past, but also what you may be looking to buy.”

    Consumers are partially aware of that because they often see ads that show them products they have browsed, or websites they have visited, and these ads “chase them,” Berman said. “What consumers may be unaware of is how the advertiser determines what they’re looking to buy, and the Cambridge Analytica exposé shows a tiny part of this world.”

    A research paper that Nave recently co-authored captures the potential impact of the kind of work Cambridge Analytica did for the Trump campaign. “On the one hand, this form of psychological mass persuasion could be used to help people make better decisions and lead healthier and happier lives,” it stated. “On the other hand, it could be used to covertly exploit weaknesses in their character and persuade them to take action against their own best interest, highlighting the potential need for policy interventions.”

    Nave said the Cambridge Analytica scandal exposes exactly those types of risks, even as they existed before the internet era. “Propaganda is not a new invention, and neither is targeted messaging in marketing,” he said. “What this scandal demonstrates, however, is that our online behavior exposes a lot about our personality, fears and weaknesses – and that this information can be used for influencing our behavior.”

    In Golbeck’s research projects involving the use of algorithms, she found that people “are really shocked that we’re able to get these insights like what your personality traits are, what your political preferences are, how influenced you can be, and how much of that data we’re able to harvest.”

    Even more shocking, perhaps, is how easy it is to find the data. “Any app on Facebook can pull the kind of data that Cambridge Analytica did – they can [do so] for all of your data and the data of all your friends,” said Golbeck. “Even if you don’t install any apps, if your friends use apps, those apps can pull your data, and then once they have that [information] they can get these extremely deep, intimate insights using artificial intelligence, about how to influence you, how to change your behavior.” But she draws a line there: “It’s one thing if that’s to get you to buy a pair of shoes; it’s another thing if it’s to change the outcome of an election.”

    “Facebook has tried to play both sides of [the issue],” said Golbeck. She recalled a study by scientists from Facebook and the University of California, San Diego, that claimed social media networks could have “a measurable if limited influence on voter turnout,” as The New York Times reported. “On one hand, they claim that they can have a big influence; on the other hand they want to say ‘No, no, we haven’t had any impact on this.’ So they are going to have a really tough act to play here, to actually justify what they’re claiming on both sides.”

    Golbeck called for ways to codify how researchers could ethically go about their work using social media data, “and give people some of those rights in a broader space that they don’t have now.” Aral expected the solution to emerge in the form of “a middle ground where we learn to use these technologies ethically in order to enhance our society, our access to information, our ability to cooperate and coordinate with one another, and our ability to spread positive social change in the world.” At the same time, he advocated tightening use requirements for the data, and bringing back “the notion of informed consent and consent in a meaningful way, so that we can realize the promise of social media while avoiding the peril.”

    Historically, marketers could collect individual data, but with social platforms, they can now also collect data about a user’s social contacts, said Berman. “These social contacts never gave permission explicitly for this information to be collected,” he added. “Consumers need to realize that by following someone or connecting to someone on social media, they also expose themselves to marketers who target the followed individual.”

    In terms of safeguards, Berman said it is hard to know in advance what a company will do with the data it collects. “If they use it for normal advertising, say toothpaste, that may be legitimate, and if they use it for political advertising, as in elections, that may be illegitimate. But the data itself is the same data.”

    According to Berman, most consumers, for example, don’t know that loyalty cards are used to track their behavior and that the data is sold to marketers. Would they stop using these cards if they knew? “I am not sure,” he said. “Research shows that people in surveys say they want to maintain their privacy rights, but when asked how much they’re willing to give up in customer experience – or to pay for it – the result is not too much. In other words, there’s a difference between how we care about privacy as an idea, and how much we’re willing to give up to maintain it.”

    Golbeck said tools exist for users to limit the amount of data they let reside on social media platforms, including one called Facebook Timeline Cleaner, and a “tweet delete” feature on Twitter. _ “One way that you can make yourself less susceptible to some of this kind of targeting is to keep less data there, delete stuff more regularly, and treat it as an ephemeral platform, _ ” she said.

    Mais est-ce crédible ? Les médias sociaux sont aussi des formes d’archives personnelles.

    #Facebook #Cambridge_analytica

  • Alors comme ça, on veut quitter Facebook ?
    http://www.makery.info/2018/03/27/alors-comme-ca-on-veut-quitter-facebook

    L’affaire Cambridge Analytica fait vaciller le réseau social. Quelle stratégie adopter après ces révélations : supprimer son compte, exiger ses datas, passer au libre ? On savait qu’il y avait du rififi chez Facebook. Mais alors là, c’est le pompon. Depuis que The Guardian, The Observer et le New York Times ont révélé les pratiques plus que douteuses de l’entreprise Cambridge Analytica, la tentation de supprimer Facebook semble être plus grande que jamais dans la communauté aux 2,13 milliards d’amis. (...)

    #CambridgeAnalytica #Facebook #algorithme #thisisyourdigitallife #élections #manipulation #électeurs #comportement #données #publicité #BigData #marketing #prédictif (...)

    ##publicité ##profiling

  • Life Inside S.C.L., Cambridge Analytica’s Parent Company | The New Yorker
    https://www.newyorker.com/news/letter-from-the-uk/life-inside-scl-cambridge-analyticas-parent-company

    There were times during our conversation when the employee seemed as bemused as anybody that a company that was started in the early nineteen-nineties with some intuitive but eccentric ideas about group psychology—one of Oakes’s first ventures was selling aromas to stores, to persuade customers to buy more—was now at the center of a transatlantic conversation about voter rights, data privacy, and the integrity of the world’s most important social network. But the employee was also clear that access to big data, particularly in the form of Facebook, combined with S.C.L.’s long interest in psychological profiling and audience segmentation, had been able to equip political campaigns with digital weapons that most voters were unaware of. “You can get philosophical about this and say that Facebook being an advertising platform masquerading as a social platform is the start of the rot and the tool was always there,” the employee said. S.C.L.’s executives were the wrong people who came along at the wrong time. “There were always going to be dodgy fuckers willing to work for rich people, and the S.C.L. was just an example of the dodgy fucker.” (S.C.L. did not immediately respond to a request for comment, but the company claims that it destroyed the Facebook data, in October, 2015, and that it played no role in the Presidential election.)

    The employee welcomed the current attention on S.C.L.’s methodology and behavior, whether it was illegal, or whether it should have been. The leaders of the company were not interested in these questions. “Alexander is not constrained by the sort of worries we are seeing expressed right now,” the employee said. “It really is about getting money together.” The employee continued, “What is wonderful about now is this bit of it is being opened, and I think it is bloody important, because something as catastrophic as Brexit and Trump—the technical possibility of that—is achieved through this dark shit. And this dark shit can be done by fucking cowboys. And, for lots of people who worked for the organization, it wasn’t supposed to be this way.”

    Some of S.C.L.’s methods had merit. Oakes’s insight in forming the B.D.I. was to aim messages at social groups—rather than at individuals—and to place a low expectation on persuading people to change their minds. In a “classic S.C.L. project,” the employee explained, the company would use subcontractors, survey companies, and academics in the run-up to an election to create what it called a “super sample.” “We would speak to sixty thousand people, and we wouldn’t say, ‘Who are you going to vote for?’ ” the employee said. “We would say, ‘How do you feel about life?’ ” S.C.L.’s data concentrated on local concerns, such as housing, water shortages, or tribal conflict. “With all of that, we would delineate a strategy for them to win by focussing on targeted groups that we had identified within the population,” the employee said. “It is not so much, let’s make these people do this thing; it is, can we take this thing in such a way that the people who should get it do get it?”

    #Facebook #Cambridge_analytica

  • ICE Uses Facebook Data to Find and Track Suspects, Internal Emails Show
    https://theintercept.com/2018/03/26/facebook-data-ice-immigration

    Cambridge Analytica may have had access to the personal information of tens of millions of unwitting Americans, but a genuine debate has emerged about whether the company had the sophistication to put that data effectively to use on behalf of Donald Trump’s presidential campaign. But one other organization that has ready access to Facebook’s trove of personal data has a much better track record of using such information effectively : U.S. Immigration and Customs Enforcement. ICE, the (...)

    #ICE #Facebook #données #migration #surveillance #géolocalisation #CambridgeAnalytica #ACLU (...)

    ##Palantir

  • The Aggregate IQ Files, Part One : How a Political Engineering Firm Exposed Their Code Base
    https://www.upguard.com/breaches/aggregate-iq-part-one

    The UpGuard Cyber Team’s latest discovery of a data leak, involving the exposed IT assets of a data analytics firm based in British Columbia, Canada, presents significant questions for society about how technology can be used. In this first installment of a multipart series titled “The AIQ Files,” we begin to explain the importance of the data revealed from a publicly exposed AggregateIQ repository, and how it relates to recent US political history. Coming amidst a firestorm of scrutiny (...)

    #AggregateIQ #CambridgeAnalytica #Facebook #algorithme #élections #manipulation #électeurs #prédictif #profiling (...)

    ##hacking

  • Ciblage électoral : les logiciels d’AggregateIQ exposés en ligne, selon des chercheurs
    https://www.nextinpact.com/brief/ciblage-electoral---les-logiciels-d-aggregateiq-exposes-en-ligne--selon-

    La société de sécurité UpGuard déclare avoir découvert une fuite de données d’AggregateIQ, une entreprise canadienne spécialisée dans l’analyse de données. Selon le dépôt de code « exposé », elle aurait fourni les outils de ciblage d’électeurs lors de la campagne présidentielle du républicain Ted Cruz en 2016. Le dépôt Gitlab, un temps en ligne, comprend des applications de gestion de données, pisteurs publicitaires et des bases de données... ainsi que des identifiants, clés, noms d’utilisateurs et mots de passe (...)

    #AggregateIQ #CambridgeAnalytica #Facebook #algorithme #élections #manipulation #électeurs #prédictif (...)

    ##profiling

  • Facebook logs SMS texts and calls, users find as they delete accounts
    https://www.theguardian.com/technology/2018/mar/25/facebook-logs-texts-and-calls-users-find-as-they-delete-accounts-cambri

    Leaving the social network after Cambridge Analytica scandal, users discover extent of data held As users continue to delete their Facebook accounts in the wake of the Cambridge Analytica scandal, a number are discovering that the social network holds far more data about them than they expected, including complete logs of incoming and outgoing calls and SMS messages. The #deletefacebook movement took off after the revelations that Facebook had shared with a Cambridge psychologist the (...)

    #CambridgeAnalytica #Facebook #terms #données #marketing #profiling

  • Le régulateur américain du commerce, la FTC, lance une enquête sur Facebook
    http://www.lemonde.fr/pixels/article/2018/03/26/la-ftc-lance-une-enquete-sur-facebook_5276697_4408996.html

    L’agence fédérale américaine du commerce va enquêter sur l’utilisation par le réseau social des données personnelles de ses abonnés. Les nuages s’amoncellent au-dessus de Facebook. Les excuses tardives et les promesses du patron Mark Zuckerberg, la semaine dernière, n’ont pas éteint la polémique autour de l’utilisation indue de données personnelles de millions d’utilisateurs par la firme britannique Cambridge Analytica. Le bureau de protection des consommateurs de la Federal Trade Commission (FTC, (...)

    #CambridgeAnalytica #Facebook #algorithme #thisisyourdigitallife #élections #manipulation #électeurs #prédictif #marketing #profiling (...)

    ##FTC

  • Antoinette Rouvroy : « À mon sens, Zuckerberg est dépassé »
    https://www.lecho.be/opinions/general/antoinette-rouvroy-a-mon-sens-zuckerberg-est-depasse/9995228.html

    Juriste et philosophe du droit aux Facultés de Namur et chercheuse au FNRS, Antoinette Rouvroy étudie la « gouvernementalité algorithmique ». Ses recherches se centrent sur la question de la norme et du pouvoir à l’heure de la révolution numérique. Elle tire la sonnette d’alarme : l’usage abusif des big data porte atteinte à notre liberté… Et commente, pour L’Echo, l’affaire Cambridge Analytica, du nom de cette société britannique qui a exploité de manière indue les données personnelles de 50 millions (...)

    #CambridgeAnalytica #Facebook #algorithme #élections #électeurs #domination #BigData #prédictif #marketing (...)

    ##profiling

    • Le monde du web est opaque. En soi, recueillir des informations est déjà une tromperie. C’est la présence des tiers qui utilisent la plateforme pour proposer d’autres services qui pose problème. Par exemple, vous répondez à un questionnaire « à quel personnage de ‘la guerre des étoiles’ ressemblez-vous ? », mais vous ignorez que ces données vont ensuite être utilisées. L’autre problème, c’est qu’on a l’impression de participer à une communauté alors qu’en réalité, il n’y a pas d’échange. Tout le monde est seul devant son écran. Facebook n’est pas un espace public. On assiste à une hypertrophie de la sphère privée qui se caractérise paradoxalement par une dépersonnalisation. La prise de conscience collective ne peut pas se produire.

  • Cambridge Analytica au cœur de la tempête politique, médiatique et judiciaire
    http://www.lemonde.fr/pixels/article/2018/03/23/perquisition-au-siege-londonien-de-l-entreprise-cambridge-analytica_5275716_

    Le régulateur britannique chargé de la protection des données veut examiner les données de la société, accusée d’avoir siphonné à leur insu les données personnelles de millions d’utilisateurs de Facebook. Les autorités britanniques ont mené une perquisition dans les locaux londoniens de Cambridge Analytica vendredi 23 mars dans la soirée. Après de longs débats judiciaires, un juge a finalement accordé un mandat de perquisition un peu après 19 heures, à la demande de l’Information Commissioner’s Office (...)

    #CambridgeAnalytica #Facebook #algorithme #thisisyourdigitallife #élections #manipulation #domination #électeurs #BigData #marketing (...)

    ##prédictif

  • Cambridge Analytica demonstrates that Facebook needs to give researchers more access.
    https://slate.com/technology/2018/03/cambridge-analytica-demonstrates-that-facebook-needs-to-give-researchers-more

    In a 2013 paper, psychologist Michal Kosinski and collaborators from University of Cambridge in the United Kingdom warned that “the predictability of individual attributes from digital records of behavior may have considerable negative implications,” posing a threat to “well-being, freedom, or even life.” This warning followed their striking findings about how accurately the personal attributes of a person (from political leanings to intelligence to sexual orientation) could be inferred from nothing but their Facebook likes. Kosinski and his colleagues had access to this information through the voluntary participation of the Facebook users by offering them the results of a personality quiz, a method that can drive viral engagement. Of course, one person’s warning may be another’s inspiration.

    Kosinski’s original research really was an important scientific finding. The paper has been cited more than 1,000 times and the dataset has spawned many other studies. But the potential uses for it go far beyond academic research. In the past few days, the Guardian and the New York Times have published a number of new stories about Cambridge Analytica, the data mining and analytics firm best known for aiding President Trump’s campaign and the pro-Brexit campaign. This trove of reporting shows how Cambridge Analytica allegedly relied on the psychologist Aleksandr Kogan (who also goes by Aleksandr Spectre), a colleague of the original researchers at Cambridge, to gain access to profiles of around 50 million Facebook users.

    According to the Guardian’s and New York Times’ reporting, the data that was used to build these models came from a rough duplicate of that personality quiz method used legitimately for scientific research. Kogan, a lecturer in another department, reportedly approached Kosinski and their Cambridge colleagues in the Psychometric Centre to discuss commercializing the research. To his credit, Kosinski declined. However, Kogan built an app named thisismydigitallife for his own startup, Global Science Research, which collected the same sorts of data. GSR paid Mechanical Turk workers (contrary to the terms of Mechanical Turk) to take a psychological quiz and provide access to their Facebook profiles. In 2014, under the contract with the parent company of Cambridge Analytica, SCL, that data was harvested and used to build a model of 50 million U.S. Facebook users that included allegedly 5,000 data points on each user.

    So if the Facebook API allowed Kogan access to this data, what did he do wrong? This is where things get murky, but bear with us. It appears that Kogan deceitfully used his dual roles as a researcher and an entrepreneur to move data between an academic context and a commercial context, although the exact method of it is unclear. The Guardian claims that Kogan “had a licence from Facebook to collect profile data, but it was for research purposes only” and “[Kogan’s] permission from Facebook to harvest profiles in large quantities was specifically restricted to academic use.” Transferring the data this way would already be a violation of the terms of Facebook’s API policies that barred use of the data outside of Facebook for commercial uses, but we are unfamiliar with Facebook offering a “license” or special “permission” for researchers to collect greater amounts of data via the API.

    Regardless, it does appear that the amount of data thisismydigitallife was vacuuming up triggered a security review at Facebook and an automatic shutdown of its API access. Relying on Wylie’s narrative, the Guardian claims that Kogan “spoke to an engineer” and resumed access:

    “Facebook could see it was happening,” says Wylie. “Their security protocols were triggered because Kogan’s apps were pulling this enormous amount of data, but apparently Kogan told them it was for academic use. So they were like, ‘Fine’.”

    Kogan claims that he had a close working relationship with Facebook and that it was familiar with his research agendas and tools.

    A great deal of research confirms that most people don’t pay attention to permissions and privacy policies for the apps they download and the services they use—and the notices are often too vague or convoluted to clearly understand anyway. How many Facebook users give third parties access to their profile so that they can get a visualization of the words they use most, or to find out which Star Wars character they are? It isn’t surprising that Kosinski’s original recruitment method—a personality quiz that provided you with a psychological profile of yourself based on a common five-factor model—resulted in more than 50,000 volunteers providing access to their Facebook data. Indeed, Kosinski later co-authored a paper detailing how to use viral marketing techniques to recruit study participants, and he has written about the ethical dynamics of utilizing friend data.

    #Facebook #Cambridge_analytica #Recherche

  • Revealed : 50 million Facebook profiles harvested for Cambridge Analytica in major data breach | News | The Guardian
    https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election

    Paul-Olivier Dehaye, a data protection specialist, who spearheaded the investigative efforts into the tech giant, said: “Facebook has denied and denied and denied this. It has misled MPs and congressional investigators and it’s failed in its duties to respect the law.

    “It has a legal obligation to inform regulators and individuals about this data breach, and it hasn’t. It’s failed time and time again to be open and transparent.”

    We exploited Facebook to harvest millions of profiles. And built models to exploit that and target their inner demons
    Christopher Wylie

    A majority of American states have laws requiring notification in some cases of data breach, including California, where Facebook is based.

    Facebook denies that the harvesting of tens of millions of profiles by GSR and Cambridge Analytica was a data breach. It said in a statement that Kogan “gained access to this information in a legitimate way and through the proper channels” but “did not subsequently abide by our rules” because he passed the information on to third parties.

    Facebook said it removed the app in 2015 and required certification from everyone with copies that the data had been destroyed, although the letter to Wylie did not arrive until the second half of 2016. “We are committed to vigorously enforcing our policies to protect people’s information. We will take whatever steps are required to see that this happens,” Paul Grewal, Facebook’s vice-president, said in a statement. The company is now investigating reports that not all data had been deleted.

    Kogan, who has previously unreported links to a Russian university and took Russian grants for research, had a licence from Facebook to collect profile data, but it was for research purposes only. So when he hoovered up information for the commercial venture, he was violating the company’s terms. Kogan maintains everything he did was legal, and says he had a “close working relationship” with Facebook, which had granted him permission for his apps.

    Coût de l’opération : 1 million de dollars.

    The Observer has seen a contract dated 4 June 2014, which confirms SCL, an affiliate of Cambridge Analytica, entered into a commercial arrangement with GSR, entirely premised on harvesting and processing Facebook data. Cambridge Analytica spent nearly $1m on data collection, which yielded more than 50 million individual profiles that could be matched to electoral rolls. It then used the test results and Facebook data to build an algorithm that could analyse individual Facebook profiles and determine personality traits linked to voting behaviour.

    The algorithm and database together made a powerful political tool. It allowed a campaign to identify possible swing voters and craft messages more likely to resonate.

    “The ultimate product of the training set is creating a ‘gold standard’ of understanding personality from Facebook profile information,” the contract specifies. It promises to create a database of 2 million “matched” profiles, identifiable and tied to electoral registers, across 11 states, but with room to expand much further.

    #Facebook #Cambridge_analytica