person:scott berinato

  • Why Data Privacy Based on Consent Is Impossible
    https://hbr.org/2018/09/stop-thinking-about-consent-it-isnt-possible-and-it-isnt-right

    For a philosopher, Helen Nissenbaum is a surprisingly active participant in shaping how we collect, use, and protect personal data. Nissenbaum, who earned her PhD from Stanford, is a professor of information science at Cornell Tech, New York City, where she focuses on the intersection of politics, ethics, and values in technology and digital media — the hard stuff. Her framework for understanding digital privacy has deeply influenced real-world policy.

    HBR senior editor Scott Berinato spoke with Nissenbaum about the concept of consent, a good definition of privacy, and why privacy is a moral issue. The following excerpts from their conversation have been edited for clarity and length.

    HBR: You often sound frustrated when you talk about the idea of consent as a privacy mechanism. Why?

    Nissenbaum: Oh, it’s just such a [long pause] — look, the operationalization of consent is just so, so crummy. For example, as part of GDPR, we’re now constantly seeing pop-ups that say, “Hey, we use cookies — click here.” This doesn’t help. You have no idea what you’re doing, what you’re consenting to. A meaningful choice would be, say, “I’m OK that you’re using cookies to track me” or “I don’t want to be tracked but still want to enjoy the service” or “It’s fine to use cookies for this particular transaction, but throw unnecessary data out and never share it with others.” But none of these choices are provided. In what sense is this a matter of choosing (versus mere picking)?

    The farce of consent as currently deployed is probably doing more harm as it gives the misimpression of meaningful control that we are guiltily ceding because we are too ignorant to do otherwise and are impatient for, or need, the proffered service. There is a strong sense that consent is still fundamental to respecting people’s privacy. In some cases, yes, consent is essential. But what we have today is not really consent.

    Even if you tried to create totally transparent consent, you couldn’t. Well-meaning companies don’t know everything that happens with the data they collect, particularly those that have succumbed, against their better judgment, to the pressures of online tracking and behavioral targeting. They don’t know where the data is going or how it will be utilized. It’s an ever-changing landscape. On the one hand, requiring consent for every use isn’t reasonable and may prevent as many good outcomes as bad ones. Imagine if new science suggests a connection between a property, or cluster of properties, and a particular cancer treatment. Returning for consent may impose obstacles that are impossible to overcome.

    But on the other hand, what exactly does it mean to grant consent no matter what uses may come up in the future? Think about a surgeon explaining a procedure to a patient in great medical detail and then asking, “Are you OK with this?” We kid ourselves if we believe that consent is all that stands in the way of surgery and outcome. Most of us say OK not because we deeply grasp the details and ramifications but because we trust the institutions that educate and train surgeons, the integrity of the medical domain, and — at worst — the self-interest of the hospitals and surgeons wishing for positive acclaim and to avoid being sued.

    It’s not that we don’t know what consent means; it’s that getting to a point where we understand the true sense of what consent means is impossible.

    Annexe : devinez chez quel éditeur le livre Obfuscation d’Helen Nissenbaum va paraître cet automne ?

    #Helen_Nissenbaum #Vie_privée #Consentement

  • Pourquoi la #personnalisation va trop loin - BRW
    http://alireailleurs.tumblr.com/post/99628815638

    Pour Scott Berinato, la personnalisation des données va trop loin. Nous sommes entré dans l’ère de la #surveillance continue de la consommation… Du thermostat de Nest aux bracelets RFID de Disney… sont autant de propositions finalement peu équitables pour l’utilisateur. L’exploitation des données est trop invasive, trop fine… Bien plus que ne s’en rendent compte les consommateurs. Dans son livre, Dataclysm, Christian Rudder, fondateur du site de rencontre OK Cupid, qui s’est fait connaître par la qualité des analyses de données de ses utilisateurs, souligne que si les gens se sentent plus préoccupés, ils n’en restent pas moins apathiques. Il sait mieux que quiconque pourquoi les entreprises sont obsédées par nos données…. Ces morceaux inoffensifs d’information sont capables de dépeindre des portraits (...)

    #vie_privée #régulation #Big_data