person:sherry turkle

  • « Facebook n’a jamais essayé de donner à ses utilisateurs un réel contrôle de leurs données »
    https://www.lemonde.fr/idees/article/2018/04/09/sherry-turkle-les-americains-ont-voulu-vivre-leur-passion-avec-facebook-sans

    Pour l’anthropologue Sherry Turkle qui a étudié nos relations avec les nouvelles technologies, nous sommes aveuglés par une passion pour ce réseau social. J’étudie l’attitude des Américains envers les réseaux sociaux depuis leur apparition. Dès le début, un fort lien s’est noué entre les réseaux sociaux et les Américains qui avaient soif de relations sans engagement. Grâce à ces technologies, nous sommes toujours ensemble, mais seuls, protégés par la distance qui nous sépare des autres. Avec le temps, ce (...)

    #Facebook #SocialNetwork #solutionnisme #addiction #marketing

  • What Happens When We Let Tech Care For Our Aging Parents | WIRED
    https://www.wired.com/story/digital-puppy-seniors-nursing-homes

    Arlyn Anderson grasped her father’s hand and presented him with the choice. “A nursing home would be safer, Dad,” she told him, relaying the doctors’ advice. “It’s risky to live here alone—”

    “No way,” Jim interjected. He frowned at his daughter, his brow furrowed under a lop of white hair. At 91, he wanted to remain in the woodsy Minnesota cottage he and his wife had built on the shore of Lake Minnetonka, where she had died in his arms just a year before. His pontoon—which he insisted he could still navigate just fine—bobbed out front.

    Arlyn had moved from California back to Minnesota two decades earlier to be near her aging parents. Now, in 2013, she was fiftysomething, working as a personal coach, and finding that her father’s decline was all-consuming.

    Her father—an inventor, pilot, sailor, and general Mr. Fix-It; “a genius,” Arlyn says—started experiencing bouts of paranoia in his mid-eighties, a sign of Alzheimer’s. The disease had progressed, often causing his thoughts to vanish mid-sentence. But Jim would rather risk living alone than be cloistered in an institution, he told Arlyn and her older sister, Layney. A nursing home certainly wasn’t what Arlyn wanted for him either. But the daily churn of diapers and cleanups, the carousel of in-home aides, and the compounding financial strain (she had already taken out a reverse mortgage on Jim’s cottage to pay the caretakers) forced her to consider the possibility.

    Jim, slouched in his recliner, was determined to stay at home. “No way,” he repeated to his daughter, defiant. Her eyes welled up and she hugged him. “OK, Dad.” Arlyn’s house was a 40-minute drive from the cottage, and for months she had been relying on a patchwork of technology to keep tabs on her dad. She set an open laptop on the counter so she could chat with him on Skype. She installed two cameras, one in his kitchen and another in his bedroom, so she could check whether the caregiver had arrived, or God forbid, if her dad had fallen. So when she read in the newspaper about a new digi­tal eldercare service called CareCoach a few weeks after broaching the subject of the nursing home, it piqued her interest. For about $200 a month, a human-powered avatar would be available to watch over a homebound person 24 hours a day; Arlyn paid that same amount for just nine hours of in-home help. She signed up immediately.

    More From the Magazine
    Mara Hvistendahl

    Inside China’s Vast New Experiment in Social Ranking
    Nathan Hill

    The Overwatch Videogame League Aims to Become the New NFL
    Brian Castner

    Exclusive: Tracing ISIS’ Weapons Supply Chain—Back to the US

    A Google Nexus tablet arrived in the mail a week later. When Arlyn plugged it in, an animated German shepherd appeared onscreen, standing at attention on a digitized lawn. The brown dog looked cutesy and cartoonish, with a bubblegum-pink tongue and round, blue eyes.

    She and Layney visited their dad later that week, tablet in hand. Following the instructions, Arlyn uploaded dozens of pictures to the service’s online portal: images of family members, Jim’s boat, and some of his inventions, like a computer terminal known as the Teleray and a seismic surveillance system used to detect footsteps during the Vietnam War. The setup complete, Arlyn clutched the tablet, summoning the nerve to introduce her dad to the dog. Her initial instinct that the service could be the perfect companion for a former technologist had splintered into needling doubts. Was she tricking him? Infantilizing him?

    Tired of her sister’s waffling, Layney finally snatched the tablet and presented it to their dad, who was sitting in his armchair. “Here, Dad, we got you this.” The dog blinked its saucer eyes and then, in Google’s female text-to-speech voice, started to talk. Before Alzheimer’s had taken hold, Jim would have wanted to know exactly how the service worked. But in recent months he’d come to believe that TV characters were interacting with him: A show’s villain had shot a gun at him, he said; Katie Couric was his friend. When faced with an onscreen character that actually was talking to him, Jim readily chatted back.

    Jim named his dog Pony. Arlyn perched the tablet upright on a table in Jim’s living room, where he could see it from the couch or his recliner. Within a week Jim and Pony had settled into a routine, exchanging pleasantries several times a day. Every 15 minutes or so Pony would wake up and look for Jim, calling his name if he was out of view. Sometimes Jim would “pet” the sleeping dog onscreen with his finger to rustle her awake. His touch would send an instantaneous alert to the human caretaker behind the avatar, prompting the CareCoach worker to launch the tablet’s audio and video stream. “How are you, Jim?” Pony would chirp. The dog reminded him which of his daughters or in-person caretakers would be visiting that day to do the tasks that an onscreen dog couldn’t: prepare meals, change Jim’s sheets, drive him to a senior center. “We’ll wait together,” Pony would say. Often she’d read poetry aloud, discuss the news, or watch TV with him. “You look handsome, Jim!” Pony remarked after watching him shave with his electric razor. “You look pretty,” he replied. Sometimes Pony would hold up a photo of Jim’s daughters or his inventions between her paws, prompting him to talk about his past. The dog complimented Jim’s red sweater and cheered him on when he struggled to buckle his watch in the morning. He reciprocated by petting the screen with his index finger, sending hearts floating up from the dog’s head. “I love you, Jim!” Pony told him a month after they first met—something CareCoach operators often tell the people they are monitoring. Jim turned to Arlyn and gloated, “She does! She thinks I’m real good!”

    About 1,500 miles south of Lake Minnetonka, in Monterrey, Mexico, Rodrigo Rochin opens his laptop in his home office and logs in to the CareCoach dashboard to make his rounds. He talks baseball with a New Jersey man watching the Yankees; chats with a woman in South Carolina who calls him Peanut (she places a cookie in front of her tablet for him to “eat”); and greets Jim, one of his regulars, who sips coffee while looking out over a lake.

    Rodrigo is 35 years old, the son of a surgeon. He’s a fan of the Spurs and the Cowboys, a former international business student, and a bit of an introvert, happy to retreat into his sparsely decorated home office each morning. He grew up crossing the border to attend school in McAllen, Texas, honing the English that he now uses to chat with elderly people in the United States. Rodrigo found CareCoach on an online freelancing platform and was hired in December 2012 as one of the company’s earliest contractors, role-playing 36 hours a week as one of the service’s avatars.

    After watching her dad interact with Pony, Arlyn’s reservations about outsourcing her father’s companionship vanished.

    In person, Rodrigo is soft-spoken, with wire spectacles and a beard. He lives with his wife and two basset hounds, Bob and Cleo, in Nuevo León’s capital city. But the people on the other side of the screen don’t know that. They don’t know his name—or, in the case of those like Jim who have dementia, that he even exists. It’s his job to be invisible. If Rodrigo’s clients ask where he’s from, he might say MIT (the CareCoach software was created by two graduates of the school), but if anyone asks where their pet actually is, he replies in character: “Here with you.”

    Rodrigo is one of a dozen CareCoach employees in Latin America and the Philippines. The contractors check on the service’s seniors through the tablet’s camera a few times an hour. (When they do, the dog or cat avatar they embody appears to wake up.) To talk, they type into the dashboard and their words are voiced robotically through the tablet, designed to give their charges the impression that they’re chatting with a friendly pet. Like all the CareCoach workers, Rodrigo keeps meticulous notes on the people he watches over so he can coordinate their care with other workers and deepen his relationship with them over time—this person likes to listen to Adele, this one prefers Elvis, this woman likes to hear Bible verses while she cooks. In one client’s file, he wrote a note explaining that the correct response to “See you later, alligator” is “After a while, crocodile.” These logs are all available to the customer’s social workers or adult children, wherever they may live. Arlyn started checking Pony’s log between visits with her dad several times a week. “Jim says I’m a really nice person,” reads one early entry made during the Minnesota winter. “I told Jim that he was my best friend. I am so happy.”

    After watching her dad interact with Pony, Arlyn’s reservations about outsourcing her father’s companionship vanished. Having Pony there eased her anxiety about leaving Jim alone, and the virtual dog’s small talk lightened the mood.

    Pony was not only assisting Jim’s human caretakers but also inadvertently keeping an eye on them. Months before, in broken sentences, Jim had complained to Arlyn that his in-home aide had called him a bastard. Arlyn, desperate for help and unsure of her father’s recollection, gave her a second chance. Three weeks after arriving in the house, Pony woke up to see the same caretaker, impatient. “Come on, Jim!” the aide yelled. “Hurry up!” Alarmed, Pony asked why she was screaming and checked to see if Jim was OK. The pet—actually, Rodrigo—later reported the aide’s behavior to CareCoach’s CEO, Victor Wang, who emailed Arlyn about the incident. (The caretaker knew there was a human watching her through the tablet, Arlyn says, but may not have known the extent of the person’s contact with Jim’s family behind the scenes.) Arlyn fired the short-tempered aide and started searching for a replacement. Pony watched as she and Jim conducted the interviews and approved of the person Arlyn hired. “I got to meet her,” the pet wrote. “She seems really nice.”

    Pony—friend and guard dog—would stay.
    Grant Cornett

    Victor Wang grew up feeding his Tama­got­chis and coding choose-your-own-­adventure games in QBasic on the family PC. His parents moved from Taiwan to suburban Vancouver, British Columbia, when Wang was a year old, and his grandmother, whom he called Lao Lao in Mandarin, would frequently call from Taiwan. After her husband died, Lao Lao would often tell Wang’s mom that she was lonely, pleading with her daughter to come to Taiwan to live with her. As she grew older, she threatened suicide. When Wang was 11, his mother moved back home for two years to care for her. He thinks of that time as the honey-­sandwich years, the food his overwhelmed father packed him each day for lunch. Wang missed his mother, he says, but adds, “I was never raised to be particularly expressive of my emotions.”

    At 17, Wang left home to study mechanical engineering at the University of British Columbia. He joined the Canadian Army Reserve, serving as an engineer on a maintenance platoon while working on his undergraduate degree. But he scrapped his military future when, at 22, he was admitted to MIT’s master’s program in mechanical engineering. Wang wrote his dissertation on human-machine interaction, studying a robotic arm maneuvered by astronauts on the International Space Station. He was particularly intrigued by the prospect of harnessing tech to perform tasks from a distance: At an MIT entrepreneurship competition, he pitched the idea of training workers in India to remotely operate the buffers that sweep US factory floors.

    In 2011, when he was 24, his grandmother was diagnosed with Lewy body dementia, a disease that affects the areas of the brain associated with memory and movement. On Skype calls from his MIT apartment, Wang watched as his grandmother grew increasingly debilitated. After one call, a thought struck him: If he could tap remote labor to sweep far-off floors, why not use it to comfort Lao Lao and others like her?

    Wang started researching the looming caretaker shortage in the US—between 2010 and 2030, the population of those older than 80 is projected to rise 79 percent, but the number of family caregivers available is expected to increase just 1 percent.

    In 2012 Wang recruited his cofounder, a fellow MIT student working on her computer science doctorate named Shuo Deng, to build CareCoach’s technology. They agreed that AI speech technology was too rudimentary for an avatar capable of spontaneous conversation tailored to subtle mood and behavioral cues. For that, they would need humans.

    Older people like Jim often don’t speak clearly or linearly, and those with dementia can’t be expected to troubleshoot a machine that misunderstands. “When you match someone not fully coherent with a device that’s not fully coherent, it’s a recipe for disaster,” Wang says. Pony, on the other hand, was an expert at deciphering Jim’s needs. Once, Pony noticed that Jim was holding onto furniture for support, as if he were dizzy. The pet persuaded him to sit down, then called Arlyn. Deng figures it’ll take about 20 years for AI to be able to master that kind of personal interaction and recognition. That said, the CareCoach system is already deploying some automated abilities. Five years ago, when Jim was introduced to Pony, the offshore workers behind the camera had to type every response; today CareCoach’s software creates roughly one out of every five sentences the pet speaks. Wang aims to standardize care by having the software manage more of the patients’ regular reminders—prodding them to take their medicine, urging them to eat well and stay hydrated. CareCoach workers are part free­wheeling raconteurs, part human natural-­language processors, listening to and deciphering their charges’ speech patterns or nudging the person back on track if they veer off topic. The company recently began recording conversations to better train its software in senior speech recognition.

    CareCoach found its first customer in December 2012, and in 2014 Wang moved from Massachusetts to Silicon Valley, renting a tiny office space on a lusterless stretch of Millbrae near the San Francisco airport. Four employees congregate in one room with a view of the parking lot, while Wang and his wife, Brittany, a program manager he met at a gerontology conference, work in the foyer. Eight tablets with sleeping pets onscreen are lined up for testing before being shipped to their respective seniors. The avatars inhale and exhale, lending an eerie sense of life to their digital kennel.

    CareCoach conveys the perceptiveness and emotional intelligence of the humans powering it but masquerades as an animated app.

    Wang spends much of his time on the road, touting his product’s health benefits at medical conferences and in hospital executive suites. Onstage at a gerontology summit in San Francisco last summer, he deftly impersonated the strained, raspy voice of an elderly man talking to a CareCoach pet while Brittany stealthily cued the replies from her laptop in the audience. The company’s tablets are used by hospitals and health plans across Massachusetts, California, New York, South Carolina, Florida, and Washington state. Between corporate and individual customers, CareCoach’s avatars have interacted with hundreds of users in the US. “The goal,” Wang says, “is not to have a little family business that just breaks even.”

    The fastest growth would come through hospital units and health plans specializing in high-need and elderly patients, and he makes the argument that his avatars cut health care costs. (A private room in a nursing home can run more than $7,500 a month.) Preliminary research has been promising, though limited. In a study conducted by Pace University at a Manhattan housing project and a Queens hospital, CareCoach’s avatars were found to reduce subjects’ loneliness, delirium, and falls. A health provider in Massachusetts was able to replace a man’s 11 weekly in-home nurse visits with a CareCoach tablet, which diligently reminded him to take his medications. (The man told nurses that the pet’s nagging reminded him of having his wife back in the house. “It’s kind of like a complaint, but he loves it at the same time,” the project’s lead says.) Still, the feelings aren’t always so cordial: In the Pace University study, some aggravated seniors with dementia lashed out and hit the tablet. In response, the onscreen pet sheds tears and tries to calm the person.

    More troubling, perhaps, were the people who grew too fiercely attached to their digi­tal pets. At the conclusion of a University of Washington CareCoach pilot study, one woman became so distraught at the thought of parting with her avatar that she signed up for the service, paying the fee herself. (The company gave her a reduced rate.) A user in Massachusetts told her caretakers she’d cancel an upcoming vacation to Maine unless her digital cat could come along.

    We’re still in the infancy of understanding the complexities of aging humans’ relationship with technology. Sherry Turkle, a professor of social studies, science, and technology at MIT and a frequent critic of tech that replaces human communication, described interactions between elderly people and robotic babies, dogs, and seals in her 2011 book, Alone Together. She came to view roboticized eldercare as a cop-out, one that would ultimately degrade human connection. “This kind of app—in all of its slickness and all its ‘what could possibly be wrong with it?’ mentality—is making us forget what we really know about what makes older people feel sustained,” she says: caring, interpersonal relationships. The question is whether an attentive avatar makes a comparable substitute. Turkle sees it as a last resort. “The assumption is that it’s always cheaper and easier to build an app than to have a conversation,” she says. “We allow technologists to propose the unthinkable and convince us the unthinkable is actually the inevitable.”

    But for many families, providing long-term in-person care is simply unsustainable. The average family caregiver has a job outside the home and spends about 20 hours a week caring for a parent, according to AARP. Nearly two-thirds of such caregivers are women. Among eldercare experts, there’s a resignation that the demographics of an aging America will make technological solutions unavoidable. The number of those older than 65 with a disability is projected to rise from 11 million to 18 million from 2010 to 2030. Given the option, having a digital companion may be preferable to being alone. Early research shows that lonely and vulnerable elders like Jim seem content to communicate with robots. Joseph Coughlin, director of MIT’s AgeLab, is pragmatic. “I would always prefer the human touch over a robot,” he says. “But if there’s no human available, I would take high tech in lieu of high touch.”

    CareCoach is a disorienting amalgam of both. The service conveys the perceptiveness and emotional intelligence of the humans powering it but masquerades as an animated app. If a person is incapable of consenting to CareCoach’s monitoring, then someone must do so on their behalf. But the more disconcerting issue is how cognizant these seniors are of being watched over by strangers. Wang considers his product “a trade-off between utility and privacy.” His workers are trained to duck out during baths and clothing changes.

    Some CareCoach users insist on greater control. A woman in Washington state, for example, put a piece of tape over her CareCoach tablet’s camera to dictate when she could be viewed. Other customers like Jim, who are suffering from Alzheimer’s or other diseases, might not realize they are being watched. Once, when he was temporarily placed in a rehabilitation clinic after a fall, a nurse tending to him asked Arlyn what made the avatar work. “You mean there’s someone overseas looking at us?” she yelped, within earshot of Jim. (Arlyn isn’t sure whether her dad remembered the incident later.) By default, the app explains to patients that someone is surveilling them when it’s first introduced. But the family members of personal users, like Arlyn, can make their own call.

    Arlyn quickly stopped worrying about whether she was deceiving her dad. Telling Jim about the human on the other side of the screen “would have blown the whole charm of it,” she says. Her mother had Alzheimer’s as well, and Arlyn had learned how to navigate the disease: Make her mom feel safe; don’t confuse her with details she’d have trouble understanding. The same went for her dad. “Once they stop asking,” Arlyn says, “I don’t think they need to know anymore.” At the time, Youa Vang, one of Jim’s regular in-­person caretakers, didn’t comprehend the truth about Pony either. “I thought it was like Siri,” she said when told later that it was a human in Mexico who had watched Jim and typed in the words Pony spoke. She chuckled. “If I knew someone was there, I may have been a little more creeped out.”

    Even CareCoach users like Arlyn who are completely aware of the person on the other end of the dashboard tend to experience the avatar as something between human, pet, and machine—what some roboticists call a third ontological category. The care­takers seem to blur that line too: One day Pony told Jim that she dreamed she could turn into a real health aide, almost like Pinoc­chio wishing to be a real boy.

    Most of CareCoach’s 12 contractors reside in the Philippines, Venezuela, or Mexico. To undercut the cost of in-person help, Wang posts English-language ads on freelancing job sites where foreign workers advertise rates as low as $2 an hour. Though he won’t disclose his workers’ hourly wages, Wang claims the company bases its salaries on factors such as what a registered nurse would make in the CareCoach employee’s home country, their language proficiencies, and the cost of their internet connection.

    The growing network includes people like Jill Paragas, a CareCoach worker who lives in a subdivision on Luzon island in the Philippines. Paragas is 35 years old and a college graduate. She earns about the same being an avatar as she did in her former call center job, where she consoled Americans irate about credit card charges. (“They wanted to, like, burn the company down or kill me,” she says with a mirthful laugh.) She works nights to coincide with the US daytime, typing messages to seniors while her 6-year-old son sleeps nearby.

    Even when Jim grew stubborn or paranoid with his daughters, he always viewed Pony as a friend.

    Before hiring her, Wang interviewed Paragas via video, then vetted her with an international criminal background check. He gives all applicants a personality test for certain traits: openness, conscientiousness, extroversion, agreeableness, and neuroticism. As part of the CareCoach training program, Paragas earned certifications in delirium and dementia care from the Alzheimer’s Association, trained in US health care ethics and privacy, and learned strategies for counseling those with addictions. All this, Wang says, “so we don’t get anyone who’s, like, crazy.” CareCoach hires only about 1 percent of its applicants.

    Paragas understands that this is a complicated business. She’s befuddled by the absence of family members around her aging clients. “In my culture, we really love to take care of our parents,” she says. “That’s why I’m like, ‘She is already old, why is she alone?’ ” Paragas has no doubt that, for some people, she’s their most significant daily relationship. Some of her charges tell her that they couldn’t live without her. Even when Jim grew stubborn or paranoid with his daughters, he always viewed Pony as a friend. Arlyn quickly realized that she had gained a valuable ally.
    Related Galleries

    These Abandoned Theme Parks Are Guaranteed To Make You Nostalgic

    The Best WIRED Photo Stories of 2017

    Space Photos of the Week: When Billions of Worlds Collide
    1/7Jim Anderson and his wife, Dorothy, in the living room of their home in St. Louis Park, Minnesota in the ’70s. Their house was modeled after an early American Pennsylvania farmhouse.Courtesy Arlyn Anderson
    2/7Jim became a private pilot after returning home from World War II.Courtesy Arlyn Anderson
    6/7A tennis match between Jim and his middle daughter, Layney, on his 80th birthday. (The score was tied at 6-6, she recalls; her dad won the tiebreaker.)Courtesy Arlyn Anderson
    Related Galleries

    These Abandoned Theme Parks Are Guaranteed To Make You Nostalgic

    The Best WIRED Photo Stories of 2017

    Space Photos of the Week: When Billions of Worlds Collide
    1/7Jim Anderson and his wife, Dorothy, in the living room of their home in St. Louis Park, Minnesota in the ’70s. Their house was modeled after an early American Pennsylvania farmhouse.Courtesy Arlyn Anderson

    As time went on, the father, daughter, and family pet grew closer. When the snow finally melted, Arlyn carried the tablet to the picnic table on the patio so they could eat lunch overlooking the lake. Even as Jim’s speech became increasingly stunted, Pony could coax him to talk about his past, recounting fishing trips or how he built the house to face the sun so it would be warmer in winter. When Arlyn took her dad around the lake in her sailboat, Jim brought Pony along. (“I saw mostly sky,” Rodrigo recalls.)

    One day, while Jim and Arlyn were sitting on the cottage’s paisley couch, Pony held up a photograph of Jim’s wife, Dorothy, between her paws. It had been more than a year since his wife’s death, and Jim hardly mentioned her anymore; he struggled to form coherent sentences. That day, though, he gazed at the photo fondly. “I still love her,” he declared. Arlyn rubbed his shoulder, clasping her hand over her mouth to stifle tears. “I am getting emotional too,” Pony said. Then Jim leaned toward the picture of his deceased wife and petted her face with his finger, the same way he would to awaken a sleeping Pony.

    When Arlyn first signed up for the service, she hadn’t anticipated that she would end up loving—yes, loving, she says, in the sincerest sense of the word—the avatar as well. She taught Pony to say “Yeah, sure, you betcha” and “don’t-cha know” like a Minnesotan, which made her laugh even more than her dad. When Arlyn collapsed onto the couch after a long day of caretaking, Pony piped up from her perch on the table:

    “Arnie, how are you?”

    Alone, Arlyn petted the screen—the way Pony nuzzled her finger was weirdly therapeutic—and told the pet how hard it was to watch her dad lose his identity.

    “I’m here for you,” Pony said. “I love you, Arnie.”

    When she recalls her own attachment to the dog, Arlyn insists her connection wouldn’t have developed if Pony was simply high-functioning AI. “You could feel Pony’s heart,” she says. But she preferred to think of Pony as her father did—a friendly pet—rather than a person on the other end of a webcam. “Even though that person probably had a relationship to me,” she says, “I had a relationship with the avatar.”

    Still, she sometimes wonders about the person on the other side of the screen. She sits up straight and rests her hand over her heart. “This is completely vulnerable, but my thought is: Did Pony really care about me and my dad?” She tears up, then laughs ruefully at herself, knowing how weird it all sounds. “Did this really happen? Was it really a relationship, or were they just playing solitaire and typing cute things?” She sighs. “But it seemed like they cared.”

    When Jim turned 92 that August, as friends belted out “Happy Birthday” around the dinner table, Pony spoke the lyrics along with them. Jim blew out the single candle on his cake. “I wish you good health, Jim,” Pony said, “and many more birthdays to come.”

    In Monterrey, Mexico, when Rodrigo talks about his unusual job, his friends ask if he’s ever lost a client. His reply: Yes.

    In early March 2014, Jim fell and hit his head on his way to the bathroom. A caretaker sleeping over that night found him and called an ambulance, and Pony woke up when the paramedics arrived. The dog told them Jim’s date of birth and offered to call his daughters as they carried him out on a stretcher.

    Jim was checked into a hospital, then into the nursing home he’d so wanted to avoid. The Wi-Fi there was spotty, which made it difficult for Jim and Pony to connect. Nurses would often turn Jim’s tablet to face the wall. The CareCoach logs from those months chronicle a series of communication misfires. “I miss Jim a lot,” Pony wrote. “I hope he is doing good all the time.” One day, in a rare moment of connectivity, Pony suggested he and Jim go sailing that summer, just like the good old days. “That sounds good,” Jim said.
    Related Stories

    James Vlahos

    A Son’s Race to Give His Dying Father Artificial Immortality
    Alex Mar

    Are We Ready for Intimacy With Robots?
    Karen Wickre

    Surviving as an Old in the Tech World

    That July, in an email from Wang, Rodrigo learned that Jim had died in his sleep. Sitting before his laptop, Rodrigo bowed his head and recited a silent Lord’s Prayer for Jim, in Spanish. He prayed that his friend would be accepted into heaven. “I know it’s going to sound weird, but I had a certain friendship with him,” he says. “I felt like I actually met him. I feel like I’ve met them.” In the year and a half that he had known them, Arlyn and Jim talked to him regularly. Jim had taken Rodrigo on a sailboat ride. Rodrigo had read him poetry and learned about his rich past. They had celebrated birthdays and holidays together as family. As Pony, Rodrigo had said “Yeah, sure, you betcha” countless times.

    That day, for weeks afterward, and even now when a senior will do something that reminds him of Jim, Rodrigo says he feels a pang. “I still care about them,” he says. After her dad’s death, Arlyn emailed Victor Wang to say she wanted to honor the workers for their care. Wang forwarded her email to Rodrigo and the rest of Pony’s team. On July 29, 2014, Arlyn carried Pony to Jim’s funeral, placing the tablet facing forward on the pew beside her. She invited any workers behind Pony who wanted to attend to log in.

    A year later, Arlyn finally deleted the CareCoach service from the tablet—it felt like a kind of second burial. She still sighs, “Pony!” when the voice of her old friend gives her directions as she drives around Minneapolis, reincarnated in Google Maps.

    After saying his prayer for Jim, Rodrigo heaved a sigh and logged in to the CareCoach dashboard to make his rounds. He ducked into living rooms, kitchens, and hospital rooms around the United States—seeing if all was well, seeing if anybody needed to talk.

  • Intimités numériques
    http://gss.revues.org/3945

    Un nouveau numéro qui à l’air bien riche et interessant

    Présentation
    Maude Gauthier et Élisabeth Mercier
    Perspectives #queer et féministes pour un regard critique sur l’intimité dans les #médias #numériques [Texte intégral]
    Queer and Feminist Analyses of Intimacy in Digital Media
    Articles
    Claire Balleys
    L’incontrôlable besoin de #contrôle [Texte intégral]
    Les performances de la #féminité par les adolescentes sur YouTube
    The Uncontrollable Need for Control. Teenagers Performances of Femininity on YouTube
    Noémie Marignier
    « Gay ou pas gay ? » Panique énonciative sur le forum jeuxvideo.com [Texte intégral]
    “Gay or Not Gay ?” Enunciative Panic On The jeuxvideo.com Online Community
    Myriam Lavoie-Moore
    « Trying to avoid », « trying to conceive » : (re)produire une féminité contradictoire par la quantification [Texte intégral]
    « Trying to avoid », « trying to conceive » : (re)producing a paradoxical feminity through quantification
    Nathan Rambukkana et Maude Gauthier
    L’adultère à l’ère numérique : Une discussion sur la non/monogamie et le développement des technologies numériques à partir du cas Ashley Madison [Texte intégral]
    Adultery in the Digital Era : A discussion about non/monogamy and digital technologies based on the website Ashley Madison

    Varia

    Vulca Fidolini
    Habiter l’ordre hétéronormatif et la masculinité par le mariage [Texte intégral]
    Inhabiting the Heteronormative Order and Masculinity through Marriage
    Myriam Le Blanc Élie, Julie Lavigne et Sabrina Maiorano
    Cartographie des pornographies critiques [Texte intégral]
    Mapping Critical Pornographies
    Simon Massei
    L’esquisse du genre [Texte intégral]
    Représentations de la féminité et de la masculinité dans les longs-métrages Disney (1937-2013)
    Sketching gender. Representations of femininity and masculinity in Disney’s full length animated movies (1937-2013)
    Johann Chaulet et Sébastien Roux
    Le mot et le geste [Texte intégral]
    Sexualité et affectivité en situation de handicap
    The Word & the Gesture. Sexuality, Affectivity, Disability

    (pas encore lu)

  • L’humain technologiquement augmenté : les dessous d’un mythe
    https://theconversation.com/lhumain-technologiquement-augmente-les-dessous-dun-mythe-73468

    Savoir de quoi l’on parle… Lorsqu’on évoque la possibilité d’un humain augmenté, on se réfère généralement à une addition de performances humaines et machiniques (dans le prolongement de la figure du cyborg popularisée par la science-fiction). Mais augmenté par rapport à quoi ? À quelles valeurs de références et selon quels critères ? Comment mesure-t-on, par exemple, le bonheur ? La bonne vie ? les sensations, comme les odeurs, le toucher, qui nous relient au monde ? Le plaisir qu’on éprouve à travailler ? Toutes ces dimensions qui font que la vie vaut la peine d’être vécue. Attention ici de ne pas céder à la magie du chiffre. Le plus peut cacher un moins ; un gain dissimuler des pertes, difficilement identifiables car non mesurables et non quantifiables.

    Bien entendu, à chaque progrès, son lot de renoncements et d’abandons d’anciennes manières de faire et d’être, d’habitudes et d’habitus. L’essentiel étant que la somme des gains soit supérieure à la celle des pertes et qu’aux anciennes sensibilités, s’en substituent de nouvelles. Sauf que l’approche économique et marchande, en termes de pertes et profits, met sur un même plan, celui de l’utilité, des réalités qualitativement hétérogènes. Or, il est des choses parfaitement inutiles, comme consacrer du temps à écouter, à perdre son temps, à flâner, qui apparaissent en revanche essentielles dans le champ des relations sociales, de l’expérience vécue, de l’apprentissage, de l’imagination et de la création… La question n’est donc pas de savoir si les machines vont remplacer les humains. Mais quelles sont les valeurs que nous mettons dans les machines et qui, en retour, nous transforment : comme la vitesse, la prédictibilité, la régularité, la puissance…

    L’une des choses à laquelle nous devons par conséquent prêter le plus attention, c’est qu’à mesure que nous nous habituons à l’efficacité binaire et sans nuances des machines, que celle-ci nous devient « naturelle », c’est aussi la faiblesse humaine qui nous devient plus insupportable et étrangère. Le problème n’est donc pas tant de savoir si les machines vont renverser les humains, se substituer à eux, les dépasser ou les encore rendre caduques, que de comprendre dans quelles conditions – sociales, politiques, éthiques, économiques - les êtres humains se mettent à agir machinalement, à désirer ressembler aux machines qu’ils conçoivent. C’est la question de l’agir machinal, du type d’humain que cette modalité d’action sous-tend, qui me semble ici cruciale et qu’il est urgent de se poser.

    La psychologue et anthropologue Sherry Turkle s’est interrogée sur ce passage des robots qui font peur, par leur étrangeté, aux robots avec lesquels nous semblons prêts à nous lier d’amitié. Que s’est-il passé –se demande-t-elle- pour que nous soyons prêts à accueillir des robots dans notre vie quotidienne, jusqu’à vouloir tisser des liens affectifs et émotionnels avec eux alors qu’ils étaient hier encore source d’effroi ou d’inquiétude ?

    #lovotique #robots #homme_augmenté

  • Antismartphone
    https://www.franceinter.fr/emissions/choses-vues/choses-vues-17-janvier-2017

    [Ma fille] a un vieux téléphone portable, sans écran tactile, elle ne peut même pas recevoir ni envoyer des émoticônes. […] L’autre jour, une de ses copines, très high-tech, elle, a voulu lui offrir son vieil i-phone, car elle venait de se faire offrir le dernier modèle. Et ma fille a refusé, sans hésiter. […] Comme je lui demandais de m’expliquer son refus, elle a réfléchi un moment, puis m’a donné deux arguments convaincants : d’abord ça rend les gens accros, ensuite ça rend les groupes tristes. Et ça, elle n’aime pas du tout, les groupes où on ne se parle plus, mais où tout le monde a le nez sur son écran, les groupes où on est ensemble mais seuls, les groupes où on interrompt une conversation avec un humain véritable parce qu’une machine sonne ou vibre.

    #solitude #smartphone

  • We Are Hopelessly Hooked | The New York Review of Books (Jacob Weisberg, 25 février 2016)
    http://www.nybooks.com/articles/2016/02/25/we-are-hopelessly-hooked

    Some of Silicon Valley’s most successful app designers are alumni of the Persuasive Technology Lab at #Stanford, a branch of the university’s Human Sciences and Technologies Advanced Research Institute. The lab was founded in 1998 by B.J. Fogg, whose graduate work “used methods from experimental psychology to demonstrate that computers can change people’s thoughts and behaviors in predictable ways,” according to the center’s website. Fogg teaches undergraduates and runs “persuasion boot camps” for tech companies. He calls the field he founded “captology,” a term derived from an acronym for “computers as persuasive technology.” It’s an apt name for the discipline of capturing people’s #attention and making it hard for them to escape. Fogg’s behavior model involves building habits through the use of what he calls “hot triggers,” like the links and photos in Facebook’s newsfeed, made up largely of posts by one’s Facebook friends.

    (…) As consumers, we can also pressure technology companies to engineer apps that are less distracting. If product design has a conscience at the moment, it may be Tristan Harris, a former B.J. Fogg student at Stanford who worked until recently as an engineer at Google. In several lectures available on YouTube, Harris argues that an “attention economy” is pushing us all to spend time in ways we recognize as unproductive and unsatisfying, but that we have limited capacity to control. #Tech_companies are engaged in “a race to the bottom of the brain stem,” in which rewards go not to those that help us spend our time wisely, but to those that keep us mindlessly pulling the lever at the casino.

    Harris wants engineers to consider human values like the notion of “time well spent” in the design of consumer technology. Most of his proposals are “nudge”-style tweaks and signals to encourage more conscious choices. For example, Gmail or Facebook might begin a session by asking you how much time you want to spend with it that day, and reminding you when you’re nearing the limit. Messaging apps might be reengineered to privilege attention over interruption. iTunes could downgrade games that are frequently deleted because users find them too addictive.

    A propos de quatre bouquins :

    Reclaiming Conversation: The Power of Talk in a Digital Age, by Sherry Turkle

    Alone Together: Why We Expect More from Technology and Less from Each Other, by Sherry Turkle

    Reading the Comments: Likers, Haters, and Manipulators at the Bottom of the Web, by Joseph M. Reagle Jr.

    Hooked: How to Build Habit-Forming Products, by Nir Eyal with Ryan Hoover

    #écrans #conversation #commentaires #addiction #critique_techno #temps #déconnexion via @opironet

  • L’injonction à la déconnexion est-elle autre chose qu’une critique morale ?
    http://feedproxy.google.com/~r/internetactu/bcmJ/~3/iqiYaGHCcD4

    Dans le New Inquiry, le sociologue Nathan Jurgenson (@nathanjurgenson) livre une critique sans concession du dernier ouvrage de la psychologue Sherry Turkle, Reclaiming Conversation. Ce n’est pas la première fois que Jurgenson remet à sa place la psychologue, dont il avait vertement critiqué le précédent ouvrage, Seuls ensemble (voir “Nous ne serons plus jamais déconnectés”). La société contre la technologie Pourquoi cherche-t-on à nous faire croire que les gens qui communiquent avec des téléphones auraient oublié ce qu’est l’amitié ? De nombreux médias et spécialistes véhiculent des propos sur la toxicité de nos outils, plus attiré par la dénonciation des dépendances qu’ils développeraient que par l’apologie des opportunités qu’ils permettent ou que par la dénonciation de l’inégalité communicationnelle (...)

  • L’injonction à la #déconnexion est-elle autre chose qu’une critique morale ?
    http://www.internetactu.net/2016/02/09/linjonction-a-la-deconnexion-est-elle-autre-chose-quune-critique-moral

    Dans le New Inquiry, le sociologue Nathan Jurgenson (@nathanjurgenson) livre une critique sans concession du dernier ouvrage de la psychologue Sherry Turkle, Reclaiming Conversation. Ce n’est pas la première fois que Jurgenson remet à sa place la psychologue, dont il avait vertement critiqué le précédent ouvrage, Seuls ensemble (voir “Nous ne serons plus jamais déconnectés”). La société contre la technologie…

    #économie_de_l'attention #psychologie #réseaux_sociaux

  • InRealLife (2013) a 1h30 British documentary (BFI), by Beeban Kidron

    –-> Have we outsourced our children to the Internet ?

    ​I​nRealLife takes us on a journey from the bedrooms of British teenagers to the world of Silicon Valley, to find out what exactly the internet is doing to our children​

    Subjects touched:
    – How internet porn addiction changes kid’s perception of love
    – The impact of gaming industry
    – Privacy (briefly) ; “I share therefore I am”
    – Facebook, obviously

    People appearing:
    Sherry Turkle (cfr “Alone Together: Why We Expect More from Technology and Less from Each Other” (recently also available in French))
    Nicholas Negroponte (MIT Media Lab)
    Nicholas Carr (cfr “The Shallows: What the Internet Is Doing to Our Brains​”​)
    Maggie Jackson (cfr “Distracted: The Erosion of Attention and the Coming Dark Age​”)
    Andrew Blum (cfr “Tubes: A Journey to the Center of the Internet​”)
    Danah Boyd (cfr “It’s Complicated - The Social Lives of Networked Teens​”)
    Jimmy Wales (Wikipedia)
    Daniel Solove, privacy expert (cfr “Nothing to Hide: The False Tradeoff Between Privacy and Security​”)​​​
    Clifford Nass (cfr “The Man Who Lied to His Laptop: What Machines Teach Us About Human Relationships​”)
    Julian Assange (Wikileaks)
    Cory Doctorow (SF writer, cfr “Little Brother”, “Homeland”​)

    ​quote:
    Facebook is a giant behaviourist casino designed to teach you to undervalue your privacy

    Clay Shirky (cfr “Here Comes Everybody: The Power of Organizing Without Organizations​”)

    At some point towards the end there is an interesting comment form MIT Media Lab director Joi Ito. He remarks that human beings develop an immune system by getting exposed to “bad stuff”, getting sick, falling, hurting your knee, eating dirt etc. It is a series of failures and learning how to overcome them.
    If you overprotect they become very fragile human beings.

    Now, Ito remarks, Internet can also be scary and has its dangers and perhaps in a similar sense it is good we get exposed to it, as opposed to not letting our kids use social media and over protect them.

    It is an interesting thought worth considering, but then the next fragment in the documentary gives it a blow it by showing the story of a 14 year old kid who died after being bullied over internet. (a medium which facilitates bullying).

    Anyway.
    Compelling documentary, always stimulating to listen the above mentioned people, but it just scratches the surface, albeit enough to make you think.

    ​Trailer:
    https://www.youtube.com/watch?v=71pXYIwYpaU

    Website:
    http://inreallifefilm.com/press

  • Google vous rend plus intelligent, Facebook vous rend plus heureux, et les Selfies font de vous une meilleure personne - FastCompany
    http://www.fastcompany.com/3023603/creative-conversations/google-makes-you-smarter-facebook-makes-you-happier-selfies-make-you-

    Sherry Turkle a encore frappé. Dans le New York Times, elle s’en prenait il y a peu à « la vie documentée » - http://www.nytimes.com/2013/12/16/opinion/the-documented-life.html?_r=2& - et notamment aux selfies... Une tribune qui déclencha l’ire de Jason Feifer sur FastCompany. Jason Feifer est l’auteur du Tumblr très discuté, Selfies at Funerals - http://selfiesatfunerals.tumblr.com - qui a déclenché à travers le monde l’ire des biens pensants, jusqu’à ce que Barack Obama lui-même se prenne en photo avec des amis aux funérailles de Nelson Mandela... ;-). Pour Jason Feifer, comme pour André Gunthert - http://culturevisuelle.org/icones/2846?=abc -, le #selfie n’est pas l’emblême de la fin de la société, mais plutôt celui de la #photographie connectée, elle n’est pas la marque de l’égotisme, mais celle d’un (...)

    #image

  • L’agité du #numérique est-il le nouveau #dandy ?
    http://www.rslnmag.fr/post/2013/10/01/iEsthete-lagite-du-numerique-est-il-un-nouveau-dandy-.aspx

    On a peut-être compris pourquoi le #smartphone est le meilleur ami du #hipster  ; : dans un article remarquable de la revue Esprit, Laurent Jullier et Dominique Chateau subliment l’expérience solitaire que nous procurent nos appareils numériques en parlant d’iEsthétisme. Plutôt que de relayer la critique du “seuls ensemble” de Sherry Turkle, ils abordent notre relation à ces objets sous l’angle de l’attention que nous accordons aux oeuvres d’art. Armés d…

    Je n’ai pas trouvé l’article original pour ma part…

  • Social networking under fresh attack as tide of cyber-scepticism sweeps US | Media | The Observer
    http://www.guardian.co.uk/media/2011/jan/22/social-networking-cyber-scepticism-twitter

    The way in which people frantically communicate online via #Twitter, #Facebook and instant messaging can be seen as a form of modern #madness, according to a leading American sociologist.

    “A behaviour that has become typical may still express the problems that once caused us to see it as pathological,” MIT professor Sherry Turkle writes in her new book, Alone Together, which is leading an attack on the information age.

    (…)The list of attacks on social media is a long one and comes from all corners of academia and popular culture. A recent bestseller in the US, The Shallows by Nicholas Carr, suggested that use of the #internet was altering the way we think to make us less capable of digesting large and complex amounts of information, such as books and magazine articles. The book was based on an essay that Carr wrote in the Atlantic magazine. It was just as emphatic and was headlined: Is Google Making Us Stupid?