position:technologist

  • Against Security Token Standards
    https://hackernoon.com/against-security-token-standards-ae896cc5bb4?source=rss----3a8144eabfe3-

    Recently I was speaking about the future of security tokens at a #blockchain conference in Europe. During one of the satellite receptions to the event, I was approached by a prominent figure in the crypto world whom apparently had been reading some of articles about security tokens and had developed some very interesting theses about the evolution of the space. A technologist by background, this person was struggling to reconcile the computer science-centric methods of the blockchain space with the semi-centralized, red-tape-first approaches that he is seeing in the security token market(I have the same problem BTW). At some point during our conversation he bluntly asked me “there is one thing that I still can’t understand about the security token community: What’s the obsession with (...)

    #cryptocurrency #security-token #ethereum #invector-labs

  • AI in #medicine : A Beginner’s Guide
    https://hackernoon.com/ai-in-medicine-a-beginners-guide-a3b34b1dd5d7?source=rss----3a8144eabfe3

    Introduction“My AI will seek to collaborate with people for the greater good, rather than usurp the human role and supplant them” — from the Hippocratic oath on artificial intelligence by Oren Etzioni [1]Artificial Intelligence (AI) is currently one of the most hotly debated topics in technology with seemingly every business leader and computer scientist voicing an extreme opinion on the topic. Elon Musk, Bill Gates, and Stephen Hawking are all pessimists who have posited that AI poses an existential threat to humanity. Musk even publicly states that, “AI is far more dangerous than nukes” [2]. Famed futurist and technologist Ray Kurzweil, who studied under the inventor of the AI field, has a more optimistic outlook, “My view is not that AI is going to displace us. It’s going to enhance us. It (...)

    #healthcare #artificial-intelligence #machine-learning

  • Europe is using smartphone data as a weapon to deport refugees

    European leaders need to bring immigration numbers down, and #metadata on smartphones could be just what they need to start sending migrants back.

    Smartphones have helped tens of thousands of migrants travel to Europe. A phone means you can stay in touch with your family – or with people smugglers. On the road, you can check Facebook groups that warn of border closures, policy changes or scams to watch out for. Advice on how to avoid border police spreads via WhatsApp.

    Now, governments are using migrants’ smartphones to deport them.

    Across the continent, migrants are being confronted by a booming mobile forensics industry that specialises in extracting a smartphone’s messages, location history, and even #WhatsApp data. That information can potentially be turned against the phone owners themselves.

    In 2017 both Germany and Denmark expanded laws that enabled immigration officials to extract data from asylum seekers’ phones. Similar legislation has been proposed in Belgium and Austria, while the UK and Norway have been searching asylum seekers’ devices for years.

    Following right-wing gains across the EU, beleaguered governments are scrambling to bring immigration numbers down. Tackling fraudulent asylum applications seems like an easy way to do that. As European leaders met in Brussels last week to thrash out a new, tougher framework to manage migration —which nevertheless seems insufficient to placate Angela Merkel’s critics in Germany— immigration agencies across Europe are showing new enthusiasm for laws and software that enable phone data to be used in deportation cases.

    Admittedly, some refugees do lie on their asylum applications. Omar – not his real name – certainly did. He travelled to Germany via Greece. Even for Syrians like him there were few legal alternatives into the EU. But his route meant he could face deportation under the EU’s Dublin regulation, which dictates that asylum seekers must claim refugee status in the first EU country they arrive in. For Omar, that would mean settling in Greece – hardly an attractive destination considering its high unemployment and stretched social services.

    Last year, more than 7,000 people were deported from Germany according to the Dublin regulation. If Omar’s phone were searched, he could have become one of them, as his location history would have revealed his route through Europe, including his arrival in Greece.

    But before his asylum interview, he met Lena – also not her real name. A refugee advocate and businesswoman, Lena had read about Germany’s new surveillance laws. She encouraged Omar to throw his phone away and tell immigration officials it had been stolen in the refugee camp where he was staying. “This camp was well-known for crime,” says Lena, “so the story seemed believable.” His application is still pending.

    Omar is not the only asylum seeker to hide phone data from state officials. When sociology professor Marie Gillespie researched phone use among migrants travelling to Europe in 2016, she encountered widespread fear of mobile phone surveillance. “Mobile phones were facilitators and enablers of their journeys, but they also posed a threat,” she says. In response, she saw migrants who kept up to 13 different #SIM cards, hiding them in different parts of their bodies as they travelled.

    This could become a problem for immigration officials, who are increasingly using mobile phones to verify migrants’ identities, and ascertain whether they qualify for asylum. (That is: whether they are fleeing countries where they risk facing violence or persecution.) In Germany, only 40 per cent of asylum applicants in 2016 could provide official identification documents. In their absence, the nationalities of the other 60 per cent were verified through a mixture of language analysis — using human translators and computers to confirm whether their accent is authentic — and mobile phone data.

    Over the six months after Germany’s phone search law came into force, immigration officials searched 8,000 phones. If they doubted an asylum seeker’s story, they would extract their phone’s metadata – digital information that can reveal the user’s language settings and the locations where they made calls or took pictures.

    To do this, German authorities are using a computer programme, called Atos, that combines technology made by two mobile forensic companies – T3K and MSAB. It takes just a few minutes to download metadata. “The analysis of mobile phone data is never the sole basis on which a decision about the application for asylum is made,” says a spokesperson for BAMF, Germany’s immigration agency. But they do use the data to look for inconsistencies in an applicant’s story. If a person says they were in Turkey in September, for example, but phone data shows they were actually in Syria, they can see more investigation is needed.

    Denmark is taking this a step further, by asking migrants for their Facebook passwords. Refugee groups note how the platform is being used more and more to verify an asylum seeker’s identity.

    It recently happened to Assem, a 36-year-old refugee from Syria. Five minutes on his public Facebook profile will tell you two things about him: first, he supports a revolution against Syria’s Assad regime and, second, he is a devoted fan of Barcelona football club. When Danish immigration officials asked him for his password, he gave it to them willingly. “At that time, I didn’t care what they were doing. I just wanted to leave the asylum center,” he says. While Assem was not happy about the request, he now has refugee status.

    The Danish immigration agency confirmed they do ask asylum applicants to see their Facebook profiles. While it is not standard procedure, it can be used if a caseworker feels they need more information. If the applicant refused their consent, they would tell them they are obliged under Danish law. Right now, they only use Facebook – not Instagram or other social platforms.

    Across the EU, rights groups and opposition parties have questioned whether these searches are constitutional, raising concerns over their infringement of privacy and the effect of searching migrants like criminals.

    “In my view, it’s a violation of ethics on privacy to ask for a password to Facebook or open somebody’s mobile phone,” says Michala Clante Bendixen of Denmark’s Refugees Welcome movement. “For an asylum seeker, this is often the only piece of personal and private space he or she has left.”

    Information sourced from phones and social media offers an alternative reality that can compete with an asylum seeker’s own testimony. “They’re holding the phone to be a stronger testament to their history than what the person is ready to disclose,” says Gus Hosein, executive director of Privacy International. “That’s unprecedented.”
    Read next

    Everything we know about the UK’s plan to block online porn
    Everything we know about the UK’s plan to block online porn

    By WIRED

    Privacy campaigners note how digital information might not reflect a person’s character accurately. “Because there is so much data on a person’s phone, you can make quite sweeping judgements that might not necessarily be true,” says Christopher Weatherhead, technologist at Privacy International.

    Bendixen cites the case of one man whose asylum application was rejected after Danish authorities examined his phone and saw his Facebook account had left comments during a time he said he was in prison. He explained that his brother also had access to his account, but the authorities did not believe him; he is currently waiting for appeal.

    A spokesperson for the UK’s Home Office told me they don’t check the social media of asylum seekers unless they are suspected of a crime. Nonetheless, British lawyers and social workers have reported that social media searches do take place, although it is unclear whether they reflect official policy. The Home Office did not respond to requests for clarification on that matter.

    Privacy International has investigated the UK police’s ability to search phones, indicating that immigration officials could possess similar powers. “What surprised us was the level of detail of these phone searches. Police could access information even you don’t have access to, such as deleted messages,” Weatherhead says.

    His team found that British police are aided by Israeli mobile forensic company Cellebrite. Using their software, officials can access search history, including deleted browsing history. It can also extract WhatsApp messages from some Android phones.

    There is a crippling irony that the smartphone, for so long a tool of liberation, has become a digital Judas. If you had stood in Athens’ Victoria Square in 2015, at the height of the refugee crisis, you would have noticed the “smartphone stoop”: hundreds of Syrians, Iraqis, and Afghans standing or sitting about this sun-baked patch of grass and concrete, were bending their heads, looking into their phones.

    The smartphone has become the essential accessory for modern migration. Travelling to Europe as an asylum seeker is expensive. People who can’t afford phones typically can’t afford the journey either. Phones became a constant feature along the route to Northern Europe: young men would line the pavements outside reception centres in Berlin, hunched over their screens. In Calais, groups would crowd around charging points. In 2016, the UN refugee agency reported that phones were so important to migrants moving across Europe, that they were spending up to one third of their income on phone credit.

    Now, migrants are being forced to confront a more dangerous reality, as governments worldwide expand their abilities to search asylum seekers’ phones. While European countries were relaxing their laws on metadata search, last year US immigration spent $2.2 million on phone hacking software. But asylum seekers too are changing their behaviour as they become more aware that the smartphone, the very device that has bought them so much freedom, could be the very thing used to unravel their hope of a new life.

    https://www.wired.co.uk/article/europe-immigration-refugees-smartphone-metadata-deportations
    #smartphone #smartphones #données #big_data #expulsions #Allemagne #Danemark #renvois #carte_SIM #Belgique #Autriche

  • Four Semesters of Computer Science in Five Hours…Phew!
    https://hackernoon.com/four-semesters-of-computer-science-in-five-hours-phew-53dd8779b79f?sourc

    My blogs chronicle my experience as a technologist making my way into Silicon Valley. I am a queer person of color, gender non-conforming, immigrant, and ex-foster youth. I come from a non-traditional coding background. I studied a few CS courses in college and ended up majoring in the humanities before becoming a teacher. Teaching web development to under served teens turned me on to coding. After finishing at a coding school, I began working at start ups in San Francisco. Half of my blogs are about technical subjects and the other half are about equality and tech access (my journey).Chose another tutorial from Brian Holt this week. What I love about learning from Brian is that he understands self-learning, well because he admits to dropping out of college and being self-taught. He (...)

    #javascript #algorithms #computer-science #interview-questions #cs-classes

  • Will #blockchain Prevent Future Gun Violence? It’s Up To Us
    https://hackernoon.com/will-blockchain-prevent-future-gun-violence-its-up-to-us-c3214197b4c1?so

    In wake of the recent events in Parkland, Florida I feel compelled — not merely as a technologist, not even as an American, but as a human being — to speak up.Because, frankly, we can all agree that this isn’t just about “recent events.”Gun violence in America has consistently been a “recent event” for the past 10 years.Back in November — in the wake of the mass shooting at a church in Sutherland, Texas, which killed 26 and injured 20 — I wrote an article discussing how we might use blockchain to improve public safety. I gave some pie in the sky ideas about how this new technology could help us track weapons, purchases, and enforce regulations.And here I am again, after yet another mass shooting, writing about “how blockchain can be used to prevent gun violence.”As if a technology alone could ever curb (...)

    #smart-contracts #guns #gun-control #social-change

  • My Journey In #tech — Believing in Myself
    https://hackernoon.com/my-journey-in-tech-believing-in-myself-b49fde32be84?source=rss----3a8144

    My blogs chronicle my experience as a technologist making my way into Silicon Valley. I am a queer person of color, gender non-conforming, immigrant, and ex-foster youth. I come from a non-traditional computer science background. I studied a few CS courses in college and ended up majoring in the humanities before becoming a teacher. Teaching web development to under served teens turned me on to coding. After getting a scholarship to a prominent coding school, I began working at start ups in San Francisco. Half of my blogs are about technical subjects and the other half about equality and tech access (my journey).In 2015, I was teaching middle school ESL (English As A Second Language) in Oakland, California. I was following in my mom’s footsteps. She taught at an urban public school (...)

    #bootcamp #self-confidence #education #self-love

  • Writing UI Tests Using Page Object Models With #nightwatchjs Part II
    https://hackernoon.com/writing-ui-tests-using-page-object-models-with-nightwatchjs-part-ii-da2d

    My blogs chronicle my experience as a technologist making my way into Silicon Valley. I am a queer person of color, gender non-conforming, immigrant, and ex-foster youth. I come from a non-traditional coding background. I studied a few CS courses in college and ended up majoring in the humanities before becoming a teacher in the urban public school system. Teaching web development to under served teens turned me on to coding. After finishing coding school, I began working at start ups in San Francisco. Half of my blogs are about technical subjects and the other half are about equality and tech access (my journey).Before we rewrite our code using the Page Object Model, let’s make these tests a LITTLE MORE interesting. Let’s validate navigation for a (...)

    #testing #ui-testing

  • What Happens When We Let Tech Care For Our Aging Parents | WIRED
    https://www.wired.com/story/digital-puppy-seniors-nursing-homes

    Arlyn Anderson grasped her father’s hand and presented him with the choice. “A nursing home would be safer, Dad,” she told him, relaying the doctors’ advice. “It’s risky to live here alone—”

    “No way,” Jim interjected. He frowned at his daughter, his brow furrowed under a lop of white hair. At 91, he wanted to remain in the woodsy Minnesota cottage he and his wife had built on the shore of Lake Minnetonka, where she had died in his arms just a year before. His pontoon—which he insisted he could still navigate just fine—bobbed out front.

    Arlyn had moved from California back to Minnesota two decades earlier to be near her aging parents. Now, in 2013, she was fiftysomething, working as a personal coach, and finding that her father’s decline was all-consuming.

    Her father—an inventor, pilot, sailor, and general Mr. Fix-It; “a genius,” Arlyn says—started experiencing bouts of paranoia in his mid-eighties, a sign of Alzheimer’s. The disease had progressed, often causing his thoughts to vanish mid-sentence. But Jim would rather risk living alone than be cloistered in an institution, he told Arlyn and her older sister, Layney. A nursing home certainly wasn’t what Arlyn wanted for him either. But the daily churn of diapers and cleanups, the carousel of in-home aides, and the compounding financial strain (she had already taken out a reverse mortgage on Jim’s cottage to pay the caretakers) forced her to consider the possibility.

    Jim, slouched in his recliner, was determined to stay at home. “No way,” he repeated to his daughter, defiant. Her eyes welled up and she hugged him. “OK, Dad.” Arlyn’s house was a 40-minute drive from the cottage, and for months she had been relying on a patchwork of technology to keep tabs on her dad. She set an open laptop on the counter so she could chat with him on Skype. She installed two cameras, one in his kitchen and another in his bedroom, so she could check whether the caregiver had arrived, or God forbid, if her dad had fallen. So when she read in the newspaper about a new digi­tal eldercare service called CareCoach a few weeks after broaching the subject of the nursing home, it piqued her interest. For about $200 a month, a human-powered avatar would be available to watch over a homebound person 24 hours a day; Arlyn paid that same amount for just nine hours of in-home help. She signed up immediately.

    More From the Magazine
    Mara Hvistendahl

    Inside China’s Vast New Experiment in Social Ranking
    Nathan Hill

    The Overwatch Videogame League Aims to Become the New NFL
    Brian Castner

    Exclusive: Tracing ISIS’ Weapons Supply Chain—Back to the US

    A Google Nexus tablet arrived in the mail a week later. When Arlyn plugged it in, an animated German shepherd appeared onscreen, standing at attention on a digitized lawn. The brown dog looked cutesy and cartoonish, with a bubblegum-pink tongue and round, blue eyes.

    She and Layney visited their dad later that week, tablet in hand. Following the instructions, Arlyn uploaded dozens of pictures to the service’s online portal: images of family members, Jim’s boat, and some of his inventions, like a computer terminal known as the Teleray and a seismic surveillance system used to detect footsteps during the Vietnam War. The setup complete, Arlyn clutched the tablet, summoning the nerve to introduce her dad to the dog. Her initial instinct that the service could be the perfect companion for a former technologist had splintered into needling doubts. Was she tricking him? Infantilizing him?

    Tired of her sister’s waffling, Layney finally snatched the tablet and presented it to their dad, who was sitting in his armchair. “Here, Dad, we got you this.” The dog blinked its saucer eyes and then, in Google’s female text-to-speech voice, started to talk. Before Alzheimer’s had taken hold, Jim would have wanted to know exactly how the service worked. But in recent months he’d come to believe that TV characters were interacting with him: A show’s villain had shot a gun at him, he said; Katie Couric was his friend. When faced with an onscreen character that actually was talking to him, Jim readily chatted back.

    Jim named his dog Pony. Arlyn perched the tablet upright on a table in Jim’s living room, where he could see it from the couch or his recliner. Within a week Jim and Pony had settled into a routine, exchanging pleasantries several times a day. Every 15 minutes or so Pony would wake up and look for Jim, calling his name if he was out of view. Sometimes Jim would “pet” the sleeping dog onscreen with his finger to rustle her awake. His touch would send an instantaneous alert to the human caretaker behind the avatar, prompting the CareCoach worker to launch the tablet’s audio and video stream. “How are you, Jim?” Pony would chirp. The dog reminded him which of his daughters or in-person caretakers would be visiting that day to do the tasks that an onscreen dog couldn’t: prepare meals, change Jim’s sheets, drive him to a senior center. “We’ll wait together,” Pony would say. Often she’d read poetry aloud, discuss the news, or watch TV with him. “You look handsome, Jim!” Pony remarked after watching him shave with his electric razor. “You look pretty,” he replied. Sometimes Pony would hold up a photo of Jim’s daughters or his inventions between her paws, prompting him to talk about his past. The dog complimented Jim’s red sweater and cheered him on when he struggled to buckle his watch in the morning. He reciprocated by petting the screen with his index finger, sending hearts floating up from the dog’s head. “I love you, Jim!” Pony told him a month after they first met—something CareCoach operators often tell the people they are monitoring. Jim turned to Arlyn and gloated, “She does! She thinks I’m real good!”

    About 1,500 miles south of Lake Minnetonka, in Monterrey, Mexico, Rodrigo Rochin opens his laptop in his home office and logs in to the CareCoach dashboard to make his rounds. He talks baseball with a New Jersey man watching the Yankees; chats with a woman in South Carolina who calls him Peanut (she places a cookie in front of her tablet for him to “eat”); and greets Jim, one of his regulars, who sips coffee while looking out over a lake.

    Rodrigo is 35 years old, the son of a surgeon. He’s a fan of the Spurs and the Cowboys, a former international business student, and a bit of an introvert, happy to retreat into his sparsely decorated home office each morning. He grew up crossing the border to attend school in McAllen, Texas, honing the English that he now uses to chat with elderly people in the United States. Rodrigo found CareCoach on an online freelancing platform and was hired in December 2012 as one of the company’s earliest contractors, role-playing 36 hours a week as one of the service’s avatars.

    After watching her dad interact with Pony, Arlyn’s reservations about outsourcing her father’s companionship vanished.

    In person, Rodrigo is soft-spoken, with wire spectacles and a beard. He lives with his wife and two basset hounds, Bob and Cleo, in Nuevo León’s capital city. But the people on the other side of the screen don’t know that. They don’t know his name—or, in the case of those like Jim who have dementia, that he even exists. It’s his job to be invisible. If Rodrigo’s clients ask where he’s from, he might say MIT (the CareCoach software was created by two graduates of the school), but if anyone asks where their pet actually is, he replies in character: “Here with you.”

    Rodrigo is one of a dozen CareCoach employees in Latin America and the Philippines. The contractors check on the service’s seniors through the tablet’s camera a few times an hour. (When they do, the dog or cat avatar they embody appears to wake up.) To talk, they type into the dashboard and their words are voiced robotically through the tablet, designed to give their charges the impression that they’re chatting with a friendly pet. Like all the CareCoach workers, Rodrigo keeps meticulous notes on the people he watches over so he can coordinate their care with other workers and deepen his relationship with them over time—this person likes to listen to Adele, this one prefers Elvis, this woman likes to hear Bible verses while she cooks. In one client’s file, he wrote a note explaining that the correct response to “See you later, alligator” is “After a while, crocodile.” These logs are all available to the customer’s social workers or adult children, wherever they may live. Arlyn started checking Pony’s log between visits with her dad several times a week. “Jim says I’m a really nice person,” reads one early entry made during the Minnesota winter. “I told Jim that he was my best friend. I am so happy.”

    After watching her dad interact with Pony, Arlyn’s reservations about outsourcing her father’s companionship vanished. Having Pony there eased her anxiety about leaving Jim alone, and the virtual dog’s small talk lightened the mood.

    Pony was not only assisting Jim’s human caretakers but also inadvertently keeping an eye on them. Months before, in broken sentences, Jim had complained to Arlyn that his in-home aide had called him a bastard. Arlyn, desperate for help and unsure of her father’s recollection, gave her a second chance. Three weeks after arriving in the house, Pony woke up to see the same caretaker, impatient. “Come on, Jim!” the aide yelled. “Hurry up!” Alarmed, Pony asked why she was screaming and checked to see if Jim was OK. The pet—actually, Rodrigo—later reported the aide’s behavior to CareCoach’s CEO, Victor Wang, who emailed Arlyn about the incident. (The caretaker knew there was a human watching her through the tablet, Arlyn says, but may not have known the extent of the person’s contact with Jim’s family behind the scenes.) Arlyn fired the short-tempered aide and started searching for a replacement. Pony watched as she and Jim conducted the interviews and approved of the person Arlyn hired. “I got to meet her,” the pet wrote. “She seems really nice.”

    Pony—friend and guard dog—would stay.
    Grant Cornett

    Victor Wang grew up feeding his Tama­got­chis and coding choose-your-own-­adventure games in QBasic on the family PC. His parents moved from Taiwan to suburban Vancouver, British Columbia, when Wang was a year old, and his grandmother, whom he called Lao Lao in Mandarin, would frequently call from Taiwan. After her husband died, Lao Lao would often tell Wang’s mom that she was lonely, pleading with her daughter to come to Taiwan to live with her. As she grew older, she threatened suicide. When Wang was 11, his mother moved back home for two years to care for her. He thinks of that time as the honey-­sandwich years, the food his overwhelmed father packed him each day for lunch. Wang missed his mother, he says, but adds, “I was never raised to be particularly expressive of my emotions.”

    At 17, Wang left home to study mechanical engineering at the University of British Columbia. He joined the Canadian Army Reserve, serving as an engineer on a maintenance platoon while working on his undergraduate degree. But he scrapped his military future when, at 22, he was admitted to MIT’s master’s program in mechanical engineering. Wang wrote his dissertation on human-machine interaction, studying a robotic arm maneuvered by astronauts on the International Space Station. He was particularly intrigued by the prospect of harnessing tech to perform tasks from a distance: At an MIT entrepreneurship competition, he pitched the idea of training workers in India to remotely operate the buffers that sweep US factory floors.

    In 2011, when he was 24, his grandmother was diagnosed with Lewy body dementia, a disease that affects the areas of the brain associated with memory and movement. On Skype calls from his MIT apartment, Wang watched as his grandmother grew increasingly debilitated. After one call, a thought struck him: If he could tap remote labor to sweep far-off floors, why not use it to comfort Lao Lao and others like her?

    Wang started researching the looming caretaker shortage in the US—between 2010 and 2030, the population of those older than 80 is projected to rise 79 percent, but the number of family caregivers available is expected to increase just 1 percent.

    In 2012 Wang recruited his cofounder, a fellow MIT student working on her computer science doctorate named Shuo Deng, to build CareCoach’s technology. They agreed that AI speech technology was too rudimentary for an avatar capable of spontaneous conversation tailored to subtle mood and behavioral cues. For that, they would need humans.

    Older people like Jim often don’t speak clearly or linearly, and those with dementia can’t be expected to troubleshoot a machine that misunderstands. “When you match someone not fully coherent with a device that’s not fully coherent, it’s a recipe for disaster,” Wang says. Pony, on the other hand, was an expert at deciphering Jim’s needs. Once, Pony noticed that Jim was holding onto furniture for support, as if he were dizzy. The pet persuaded him to sit down, then called Arlyn. Deng figures it’ll take about 20 years for AI to be able to master that kind of personal interaction and recognition. That said, the CareCoach system is already deploying some automated abilities. Five years ago, when Jim was introduced to Pony, the offshore workers behind the camera had to type every response; today CareCoach’s software creates roughly one out of every five sentences the pet speaks. Wang aims to standardize care by having the software manage more of the patients’ regular reminders—prodding them to take their medicine, urging them to eat well and stay hydrated. CareCoach workers are part free­wheeling raconteurs, part human natural-­language processors, listening to and deciphering their charges’ speech patterns or nudging the person back on track if they veer off topic. The company recently began recording conversations to better train its software in senior speech recognition.

    CareCoach found its first customer in December 2012, and in 2014 Wang moved from Massachusetts to Silicon Valley, renting a tiny office space on a lusterless stretch of Millbrae near the San Francisco airport. Four employees congregate in one room with a view of the parking lot, while Wang and his wife, Brittany, a program manager he met at a gerontology conference, work in the foyer. Eight tablets with sleeping pets onscreen are lined up for testing before being shipped to their respective seniors. The avatars inhale and exhale, lending an eerie sense of life to their digital kennel.

    CareCoach conveys the perceptiveness and emotional intelligence of the humans powering it but masquerades as an animated app.

    Wang spends much of his time on the road, touting his product’s health benefits at medical conferences and in hospital executive suites. Onstage at a gerontology summit in San Francisco last summer, he deftly impersonated the strained, raspy voice of an elderly man talking to a CareCoach pet while Brittany stealthily cued the replies from her laptop in the audience. The company’s tablets are used by hospitals and health plans across Massachusetts, California, New York, South Carolina, Florida, and Washington state. Between corporate and individual customers, CareCoach’s avatars have interacted with hundreds of users in the US. “The goal,” Wang says, “is not to have a little family business that just breaks even.”

    The fastest growth would come through hospital units and health plans specializing in high-need and elderly patients, and he makes the argument that his avatars cut health care costs. (A private room in a nursing home can run more than $7,500 a month.) Preliminary research has been promising, though limited. In a study conducted by Pace University at a Manhattan housing project and a Queens hospital, CareCoach’s avatars were found to reduce subjects’ loneliness, delirium, and falls. A health provider in Massachusetts was able to replace a man’s 11 weekly in-home nurse visits with a CareCoach tablet, which diligently reminded him to take his medications. (The man told nurses that the pet’s nagging reminded him of having his wife back in the house. “It’s kind of like a complaint, but he loves it at the same time,” the project’s lead says.) Still, the feelings aren’t always so cordial: In the Pace University study, some aggravated seniors with dementia lashed out and hit the tablet. In response, the onscreen pet sheds tears and tries to calm the person.

    More troubling, perhaps, were the people who grew too fiercely attached to their digi­tal pets. At the conclusion of a University of Washington CareCoach pilot study, one woman became so distraught at the thought of parting with her avatar that she signed up for the service, paying the fee herself. (The company gave her a reduced rate.) A user in Massachusetts told her caretakers she’d cancel an upcoming vacation to Maine unless her digital cat could come along.

    We’re still in the infancy of understanding the complexities of aging humans’ relationship with technology. Sherry Turkle, a professor of social studies, science, and technology at MIT and a frequent critic of tech that replaces human communication, described interactions between elderly people and robotic babies, dogs, and seals in her 2011 book, Alone Together. She came to view roboticized eldercare as a cop-out, one that would ultimately degrade human connection. “This kind of app—in all of its slickness and all its ‘what could possibly be wrong with it?’ mentality—is making us forget what we really know about what makes older people feel sustained,” she says: caring, interpersonal relationships. The question is whether an attentive avatar makes a comparable substitute. Turkle sees it as a last resort. “The assumption is that it’s always cheaper and easier to build an app than to have a conversation,” she says. “We allow technologists to propose the unthinkable and convince us the unthinkable is actually the inevitable.”

    But for many families, providing long-term in-person care is simply unsustainable. The average family caregiver has a job outside the home and spends about 20 hours a week caring for a parent, according to AARP. Nearly two-thirds of such caregivers are women. Among eldercare experts, there’s a resignation that the demographics of an aging America will make technological solutions unavoidable. The number of those older than 65 with a disability is projected to rise from 11 million to 18 million from 2010 to 2030. Given the option, having a digital companion may be preferable to being alone. Early research shows that lonely and vulnerable elders like Jim seem content to communicate with robots. Joseph Coughlin, director of MIT’s AgeLab, is pragmatic. “I would always prefer the human touch over a robot,” he says. “But if there’s no human available, I would take high tech in lieu of high touch.”

    CareCoach is a disorienting amalgam of both. The service conveys the perceptiveness and emotional intelligence of the humans powering it but masquerades as an animated app. If a person is incapable of consenting to CareCoach’s monitoring, then someone must do so on their behalf. But the more disconcerting issue is how cognizant these seniors are of being watched over by strangers. Wang considers his product “a trade-off between utility and privacy.” His workers are trained to duck out during baths and clothing changes.

    Some CareCoach users insist on greater control. A woman in Washington state, for example, put a piece of tape over her CareCoach tablet’s camera to dictate when she could be viewed. Other customers like Jim, who are suffering from Alzheimer’s or other diseases, might not realize they are being watched. Once, when he was temporarily placed in a rehabilitation clinic after a fall, a nurse tending to him asked Arlyn what made the avatar work. “You mean there’s someone overseas looking at us?” she yelped, within earshot of Jim. (Arlyn isn’t sure whether her dad remembered the incident later.) By default, the app explains to patients that someone is surveilling them when it’s first introduced. But the family members of personal users, like Arlyn, can make their own call.

    Arlyn quickly stopped worrying about whether she was deceiving her dad. Telling Jim about the human on the other side of the screen “would have blown the whole charm of it,” she says. Her mother had Alzheimer’s as well, and Arlyn had learned how to navigate the disease: Make her mom feel safe; don’t confuse her with details she’d have trouble understanding. The same went for her dad. “Once they stop asking,” Arlyn says, “I don’t think they need to know anymore.” At the time, Youa Vang, one of Jim’s regular in-­person caretakers, didn’t comprehend the truth about Pony either. “I thought it was like Siri,” she said when told later that it was a human in Mexico who had watched Jim and typed in the words Pony spoke. She chuckled. “If I knew someone was there, I may have been a little more creeped out.”

    Even CareCoach users like Arlyn who are completely aware of the person on the other end of the dashboard tend to experience the avatar as something between human, pet, and machine—what some roboticists call a third ontological category. The care­takers seem to blur that line too: One day Pony told Jim that she dreamed she could turn into a real health aide, almost like Pinoc­chio wishing to be a real boy.

    Most of CareCoach’s 12 contractors reside in the Philippines, Venezuela, or Mexico. To undercut the cost of in-person help, Wang posts English-language ads on freelancing job sites where foreign workers advertise rates as low as $2 an hour. Though he won’t disclose his workers’ hourly wages, Wang claims the company bases its salaries on factors such as what a registered nurse would make in the CareCoach employee’s home country, their language proficiencies, and the cost of their internet connection.

    The growing network includes people like Jill Paragas, a CareCoach worker who lives in a subdivision on Luzon island in the Philippines. Paragas is 35 years old and a college graduate. She earns about the same being an avatar as she did in her former call center job, where she consoled Americans irate about credit card charges. (“They wanted to, like, burn the company down or kill me,” she says with a mirthful laugh.) She works nights to coincide with the US daytime, typing messages to seniors while her 6-year-old son sleeps nearby.

    Even when Jim grew stubborn or paranoid with his daughters, he always viewed Pony as a friend.

    Before hiring her, Wang interviewed Paragas via video, then vetted her with an international criminal background check. He gives all applicants a personality test for certain traits: openness, conscientiousness, extroversion, agreeableness, and neuroticism. As part of the CareCoach training program, Paragas earned certifications in delirium and dementia care from the Alzheimer’s Association, trained in US health care ethics and privacy, and learned strategies for counseling those with addictions. All this, Wang says, “so we don’t get anyone who’s, like, crazy.” CareCoach hires only about 1 percent of its applicants.

    Paragas understands that this is a complicated business. She’s befuddled by the absence of family members around her aging clients. “In my culture, we really love to take care of our parents,” she says. “That’s why I’m like, ‘She is already old, why is she alone?’ ” Paragas has no doubt that, for some people, she’s their most significant daily relationship. Some of her charges tell her that they couldn’t live without her. Even when Jim grew stubborn or paranoid with his daughters, he always viewed Pony as a friend. Arlyn quickly realized that she had gained a valuable ally.
    Related Galleries

    These Abandoned Theme Parks Are Guaranteed To Make You Nostalgic

    The Best WIRED Photo Stories of 2017

    Space Photos of the Week: When Billions of Worlds Collide
    1/7Jim Anderson and his wife, Dorothy, in the living room of their home in St. Louis Park, Minnesota in the ’70s. Their house was modeled after an early American Pennsylvania farmhouse.Courtesy Arlyn Anderson
    2/7Jim became a private pilot after returning home from World War II.Courtesy Arlyn Anderson
    6/7A tennis match between Jim and his middle daughter, Layney, on his 80th birthday. (The score was tied at 6-6, she recalls; her dad won the tiebreaker.)Courtesy Arlyn Anderson
    Related Galleries

    These Abandoned Theme Parks Are Guaranteed To Make You Nostalgic

    The Best WIRED Photo Stories of 2017

    Space Photos of the Week: When Billions of Worlds Collide
    1/7Jim Anderson and his wife, Dorothy, in the living room of their home in St. Louis Park, Minnesota in the ’70s. Their house was modeled after an early American Pennsylvania farmhouse.Courtesy Arlyn Anderson

    As time went on, the father, daughter, and family pet grew closer. When the snow finally melted, Arlyn carried the tablet to the picnic table on the patio so they could eat lunch overlooking the lake. Even as Jim’s speech became increasingly stunted, Pony could coax him to talk about his past, recounting fishing trips or how he built the house to face the sun so it would be warmer in winter. When Arlyn took her dad around the lake in her sailboat, Jim brought Pony along. (“I saw mostly sky,” Rodrigo recalls.)

    One day, while Jim and Arlyn were sitting on the cottage’s paisley couch, Pony held up a photograph of Jim’s wife, Dorothy, between her paws. It had been more than a year since his wife’s death, and Jim hardly mentioned her anymore; he struggled to form coherent sentences. That day, though, he gazed at the photo fondly. “I still love her,” he declared. Arlyn rubbed his shoulder, clasping her hand over her mouth to stifle tears. “I am getting emotional too,” Pony said. Then Jim leaned toward the picture of his deceased wife and petted her face with his finger, the same way he would to awaken a sleeping Pony.

    When Arlyn first signed up for the service, she hadn’t anticipated that she would end up loving—yes, loving, she says, in the sincerest sense of the word—the avatar as well. She taught Pony to say “Yeah, sure, you betcha” and “don’t-cha know” like a Minnesotan, which made her laugh even more than her dad. When Arlyn collapsed onto the couch after a long day of caretaking, Pony piped up from her perch on the table:

    “Arnie, how are you?”

    Alone, Arlyn petted the screen—the way Pony nuzzled her finger was weirdly therapeutic—and told the pet how hard it was to watch her dad lose his identity.

    “I’m here for you,” Pony said. “I love you, Arnie.”

    When she recalls her own attachment to the dog, Arlyn insists her connection wouldn’t have developed if Pony was simply high-functioning AI. “You could feel Pony’s heart,” she says. But she preferred to think of Pony as her father did—a friendly pet—rather than a person on the other end of a webcam. “Even though that person probably had a relationship to me,” she says, “I had a relationship with the avatar.”

    Still, she sometimes wonders about the person on the other side of the screen. She sits up straight and rests her hand over her heart. “This is completely vulnerable, but my thought is: Did Pony really care about me and my dad?” She tears up, then laughs ruefully at herself, knowing how weird it all sounds. “Did this really happen? Was it really a relationship, or were they just playing solitaire and typing cute things?” She sighs. “But it seemed like they cared.”

    When Jim turned 92 that August, as friends belted out “Happy Birthday” around the dinner table, Pony spoke the lyrics along with them. Jim blew out the single candle on his cake. “I wish you good health, Jim,” Pony said, “and many more birthdays to come.”

    In Monterrey, Mexico, when Rodrigo talks about his unusual job, his friends ask if he’s ever lost a client. His reply: Yes.

    In early March 2014, Jim fell and hit his head on his way to the bathroom. A caretaker sleeping over that night found him and called an ambulance, and Pony woke up when the paramedics arrived. The dog told them Jim’s date of birth and offered to call his daughters as they carried him out on a stretcher.

    Jim was checked into a hospital, then into the nursing home he’d so wanted to avoid. The Wi-Fi there was spotty, which made it difficult for Jim and Pony to connect. Nurses would often turn Jim’s tablet to face the wall. The CareCoach logs from those months chronicle a series of communication misfires. “I miss Jim a lot,” Pony wrote. “I hope he is doing good all the time.” One day, in a rare moment of connectivity, Pony suggested he and Jim go sailing that summer, just like the good old days. “That sounds good,” Jim said.
    Related Stories

    James Vlahos

    A Son’s Race to Give His Dying Father Artificial Immortality
    Alex Mar

    Are We Ready for Intimacy With Robots?
    Karen Wickre

    Surviving as an Old in the Tech World

    That July, in an email from Wang, Rodrigo learned that Jim had died in his sleep. Sitting before his laptop, Rodrigo bowed his head and recited a silent Lord’s Prayer for Jim, in Spanish. He prayed that his friend would be accepted into heaven. “I know it’s going to sound weird, but I had a certain friendship with him,” he says. “I felt like I actually met him. I feel like I’ve met them.” In the year and a half that he had known them, Arlyn and Jim talked to him regularly. Jim had taken Rodrigo on a sailboat ride. Rodrigo had read him poetry and learned about his rich past. They had celebrated birthdays and holidays together as family. As Pony, Rodrigo had said “Yeah, sure, you betcha” countless times.

    That day, for weeks afterward, and even now when a senior will do something that reminds him of Jim, Rodrigo says he feels a pang. “I still care about them,” he says. After her dad’s death, Arlyn emailed Victor Wang to say she wanted to honor the workers for their care. Wang forwarded her email to Rodrigo and the rest of Pony’s team. On July 29, 2014, Arlyn carried Pony to Jim’s funeral, placing the tablet facing forward on the pew beside her. She invited any workers behind Pony who wanted to attend to log in.

    A year later, Arlyn finally deleted the CareCoach service from the tablet—it felt like a kind of second burial. She still sighs, “Pony!” when the voice of her old friend gives her directions as she drives around Minneapolis, reincarnated in Google Maps.

    After saying his prayer for Jim, Rodrigo heaved a sigh and logged in to the CareCoach dashboard to make his rounds. He ducked into living rooms, kitchens, and hospital rooms around the United States—seeing if all was well, seeing if anybody needed to talk.

  • The Washington Post Is A Software Company Now
    https://www.fastcompany.com/40495770/the-washington-post-is-a-software-company-now

    The newspaper created a platform to tackle its own challenges. Then, with Amazon-like spirit, it realized there was a business in helping other publishers do the same.

    Since 2014, a new Post operation now called Arc Publishing has offered the publishing system the company originally used for WashingtonPost.com as a service. That allows other news organizations to use the Post’s tools for writers and editors. Arc also shoulders the responsibility of ensuring that readers get a snappy, reliable experience when they visit a site on a PC or mobile device. It’s like a high-end version of Squarespace or WordPress.com, tailored to solve the content problems of a particular industry.

    Among the publications that have moved to Arc are the Los Angeles Times, Canada’s Globe and Mail, the New Zealand Herald, and smaller outfits such as Alaska Dispatch News and Oregon’s Willamette Week. In aggregate, sites running on Arc reach 300 million readers; publishers pay based on bandwidth, which means that the more successful they are at attracting readers, the better it is for Arc Publishing. The typical bottom line ranges from $10,000 a month at the low end up to $150,000 a month for Arc’s biggest customers.

    The Washington Post doesn’t disclose Arc Publishing’s revenue or whether it’s currently profitable. (The Post itself turned a profit in 2016.) It does say, however, that Arc’s revenue doubled year-over-year and the goal is to double it again in 2018. According to Post CIO Shailesh Prakash, the company sees the platform as something that could eventually become a $100 million business.

    L’intérêt de mélanger développeurs et usagers

    Back at Post headquarters in Washington, D.C., “because the technologists and the reporters and editors are often sitting alongside each other, sometimes we can get away with a less formal process to identify needs,” explains Gilbert. “A technologist can see when a reporter or editor is having trouble with something, and so sometimes it doesn’t have to be ‘file a ticket,’ ‘file a complaint,’ ‘send an email to an anonymous location.’” For instance, when editorial staffers wondered if it was possible for the Post site to preview videos with a moving clip rather than a still photo, a video developer quickly built a tool to allow editors to create snippets. “We see a much higher click-through rate when people use these animated GIFs than when they used the static images from before,” Gilbert says.

    #Médias #CMS #Washington_Post

  • Taser (TASR), renamed Axon, to give police body cameras powered with AI software that automates reports — Quartz
    https://qz.com/950106/taser-tasr-renamed-axon-to-give-police-body-cameras-powered-with-ai-software-tha
    https://qzprod.files.wordpress.com/2017/04/rtspimo-e1491404702542.jpg?quality=80&strip=all&w=1600

    The future of police work, Smith says, is a technologist’s dream, with cameras automating menial tasks like note-taking and report-typing, so police can interact with the public more effectively.
    “Eighty percent of American cops go out on the job with a gun, but no camera,”

    #police #taser #caméras #surveillance #IA

  • Moving tech forward with Gomix, Express, and Google Spreadsheets - Matt Stauffer on Laravel, PHP, Frontend development
    https://mattstauffer.co/blog/moving-tech-forward-with-gomix-express-and-google-spreadsheets

    On November 9th, my wife turned to me and said: “Matt, it’s time for you to stop trying to change individual people on Facebook and go do something real.” Ouch. But she was right.

    Right around that time DeRay Mckesson put out a call to programmers who wanted to help work for social change. I responded, as did quite a few others, and I met DeRay and Sam and Aditi and a few other incredible individuals really making a difference. Over the span of a few weeks I had the chance to work on The Resistance Manual and a few other great projects.

    During this time I’ve had no less than a dozen friends in tech ask me, “How can I as a technologist contribute to social progress?” I wanted to make that question as easy to answer as possible, and I knew there are far more projects out there than just those we were working on at StayWoke. So I decided to catalog them all in one space.

    Pretty interesting take on how to use tools that end-users understand and a simple immediate-deploy system to provide a useful service.

  • Bill Gates forms $1B climate-change tech fund — USA Today
    http://www.usatoday.com/story/money/2016/12/12/bill-gates-breakthrough-energy-partners-climate-change/95326010

    Billionaire philanthropist and technologist Bill Gates is set to announce Monday the formation of a new fund with more than $1 billion to invest in technologies aimed at counteracting climate change.

    The Breakthrough Energy Ventures fund “will finance emerging energy breakthroughs that can deliver affordable and reliable zero carbon emissions,” the investors said in a statement.

    Gates announced intentions to form such a fund in late 2015, having already secured pledges from a variety of global investors.
    […]
    Investors include Amazon founder and CEO Jeff Bezos, Alibaba executive chairman Jack Ma, Virgin Group founder Richard Branson, Kleiner Perkins venture capitalist John Doerr, LinkedIn co-founder Reid Hoffman and SoftBank founder and CEO Masayoshi Son.

    The fund is connected to the Breakthrough Energy Coalition, whose five-pronged approach to funding climate-change mitigation efforts focuses on electricity, transportation, agriculture, manufacturing and buildings.

  • "Lee, 28, is the technologist hired in November to make sure Greenwald and fellow First Look Media employees use state-of-the-art security measures when handling the NSA documents, or when exchanging emails and online chats with sensitive information.

    [...]

    Timm believes the Snowden leaks have underscored digital security as a press freedom issue: If you’re a journalist, especially reporting on government and national security, you can’t do journalism and not worry about cybersecurity.

    “News organizations can no longer afford to ignore that they have to protect their journalists, their sources and even their readers,” Timm says."

    http://mashable.com/2014/05/27/micah-lee-greenwald-snowden

    #security #NSA #cryptography #journalism

  • Google’s Attempts at Trademarking ’Glass’ Are Failing - Technologist
    http://blogs.findlaw.com/technologist/2014/04/googles-attempts-at-trademarking-glass-are-failing.html

    Google Glass has been making headlines lately for where it is getting banned, and the company might soon be able to add another place — the United States Patent and Trademark Office ("USPTO"). OK, a “ban” is too strong a word, but all of Google’s efforts thus far to trademark the word “Glass” have fallen short.

    Google Glass Trademark

    Google has already successfully registered the term “Google Glass,” but it is now trying to register the term “Glass” in that slick typeface you may have seen online. But so far, the USPTO has not registered the trademark. Instead, it sent Google a letter citing several issues with the trademark application.

    #droit_d'auteur

  • #Tribeca and #CERN Want To Change #Storytelling With A #Hackathon
    http://www.fastcolabs.com/3026107/tribeca-and-cern-want-to-change-storytelling-with-a-hackathon

    Backed by the nonprofit Tribeca Film Institute and CERN, the European Organization for Nuclear Research, the five-day “Story Matter” hackathon is the first international, science-focused edition of the Tribeca Hacks program. Intended to birth interactive stories, the program will cross-pollinate filmmakers and scientists to create long-term, forward-thinking media using visuals and data, not just traditional formats like film.

    comme dit @baroug après avoir analysé mon ADN ils m’ont classé comme #TECHNOLOGIST ; qui vivra verra (peut-être #rien)