https://www.wired.com

  • Everything We Know About How the FBI Hacks People
    https://www.wired.com/2016/05/history-fbis-hacking

    Recent headlines warn that the government now has greater authority to hack your computers, in and outside the US. Changes to federal criminal court procedures known as Rule 41 are to blame ; they vastly expand how and whom the FBI can legally hack. But just like the NSA’s hacking operations, FBI hacking isn’t new. In fact, the bureau has a long history of surreptitiously hacking us, going back two (...)

    #FBI #hacking #Carnivore #surveillance

  • JavaScript Conquered the Web. Now It’s Taking Over the Desktop | WIRED
    http://www.wired.com/2016/05/javascript-conquered-web-now-taking-desktop

    JavaScript was originally created in 1995 to give web pages a little more pep than the <blink> tag could provide. Today it has far more powerful uses. Companies like Google and Facebook build complex, desktop-like web applications with JavaScript; since the launch of Node.js in 2009, it’s also become one of the most popular languages for building server-side software. Today, even the web isn’t big enough to contain JavaScript’s versatility: it’s now making its way into applications for the desktop.

    Electron is a software development platform created by Github that lets developers use JavaScript along with other web technologies like HTML and CSS to create desktop applications that can run on Windows, Macintosh OS X, and Linux. The company released the first full version of Electron yesterday. But some of tech’s biggest names have already put the tool to work to push JavaScript beyond the browser.
    Related Stories

    The Creator of JavaScript Is Out to Upend the Ad Industry
    The Creator of JavaScript Is Out to Upend the Ad Industry
    I Turned Off JavaScript for a Whole Week and It Was Glorious
    I Turned Off JavaScript for a Whole Week and It Was Glorious
    Clever New GitHub Tool Lets Coders Build Software Like Bridges
    Clever New GitHub Tool Lets Coders Build Software Like Bridges

    Last year, Microsoft released a code editor called Visual Studio Code that was built using Electron. Workplace chat unicorn Slack uses Electron to build its desktop client. The startup Nylas (formerly known as Inbox) used Electron to build an entire email client.

    Electron Quick Start
    http://electron.atom.io/docs/tutorial/quick-start

    #javascript #programmer #deskop

  • The Ingenious Way Iranians Are Using Satellite TV to Beam in Banned Internet | WIRED
    http://www.wired.com/2016/04/ingenious-way-iranians-using-satellite-tv-beam-banned-data

    By broadcasting on its own satellite TV channel and distributing a piece of Windows desktop software that can decode that satellite video stream, the Toosheh project sends thousands of Iranians a daily digital bundle of news articles, videos, and audio—everything from Persian music videos to critical news coverage of the Iranian Revolutionary Guard

  • Five burning questions about Magic Leap after Wired’s huge profile | The Verge
    http://www.theverge.com/2016/4/19/11459498/five-burning-questions-about-magic-leap-after-wireds-huge-profile

    Wired ran an enormous profile on mysterious AR startup Magic Leap today, written by legendary tech journalist Kevin Kelly. It’s incredible, and you should read it, if only because Kelly’s obvious love and enthusiasm for virtual and augmented reality is infectious and energizing.

    But the piece also raises many, many more questions about Magic Leap than it answers — and given the extreme opacity that’s surrounded Magic Leap, that’s pretty notable. (To catch you up: Magic Leap is a secretive company that’s raised over a billion in funding from #Silicon_Valley giants like Google and Andreesen Horowitz, but it’s never given a public demo

    http://www.wired.com/2016/04/magic-leap-vr

    #son #réalité_virtuelle #hype avec Peter Jackson et Neal Stephenson

  • #Internet by #Satellite Is a Space Race With No Winners | WIRED
    https://www.wired.com/2015/06/elon-musk-space-x-satellite-internet

    “These large constellations are very inefficient,” says Roger Rusch, a satellite communications industry analyst. He acknowledges that the small satellites SpaceX and OneWeb hope to use are less expensive today than they were in the 1990s, but says they’re still too costly. “They’re cheaper, but you need 4,000 of them, so they need to be 1,000 times cheaper,” he says.

    #Facebook CEO Mark Zuckerberg mentioned the possibility of providing satellite internet service via the Internet.org initiative—a non-profit he co-founded to expand internet access throughout the world — is a blog post last year. But he’s already shelved the idea due to its cost, according to The Information.

  • Des avions piratables par WiFi
    http://d4n3ws.polux-hosting.com/2015/04/19/des-avions-piratables-par-wifi

    Le GAO (Government Accountability Office) est l’organisme gouvernemental d’audit US, l’équivalent de la Cour des comptes chez nous.

    Il a rendu un rapport indiquant que les avions 787 Dreamliner (Boeing), A350 et A380 sont potentiellement vulnérables à un piratage via le réseau WiFi.

    Ces avions permettent en effet à leurs passagers de se connecter à Internet via un réseau WiFi déployé dans l’avion.
    Le système d’aviation / navigation des pilotes n’est protégé du réseau offert aux voyageurs que via un firewall ce qui n’est pas assez pour le GAO qui souhaite de vrais air-gaps entre les deux réseaux.

    Au delà du scénario possible du hacker kamikaze qui pirate le réseau des pilotes, le scénario du simple voyageur qui se retrouve le relais d’une attaque par la présence d’un malware sur sa machine est cité par le GAO.

    Porte blindée versus firewall.

    L’article de Wired : https://www.wired.com/2015/04/hackers-commandeer-new-planes-passenger-wi-fi

    #Airbus #Boeing #Firewall #Government_Accountability_Office #Sécurité_informatique #Vulnérabilité_(informatique) #Wi-Fi

  • The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed | WIRED
    http://www.wired.com/2014/10/content-moderation
    By Adrian Chen

    The campuses of the tech industry are famous for their lavish cafeterias, cushy shuttles, and on-site laundry services. But on a muggy February afternoon, some of these companies’ most important work is being done 7,000 miles away, on the second floor of a former elementary school at the end of a row of auto mechanics’ stalls in Bacoor, a gritty Filipino town 13 miles southwest of Manila.
    Baybayan is part of a massive labor force that handles “content moderation”—the removal of offensive material—for US social-networking sites. As social media connects more people more intimately than ever before, companies have been confronted with the Grandma Problem: Now that grandparents routinely use services like Facebook to connect with their kids and grandkids, they are potentially exposed to the Internet’s panoply of jerks, racists, creeps, criminals, and bullies.
    So companies like Facebook and Twitter rely on an army of workers employed to soak up the worst of humanity in order to protect the rest of us. And there are legions of them—a vast, invisible pool of human labor. Hemanshu Nigam, the former chief security officer of MySpace who now runs online safety consultancy SSP Blue, estimates that the number of content moderators scrubbing the world’s social media sites, mobile apps, and cloud storage services runs to “well over 100,000”—that is, about twice the total head count of Google and nearly 14 times that of Facebook.
    This work is increasingly done in the Philippines. A former US colony, the Philippines has maintained close cultural ties to the United States, which content moderation companies say helps Filipinos determine what Americans find offensive. And moderators in the Philippines can be hired for a fraction of American wages.
    Here in the former elementary school, Baybayan and his coworkers are screening content for Whisper, an LA-based mobile startup—recently valued at $200 million by its VCs—that lets users post photos and share secrets anonymously. They work for a US-based outsourcing firm called TaskUs. It’s something of a surprise that Whisper would let a reporter in to see this process. When I asked Microsoft, Google, and Facebook for information about how they moderate their services, they offered vague statements about protecting users but declined to discuss specifics.
    I was given a look at the Whisper moderation process because Michael Heyward, Whisper’s CEO, sees moderation as an integral feature and a key selling point of his app. Whisper practices “active moderation,” an especially labor-intensive process in which every single post is screened in real time; many other companies moderate content only if it’s been flagged as objectionable by users, which is known as reactive moderating.
    A list of categories, scrawled on a whiteboard, reminds the workers of what they’re hunting for: pornography, gore, minors, sexual solicitation, sexual body parts/images, racism.
    While a large amount of content moderation takes place overseas, much is still done in the US, often by young college graduates like Swearingen was. Many companies employ a two-tiered moderation system, where the most basic moderation is outsourced abroad while more complex screening, which requires greater cultural familiarity, is done domestically. US-based moderators are much better compensated than their overseas counterparts: A brand-new American moderator for a large tech company in the US can make more in an hour than a veteran Filipino moderator makes in a day. But then a career in the outsourcing industry is something many young Filipinos aspire to, whereas American moderators often fall into the job as a last resort, and burnout is common.
    “Everybody hits the wall, generally between three and five months,” says a former YouTube content moderator I’ll call Rob. “You just think, ‘Holy shit, what am I spending my day doing? This is awful.’”
    But as months dragged on, the rough stuff began to take a toll. The worst was the gore: brutal street fights, animal torture, suicide bombings, decapitations, and horrific traffic accidents. The Arab Spring was in full swing, and activists were using YouTube to show the world the government crackdowns that resulted. Moderators were instructed to leave such “newsworthy” videos up with a warning, even if they violated the content guidelines. But the close-ups of protesters’ corpses and street battles were tough for Rob and his coworkers to handle. So were the videos that documented misery just for the sick thrill of it.
    “If someone was uploading animal abuse, a lot of the time it was the person who did it. He was proud of that,” Rob says. “And seeing it from the eyes of someone who was proud to do the fucked-up thing, rather than news reporting on the fucked-up thing—it just hurts you so much harder, for some reason. It just gives you a much darker view of humanity.”
    In Manila, I meet Denise (not her real name), a psychologist who consults for two content-moderation firms in the Philippines. “It’s like PTSD,” she tells me as we sit in her office above one of the city’s perpetually snarled freeways. “There is a memory trace in their mind.”

    • The campuses of the tech industry are famous for their lavish cafeterias, cushy shuttles, and on-site laundry services. But on a muggy February afternoon, some of these companies’ most important work is being done 7,000 miles away, on the second floor of a former elementary school at the end of a row of auto mechanics’ stalls in Bacoor, a gritty Filipino town 13 miles southwest of Manila.

      #travail #inégalités #réseaux_sociaux #surveillance #censure #philippines #silicon_valley

      the number of content moderators scrubbing the world’s social media sites, mobile apps, and cloud storage services runs to “well over 100,000”—that is, about twice the total head count of #Google and nearly 14 times that of #Facebook.

      A suicidal message posted by a whisper user and flagged for deletion by a TaskUs employee. MOISES SAMAN/MAGNUM

      Whisper practices “active #moderation,” an especially labor-intensive process in which every single post is screened in real time; many other companies moderate content only if it’s been flagged as objectionable by users, which is known as reactive moderating.

      (…)

      While a large amount of content moderation takes place overseas, much is still done in the US, often by young college graduates

      (...)

      He also got a fascinating glimpse into the inner workings of #YouTube. For instance, in late 2010, Google’s legal team gave moderators the urgent task of deleting the violent sermons of American radical Islamist preacher Anwar al-Awlaki, after a British woman said she was inspired by them to stab a politician.

      (…)

      In Manila, I meet Denise (not her real name), a psychologist who consults for two content-moderation firms in the Philippines. “It’s like PTSD,” she tells me as we sit in her office above one of the city’s perpetually snarled freeways. “There is a memory trace in their mind.”

      #santé_mentale