company:google

  • Top 5 #kotlin #programming Courses for #java and #android Programmers
    https://hackernoon.com/top-5-kotlin-programming-courses-for-java-and-android-programmers-49e842

    Kotlin vs Java (https://kotlinlang.org)If you don’t know Kotlin, it’s a relatively new programming language that makes programming on Android and Java easy. It’s Android’s official Application development language and 100% compatible with Java and removes some of the pain points of Java.Ever since Google announced Kotlin as the official language for Android Development, I received a lot of queries from my readers about whether Java developers should learn Kotlin now? Or, which one is better to start with Android development, — Kotlin or Java?I have answered that question in my last article, but I am still receiving a lot of queries about learning Kotlin and whether a Java developer should learn Kotlin or not?Well, to be honest with you, being a Polyglot programmer, i.e. a programmer who knows (...)

    #software-development


  • Le contrôle des données numériques personnelles est un enjeu de liberté collective
    https://www.lemonde.fr/pixels/article/2018/10/19/le-controle-des-donnees-numeriques-personnelles-est-un-enjeu-de-liberte-coll

    Les révélations des failles de sécurité touchant des services en ligne s’accumulent. Et la collecte de nos données fait peser un risque collectif d’envergure. C’est une litanie. Facebook a admis, vendredi 12 octobre, que des données personnelles de 29 millions d’internautes avaient été subtilisées par des pirates informatiques. Quatre jours auparavant, son concurrent Google confiait qu’une faille avait exposé un demi-million d’utilisateurs de Google+. Il ne s’agit-là que des exemples les plus récents. (...)

    #Adidas #Altaba/Yahoo ! #BritishAirways #CambridgeAnalytica #Equifax #Target #AshleyMadison.com #Uber #algorithme #manipulation #bénéfices #BigData #hacking (...)

    ##Altaba/Yahoo_ ! ##profiling


  • Tip of the Week #153: Don’t use using-directives
    https://abseil.io/tips/153

    Originally posted as TotW #153 on July 17, 2018

    by Roman Perepelitsa (roman.perepelitsa@gmail.com) and Ashley Hedberg (ahedberg@google.com)

    I view using-directives as time-bombs, both for the parties that deal in them and the language system. – Ashley Hedberg with apologies to Warren Buffett

    tl;dr

    Using-directives (using namespace foo) are dangerous enough to be banned by the Google style guide. Don’t use them in code that will ever need to be upgraded.

    These are not to be confused with using-declarations (using ::foo::SomeName), which are permitted in *.cc files.

    Using-directives at Function Scope

    What do you think this code does?

    namespace totw namespace example namespace

    TEST(MyTest, UsesUsingDirectives) using namespace ::testing; Sequence seq; // ::testing::Sequence (...)


  • Chinese search firm Baidu joins global AI ethics body
    https://www.theguardian.com/technology/2018/oct/17/baidu-chinese-search-firm-joins-global-ai-ethics-body-google-apple-face

    Company is first Chinese member of Partnership on AI, following, Google, Apple, Facebook and others The AI ethics body formed by five of the largest US corporations has expanded to include its first Chinese member, the search firm Baidu. The Partnership on Artificial Intelligence to Benefit People and Society – known as the Partnership on AI (PAI) – was formed in 2016 by Google, Facebook, Amazon, IBM and Microsoft to act as an umbrella organisation for the five companies to conduct (...)

    #Google #Microsoft #IBM #Amazon #Baidu #algorithme #éthique


  • It turns out that Facebook could in fact use data collected from its Portal in-home video device to target you with ads
    https://www.recode.net/2018/10/16/17966102/facebook-portal-ad-targeting-data-collection

    Who you call and what apps you use could determine what ads you see. Facebook announced Portal last week, its take on the in-home, voice-activated speaker to rival competitors from Amazon, Google and Apple. The biggest question surrounding the device : Why should anyone trust Facebook enough to put Facebook-powered microphones and video cameras in their living room or kitchen ? Given Facebook’s year of privacy and security issues, privacy around the device — including what data Facebook (...)

    #Facebook #Messenger #domotique #Portal #écoutes #publicité #profiling

    ##publicité


  • #PeerTube 1.0 : the free/libre and federated video platform
    https://framablog.org/2018/10/15/peertube-1-0-the-free-libre-and-federated-video-platform

    At the end of 2014, the French non-profit association #Framasoft started a crazy challenge : what if we de-google-ified the Internet ? [For French version of this article, see here] Three years later, more than thirty alternative #Services to Google, Facebook, and … Lire la suite­­

    #Contributopia #Dégooglisons_Internet #Fédération #Libres_Logiciels #Migration #contributopia #Degooglisons #Planet #RezoTIC #YouTube



  • The Growth of Sinclair’s Conservative Media Empire | The New Yorker
    https://www.newyorker.com/magazine/2018/10/22/the-growth-of-sinclairs-conservative-media-empire

    Sinclair is the largest owner of television stations in the United States, with a hundred and ninety-two stations in eighty-nine markets. It reaches thirty-nine per cent of American viewers. The company’s executive chairman, David D. Smith, is a conservative whose views combine a suspicion of government, an aversion to political correctness, and strong libertarian leanings. Smith, who is sixty-eight, has a thick neck, deep under-eye bags, and a head of silvery hair. He is an enthusiast of fine food and has owned farm-to-table restaurants in Harbor East, an upscale neighborhood in Baltimore. An ardent supporter of Donald Trump, he has not been shy about using his stations to advance his political ideology. Sinclair employees say that the company orders them to air biased political segments produced by the corporate news division, including editorials by the conservative commentator Mark Hyman, and that it feeds interviewers questions intended to favor Republicans.

    In some cases, anchors have been compelled to read from scripts prepared by Sinclair. In April, 2018, dozens of newscasters across the country parroted Trump’s invectives about “fake news,” saying, “Some members of the media use their platforms to push their own personal bias and agenda to control exactly what people think. This is extremely dangerous to our democracy.” In response, Dan Rather, the former anchor of “CBS Evening News,” wrote, on Twitter, “News anchors looking into camera and reading a script handed down by a corporate overlord, words meant to obscure the truth not elucidate it, isn’t journalism. It’s propaganda. It’s Orwellian. A slippery slope to how despots wrest power, silence dissent, and oppress the masses.”

    It’s unclear whether Sinclair is attempting to influence the politics of its viewers or simply appealing to positions that viewers may already have—or both. Andrew Schwartzman, a telecommunications lecturer at Georgetown Law School, told me, “I don’t know where their personal philosophy ends and their business goals begin. They’re not the Koch brothers, but they reflect a deep-seated conservatism and generations of libertarian philosophy that also happen to help their business.”

    Sinclair has even greater ambitions for expansion. In May, 2017, the company announced a proposed $3.9-billion merger between Sinclair and Tribune Media Company, which owns forty-two television stations. The merger would make Sinclair far larger than any other broadcaster in the country, with stations beaming into seventy per cent of American households. The proposal alarmed regulatory and free-speech experts. Michael Copps, a former official at the Federal Communications Commission, told me, “One of the goals of the First Amendment is to make sure the American people have the news and information they need to make intelligent decisions about our democracy, and I think we’re pretty close to a situation where the population lacks the ability to do that. That’s the whole premise of self-government.” He went on, “There are a lot of problems facing our country, but I don’t know one as important as this. When you start dismantling our news-and-information infrastructure, that’s poison to self-government and poison to democracy.”

    In subsequent years, Smith took measures to deepen Sinclair’s influence among policymakers, apparently recognizing that the company’s profits were dependent upon regulatory decisions made in Washington. One of Smith’s first notable forays into politics was his support for Robert Ehrlich, Jr., a Republican congressman who represented Maryland from 1995 until 2003. Sinclair became a top donor to Ehrlich and, in 2001, Ehrlich sent the first of several letters on Sinclair’s behalf to Michael Powell, who had recently become the chair of the F.C.C. The commission was investigating a request from Sinclair to buy a new group of stations, and Ehrlich protested the “unnecessary delays on pending applications.” The F.C.C.’s assistant general counsel responded that Ehrlich’s communication had violated procedural rules. Ehrlich sent another message, alleging that the delays were politically motivated and threatening to “call for a congressional investigation into this matter.” He added, “Knowing that you have served as Chairman for a few short months, we would prefer to give you an opportunity to address these concerns.” The proposed acquisitions were approved.

    A former general-assignment reporter at the station, Jonathan Beaton, told me, “Almost immediately, I could tell it was a very corrupt culture, where you knew from top down there were certain stories you weren’t going to cover. They wanted you to keep your head down and not upset the fruit basket. I’m a Republican, and I was still appalled by what I saw at Sinclair.” Beaton characterized the man-on-the-street segments as “Don’t forget to grab some random poor soul on the street and shove a microphone in their face and talk about what the Democrats have done wrong.” He said that reporters generally complied because of an atmosphere of “intimidation and fear.”

    After Trump’s victory, it looked as though Sinclair’s investment in the candidate would pay off. In January, 2017, Trump appointed Ajit Pai, a vocal proponent of media deregulation, to be the chair of the F.C.C. Pai, formerly an associate general counsel at Verizon and an aide to Senators Jeff Sessions and Sam Brownback, was exactly the sort of commission head that Sinclair had been hoping for. He believed that competition from technology companies such as Google had made many government restrictions on traditional media irrelevant—an argument that echoed Smith’s views on ownership caps and other regulations. Sinclair executives quickly tried to cultivate a relationship with Pai; shortly after the election, he addressed a gathering of Sinclair managers at the Four Seasons in Baltimore. He also met with David Smith and Sinclair’s C.E.O., Christopher Ripley, the day before Trump’s Inauguration.

    It’s not unusual for business executives to meet with the chair of the F.C.C., but Pai soon announced a series of policy changes that seemed designed to help Sinclair. The first was the reinstatement of the ultrahigh-frequency discount, an arcane rule that digital technology had rendered obsolete. The move served no practical purpose, but it freed Sinclair to acquire many more stations without bumping up against the national cap.

    The F.C.C. soon made other regulatory modifications that were helpful to Sinclair. It eliminated a rule requiring television stations to maintain at least one local studio in licensed markets, essentially legitimatizing Sinclair’s centralized news model. Perhaps most perniciously, Pai took steps toward approving a new broadcast-transmission standard called Next Gen TV, which would require all consumers in the U.S. to purchase new televisions or converter devices. A subsidiary of Sinclair owns six patents necessary for the new standard, which could mean billions of dollars in earnings for the company. Jessica Rosenworcel, the sole Democratic commissioner at the F.C.C., told me, “It’s striking that all of our media policy decisions seem almost custom-built for this one company. Something is wrong.” Rosenworcel acknowledged that many F.C.C. policies need to be modernized, but, she said, “broadcasting is unique. It uses the public airwaves, it’s a public trust.” She added, “I don’t think those ideas are retrograde. They are values we should sustain.”

    The F.C.C. and the D.O.J. both warned Sinclair about the dummy divestitures, insisting that the company find independent owners in ten problematic markets. According to a lawsuit later filed by Tribune, instead of taking steps to appease regulators, Sinclair executives “antagonized DOJ and FCC staff” by acting “confrontational” and “belittling.” The company offered to make sales in only four of the markets, and told the Justice Department that it would have to litigate for any further concessions. One Sinclair lawyer told government representatives, “Sue me.” There was no tactical reason for Sinclair to take such a combative and self-sabotaging stance. Instead, the episode seemed to reflect how Trump’s own corruption and conflicts of interest have filtered into the business community. One industry expert who followed the proceedings closely told me that the company clearly “felt that, with the President behind them, why would the commission deny them anything?

    Then, in April, the Web site Deadspin edited the broadcasts of Sinclair anchors reciting the script about fake news into one terrifying montage, with a tapestry of anchors in different cities speaking in unison. The video ignited public outrage, and Trump tweeted a defense of Sinclair, calling it “far superior to CNN and even more Fake NBC, which is a total joke.” (In a statement, a spokesperson for Sinclair said, “This message was not presented as news and was not intended to be political—there was no mention of President Trump, political parties, policy issues, etc. It was a business objective centered on attracting more viewers.”)

    #Médias #Concentration #Dérégulation #Etats-Unis #Sinclair


  • #Paywall : The Business of Scholarship

    Paywall: The Business of Scholarship, produced by #Jason_Schmitt, provides focus on the need for open access to research and science, questions the rationale behind the $25.2 billion a year that flows into for-profit academic publishers, examines the 35-40% profit margin associated with the top academic publisher Elsevier and looks at how that profit margin is often greater than some of the most profitable tech companies like Apple, Facebook and Google. For more information please visit: Paywallthemovie.com


    https://vimeo.com/273358286

    #édition_scientifique #université #documentaire #film #elsevier #profit #capitalisme #savoir
    signalé par @fil, que je remercie

    • je suis tres surpris par un point mentionne plusieurs fois : il faut que la recherche sur la sante, le rechauffement climatique, etc, bref, tout ce qui a « un vrai impact » soit ouvert, parce qu’il y a des vrais problemes, et donc il faut du monde pour y participer.... Mais je n’ai pas entendu grand chose sur la recherche fondamentale... Je ne sais pas si ca tient du fait que la recherche fondamentale etant moins « remuneratrice », le probleme est moins flagrant... Mais ca me met mal a l’aise cette separation entre « les vrais problemes de la vie » et les questions fondamentales qui n’interessent pas grand monde....


  • Amazon’s Echo May Be Able To Read Your Emotions - The Atlantic
    https://www.theatlantic.com/technology/archive/2018/10/alexa-emotion-detection-ai-surveillance/572884

    Amazon has patented technology that would allow its devices to read your emotional and physical state, and sell advertisements based on them. Are we entering the era of the mood-targeted ad?

    Patents are not products, of course—but they can offer insight into how companies will approach emergent tech. In this case, the patent hints at new possibilities for dynamic targeted advertising in its always-on line of products. The patent lays out an example: Say you tell Alexa you’re hungry, and she can tell by the sniffle in your voice that you’re coming down with something. She can then ask if you want a recipe for chicken soup, or she can ask a question “associated with particular advertisers.” Perhaps Panera wants to tell you about its soups.

    Amazon isn’t the only technology company to pursue technology that takes full advantage of these emotional windows. Google has a similar patent, for a method to augment devices to detect negative emotions then automatically suggest advice. IBM has one that would help search engines return web results based on the user’s “current emotional state.” Searching for “good podcasts,” “football,” or “events near me,” for example, would return different results based on user mood, as determined via face recognition in the webcam, a scan of the person’s heart rate or—and this is where the “patents are not products” disclaimer must be emphasized most heavily—the “user’s brain waves.”

    Spotify, meanwhile, is already practicing a type of dynamic emotional targeting all its own. Starting in 2014, it began associating playlists with different moods and events, selling ad space to companies based on the associations. An Adele-centric playlist may be a dead giveaway for emotional turmoil, so products associated with sadness (ice cream, tissues) would be recommended. A hip-hop heavy playlist might come with a “block party” association, and Spotify would suggest the playlist for a company advertising barbecue sauce, and so on.

    The purpose of profiling is to sell products. Each of us are made up of dozens of marketable categories. Dynamic emotional targeting ups the ante: Now we are a collection of categories both stable (gender, age, residence) and in flux (mental and emotional states), and our devices are eager to hear all about it.

    #Emotions #Publicité #Brevets #Amazon #IBM #Google #Spotify


  • From the birth of computing to Amazon : why tech’s gender problem is nothing new
    https://www.theguardian.com/technology/2018/oct/11/tech-gender-problem-amazon-facebook-bias-women

    Decades after women were pushed out of programming, Amazon’s AI recruiting technology carried on the industry’s legacy of bias A recent report revealed Amazon’s AI recruiting technology developed a bias against women because it was trained predominantly on men’s resumes. Although Amazon shut the project down, this kind of mechanized sexism is common and growing – and the problem isn’t limited to AI mishaps. Facebook allows the targeting of job ads by gender, resulting in discrimination in (...)

    #Alphabet #Google #Amazon #Facebook #algorithme #BigData #discrimination #GAFAM


  • Google Drops Out of Pentagon’s $10 Billion Cloud Competition
    https://www.bloomberg.com/news/articles/2018-10-08/google-drops-out-of-pentagon-s-10-billion-cloud-competition

    Alphabet Inc.’s Google has decided not to compete for the Pentagon’s cloud-computing contract valued at as much as $10 billion, saying the project may conflict with its corporate values. The project, known as the Joint Enterprise Defense Infrastructure cloud, or JEDI, involves transitioning massive amounts of Defense Department data to a commercially operated cloud system. Companies are due to submit bids for the contract, which could last as long as 10 years, on Oct. 12th. Read more : Why (...)

    #Alphabet #Google #Amazon #algorithme #bénéfices #cloud #profiling #USDepartmentOfDefense


  • UK high court blocks mass privacy action against Google
    https://www.theguardian.com/technology/2018/oct/08/uk-high-court-blocks-mass-privacy-action-against-google

    Tech company faced claims it gathered personal data from more than 4m iPhone users The high court has blocked a mass lawsuit against Google that aimed to collect as much as £3bn in compensation for the company’s historical practice of collecting data on iPhone users whose privacy settings should have prevented surveillance. Mr Justice Warby, sitting in London, announced his decision on Monday. The litigation was brought by the campaign group Google You Owe Us, led by the former Which ? (...)

    #Apple #algorithme #iPhone #terms #procès #profiling #données


  • Former Google boss urges tech giants to end the delusion that it’s making the world a better place | Daily Mail Online
    https://www.dailymail.co.uk/news/article-6245847/Former-Google-boss-urges-tech-giants-end-delusion-making-world-better-p

    She said: ‘You can’t go about telling your advertisers that you can target users down to the tiniest pixel, but then throw your hands up in front of the politicians and say your machines can’t figure out if bad actors are using your platform.’

    Google has been widely criticised for allowing jihadists, far-Right extremists and other hate preachers to post content on its YouTube video platform. In some cases, it funnelled cash from advertisers to the extremists posting videos.


  • The Global Internet Phenomena Report
    https://www.sandvine.com/hubfs/downloads/phenomena/2018-phenomena-report.pdf

    The data in this edition of the Global Internet Phenomena Report is drawn from Sandvine’s installed base of over 150 Tier 1 and Tier 2 fixed and mobile operators worldwide. The report does not include significant data from either China or India, but the data represents a portion of Sandvine’s 2.1B subscribers installed base, a statistically significant segment of the internet population.

    This edition combines fixed and mobile data into a single comprehensive view of internet traffic (...)

    #Google #Nest #Amazon #Amazon's_Prime #AWS #BitTorrent #Facebook #cryptage #Alexa #Siri #Nest_Learning_Thermostat #domination #thermostat #cloud #jeu (...)

    ##game


  • How mapmakers help indigenous people defend their lands
    https://www.nationalgeographic.com/culture-exploration/2018/10/indigenous-cultures-mapping-projects-reclaim-lands-columbus

    One early project in the 1990s focused on the remote Darién region of Panama. Official maps of the area contained little detail—the persistent cloud cover and dense rainforest canopy were impenetrable to the satellite imagery and aerial photos that government cartographers used to make their maps. But to the three main indigenous groups in the region, Emberá, the Wounaan, and the Guna, the land was filled with landmarks.

    The organization’s approach was simple: ask indigenous people to draw detailed maps of their lands, and then get professional cartographers to incorporate this information into modern, geographically accurate maps.

    To map the Darién, indigenous leaders selected men from communities in the region to act as surveyors. The surveyors then set out by bus, by canoe, or on foot, armed with pencils, pens, and blank sheets of manila paper to sketch the local waterways and other landmarks. In collaboration with villagers and their leaders they carefully drew maps that included things of importance to their communities that wouldn’t typically appear on government maps, like hunting and fishing grounds, or places where firewood, fruit, or medicine were gathered. They often chose to leave out cemeteries and sacred sites, preferring to keep that knowledge within their communities. The quality of these maps varies considerably, but the best of them are works of art, Chapin says (see below).

    #cartographie #cartographie_participative #territoire #peuples_autochtones


  • Official documents prove: Israel bans young Americans based on Canary Mission website - Israel News - Haaretz.com

    Some Americans detained upon arrival in Israel reported being questioned about their political activity based on ’profiles’ on the controversial website Canary Mission. Documents obtained by Haaretz now clearly show that is indeed a source of information for decisions to bar entry

    Noa Landau SendSend me email alerts
    Oct 04, 2018

    https://www.haaretz.com/israel-news/.premium-official-documents-prove-israel-bans-young-americans-based-on-cana

    The Strategic Affairs and Public Diplomacy Ministry is using simple Google searches, mainly the controversial American right-wing website Canary Mission, to bar political activists from entering Israel, according to documents obtained by Haaretz.
    >>Israeli court rejects American visa-holding student’s appeal; to be deported for backing BDS
    The internal documents, some of which were submitted to the appeals tribunal in the appeal against the deportation of American student Lara Alqasem, show that officials briefly interviewed Alqasem, 22, at Ben-Gurion International Airport on her arrival Tuesday night, then passed her name on for “continued handling” by the ministry because of “suspicion of boycott activity.” Israel recently passed a law banning the entry of foreign nationals who engage in such activity.

    >> Are you next? Know your rights if detained at Israel’s border

    Links to Canary Mission and Facebook posts are seen on an official Ministry of Strategic Affairs document.
    The ministry then sent the officials at the airport an official report classified “sensitive” about Alqasem’s supposed political activities, which included information from five links – four from Facebook and one, the main source, from the Canary Mission site, which follows pro-Palestinian activists on U.S. campuses.
    Keep updated: Sign up to our newsletter
    Email* Sign up

    A decision on Alqasem’s appeal against her deportation was expected Thursday afternoon.
    Canary Mission, now the subject of major controversy in the American Jewish community, has been collecting information since 2015 about BDS activists at universities, and sends the information to potential employers. Pro-Israel students have also criticized their activities.

    Lara Alqasem.
    This week, the American Jewish news site The Forward reported that at least $100,000 of Canary Mission’s budget had been contributed through the San Francisco Jewish Federation and the Helen Diller Family Foundation, which donates to Jewish education. The donation was handed to a group registered in Beit Shemesh called Megamot Shalom, specifically stating that it was for Canary Mission. A few hours after the report was published, the federation announced that it would no longer fund the group.
    Over the past few months some of the Americans who have been detained for questioning upon arrival in Israel have reported that they were questioned about their political activity based on “profiles” about them published on Canary Mission. The documents obtained by Haaretz now show clearly that the site is indeed the No. 1 source of information for the decision to bar entry to Alqasem.
    According to the links that were the basis for the decision to suspend the student visa that Alqasem had been granted by the Israeli Consulate in Miami, she was president of the Florida chapter of a group called Students for Justice in Palestine, information quoted directly from the Canary Mission. The national arm of that organization, National Students for Justice in Palestine, is indeed on the list of 20 groups that the Strategic Affairs Ministry compiled as criteria to invoke the anti-boycott law. However, Alqasem was not a member at the national level, but rather a local activist. She told the appeals tribunal that the local chapter had only a few members.

    Canary Mission’s profile of Lara Alqasem.
    The ministry also cited as a reason for barring Alqasem’s entry to Israel a Facebook post showing that “In April 2016 [her] chapter conducted an ongoing campaign calling for the boycott of Sabra hummus, the American version of Hummus Tzabar, because Strauss, which owns Tzabar, funds the Golani Brigade.” Alqasem told the tribunal that she had not taken an active part in this campaign. Another link was about a writers’ petition calling on a cultural center to refuse sponsorship by Israel for its activities. Yet another post, by the local Students for Justice in Palestine, praised the fact that an international security company had stopped operations in Israel. None of these links quoted Alqasem.
    She told the tribunal that she is not currently a member of any pro-boycott group and would not come to study for her M.A. in Israel if she were.
    The Strategic Affairs Ministry report on Alqasem is so meager that its writers mentioned it themselves: “It should be noted that in this case we rely on a relatively small number of sources found on the Internet.” Over the past few months Haaretz has been following up reports of this nature that have been the basis for denying entry to activists, and found that in many other cases the material consisted of superficial Google searches and that the ministry, by admission of its own senior officials, does not collect information from non-public sources.
    skip - Facebook post calling for the boycott of Sabra hummus

    The ministry’s criteria for invoking the anti-boycott law state clearly that in order to bar entry to political activists, they must “hold senior or significant positions in the organizations,” including “official senior roles in prominent groups (such as board members).”
    But the report on Alqasem does not indicate that she met the criterion of “senior” official in the national movement, nor was this the case for other young people questioned recently at the airport. In some cases it was the Shin Bet security service that questioned people due to past participation in activities such as demonstrations in the territories, and not BDS activities.
    “Key activists,” according to the ministry’s criteria, also means people who “consistently take part in promoting BDS in the framework of prominent delegitimization groups or independently, and not, for example, an activist who comes as part of a delegation.” In Alqasem’s case, however, her visa was issued after she was accepted for study at Hebrew University.



  • Google is Working on Apple Magic Trackpad 2 Linux Support
    https://www.omgubuntu.co.uk/2018/09/google-working-on-apple-magic-trackpad-2-linux-support

    Google engineers are working to add Apple Magic Trackpad 2 Linux support to the mainline Linux Kernel. Although it’s been 3 years since Apple announced the Magic Trackpad 2 Linux users have needed to rely on out-of-tree patches and drivers to use the multi-touch device with desktop Linux distributions like Ubuntu, in both wired and […] This post, Google is Working on Apple Magic Trackpad 2 Linux Support, was written by Joey Sneddon and first appeared on OMG! Ubuntu!.


  • Can Mark Zuckerberg Fix Facebook Before It Breaks Democracy? | The New Yorker
    https://www.newyorker.com/magazine/2018/09/17/can-mark-zuckerberg-fix-facebook-before-it-breaks-democracy

    Since 2011, Zuckerberg has lived in a century-old white clapboard Craftsman in the Crescent Park neighborhood, an enclave of giant oaks and historic homes not far from Stanford University. The house, which cost seven million dollars, affords him a sense of sanctuary. It’s set back from the road, shielded by hedges, a wall, and mature trees. Guests enter through an arched wooden gate and follow a long gravel path to a front lawn with a saltwater pool in the center. The year after Zuckerberg bought the house, he and his longtime girlfriend, Priscilla Chan, held their wedding in the back yard, which encompasses gardens, a pond, and a shaded pavilion. Since then, they have had two children, and acquired a seven-hundred-acre estate in Hawaii, a ski retreat in Montana, and a four-story town house on Liberty Hill, in San Francisco. But the family’s full-time residence is here, a ten-minute drive from Facebook’s headquarters.

    Occasionally, Zuckerberg records a Facebook video from the back yard or the dinner table, as is expected of a man who built his fortune exhorting employees to keep “pushing the world in the direction of making it a more open and transparent place.” But his appetite for personal openness is limited. Although Zuckerberg is the most famous entrepreneur of his generation, he remains elusive to everyone but a small circle of family and friends, and his efforts to protect his privacy inevitably attract attention. The local press has chronicled his feud with a developer who announced plans to build a mansion that would look into Zuckerberg’s master bedroom. After a legal fight, the developer gave up, and Zuckerberg spent forty-four million dollars to buy the houses surrounding his. Over the years, he has come to believe that he will always be the subject of criticism. “We’re not—pick your noncontroversial business—selling dog food, although I think that people who do that probably say there is controversy in that, too, but this is an inherently cultural thing,” he told me, of his business. “It’s at the intersection of technology and psychology, and it’s very personal.”

    At the same time, former Facebook executives, echoing a growing body of research, began to voice misgivings about the company’s role in exacerbating isolation, outrage, and addictive behaviors. One of the largest studies, published last year in the American Journal of Epidemiology, followed the Facebook use of more than five thousand people over three years and found that higher use correlated with self-reported declines in physical health, mental health, and life satisfaction. At an event in November, 2017, Sean Parker, Facebook’s first president, called himself a “conscientious objector” to social media, saying, “God only knows what it’s doing to our children’s brains.” A few days later, Chamath Palihapitiya, the former vice-president of user growth, told an audience at Stanford, “The short-term, dopamine-driven feedback loops that we have created are destroying how society works—no civil discourse, no coöperation, misinformation, mistruth.” Palihapitiya, a prominent Silicon Valley figure who worked at Facebook from 2007 to 2011, said, “I feel tremendous guilt. I think we all knew in the back of our minds.” Of his children, he added, “They’re not allowed to use this shit.” (Facebook replied to the remarks in a statement, noting that Palihapitiya had left six years earlier, and adding, “Facebook was a very different company back then.”)

    In March, Facebook was confronted with an even larger scandal: the Times and the British newspaper the Observer reported that a researcher had gained access to the personal information of Facebook users and sold it to Cambridge Analytica, a consultancy hired by Trump and other Republicans which advertised using “psychographic” techniques to manipulate voter behavior. In all, the personal data of eighty-seven million people had been harvested. Moreover, Facebook had known of the problem since December of 2015 but had said nothing to users or regulators. The company acknowledged the breach only after the press discovered it.

    We spoke at his home, at his office, and by phone. I also interviewed four dozen people inside and outside the company about its culture, his performance, and his decision-making. I found Zuckerberg straining, not always coherently, to grasp problems for which he was plainly unprepared. These are not technical puzzles to be cracked in the middle of the night but some of the subtlest aspects of human affairs, including the meaning of truth, the limits of free speech, and the origins of violence.

    Zuckerberg is now at the center of a full-fledged debate about the moral character of Silicon Valley and the conscience of its leaders. Leslie Berlin, a historian of technology at Stanford, told me, “For a long time, Silicon Valley enjoyed an unencumbered embrace in America. And now everyone says, Is this a trick? And the question Mark Zuckerberg is dealing with is: Should my company be the arbiter of truth and decency for two billion people? Nobody in the history of technology has dealt with that.”

    In 2002, Zuckerberg went to Harvard, where he embraced the hacker mystique, which celebrates brilliance in pursuit of disruption. “The ‘fuck you’ to those in power was very strong,” the longtime friend said. In 2004, as a sophomore, he embarked on the project whose origin story is now well known: the founding of Thefacebook.com with four fellow-students (“the” was dropped the following year); the legal battles over ownership, including a suit filed by twin brothers, Cameron and Tyler Winklevoss, accusing Zuckerberg of stealing their idea; the disclosure of embarrassing messages in which Zuckerberg mocked users for giving him so much data (“they ‘trust me.’ dumb fucks,” he wrote); his regrets about those remarks, and his efforts, in the years afterward, to convince the world that he has left that mind-set behind.

    New hires learned that a crucial measure of the company’s performance was how many people had logged in to Facebook on six of the previous seven days, a measurement known as L6/7. “You could say it’s how many people love this service so much they use it six out of seven days,” Parakilas, who left the company in 2012, said. “But, if your job is to get that number up, at some point you run out of good, purely positive ways. You start thinking about ‘Well, what are the dark patterns that I can use to get people to log back in?’ ”

    Facebook engineers became a new breed of behaviorists, tweaking levers of vanity and passion and susceptibility. The real-world effects were striking. In 2012, when Chan was in medical school, she and Zuckerberg discussed a critical shortage of organs for transplant, inspiring Zuckerberg to add a small, powerful nudge on Facebook: if people indicated that they were organ donors, it triggered a notification to friends, and, in turn, a cascade of social pressure. Researchers later found that, on the first day the feature appeared, it increased official organ-donor enrollment more than twentyfold nationwide.

    Sean Parker later described the company’s expertise as “exploiting a vulnerability in human psychology.” The goal: “How do we consume as much of your time and conscious attention as possible?” Facebook engineers discovered that people find it nearly impossible not to log in after receiving an e-mail saying that someone has uploaded a picture of them. Facebook also discovered its power to affect people’s political behavior. Researchers found that, during the 2010 midterm elections, Facebook was able to prod users to vote simply by feeding them pictures of friends who had already voted, and by giving them the option to click on an “I Voted” button. The technique boosted turnout by three hundred and forty thousand people—more than four times the number of votes separating Trump and Clinton in key states in the 2016 race. It became a running joke among employees that Facebook could tilt an election just by choosing where to deploy its “I Voted” button.

    These powers of social engineering could be put to dubious purposes. In 2012, Facebook data scientists used nearly seven hundred thousand people as guinea pigs, feeding them happy or sad posts to test whether emotion is contagious on social media. (They concluded that it is.) When the findings were published, in the Proceedings of the National Academy of Sciences, they caused an uproar among users, many of whom were horrified that their emotions may have been surreptitiously manipulated. In an apology, one of the scientists wrote, “In hindsight, the research benefits of the paper may not have justified all of this anxiety.”

    Facebook was, in the words of Tristan Harris, a former design ethicist at Google, becoming a pioneer in “ persuasive technology.

    Facebook had adopted a buccaneering motto, “Move fast and break things,” which celebrated the idea that it was better to be flawed and first than careful and perfect. Andrew Bosworth, a former Harvard teaching assistant who is now one of Zuckerberg’s longest-serving lieutenants and a member of his inner circle, explained, “A failure can be a form of success. It’s not the form you want, but it can be a useful thing to how you learn.” In Zuckerberg’s view, skeptics were often just fogies and scolds. “There’s always someone who wants to slow you down,” he said in a commencement address at Harvard last year. “In our society, we often don’t do big things because we’re so afraid of making mistakes that we ignore all the things wrong today if we do nothing. The reality is, anything we do will have issues in the future. But that can’t keep us from starting.”

    In contrast to a traditional foundation, an L.L.C. can lobby and give money to politicians, without as strict a legal requirement to disclose activities. In other words, rather than trying to win over politicians and citizens in places like Newark, Zuckerberg and Chan could help elect politicians who agree with them, and rally the public directly by running ads and supporting advocacy groups. (A spokesperson for C.Z.I. said that it has given no money to candidates; it has supported ballot initiatives through a 501(c)(4) social-welfare organization.) “The whole point of the L.L.C. structure is to allow a coördinated attack,” Rob Reich, a co-director of Stanford’s Center on Philanthropy and Civil Society, told me. The structure has gained popularity in Silicon Valley but has been criticized for allowing wealthy individuals to orchestrate large-scale social agendas behind closed doors. Reich said, “There should be much greater transparency, so that it’s not dark. That’s not a criticism of Mark Zuckerberg. It’s a criticism of the law.”

    La question des langues est fondamentale quand il s’agit de réseaux sociaux

    Beginning in 2013, a series of experts on Myanmar met with Facebook officials to warn them that it was fuelling attacks on the Rohingya. David Madden, an entrepreneur based in Myanmar, delivered a presentation to officials at the Menlo Park headquarters, pointing out that the company was playing a role akin to that of the radio broadcasts that spread hatred during the Rwandan genocide. In 2016, C4ADS, a Washington-based nonprofit, published a detailed analysis of Facebook usage in Myanmar, and described a “campaign of hate speech that actively dehumanizes Muslims.” Facebook officials said that they were hiring more Burmese-language reviewers to take down dangerous content, but the company repeatedly declined to say how many had actually been hired. By last March, the situation had become dire: almost a million Rohingya had fled the country, and more than a hundred thousand were confined to internal camps. The United Nations investigator in charge of examining the crisis, which the U.N. has deemed a genocide, said, “I’m afraid that Facebook has now turned into a beast, and not what it was originally intended.” Afterward, when pressed, Zuckerberg repeated the claim that Facebook was “hiring dozens” of additional Burmese-language content reviewers.

    More than three months later, I asked Jes Kaliebe Petersen, the C.E.O. of Phandeeyar, a tech hub in Myanmar, if there had been any progress. “We haven’t seen any tangible change from Facebook,” he told me. “We don’t know how much content is being reported. We don’t know how many people at Facebook speak Burmese. The situation is getting worse and worse here.”

    I saw Zuckerberg the following morning, and asked him what was taking so long. He replied, “I think, fundamentally, we’ve been slow at the same thing in a number of areas, because it’s actually the same problem. But, yeah, I think the situation in Myanmar is terrible.” It was a frustrating and evasive reply. I asked him to specify the problem. He said, “Across the board, the solution to this is we need to move from what is fundamentally a reactive model to a model where we are using technical systems to flag things to a much larger number of people who speak all the native languages around the world and who can just capture much more of the content.”

    Lecture des journaux ou des aggrégateurs ?

    once asked Zuckerberg what he reads to get the news. “I probably mostly read aggregators,” he said. “I definitely follow Techmeme”—a roundup of headlines about his industry—“and the media and political equivalents of that, just for awareness.” He went on, “There’s really no newspaper that I pick up and read front to back. Well, that might be true of most people these days—most people don’t read the physical paper—but there aren’t many news Web sites where I go to browse.”

    A couple of days later, he called me and asked to revisit the subject. “I felt like my answers were kind of vague, because I didn’t necessarily feel like it was appropriate for me to get into which specific organizations or reporters I read and follow,” he said. “I guess what I tried to convey, although I’m not sure if this came across clearly, is that the job of uncovering new facts and doing it in a trusted way is just an absolutely critical function for society.”

    Zuckerberg and Sandberg have attributed their mistakes to excessive optimism, a blindness to the darker applications of their service. But that explanation ignores their fixation on growth, and their unwillingness to heed warnings. Zuckerberg resisted calls to reorganize the company around a new understanding of privacy, or to reconsider the depth of data it collects for advertisers.

    Antitrust

    In barely two years, the mood in Washington had shifted. Internet companies and entrepreneurs, formerly valorized as the vanguard of American ingenuity and the astronauts of our time, were being compared to Standard Oil and other monopolists of the Gilded Age. This spring, the Wall Street Journal published an article that began, “Imagine a not-too-distant future in which trustbusters force Facebook to sell off Instagram and WhatsApp.” It was accompanied by a sepia-toned illustration in which portraits of Zuckerberg, Tim Cook, and other tech C.E.O.s had been grafted onto overstuffed torsos meant to evoke the robber barons. In 1915, Louis Brandeis, the reformer and future Supreme Court Justice, testified before a congressional committee about the dangers of corporations large enough that they could achieve a level of near-sovereignty “so powerful that the ordinary social and industrial forces existing are insufficient to cope with it.” He called this the “curse of bigness.” Tim Wu, a Columbia law-school professor and the author of a forthcoming book inspired by Brandeis’s phrase, told me, “Today, no sector exemplifies more clearly the threat of bigness to democracy than Big Tech.” He added, “When a concentrated private power has such control over what we see and hear, it has a power that rivals or exceeds that of elected government.”

    When I asked Zuckerberg whether policymakers might try to break up Facebook, he replied, adamantly, that such a move would be a mistake. The field is “extremely competitive,” he told me. “I think sometimes people get into this mode of ‘Well, there’s not, like, an exact replacement for Facebook.’ Well, actually, that makes it more competitive, because what we really are is a system of different things: we compete with Twitter as a broadcast medium; we compete with Snapchat as a broadcast medium; we do messaging, and iMessage is default-installed on every iPhone.” He acknowledged the deeper concern. “There’s this other question, which is just, laws aside, how do we feel about these tech companies being big?” he said. But he argued that efforts to “curtail” the growth of Facebook or other Silicon Valley heavyweights would cede the field to China. “I think that anything that we’re doing to constrain them will, first, have an impact on how successful we can be in other places,” he said. “I wouldn’t worry in the near term about Chinese companies or anyone else winning in the U.S., for the most part. But there are all these places where there are day-to-day more competitive situations—in Southeast Asia, across Europe, Latin America, lots of different places.”

    The rough consensus in Washington is that regulators are unlikely to try to break up Facebook. The F.T.C. will almost certainly fine the company for violations, and may consider blocking it from buying big potential competitors, but, as a former F.T.C. commissioner told me, “in the United States you’re allowed to have a monopoly position, as long as you achieve it and maintain it without doing illegal things.”

    Facebook is encountering tougher treatment in Europe, where antitrust laws are stronger and the history of fascism makes people especially wary of intrusions on privacy. One of the most formidable critics of Silicon Valley is the European Union’s top antitrust regulator, Margrethe Vestager.

    In Vestager’s view, a healthy market should produce competitors to Facebook that position themselves as ethical alternatives, collecting less data and seeking a smaller share of user attention. “We need social media that will allow us to have a nonaddictive, advertising-free space,” she said. “You’re more than welcome to be successful and to dramatically outgrow your competitors if customers like your product. But, if you grow to be dominant, you have a special responsibility not to misuse your dominant position to make it very difficult for others to compete against you and to attract potential customers. Of course, we keep an eye on it. If we get worried, we will start looking.”

    Modération

    As hard as it is to curb election propaganda, Zuckerberg’s most intractable problem may lie elsewhere—in the struggle over which opinions can appear on Facebook, which cannot, and who gets to decide. As an engineer, Zuckerberg never wanted to wade into the realm of content. Initially, Facebook tried blocking certain kinds of material, such as posts featuring nudity, but it was forced to create long lists of exceptions, including images of breast-feeding, “acts of protest,” and works of art. Once Facebook became a venue for political debate, the problem exploded. In April, in a call with investment analysts, Zuckerberg said glumly that it was proving “easier to build an A.I. system to detect a nipple than what is hate speech.”

    The cult of growth leads to the curse of bigness: every day, a billion things were being posted to Facebook. At any given moment, a Facebook “content moderator” was deciding whether a post in, say, Sri Lanka met the standard of hate speech or whether a dispute over Korean politics had crossed the line into bullying. Zuckerberg sought to avoid banning users, preferring to be a “platform for all ideas.” But he needed to prevent Facebook from becoming a swamp of hoaxes and abuse. His solution was to ban “hate speech” and impose lesser punishments for “misinformation,” a broad category that ranged from crude deceptions to simple mistakes. Facebook tried to develop rules about how the punishments would be applied, but each idiosyncratic scenario prompted more rules, and over time they became byzantine. According to Facebook training slides published by the Guardian last year, moderators were told that it was permissible to say “You are such a Jew” but not permissible to say “Irish are the best, but really French sucks,” because the latter was defining another people as “inferiors.” Users could not write “Migrants are scum,” because it is dehumanizing, but they could write “Keep the horny migrant teen-agers away from our daughters.” The distinctions were explained to trainees in arcane formulas such as “Not Protected + Quasi protected = not protected.”

    It will hardly be the last quandary of this sort. Facebook’s free-speech dilemmas have no simple answers—you don’t have to be a fan of Alex Jones to be unnerved by the company’s extraordinary power to silence a voice when it chooses, or, for that matter, to amplify others, to pull the levers of what we see, hear, and experience. Zuckerberg is hoping to erect a scalable system, an orderly decision tree that accounts for every eventuality and exception, but the boundaries of speech are a bedevilling problem that defies mechanistic fixes. The Supreme Court, defining obscenity, landed on “I know it when I see it.” For now, Facebook is making do with a Rube Goldberg machine of policies and improvisations, and opportunists are relishing it. Senator Ted Cruz, Republican of Texas, seized on the ban of Jones as a fascist assault on conservatives. In a moment that was rich even by Cruz’s standards, he quoted Martin Niemöller’s famous lines about the Holocaust, saying, “As the poem goes, you know, ‘First they came for Alex Jones.’ ”

    #Facebook #Histoire_numérique


  • The Publisher’s Patron: How Google’s News Initiative Is Re-Defining Journalism |European Journalism Observatory - EJO
    https://en.ejo.ch/digital-news/the-publishers-patron

    We found that a large part of Google’s money goes to the media establishment.

    Our data helps to shine a light on what ‘innovation’ means in the world of Google. Four in ten projects funded by the DNI deal with automation and data journalism. For example, Google money helped a joint project between the Press Association, a UK news agency, and the media start-up Urbs Media with 706,000 euro to start a project on automation in local news.

    The funding Google gives to media institutions and publishers bring it such soft power. It also helps Google to safeguard its long-term interests. Increasingly, the company is shifting from being a mere search engine to becoming a central node for the production and distribution of news. Its role will soon be indispensable for the news industry.

    #DNI #google #presse #soft_power #critique


  • Former Google Scientist Tells Senate to Act Over Company’s “Unethical and Unaccountable” China Censorship Plan
    https://theintercept.com/2018/09/26/former-google-scientist-tells-senate-to-act-over-companys-unethical-an

    A scientist who quit Google over its plan to build a censored search engine in China has told U.S. senators that some company employees may have “actively subverted” an internal privacy review of the system. Jack Poulson resigned from Google in August after The Intercept reported that a group of the internet giant’s staffers was secretly working on a search engine for China that would remove content about subjects such as human rights, democracy, peaceful protest, and religion. “I view our (...)

    #Google #algorithme #Dragonfly #censure #filtrage #web #surveillance


  • Amazon’s New Microwave : ‘Alexa, Please Defrost My Chicken’
    https://www.wsj.com/articles/amazons-new-microwave-alexa-please-defrost-my-chicken-1537469765

    New offerings include Alexa-enabled chip that manufacturers can install to control basic appliances In a bid to control the smart home of the future, Amazon.com Inc. AMZN 2.08% is offering makers of electronics a small chip that would let people use their voice to command everything from microwaves and coffee machines to room fans and guitar amplifiers. The online retail giant is hoping big manufacturers will sign up to incorporate the Alexa-enabled chips—which cost a few dollars each—in (...)

    #Alphabet #Apple #Google #Microsoft #Nest #AT&T #Amazon #algorithme #Alexa #domotique #Home #HomePod #biométrie (...)

    ##AT&T ##voix
    https://images.wsj.net/im-27348/social


  • Google admits it lets hundreds of other companies access your Gmail inbox

    https://www.telegraph.co.uk/technology/2018/09/20/google-admits-hundreds-companies-read-gmail-inbox

    Google is allowing hundreds of companies to scan people’s Gmail accounts, read their emails and even share their data with other firms, the company has confirmed.

    In a letter to US senators Susan Molinari, Google’s vice president for public policy in the Americas admitted that it lets app developers access the inboxes of millions of users – even though Google itself stopped looking in 2017.

    In some cases human employees have manually read thousands of emails in order to help train AI systems which perform the same task.


  • Google erases ’Don’t be evil’ from code of conduct after 18 years | ZDNet
    https://www.zdnet.com/article/google-erases-dont-be-evil-from-code-of-conduct-after-18-years

    At some point in the past month, Google removed its famous ’Don’t be evil’ motto from the introduction to its code of conduct.

    As spotted by Gizmodo, the phrase was dropped from the preface of Google’s code of conduct in late April or early May.

    Until then, ’Don’t be evil’ were the first words of the opening and closing sentences of Google’s code of conduct and have been part of it since 2000.

    The phase occasionally guides debate within the company. The 4,000 staff protesting Google’s work for the Pentagon’s AI Project Maven referred to the motto to highlight how the contract conflicted with the company’s values.

    Google’s parent company, Alphabet, also adopted and still retains a variant of the motto in the form of ’Do the right thing’.

    A copy of the Google’s Code of Conduct page from April 21 on the Wayback Machine shows the old version.

    "’Don’t be evil.’ Googlers generally apply those words to how we serve our users. But ’Don’t be evil’ is much more than that. Yes, it’s about providing our users unbiased access to information, focusing on their needs and giving them the best products and services that we can. But it’s also about doing the right thing more generally — following the law, acting honorably, and treating co-workers with courtesy and respect.

    "The Google Code of Conduct is one of the ways we put ’Don’t be evil’ into practice. It’s built around the recognition that everything we do in connection with our work at Google will be, and should be, measured against the highest possible standards of ethical business conduct.

    “We set the bar that high for practical as well as aspirational reasons: Our commitment to the highest standards helps us hire great people, build great products, and attract loyal users. Trust and mutual respect among employees and users are the foundation of our success, and they are something we need to earn every day.”

    The whole first paragraph has been removed from the current Code of Conduct page, which now begins with:

    "The Google Code of Conduct is one of the ways we put Google’s values into practice. It’s built around the recognition that everything we do in connection with our work at Google will be, and should be, measured against the highest possible standards of ethical business conduct.

    “We set the bar that high for practical as well as aspirational reasons: Our commitment to the highest standards helps us hire great people, build great products, and attract loyal users. Respect for our users, for the opportunity, and for each other are foundational to our success, and are something we need to support every day.”

    While the phrase no longer leads Google’s code of conduct, one remnant remains at the end.

    “And remember... don’t be evil, and if you see something that you think isn’t right — speak up.”

    #Google #Histoire_numérique #Motto #Evil