Seenthis
•
 
Identifiants personnels
  • [mot de passe oublié ?]

 
  • #a
  • #al
  • #algorithm
RSS: #algorithmwatch

#algorithmwatch

0 | 25
  • @etraces
    e-traces @etraces via RSS ART LIBRE 12/01/2021

    China’s social credit system was due by 2020 but is far from ready
    ▻https://algorithmwatch.org/en/story/chinas-social-credit-system-overdue

    Six years after the government announced plans for a national social credit score, Chinese citizens face dozens of systems that are largely incompatible with each other. The central government is planning an overhaul. Research and planning for a national credit score in China started in 1999, according to Lin Junyue, one of the most important minds behind the system. It began as a research project led by the World Economics and Politics Institute of the Chinese Academy of Social Sciences. (...)

    #Alibaba #AntFinancial #algorithme #Sésame #données #SocialCreditSystem #finance #notation #surveillance (...)

    ##AlgorithmWatch

    https://algorithmwatch.org/wp-content/uploads/2021/01/China_Social_Credit_System-01.png

    e-traces @etraces via RSS ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces via RSS ART LIBRE 20/12/2020

    New report highlights the risks of AI on fundamental rights
    ▻https://algorithmwatch.org/en/story/ai-fundamental-rights

    The European watchdog for fundamental rights published a report on Artificial Intelligence. AlgorithmWatch welcomes some of the recommendations, and encourages a bolder approach. The European Union Agency for Fundamental Rights (FRA), which supports European institutions and members states on related issues, released a report on Artificial Intelligence on 14 December. With this report, FRA offers good advice to European institutions and member states. In slightly over 100 pages, the agency (...)

    #Amazon #algorithme #racisme #religion #sexisme #[fr]Règlement_Général_sur_la_Protection_des_Données_(RGPD)[en]General_Data_Protection_Regulation_(GDPR)[nl]General_Data_Protection_Regulation_(GDPR) #biais #discrimination #santé #surveillance (...)

    ##[fr]Règlement_Général_sur_la_Protection_des_Données__RGPD_[en]General_Data_Protection_Regulation__GDPR_[nl]General_Data_Protection_Regulation__GDPR_ ##santé ##AlgorithmWatch

    https://algorithmwatch.org/wp-content/uploads/2020/12/koshu-kunii-_L630O5hEp4-unsplash2-scaled.jpg

    e-traces @etraces via RSS ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces via RSS ART LIBRE 15/12/2020

    Despite transparency, the Nutri-Score algorithm faces strong resistance
    ▻https://algorithmwatch.org/en/story/nutriscore

    The Nutri-Score summarizes basic nutritional information on a 5-letter scale. Despite its many qualities, it faces a strong backlash that could hold a lesson for operators of automated systems. The German government, which currently holds the rotating EU presidency, will push for EU-wide guidelines on nutrition labeling at Tuesday’s Council of agriculture ministers, which it chairs. In particular, the German minister should advocate the use of Nutri-Score, an algorithm that informs (...)

    #santé #manipulation #algorithme #AlgorithmWatch #Nutri-Score

    ##santé

    https://algorithmwatch.org/wp-content/uploads/2020/12/nutriscore_socialcard.jpg

    e-traces @etraces via RSS ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces via RSS ART LIBRE 6/12/2020

    Health algorithms discriminate against Black patients, also in Switzerland
    ▻https://algorithmwatch.ch/en/racial-health-bias

    Algorithms used to assess kidney function or predict heart failure use race as a central criterion. There is no scientific basis to do so, and the results discriminate against Blacks. Many medical algorithms require the race of the patient to be included in the calculation. The American Heart Association, for instance, recommends using an algorithm to calculate the risk of heart failure. People who are categorized as “non-black” automatically receive three additional points, out of 100 (...)

    #algorithme #racisme #discrimination #santé #AlgorithmWatch

    ##santé

    https://algorithmwatch.ch/en/wp-content/uploads/2020/12/medicine-racial-bias-socialcard.jpg

    e-traces @etraces via RSS ART LIBRE
    • @nestor
      nestor @nestor 6/12/2020

      maybe but unfortunately the no scientific basis to do so could be antiwit discrimini invisible undercover as usual

      nestor @nestor
    Écrire un commentaire

  • @etraces
    e-traces @etraces via RSS ART LIBRE 29/11/2020

    La société automatisée au risque de l’opacité
    ▻http://www.internetactu.net/a-lire-ailleurs/la-societe-automatisee-au-risque-de-lopacite

    L’association européenne et militante Algorithm Watch (@algorithmwatch) vient de publier son rapport annuel sur la société automatisée. Son constat est sévère : les systèmes de prise de décision automatisés se généralisent dans la plus grande opacité. Fabio Chiusi (@fabiochiusi) dans son introduction revient sur le fiasco de la notation automatisée des résultats du bac 2020 britannique… Il rappelle que le choix du gouvernement pour un système automatique de production de résultats était motivé par (...)

    #algorithme #technologisme #consentement #prédiction #discrimination #enseignement #notation #santé (...)

    ##santé ##AlgorithmWatch
    /assets/images/logo_ia.png

    e-traces @etraces via RSS ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces via RSS ART LIBRE 26/11/2020

    Dutch city uses algorithm to assess home value, but has no idea how it works
    ▻https://algorithmwatch.org/en/story/woz-castricum-gdpr-art-22

    In a seemingly routine case at the Amsterdam court of appeal, a judge ruled that it was acceptable for a municipality to use a black-box algorithm, as long as the results were unsurprising. In 2016, the municipality of Castricum, a seaside town of 35,000 in Holland, set the home value of an unnamed claimant at 320,000€ (in the Netherlands, property tax is paid based on a house’s estimated resale value). Way too high, said the claimant, who promptly went to court. The claimant argued that (...)

    #algorithme #consentement #législation #urbanisme #AlgorithmWatch

    https://algorithmwatch.org/wp-content/uploads/2020/11/wynand-van-poortvliet-_jmagbo2dnk-unsplash-scaled.jpg

    e-traces @etraces via RSS ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces via RSS ART LIBRE 23/11/2020

    French tax authority pushes for automated controls despite mixed results
    ▻https://algorithmwatch.org/en/story/france-tax-automated-dgfip

    Since 2014, a team of data-scientists supports local tax offices to help them identify complex fraud. But the motive could be more base : to make tax collectors redundant. 785 million euros. This is the amount tax collectors recovered in 2019 thanks to “data mining”, according to a statement by the French government in February 2020. This was made possible by a team of thirty data scientists. The group was set up in 2014 by the French tax authority (DGFiP) to develop machine learning (...)

    #algorithme #technologisme #données #fiscalité #fraude #AlgorithmWatch

    ##fiscalité

    https://algorithmwatch.org/wp-content/uploads/2020/11/2571619476_ec35d7bbeb_k.jpg

    e-traces @etraces via RSS ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces via RSS ART LIBRE 22/11/2020

    New Swiss algorithm to desegregate schools, one block at a time
    ▻https://algorithmwatch.org/en/story/zurich-schools-algorithm

    Two researchers from Zurich University created an algorithm that helps desegregate schools by slightly changing the boundaries of each school’s catchment area. Ever since authorities decided to open public schools to children of all backgrounds in the course of the 20th century, elite families regrouped in select schools. The resulting segregation has been shown in multiple studies to decrease the performance of pupils across the board. In schools where underprivileged pupils are (...)

    #algorithme #cartographie #discrimination #enfants #enseignement #AlgorithmWatch

    https://algorithmwatch.org/wp-content/uploads/2019/11/andrew-ebrahim-zRwXf6PizEo-unsplash_small.jpg

    e-traces @etraces via RSS ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 13/11/2020

    Männer fahren LKW, Frauen erziehen Kinder – diskriminierendes Gendertargeting durch Facebook
    ▻https://algorithmwatch.org/diskriminierendes-gendertargeting-durch-facebook

    Facebook spielt Stellenanzeigen in diskriminierender Weise an die Nutzer:innen seiner Plattformen aus. Untersuchungen von AlgorithmWatch zeigen, dass sich Facebook an groben Genderstereotypen zu orientieren scheint, um die Zielgruppe von Anzeigen zu bestimmen – auch wenn die Auftraggeber:innen die Ausschreibungen genderneutral gestalten. Die Untersuchungen von AlgorithmWatch machen deutlich, dass das klassische Antidiskriminierungsrecht, insbesondere das Allgemeine Gleichbehandlungsgesetz (...)

    #Facebook #algorithme #sexisme #biais #discrimination #travail #recrutement #AlgorithmWatch

    https://algorithmwatch.org/wp-content/uploads/2020/11/social-card_re_gendertargeting-facebook.png

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 13/11/2020

    Spanish police plan to extend use of its lie-detector while efficacy is unclear
    ▻https://algorithmwatch.org/en/story/spain-police-veripol

    Veripol is a software that assesses the veracity of complaints filed with the Spanish national police. It was introduced in 2018, but it’s unclear if it works as intended. Two years ago, the Spanish national police introduced a tool named Veripol in police stations to help detect false complaints, such as a person declaring a robbery that never happened. It is the first time such a tool is used in Spain, and probably worldwide. Veripol is a computer program that scans complaints for (...)

    #algorithme #biométrie #manipulation #police #AlgorithmWatch

    https://algorithmwatch.org/wp-content/uploads/2020/10/lie-detector-story-social-card.jpg

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 30/10/2020

    Belgium - Automating Society Report 2020
    ▻https://automatingsociety.algorithmwatch.org/report2020/belgium

    Contextualization As a result of the different governments, and the different levels of government, in Belgium (Federal and Regional), several different strategies dealing with digitization emerged in 2018. In Flanders, this strategy is called Vlaanderen Radicaal Digitaal, while in the Walloon region, it is known as Plan Numerique. In 2015, the federal government launched a strategy called Digital Belgium, but this covered more than just ADM. In 2018, the Flemish government launched an (...)

    #Accenture #Briefcam #algorithme #biométrie #éthique #facial #prédiction #reconnaissance #vidéo-surveillance #[fr]Règlement_Général_sur_la_Protection_des_Données_(RGPD)[en]General_Data_Protection_Regulation_(GDPR)[nl]General_Data_Protection_Regulation_(GDPR) #comportement (...)

    ##[fr]Règlement_Général_sur_la_Protection_des_Données__RGPD_[en]General_Data_Protection_Regulation__GDPR_[nl]General_Data_Protection_Regulation__GDPR_ ##discrimination ##enseignement ##pauvreté ##santé ##sport ##surveillance ##_ ##APD-Belgique ##AlgorithmWatch

    https://automatingsociety.algorithmwatch.org/wp-content/uploads/2020/10/AS2-social-media-card-v1.png

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 30/10/2020

    In Flanders, an algorithm attempts to make school choice fairer
    ▻https://automatingsociety.algorithmwatch.org/report2020/belgium/belgium-story

    In Belgium, some schools don’t have enough capacity for all students who want to go there. In the Flemish part of the country, the government introduced an algorithm to assign places in schools, in the hope of giving every student the same chances. Belgium has a long tradition of free choice of school, but in recent years this has met its limits. Some schools are more popular than others. Because class sizes are limited, this is a problem if too many parents want to enroll their children in (...)

    #algorithme #discrimination #enseignement #pauvreté #AlgorithmWatch

    ##pauvreté

    https://automatingsociety.algorithmwatch.org/wp-content/uploads/2020/10/AS2-social-media-card-v1.png

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 30/10/2020

    Automating Society Report 2020
    ▻https://automatingsociety.algorithmwatch.org

    Life in the automated society : How automated decision-making systems became mainstream, and what to do about it On a cloudy August day in London, students were angry. They flocked to Parliament Square by the hundreds, in protest – their placards emblazoned with support for unusual allies : their teachers, and an even more unusual target : an algorithm. Due to the COVID-19 pandemic, schools closed in March in the United Kingdom. With the virus still raging throughout Europe over the summer (...)

    #algorithme #biométrie #technologisme #facial #prédiction #reconnaissance #discrimination #enseignement #pauvreté #santé (...)

    ##pauvreté ##santé ##AlgorithmWatch

    https://automatingsociety.algorithmwatch.org/wp-content/uploads/2020/10/AS2-social-media-card-v1.png

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 25/10/2020
    5
    @biggrizzly
    @arno
    @simplicissimus
    @7h36
    @alexcorp
    5

    Spam filters are efficient and uncontroversial. Until you look at them.
    ▻https://algorithmwatch.org/en/story/spam-filters-outlook-spamassassin

    An experiment reveals that Microsoft Outlook marks messages as spam on the basis of a single word, such as “Nigeria”. Spam filters are largely unaudited and could discriminate unfairly. In an experiment, AlgorithmWatch sent a few hundred emails to 10 email inboxes at Gmail, Yahoo, Outlook, GMX and LaPoste (the last two are used by millions of Germans and French, respectively). All accounts were created specifically for the experiment. The results, which are available online, show that (...)

    #Altaba/Yahoo ! #Microsoft #Gmail #outlook.com #censure #discrimination #biais #AlgorithmWatch

    ##Altaba/Yahoo_!

    https://algorithmwatch.org/wp-content/uploads/2020/10/the-blowup-xBqM6cfgP4U-unsplash.png

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 22/09/2020

    In Italy, an appetite for face recognition in football stadiums
    ▻https://algorithmwatch.org/en/story/italy-stadium-face-recognition

    Right before the pandemic, the government and top sports authorities were planning a massive deployment of face recognition and sound surveillance technologies in all Italian football stadiums. The reason ? To help fight racism. At the beginning of 2020, just as the whole world was grappling with increasing evidence of the discriminatory and racist outcomes of face recognition technologies, Italy mulled its widespread adoption in football stadiums as an essential tool in the fight against (...)

    #algorithme #capteur #CCTV #biométrie #racisme #facial #reconnaissance #son #comportement #COVID-19 #écoutes #santé #sport (...)

    ##santé ##AlgorithmWatch

    https://algorithmwatch.org/wp-content/uploads/2020/09/elimirana-Pf0YT8q0Da0-unsplash.jpg

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 21/09/2020

    Suzhou introduced a new social scoring system, but it was too Orwellian, even for China
    ▻https://algorithmwatch.org/en/story/suzhou-china-social-score

    A city of 10 million in eastern China upgraded its Covid-tracking app to introduce a new “civility” score. It had to backtrack after a public outcry. Suzhou is a city with a population of 10 million, located 100 km west of Shanghai. It is well known for its classic Chinese gardens and, since last week, one of the most Orwellian social scoring experiments to date. The municipal government launched a pilot for a new social behavior scoring system on 3 September 2020, also referred to as the (...)

    #WeChat #Weibo #algorithme #Alipay #AlipayHealthCode #consentement #COVID-19 #notation #santé #SocialCreditSystem #AlgorithmWatch (...)

    ##santé ##surveillance

    https://algorithmwatch.org/wp-content/uploads/2020/09/49845463732_06798bdf34_k.jpg

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 20/09/2020

    Female historians and male nurses do not exist, Google Translate tells its European users
    ▻https://algorithmwatch.org/en/story/google-translate-gender-bias

    An experiment shows that Google Translate systematically changes the gender of translations when they do not fit with stereotypes. It is all because of English, Google says. If you were to read a story about male and female historians translated by Google, you might be forgiven for overlooking the females in the group. The phrase “vier Historikerinnen und Historiker” (four male and female historians) is rendered as “cuatro historiadores” (four male historians) in Spanish, with similar results (...)

    #Google #GoogleTranslate #algorithme #sexisme #discrimination #femmes #AlgorithmWatch

    https://algorithmwatch.org/wp-content/uploads/2020/09/GoogleTranslate.png

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 27/08/2020

    Pre-crime at the tax office : How Poland automated the fight against VAT fraud.
    ▻https://algorithmwatch.org/en/story/poland-stir-vat-fraud

    In their fight against fraud, Polish tax authorities use STIR, an algorithm sifting through the data of millions of entrepreneurs. The government claims success, but dozens of companies have been hit, some say wrongly. “We have broken the group of VAT fraudsters”, “We have detected the artificial vegetable oil trading being a carousel fraud”, “National Revenue Administration liquidated the Asian mafia thanks to STIR”. These are just a few headlines from the last few months. They showcase the (...)

    #algorithme #fiscalité #fraude #surveillance #AlgorithmWatch

    ##fiscalité

    https://algorithmwatch.org/wp-content/uploads/2020/08/kamil-gliwinski-xcPw1-5OHTk-unsplash.jpg

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 16/08/2020
    1
    @02myseenthis01
    1

    Under the Twitter streetlight : How data scarcity distorts research
    https://algorithmwatch.org/en/story/data-access-researchers-left-on-read

    As part of our #LeftOnRead campaign, several researchers testified to the reluctance of online platforms to provide useful data. Many resort to studying Twitter, which is more accommodating than most. Tiziano Bonini, an associate professor at the University of Siena, began an ethnographic investigation of online music platforms in late 2017, together with his colleague Alessandro Gandini of the University of Milan. As the discovery and consumption of cultural artifacts, including songs, (...)

    #TikTok #Facebook #Spotify #Snapchat #Instagram #Twitter #YouTube #algorithme #manipulation #données #biais #enseignement (...)

    ##AlgorithmWatch

    https://algorithmwatch.org/wp-content/uploads/2020/08/thomas-stephan-wBCqpK5extc-unsplash.jpg

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 14/08/2020

    Spain’s largest bus terminal deployed live face recognition four years ago, but few noticed
    ▻https://algorithmwatch.org/en/story/spain-mendez-alvaro-face-recognition

    Madrid South Station’s face recognition system automatically matches every visitor’s face against a database of suspects, and shares information with the Spanish police. Around 20 million travellers transited last year through Madrid’s South bus terminal, known as Méndez Álvaro Station to locals. Those 20 million persons had their face scanned as they entered the station. They were tracked as they walked to the bays where their bus was parked, before leaving the Spanish capital. Unless the (...)

    #algorithme #CCTV #biométrie #police #consentement #criminalité #facial #reconnaissance #vidéo-surveillance #[fr]Règlement_Général_sur_la_Protection_des_Données_(RGPD)[en]General_Data_Protection_Regulation_(GDPR)[nl]General_Data_Protection_Regulation_(GDPR) #surveillance # (...)

    ##criminalité ##[fr]Règlement_Général_sur_la_Protection_des_Données__RGPD_[en]General_Data_Protection_Regulation__GDPR_[nl]General_Data_Protection_Regulation__GDPR_ ##_ ##AlgorithmWatch

    https://algorithmwatch.org/wp-content/uploads/2020/08/Estación_de_Méndez_Álvaro_Madrid_03-1.jpg

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 14/08/2020

    Downgraded A-level students urged to join possible legal action
    ▻https://www.theguardian.com/education/2020/aug/13/downgraded-a-level-students-urged-to-join-possible-legal-action

    Legal letter sent to Ofqual and DfE calls for changes to ‘unfair’ grading algorithm Students affected by the mass downgrading of A-level results in England have been urged to join a possible legal action against the Department for Education and the exams regulator. Nearly 40% of A-level assessments by teachers were downgraded by the Office of Qualifications and Examinations Regulation’s algorithm, according to official figures published on Thursday morning. The method for allocating results (...)

    #algorithme #biais #discrimination #enseignement #AlgorithmWatch

    ▻https://i.guim.co.uk/img/media/0f6553ef78dc76f0dc5f17731c1001de2785f5c6/0_283_6962_4179/master/6962.jpg

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 14/08/2020
    1
    @02myseenthis01
    1

    This man had his credit score changed from C to A+ after a few emails
    ▻https://algorithmwatch.org/en/story/credit-score-crif-buergel

    A 52-year-old man in Hanover, Germany, discovered that he’d been erroneously scored by a credit bureau. His story reveals the gaps in credit score regulation. In October 2019, Mark Wetzler searched for a new electricity provider on a comparison website. He found one, asked for contract and didn’t think more about it. Two days later, he received a letter from the utility company telling him that his contract was denied. His credit score was too low. Mr Wetzler was taken aback by this answer. (...)

    #algorithme #banque #AlgorithmWatch #bug #notation

    https://algorithmwatch.org/wp-content/uploads/2020/05/CRIF-banner.png

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 5/08/2020
    1
    @02myseenthis01
    1

    Slovenian police acquires automated tools first, legalizes them later
    ▻https://algorithmwatch.org/en/story/slovenia-police-face-recognition

    The Slovenian police legalized its use of face recognition 5 years after it started to use it. Despite formal safeguards, no institution can restrain the Interior ministry. When Slovenian journalists or activists ask officials whether the police, the secret service, and the army are using any technological tools for mass surveillance, they are often reluctant to admit even the existence of such devices. However, when they meet their colleagues at international security conferences or (...)

    #algorithme #CCTV #IMSI-catchers #biométrie #facial #législation #reconnaissance #vidéo-surveillance #écoutes #surveillance (...)

    ##AlgorithmWatch

    https://algorithmwatch.org/wp-content/uploads/2020/06/5756105879_80aeddfddf_k.jpg

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 5/08/2020

    Werkwijze Belastingdienst in strijd met de wet en discriminerend
    ▻https://autoriteitpersoonsgegevens.nl/nl/nieuws/werkwijze-belastingdienst-strijd-met-de-wet-en-discrimine

    Afdeling Toeslagen van de Belastingdienst had de (dubbele) nationaliteit van aanvragers van kinderopvangtoeslag niet zo mogen verwerken als jarenlang gebeurde. Deze verwerkingen waren onrechtmatig, discriminerend en daarmee onbehoorlijk – zware overtredingen van de privacywet, de Algemene verordening gegevensbescherming (AVG). Dit blijkt uit onderzoek van de Autoriteit Persoonsgegevens (AP). Aleid Wolfsen, voorzitter van de AP, overhandigde het onderzoek vandaag aan staatssecretaris (...)

    #algorithme #migration #racisme #données #discrimination #AlgorithmWatch

    /profiles/cbp/themes/cbp/src/templates/presentation/img/ap_logo-200x200.png

    e-traces @etraces ART LIBRE
    Écrire un commentaire

  • @etraces
    e-traces @etraces ART LIBRE 5/08/2020
    1
    @02myseenthis01
    1

    Swiss police automated crime predictions but has little to show for it
    ▻https://algorithmwatch.org/en/story/swiss-predictive-policing

    A review of 3 automated systems in use by the Swiss police and judiciary reveals serious issues. Real-world effects are impossible to assess due to a lack of transparency. The Swiss police and justice authorities use, by one count, over 20 different automated systems to estimate or predict inappropriate behavior. Police and justice are largely regional competencies in Switzerland ; each Canton might have its own systems in place. Based on a series of reports by the Swiss public service (...)

    #algorithme #criminalité #prédiction #discrimination #prison #AlgorithmWatch

    ##criminalité

    https://algorithmwatch.org/wp-content/uploads/2020/07/bckgd-AS2CH.jpg

    e-traces @etraces ART LIBRE
    Écrire un commentaire

0 | 25

Thèmes liés

  • #algorithmwatch
  • #algorithme
  • #discrimination
  • #surveillance
  • #sexisme
  • #santé
  • #racisme
  • #facial
  • #biométrie
  • #google
  • #reconnaissance
  • #facebook
  • #technologisme
  • #enseignement
  • #biais
  • #prédiction
  • #manipulation
  • #cctv
  • #éthique
  • #instagram
  • #pauvreté
  • #consentement
  • #notation
  • #vidéo-surveillance
  • #données
  • #bug
  • #fraude
  • #bigdata
  • #sport
  • #covid-19
  • #police
  • #comportement
  • #criminalité
  • #youtube
  • #beauté
  • #googletranslate
  • #modération
  • #europeandatajournalismnetwork
  • #lgbt
  • #écoutes