AI ethics - why teaching ethics and “ethics training” is problemati...
▻https://diasp.eu/p/11663295
AI ethics - why teaching ethics and “ethics training” is problematic | #AI #artificialintelligence #bias #ethics #teaching
AI ethics - why teaching ethics and “ethics training” is problemati...
▻https://diasp.eu/p/11663295
AI ethics - why teaching ethics and “ethics training” is problematic | #AI #artificialintelligence #bias #ethics #teaching
To Surveil and Predict: A Human Rights Analysis of Algorithmic Poli...
▻https://diasp.eu/p/11585135
To Surveil and Predict: A Human Rights Analysis of Algorithmic Policing in Canada | #algorithmic #bias #canada #crime #discrimination #humanrights #justice #policy #predictive #recommendations #surveillance
Unraveling the Mindset of Victimhood - A must read. (▻https://www.sc...
▻https://diasp.eu/p/11495110
Unraveling the Mindset of Victimhood - A must read. | #anthropology #anxiety #attachment #attribution #behavior #beliefs #bestof #bias #conflicts #control #culture #egoism #elitism #empathy #forgiveness #fulfillment #growth #happiness #identity #insecurity #interpretation #learning #locus #memory #mindset #paranoia #powerdynamics #psychology #recognition #relationships #rumination #sociology #victimhood
Why Are We So Quick To Scrutinise How Low-Income Families Spend The...
▻https://diasp.eu/p/11236908
Why Are We So Quick To Scrutinise How Low-Income Families Spend Their Money? | #bias #comparison #consumption #inequality #money
La politique des putes
Océan réalise, avec « La #Politique_des_putes », une enquête en immersion dans laquelle il tend le micro à des travailleuses·rs du sexe. Elles disent le stigmate, la marginalisation, la précarité, les violences systémiques mais aussi les ressources et l’empowerment. Pour elles, l’intime est résistance. Dix épisodes de 30 mn pour briser les préjugés.
►http://www.nouvellesecoutes.fr/podcasts/intime-politique
#sex_work #prostitution #patriarchy #capitalism #feminism #wage_labor #whorephobe #whorephobia #pimping #stigma #bias #prejudice #stigmatization #discrimination #systemic_violence #instiutional_violence #heterosexual_concept #sexual_education #normalisation #abolotionism #black_and_white #subventions #decriminalisation #penalty #laws #rights #transphobia #domination #marginalisation #vulnerability #invisbility #undocumented #isolation #fear #police_harassment #physical_violence #rape #precarity #affirmation #empowerment #dignity #trust #solidarity #network #community #choice #perception #society #associations #seropositive #access_to_healthcare #suicidal_thought #debt #menace #voodoo #exploitation #trafficking #migration #borders #family_pressure #clients_image #mudered #testimony #interview #podcast #audio #France #Paris
“sex work is not dirty - dirty are all the representations about sex work” (La politique des putes 7/10, min 7).
sur la gestion (pathétique?) de la pandémie, d’un collègue de Nancy
Is “#stay_at_home” the correct #social_distancing measure to fight the covid-19 pandemic ?
▻https://medium.com/@antonello.lobianco/is-stay-at-home-the-correct-social-distancing-measure-to-fight-the-covid-19-
“The message “stay at home’’ is undoubtedly simpler than “don’t stay without means of respiratory protection in closed places in the presence of other people and avoid physical contacts’’ but this simplicity comes at the cost of a #bias.
In other words, looking at everyone on the road as a possible infector, is only a desperate response from a #frightened and #confused society that finds itself sick but does not want to discover itself technologically overtaken.”
A tug-of-war over biased AI - Axios
▻https://www.axios.com/ai-bias-c7bf3397-a870-4152-9395-83b6bf1e6a67.html
The idea that AI can replicate or amplify human prejudice, once argued mostly at the field’s fringes, has been thoroughly absorbed into its mainstream: Every major tech company now makes the necessary noise about “AI ethics.”
Yes, but: A critical split divides AI reformers. On one side are the bias-fixers, who believe the systems can be purged of prejudice with a bit more math. (Big Tech is largely in this camp.) On the other side are the bias-blockers, who argue that AI has no place at all in some high-stakes decisions.
Why it matters: This debate will define the future of the controversial AI systems that help determine people’s fates through hiring, underwriting, policing and bail-setting.
What’s happening: Despite the rise of the #bias-blockers in 2019, the #bias-fixers remain the orthodoxy.
#IA #biais #technologie #décisions
Why Feedback Rarely Does What It’s Meant To (▻https://hbr.org/2019/0...
▻https://diasp.eu/p/9998277
Why Feedback Rarely Does What It’s Meant To | #bestof #bias #cognition #errors #excellence #failures #fallacy #feedback #judgment #leadership #learning #management #psychometrics #radicalcandor #selfevaluation #truth
Les #algorithmes, un danger pour la #santé des Américains noirs
Même les #logiciels de prise de décision utilisés par les hôpitaux aux États-Unis sont racistes. Peut-on corriger ces #préjugés, se demande la revue scientifique Nature.
▻https://www.courrierinternational.com/article/technologies-les-algorithmes-un-danger-pour-la-sante-des-amer
#santé #racisme #Noirs #Afro-américains #USA #Etats-Unis
Millions of black people affected by racial #bias in health-care algorithms
Study reveals rampant racism in decision-making software used by US hospitals — and highlights ways to correct it.
Ghani says that his team has carried out unpublished analyses comparing algorithms used in public health, criminal justice and education to human decision making. They found that the machine-learning systems were biased — but less so than the people.
“We are still using these algorithms called humans that are really biased,” says Ghani. “We’ve tested them and known that they’re horrible, but we still use them to make really important decisions every day.”
Et les chicanos, les indiens, les prisonniers ?
Ils n’ont pas de problèmes ?
Oro e acqua minerale
C’è una piccola regione in Australia, nella zona centrale dello stato di Victoria, che è molto ticinese. C’è un paese che si chiama Hepburn dove i cognomi degli abitanti sono Rodoni, Vanzetta, Scheggia, Vanina, Tinetti, Righetti, Crippa, Perini, Respini... Non parlano italiano, non parlano dialetto ticinese e sono veramente australiani.
I loro antenati emigrarono in Australia dal Ticino, attorno al 1850, quando scoppiò la febbre dell’oro. Facevano i cercatori d’oro, ma piano piano si insediarono in quella regione e crearono una comunità molto unita, che presto diventò la loro nuova patria.
▻https://www.rsi.ch/la1/programmi/cultura/storie/Oro-e-acqua-minerale-10879319.html
#film #documentaire #émigration #Tessin #Australie #histoire #Suisse #Biasca #sureau #eau_minérale #Hepburn #or #ruée_vers_l'or #extractivisme #colonisation #châtaigniers #mines
#Welcome_Stranger, une #pépite_d'or :
–---
Quelques commentaires :
« L’Australia non ha una lunga storia, solo un paio di secoli »
–-> dit un habitant de Melbourne qui a acheté une maison à Hepburn construite par une Scheggia, un émigrant tessinois autour de 1815...
Et la présentatrice en commentaire après le documentaire :
«La storia dell’Australia è molto molto giovane, lo si diceva nel documentario, 2 secoli di storia o poco più»
... comme si les #peuples_autochtones n’existaient pas avant l’arrivée des Européens, comme si l’histoire n’est écrite que depuis leur arrivée... Il y a un sacré besoin de décoloniser l’histoire...
Introspection illusion (▻https://en.wikipedia.org/wiki/Introspection...
▻https://diasp.eu/p/9148347
Introspection illusion | #bias #cognitive #illusion #introspection #mental #state
Matthew P. to Paul G.: “It’s hard to take risks if you don’t have a...
▻https://diasp.eu/p/9079253
Matthew P. to Paul G.: “It’s hard to take risks if you don’t have a safety net.” | #bias #culture #entrepreneurship #risks #safety #SiliconValley #startups #survivorship
‘If you’re struggling to survive day-to-day’: Class optimism and co...
▻https://diasp.eu/p/9079254
‘If you’re struggling to survive day-to-day’: Class optimism and contradiction in entrepreneurial discourse | #bias #contradiction #culture #entrepreneurship #research #SiliconValley #startups
You can’t characterize human nature if studies overlook 85 percent ...
▻https://diasp.eu/p/8487211
You can’t characterize human nature if studies overlook 85 percent of people on Earth | #bias #psychology #research #WEIRD
Bias in #ai: a letter to my daughter
▻https://hackernoon.com/my-dearest-eliisabet-de724a86d105?source=rss----3a8144eabfe3---4
My dearest Eliisabet,Only 5 years old, but you will grow up to become a strong, beautiful & independent woman, of that I’m sure.You are a daughter of a man that wasn’t always an example to others. It took some time for me to grow up & take responsibility for my actions. Being your father has accelerated this process for sure.My generation seems to be struggling with growing up as well. Especially in the relatively young tech community, and more specific the artificial intelligence community, people are not always taking their responsibilities serious.Equality is hard work for my non-white & non-male colleagues in both academia and the industry. The current system is biased in many subtle but also in some very obvious ways.Although the tech community has an huge impact on (...)
Tackle #bias and Other Problems/Solutions in Machine Learning Models
▻https://hackernoon.com/tackle-bias-and-other-problems-solutions-in-machine-learning-models-f427
Predictive Analytics models rely heavily on Regression, Classification and Clustering methods. When analysing the effectiveness of a predictive model, the closer the predictions are to the actual data, the better it is. This article hopes to be a one-stop reference to the major problems and their most popular/effective solutions, without diving into details for execution.A Linear Regression PlotA clustering algorithm plotPrimarily, data selection and pruning happens during the Data Preparation phase, where you take care to get rid of bad data in the first place. Then again, there are issues with the data, and their relevance to the ML model’s objectives during training, troubles with usage of #algorithms, and errors in the data that occur throughout. Effectively, the model is tested for (...)
#machine-learning #machine-learning-models #predictive-analytics
Belief in FakeNews is Associated with Delusionality, Dogmatism, Rel...
▻https://diasp.eu/p/8000419
Belief in FakeNews is Associated with Delusionality, Dogmatism, Religious Fundamentalism, and Reduced Analytic Thinking | #delusionality #bias #analytic #analyticthinking #belief #fundamentalism #fakenews #thinking #cognition #dogmatism
Don’t Want to Fall for Fake News? Don’t Be Lazy (▻https://www.wired....
▻https://diasp.eu/p/8000416
Don’t Want to Fall for Fake News? Don’t Be Lazy | #bias #analyticthinking #reflexivity #lazy #CRT #cognition #fakenews
Extending the seductive allure of neuroscience explanations effect ...
▻https://diasp.eu/p/7759430
Extending the seductive allure of neuroscience explanations effect to popular articles about educational topics - Im - 2017 - British Journal of Educational Psychology - Wiley Online Library | #psychology #persuasion #influence #bias #neuroscience #education #explanations #seduction
The Akrasia Effect: Why We Don’t Follow Through on Things (https://...
▻https://diasp.eu/p/7545431
The Akrasia Effect: Why We Don’t Follow Through on Things | #akrasia #desire #selfcontrol #time #bias #gratification #intentions #procrastination #friction #strategy #cognition #addiction #inconsistancy #engagement
lien relié au billet :
The Akrasia Effect : Why we ....
▻https://jamesclear.com/akrasia
........
▻https://de.wikipedia.org/wiki/Akrasia
........
▻https://plato.stanford.edu/search/searcher.py?query=Akrasia
▻https://plato.stanford.edu/entries/weakness-will
▻https://plato.stanford.edu/entries/aristotle-ethics/#Akra
▻https://plato.stanford.edu/entries/epistemic-self-doubt
Quantifying Biases in Online Information Exposure | Center for Complex Networks and Systems Research, Indiana University
▻https://arxiv.org/abs/1807.06958
▻https://arxiv.org/pdf/1807.06958.pdf
Our consumption of online #information is mediated by filtering, ranking, and recommendation algorithms that introduce unintentional biases as they attempt to deliver relevant and engaging content. It has been suggested that our reliance on online technologies such as search engines and social media may limit exposure to diverse points of view and make us vulnerable to manipulation by disinformation. In this paper, we mine a massive dataset of Web traffic to quantify two kinds of bias: (i) homogeneity bias, which is the tendency to consume content from a narrow set of information sources, and (ii) popularity bias, which is the selective exposure to content from top sites. Our analysis reveals different bias levels across several widely used Web platforms. Search exposes users to a diverse set of sources, while social media traffic tends to exhibit high popularity and homogeneity #bias. When we focus our analysis on traffic to news sites, we find higher levels of popularity bias, with smaller differences across applications. Overall, our results quantify the extent to which our choices of online systems confine us inside “social bubbles.”
#personnalisation #médias_sociaux #algorithme via @pomeranian99
Flow chart of cognitive biases (►https://news.ycombinator.com/item?i...
▻https://diasp.eu/p/7043595
Flow chart of cognitive biases | #bestof #cognitive #biases #cognition
The Trouble with Bias - Kate Crawford - NIPS 2017 (50mn)
▻https://www.youtube.com/watch?v=6Uao14eIyGc&t=3m46s
Great talk about #bias, #fairness and #responsibility in #machineLearning
(and thanks to whoever put those NIPS talks somewhere else than Facebook)
U.S. Black Mothers Die In Childbirth At Three Times The Rate Of White Mothers : NPR
▻https://www.npr.org/2017/12/07/568948782/black-mothers-keep-dying-after-giving-birth-shalon-irvings-story-explains-why
Black women are more likely to be uninsured outside of pregnancy, when Medicaid kicks in, and thus more likely to start prenatal care later and to lose coverage in the postpartum period. They are more likely to have chronic conditions such as obesity, diabetes and hypertension that make having a baby more dangerous. The hospitals where they give birth are often the products of historical #segregation, lower in quality than those where white mothers deliver, with significantly higher rates of life-threatening complications.
Those problems are amplified by unconscious #biases that are embedded in the medical system, affecting quality of care in stark and subtle ways. In the more than 200 stories of #African-American mothers that ProPublica and NPR have collected over the past year, the feeling of being devalued and disrespected by medical providers was a constant theme.
There was the new mother in Nebraska with a history of hypertension who couldn’t get her doctors to believe she was having a heart attack until she had another one. The young Florida mother-to-be whose breathing problems were blamed on obesity when in fact her lungs were filling with fluid and her heart was failing. The Arizona mother whose anesthesiologist assumed she smoked marijuana because of the way she did her hair. The Chicago-area businesswoman with a high-risk pregnancy who was so upset at her doctor’s attitude that she changed OB/GYNs in her seventh month, only to suffer a fatal postpartum stroke.
Over and over, black women told of medical providers who equated being African-American with being poor, uneducated, noncompliant and unworthy. “Sometimes you just know in your bones when someone feels contempt for you based on your #race,” said one Brooklyn, N.Y., woman who took to bringing her white husband or in-laws to every prenatal visit. Hakima Payne, a mother of nine in Kansas City, Mo., who used to be a labor and delivery nurse and still attends births as a midwife-doula, has seen this cultural divide as both patient and caregiver. “The nursing culture is white, middle-class and female, so is largely built around that identity. Anything that doesn’t fit that #identity is suspect,” she said. Payne, who lectures on unconscious bias for professional organizations, recalled “the conversations that took place behind the nurse’s station that just made assumptions; a lot of victim-blaming — ’If those people would only do blah, blah, blah, things would be different.’”
...
But it’s the discrimination that black women experience in the rest of their lives — the double whammy of race and gender — that may ultimately be the most significant factor in poor maternal outcomes.
“It’s chronic stress that just happens all the time — there is never a period where there’s rest from it. It’s everywhere; it’s in the air; it’s just affecting everything,” said Fleda Mask Jackson, an Atlanta researcher who focuses on birth outcomes for middle-class black women.
It’s a type of stress for which education and class provide no protection. “When you interview these doctors and lawyers and business executives, when you interview African-American college graduates, it’s not like their lives have been a walk in the park,” said Michael Lu, a longtime disparities researcher and former head of the Maternal and Child Health Bureau of the Health Resources and Services Administration, the main federal agency funding programs for mothers and infants. “It’s the experience of having to work harder than anybody else just to get equal pay and equal respect. It’s being followed around when you’re shopping at a nice store, or being stopped by the police when you’re driving in a nice neighborhood.”