UN warns of impact of smart borders on refugees : ‘Data collection isn’t apolitical’ | Migration and development

/un-warns-of-impact-of-smart-borders-on-

  • UN warns of impact of smart borders on refugees: ‘Data collection isn’t apolitical’

    Special rapporteur on racism and xenophobia believes there is a misconception that biosurveillance technology is without bias

    Robotic lie detector tests at European airports, eye scans for refugees and voice-imprinting software for use in asylum applications are among new technologies flagged as “troubling” in a UN report.

    The UN’s special rapporteur on racism, racial discrimination, xenophobia and related intolerance, Prof Tendayi Achiume, said digital technologies can be unfair and regularly breach human rights. In her new report, she has called for a moratorium on the use of certain surveillance technologies.

    Achiume, who is from Zambia, told the Guardian she was concerned about the rights of displaced people being compromised. She said there was a misconception that such technologies, often considered “a humane” option in border enforcement, are without bias.

    “One of the key messages of the report is that we have to be paying very close attention to the disparate impact of this technology and not just assuming that because it’s technology, it’s going to be fair or be neutral or objective in some way.”

    She cited the example of pushback against Donald Trump’s effort to build a wall between the US and Mexico. “You see that there isn’t a similar sense of outrage when digital technologies are deployed to serve the same function … if you actually look at some of the statistics, and if you look at some of the research, which I cite in my report, it turns out that border deaths have increased in places where smart borders have been implemented.”

    She also raised concerns about the ways in which humanitarian agencies are engaging with surveillance. The report notes that in Afghanistan, the UN refugee agency (UNHCR) requires returning refugees to undergo iris registration as a prerequisite for receiving assistance.

    While the UNHCR has justified the use of this technology as a way to prevent fraud, “the impact of processing such sensitive data can be grave when systems are flawed or abused”, the report said.

    Last year the UN’s World Food Programme partnered with Palantir Technologies, a data mining company, on a $45m (£34m) contract, sharing the data of 92 million aid recipients.

    “Data collection is not an apolitical exercise,” notes Achiume’s report, “especially when powerful global north actors collect information on vulnerable populations with no regulated methods of oversights and accountability.”

    Covid-19 has also accelerated “biosurveillance” – focused on tracking people’s movements and health. Biosurveillance has everyday uses, such as the “track and trace” app in the UK, but there are concerns about the regulation of large-scale data harvested from populations.

    One example is the “Covi-Pass”, a health passport developed by Mastercard and Gavi, a private-public health alliance, that is reportedly due to be rolled out across west Africa. The UN report highlighted the implications of such passports for freedom of movement, “especially for refugees”.

    Petra Molnar from the Refugee Law Lab in Toronto said it was clear that the pandemic was increasing digital rights violations. “State responses to the pandemic are exacerbating the turn towards biosurveillance, with refugees and people on the move acting as communities on which to test various interventions and fast-track tech development,” she said.

    Molnar, who contributed to the UN rapporteur’s report, has noted the dehumanising impact of some technologies on displaced people in her own research. One asylum seeker she spoke to in Belgium said the amount of personal data he’d given up made him feel, “like a piece of meat without a life, just fingerprints and eye scans”.

    “Our conversations with refugees and people crossing borders show how little attention is being paid to the lived experiences of people who are at the sharp edges of these high-risk technological experiments,” said Molnar.

    The intersection of technology and human rights violations were highlighted in a recent investigation into the European border agency Frontex, which allegedly witnessed pushbacks of migrants in the Aegean Sea via some of its assets, including drones.

    Konstantinos Kakavoulis from Homo Digitalis, a Greek organisation focused on digital rights, said technologies often outpaced the legal framework.

    “There is no clear regulation for the use of drones or body-worn cameras by the Greek police,” he said. “The police have signed a contract for the provision of a facial recognition software with Intracom Telecom, a Greek company, without receiving the opinion of the Greek data protection authority.”

    He added: “Apart from the insufficiency of legal safeguards, we also lack transparency; and this is not only remarkable, but highly problematic.”

    Achiume said that until the impact of surveillance technologies on human rights could be understood, use of such technologies should be halted. “Until we can understand and mitigate those harms, there should just be a moratorium on them.”

    https://www.theguardian.com/global-development/2020/nov/11/un-warns-of-impact-of-smart-borders-on-refugees-data-collection-isnt-ap

    #frontières #smart_borders #frontières_intelligentes #réfugiés #asile #migrations #technologie #politique #biopolitique #technologies_digitales #droits_fondamentau #droits_humains #surveillance #contrôles_frontaliers #neutralité #Palantir_Technologies #données #biosurveillance #Covi-Pass #Mastercard #Gavi #complexe_militaro-industriel #covid-19 #coronavirus #reconnaissance_faciale #Intracom_Telecom

    ping @karine4 @isskein @etraces @thomas_lacroix