Is it possible to tell whether someone is a #criminal just from looking at their face or listening to the sound of their #voice? The idea may seem ludicrous, like something out of science fiction — Big Brother in “1984” detects any unconscious look “that carried with it the suggestion of abnormality” — and yet, some companies have recently begun to answer this question in the affirmative. AC Global Risk, a startup founded in 2016, claims to be able to determine your level of “#risk” as an employee or an asylum-seeker based not on what you say, but how you say it.
AC Global Risk, which boasts the consulting firm of Robert Gates, Condoleezza Rice, and Stephen Hadley on its advisory board, has advertised contracts with the U.S. Special Operations Command in Afghanistan, the Ugandan Wildlife Authority, and the security teams at Palantir, Apple, Facebook, and Google, among others. The extensive use of risk screening in these and other markets, Martin has said, has proven that it is “highly accurate, scalable, cost-effective, and capable of high throughput.” AC Global Risk claims that its RRA system can simultaneously process hundreds of individuals anywhere in the world. Now, in response to President Donald Trump’s calls for the “extreme vetting” of immigrants, the company has pitched itself as the ultimate solution for “the monumental refugee crisis the U.S. and other countries are currently experiencing.”
Some skeptical experts who study AI and human behavior have framed these tools as part of a growing resurgence of interest in #physiognomy, the practice of looking to the body for signs of moral character and criminal intent.