In China, a Three-Digit Score Could Dictate Your Place in Society | WIRED
▻https://www.wired.com/story/age-of-social-credit
In 2013, Ant Financial executives retreated to the mountains outside Hangzhou to discuss creating a slew of new products; one of them was Zhima Credit. The executives realized that they could use the data-collecting powers of Alipay to calculate a credit score based on an individual’s activities. “It was a very natural process,” says You Xi, a Chinese business reporter who detailed this pivotal meeting in a recent book, Ant Financial. “If you have payment data, you can assess the credit of a person.” And so the tech company began the process of creating a score that would be “credit for everything in your life,” as You explains it.
Ant Financial wasn’t the only entity keen on using data to measure people’s worth. Coincidentally or not, in 2014 the Chinese government announced it was developing what it called a system of “social credit.” In 2014, the State Council, China’s governing cabinet, publicly called for the establishment of a nationwide tracking system to rate the reputations of individuals, businesses, and even government officials. The aim is for every Chinese citizen to be trailed by a file compiling data from public and private sources by 2020, and for those files to be searchable by fingerprints and other biometric characteristics. The State Council calls it a “credit system that covers the whole society.”
For the Chinese Communist Party, social credit is an attempt at a softer, more invisible authoritarianism. The goal is to nudge people toward behaviors ranging from energy conservation to obedience to the Party. Samantha Hoffman, a consultant with the International Institute for Strategic Studies in London who is researching social credit, says that the government wants to preempt instability that might threaten the Party. “That’s why social credit ideally requires both coercive aspects and nicer aspects, like providing social services and solving real problems. It’s all under the same Orwellian umbrella.”
The State Council has signaled that under the national social credit system people will be penalized for the crime of spreading online rumors, among other offenses, and that those deemed “seriously untrustworthy” can expect to receive substandard services. Ant Financial appears to be aiming for a society divided along moral lines as well. As Lucy Peng, the company’s chief executive, was quoted as saying in Ant Financial, Zhima Credit “will ensure that the bad people in society don’t have a place to go, while good people can move freely and without obstruction.”
As Liu amassed a favorable transaction and payment history on Alipay, his score naturally improved. But it could go down if he neglected to pay a traffic fine, for example. And the privileges that come with a high score might someday be revoked for behaviors that have nothing to do with consumer etiquette. In June 2015, as 9.4 million Chinese teenagers took the grueling national college entrance examination, Hu Tao, the Zhima Credit general manager, told reporters that Ant Financial hoped to obtain a list of students who cheated, so that the fraud could become a blight on their Zhima Credit records. “There should be consequences for dishonest behavior,” she avowed. The good were moving without obstruction. A threat hung over the rest.
The algorithm behind my Zhima Credit score is a corporate secret. Ant Financial officially lists five broad categories of information that feed into the score, but the company provides only the barest of details about how these ingredients are cooked together. Like any conventional credit scoring system, Zhima Credit monitors my spending history and whether I have repaid my loans. But elsewhere the algorithm veers into voodoo, or worse. A category called Connections considers the credit of my contacts in Alipay’s social network. Characteristics takes into consideration what kind of car I drive, where I work, and where I went to school. A category called Behavior, meanwhile, scrutinizes the nuances of my consumer life, zeroing in on actions that purportedly correlate with good credit. Shortly after Zhima Credit’s launch, the company’s technology director, Li Yingyun, told the Chinese magazine Caixin that spending behavior like buying diapers, say, could boost one’s score, while playing videogames for hours on end could lower it. Online speculation held that donating to charity, presumably through Alipay’s built-in donation service, was good. But I’m not sure whether the $3 I gave for feeding brown bear cubs qualifies me as a philanthropist or a cheapskate.
Then, in 2010, Suining became one of the first areas in China to pilot a social credit system. Officials there began assessing residents on a range of criteria, including education level, online behavior, and how well they followed traffic laws. Each of Suining’s 1.1 million citizens older than 14 started out with 1,000 points, and points were added or deducted based on behavior. Taking care of elderly family members earned you 50 points. Helping the poor merited 10 points. Helping the poor in a way that was reported by the media: 15. A drunk driving conviction meant the loss of 50 points, as did bribing an official. After the points were tallied up, citizens were assigned grades of A, B, C, or D. Grade A citizens would be given priority for school admissions and employment, while D citizens would be denied licenses, permits, and access to some social services.
Although Liu hadn’t signed up for Zhima Credit, the blacklist caught up with him in other ways. He became, effectively, a second-class citizen. He was banned from most forms of travel; he could only book the lowest classes of seat on the slowest trains. He could not buy certain consumer goods or stay at luxury hotels, and he was ineligible for large bank loans. Worse still, the blacklist was public. Liu had already spent a year in jail once before on charges of “fabricating and spreading rumors” after reporting on the shady dealings of a vice-mayor of Chongqing. The memory of imprisonment left him stoic about this new, more invisible punishment. At least he was still with his wife and daughter.
Still, Liu took to his blog to stir up sympathy and convince the judge to take him off the list. As of October he was still on it. “There is almost no oversight of the court executors” who maintain the blacklist, he told me. “There are many mistakes in implementation that go uncorrected.” If Liu had a Zhima Credit score, his troubles would have been compounded by other worries. The way Zhima Credit is designed, being blacklisted sends you on a rapid downward spiral. First your score drops. Then your friends hear you are on the blacklist and, fearful that their scores might be affected, quietly drop you as a contact. The algorithm notices, and your score plummets further.
Now I had two tracking systems scoring me, on opposite sides of the globe. But these were only the scores that I knew about. Most Americans have dozens of scores, many of them drawn from behavioral and demographic metrics similar to those used by Zhima Credit, and most of them held by companies that give us no chance to opt out. Others we enter into voluntarily. The US government can’t legally compel me to participate in some massive data-driven social experiment, but I give up my data to private companies every day. I trust these corporations enough to participate in their vast scoring experiments. I post my thoughts and feelings on Facebook and leave long trails of purchases on Amazon and eBay. I rate others in Airbnb and Uber and care a little too much about how others rate me. There is not yet a great American super app, and the scores compiled by data brokers are mainly used to better target ads, not to exert social control. But through a process called identity resolution, data aggregators can use the clues I leave behind to merge my data from various sources.
Do you take antidepressants? Frequently return clothes to retailers? Write your name in all caps when filling out online forms? Data brokers collect all of this information and more. As in China, you may even be penalized for who your friends are. In 2012, Facebook patented a method of credit assessment that could consider the credit scores of people in your network. The patent describes a tool that arrives at an average credit score for your friends and rejects a loan application if that average is below a certain minimum. The company has since revised its platform policies to prohibit outside lenders from using Facebook data to determine credit eligibility. The company could still decide to get into the credit business itself, though. (“We often seek patents for technology we never implement, and patents should not be taken as an indication of future plans,” a Facebook spokesperson said in response to questions about the credit patent.) “You could imagine a future where people are watching to see if their friends’ credit is dropping and then dropping their friends if that affects them,” says Frank Pasquale, a big-data expert at University of Maryland Carey School of Law. “That’s terrifying.”
#Surveillance #Evaluation #Monnaie_numérique #Chine #Social_credits