Black Mirror in Beijing … China’s new social credit scoring system is being implemented right now

March 4, 2019

China plans to rank all its citizens based on their “social credit” by 2020. Citizens will be rewarded or punished according to their scores. Like private financial credit scores, a person’s social score can move up and down according to their behaviour.

The program is due to be fully operational nationwide by 2020, but is being piloted for millions of people across the country already. By the end of next year the scheme will be mandatory.

At the moment the system is only partly implemented, some are run by city councils, while others are scored by private tech platforms that hold personal data, most significantly Alibaba’s City Brain project. People can be penalised for perceived poor behaviours such as bad driving, smoking in non-smoking zones, buying too many video games and posting fake news online. Criticism of government, and similar bolder moves, are not even mentioned on the scale.

China has already started punishing people by restricting their travel. Nine million people with low scores have been blocked from buying tickets for domestic flights, Channel News Asia reported in March, citing official statistics. Other punishments include reducing your internet speed, or blocking all access, blocking you from the best schools, healthcare or jobs. It could also bar you from hotels and restaurants, take your pets away, or publicly shame you.

China’s new social credit system has been compared to Black Mirror, the Netflix fictional series with a similar concept, and many other dystopian futures which used to belong only in a sci fi world. It’s interesting, but also brings huge ethical and human rights questions.

The idea for social credit was announced by President Xi’s government as an opt-in system in 2014. But there’s a difference between the official government system and private, corporate versions, though the scoring systems have already started to merge.

In western countries we are accustomed to credit checks, where data brokers like Experian trace the timely manner in which we pay our debts, giving us a score that’s used by lenders and mortgage providers. Anyone who has shopped online with eBay has a rating on shipping times and communication, while Uber drivers and passengers, Airbnb hosts and guests, rate each other.

China’s social credit system takes the same idea further to all aspects of life, evaluating every citizens’ behaviour and trustworthiness. Caught jumping a traffic, not paying a court fine, playing your music too loud on the train … you could lose certain rights, such as booking a flight or train ticket.

“The idea itself is not a Chinese phenomenon,” says Mareike Ohlberg, research associate at the Mercator Institute for China Studies. Nor is the use, and abuse, of aggregated data for analysis of behaviour. “But if the Chinese system does come together as envisioned, it would still be something very unique,” she says. “It’s both unique and part of a global trend.”

Are you a good citizen?

Unveiled in a 2014 plan, most pieces of the system are already in place, and the Chinese government is targeting 2020 to get the full system rest in place, across the huge nation.

As yet, there’s no one social credit system. Instead, local states and cities have their own social record systems that work differently, while unofficial private versions are operated at companies such Ant Financial’s Zhima Credit, better known as Sesame Credit (Ant is the payment firm spun out of Alibaba). The systems use shopping habits among other data to inform credit-style scores, on an opt-in basis.

The private systems, including Ant Financial’s Sesame Credit, often get merged with the government plans, though they aren’t part of the official system. To be a bit more confusing, the data collected by private companies is expected to be used by the government in the future, and some of the data is already used in government trials. Sesame Credit says this is only with user consent.

What’s troubling is when those private systems link up to the government rankings — which is already happening with some pilots. “You’ll have sort of memorandum of understanding like arrangements between the city and, say, Alibaba and Tencent about data exchanges and including that in assessments of citizens,” Ohlberg adds. That’s a lot of data being collected with little protection, and no algorithmic transparency about how it’s analysed to spit out a score or ranking, though Sesame does share some details about what types of data is used.

How the social credit system works

The target by end next year is that the government system will be nationwide, with businesses given a “unified social credit code” and citizens an identity number, all linked to permanent record. “If you go to a credit China website, and you have an entity’s credit code, you can type that in and pull up credit records,” explains Hoffman. “Individuals will have ID-linked codes.” It’s less a score, she says, and more of a record.

Some reports talk about a blacklist; that’s part of the official government social credit system, which means if you owe the government money, for example, you could lose certain rights. There’s a difference between getting a low social credit score and being blacklisted by the government, such as for refusing to pay a fine.

The criteria that go into a social credit ranking depends on where you are, notes Ohlberg. “It’s according to which place you’re in, because they have their own catalogs,” she says. It can range from not paying fines when you’re deemed fully able to, misbehaving on a train, standing up a taxi, or driving through a red light.

One city, Rongcheng, gives all residents 1,000 points to start. Authorities make deductions for bad behaviour like traffic violations, and add points for good behaviour such as donating to charity. One regulation Ohlberg recently read specifically addresses stealing electricity. Of course, you’ll have to get caught first or be reported by someone else. While facial recognition is infamously used to spot jaywalkers, in some cities it’s not so automated, Ohlberg notes.

Private projects, such as Sesame Credit, hoover up all sorts of data on its 400 million customers, from how much time they spend playing video games (that’s bad) to whether they’re a parent (that’s good). That can be shared with other companies. One example is Sesame Credit linking up with the Baihe dating site, so would be partners can judge each other on their looks as well as their social credit score; that system is opt-in.

So far, taking part in both the private and government versions is technically voluntary; in the future, the official social credit system will be mandatory. That said, there’s plenty of pressure to take part now. “There are incentives for participating, and disincentives for not participating,” Hoffman notes.

What happens if you’re blacklisted?

China has already started punishing people by restricting their travel. Nine million people with low scores have been blocked from buying tickets for domestic flights, Channel News Asia reported in March, citing official statistics.  They can also clamp down on luxury options — three million people are barred from getting business-class train tickets.

The eventual system will punish bad passengers specifically. Potential misdeeds include trying to ride with no ticket, loitering in front of boarding gates, or smoking in no-smoking areas.

According to Foreign Policy, credit systems monitor whether people pay bills on time, much like financial credit trackers — but also ascribe a moral dimension.

Other mooted punishable offences include spending too long playing video games, wasting money on frivolous purchases and posting on social media.

Spreading fake news, specifically about terrorist attacks or airport security, will also be punishable offences.

Liu Hu is a journalist in China, writing about censorship and government corruption. Because of his work, Liu has been arrested and fined — and blacklisted. Liu found he was named on a List of Dishonest Persons Subject to Enforcement by the Supreme People’s Court as “not qualified” to buy a plane ticket, and banned from travelling some train lines, buying property, or taking out a loan.

“There was no file, no police warrant, no official advance notification. They just cut me off from the things I was once entitled to,” he told The Globe and Mail. “What’s really scary is there’s nothing you can do about it. You can report to no one. You are stuck in the middle of nowhere.”

What recourse is there? With the government system, if you want to be removed from a blacklist, you can either pay your bill or appeal to the court, says Jing Zeng, a researcher at the University of Zurich. “Bring your money to the court and then you get removed from the system,” she says. “It’s not a judicial system by itself, it’s still the court you need to [appeal to].”

However, the Chinese justice system leaves much to be desired, says Hoffman. “There are no genuine protections for the people and entities subject to the system,” she says.

“In China there is no such thing as the rule of law. Regulations that can be largely apolitical on the surface can be political when the Communist Party of China (CCP) decides to use them for political purposes.” In April 2018, the Civil Aviation Administration of China (CAAC) sent letters to international airlines demanding they show Taiwan as part of China, saying the government would “make a record of your company’s serious dishonesty and take disciplinary actions” for any that didn’t comply; they all eventually did. The system used to pressure the airlines was a pilot of the Civil Aviation Industry Credit Measures, which is part of the official social credit system.

Alongside the potential for abuse of power, the knock-on effects of statewide surveillance, and the likelihood of incorrect data, Ohlberg notes the a few bad marks on a social credit record could spark a negative spiral.

While it varies by programme, in some local pilots a positive rating means discounts and benefits, such as a simplified process with bureaucracies. If you have a low rating, you may have extra paperwork or fees. “Once you’re in a low category, it makes it difficult,” she says. “I see a huge potential for negative spiral.” Such a system could further divide society, creating classes of people depending on their social credit — and this is where comparisons to Black Mirror pop up.

What is China’s objective?

“It’s all about building trust”, says the Chinese government. The 2014 document describing the government’s plans note that as “trust-keeping is insufficiently rewarded, the costs of breaking trust tend to be low.”

And Chinese society does have trust issues, says Ohlberg, be it food quality scandal, pollution, or employees not paying their workers. “But the system can also be used to enforce vague laws like endangering national security or unity,” she adds. Zeng says that can include food safety and product quality, major problems in the country. “It’s a big problem in Chinese society,” she says. “They are punishing companies for this kind of bad behaviour.”

Plus, it could help build alternative means of financial credit, says Ohlberg, as many people in China live outside financial systems, so have no trustworthy credit rating. “Some of the earlier pilots of the social credit system that preceded the major policy plan that was published in 2014, were actually building a social credit system for the countryside,” she says. “The majority of people there wouldn’t have financial banking data on them.” For businesses, a social credit system could also be used for micro enterprises, which couldn’t be assessed with traditional criteria.

Hoffman isn’t buying that argument, saying such a system is about government power. “If solving problems was the real goal, the CCP would not need social credit to do it,” she says. “China’s social credit system is a state-driven program designed to do one thing, to uphold and expand the Chinese Communist Party’s power.”

She adds that social credit is a tech-enabled way to tie political power to social and economic development that’s been discussed in the country since the 1980s, an automation of Chairman Mao’s Mass Line — a term to describe how the party’s leadership shaped and managed society. “In Mao’s China, the Mass Line relied on ideological mass mobilisation, using Mao Zedong’s personal charisma, to force participation,” Hoffman says. “The CCP could no longer, after the Mao era, rely on ideological mobilisation as the primary tool for operationalising social management.”

The full extent of the impact on social credit to Chinese citizens is impossible to say, simply because the system doesn’t fully exist yet. Zeng suggests the reality is somewhere between the government’s claims and the Western media’s description of horror-filled dystopias. “It’s a very like a baby step,” she said of the work that’s happened so far.

Ohlberg agrees that early reporting had multiple errors that led to misunderstandings of the system — but that doesn’t mean social credit isn’t dangerous. “It’s somewhere between the people who say the media coverage is inaccurate and that means it’s not so bad and the people who see this huge dystopia,” she says. “You have to find this space between that were you can explain it is actually quite scary, even if it’s not quite the way it’s portrayed.”

Because of that, no other country should be considering this idea, says Hoffman. “The west should not copy any aspect of social credit,” Hoffman says. “Often comparisons are drawn between private applications like Uber and its rating system for customers and drivers. While these private company systems are extremely problematic in my view, they are fundamentally different. The People’s Republic of China is an authoritarian country, the Chinese Communist Party is responsible for gross human rights violations for decades— just look at the example of Xinjiang now. There is nothing any liberal democratic society should even think about copying in the social credit system.”

Related concepts

China has rolled out their new AI-enabled facial recognition eyewear to police forces over recent months, enabling officers to recognise people’s profiles instantly, so they can treat people more individually. Manufacturer LLVision says they’re able to recognise individuals from a pre-loaded database of 10,000 suspects in just 100 milliseconds:

Alibaba is creating a “City Brain” for each major city in China, combining high amounts of public and private data – there will soon be 600 million CCTV cameras on Chinese streets –  enhanced by AI and machine learning, to do everything from monitor traffic flows, to weather forecasting, addressing emergencies more quickly, and monitoring citizens to credit scores:

Join the discussion

Your e-mail address will not be published. Required fields are marked *