Orgad

Social Credit Systems: China and the West

Liav Orgad

Reijers and Van ‘t Klooster have provoked a fascinating debate on social credit systems. The disagreement among the authors is fundamental – factual and normative. There is no agreement on the definition of a social credit system, the normative benchmark to evaluate it, the implications on the institution of citizenship, the function of the Chinese version of a Social Credit System, and the lessons to be learned from China’s system to western democracies. The reader is left with no clear answer on whether a Social Credit System, in one version or another, is normatively ‘good’ or ‘bad’ (in terms of legitimacy, fairness, or efficiency). Is the ‘problem’ of China’s Social Credit System related to its design, technological function, and the regime in which it operates, and can there ever be a liberal-democratic version of a Social Credit System? I want to address below one aspect of the debate: the comparison between China and the West.

Credit Systems in the West

Scoring systems are well known in western societies. On the private sector, ranking is everything – Airbnb, Uber, eBay, and even university professors (e.g., Rate My Professors platform). We rate one another (as passengers, clients, students), products, locations, and situations; by using technology or without it. Examples are all around us: agencies that rate job applications and provide employment and credit checks (e.g., ‘Experian’); insurance companies that offer better deals for people who are willing to give access to personal data like fitness trackers (e.g., Vitality’s Fitness Tracker Offers) and tracking devises to record driving patterns (e.g., Allstate or State Farm); landlords who decide on potential tenants based on ‘tenant blacklists’ in the housing rental market; and firms that base their decisions on data bought from ‘data brokers’. The consequences of such ratings can be severe – a refusal to a job application and health insurance policy or inability to return a product in case of being on a ‘store-returns blacklist’ (for customers who often return products). The most known examples are private credit scores in the United States (e.g., FICO) and other states (e.g., Schufa in Germany), which assess the ‘creditworthiness’ of a person based on financial criteria (e.g., payment history, debt burden, credit used, length of credit history, and geosocial data), and whose score are frequently used by banks, insurance and credit card companies. Such methods are expanding in the West.

Scoring systems are utilised not only in the private sector but also in the public sector. In Israel, for instance, every citizen who serves in the military gets a personal score (a quality index, kaba), which reflects his/her value as a soldier. The numerical figure (41-56) influences the placement in military positions, which, in Israel, is likely to affect the future prosperity of the individual. Reported criteria include personal data (education or motivation) as well as the result of a psychometric test and an interview. At Ben Gurion Airport, every passenger is assigned a score according to the security risk he or she presents, and there are ‘blacklists’ and ‘redlists’ according to the security risk.[1] Scoring systems are used by public authorities in almost every western country. Not all scoring methods are tech-based, and some of them are unnoticed, but the public sector regularly rates the valuableness, reliability, and credibility of both citizens and non-citizens (think, for instance, of a points-based system for immigration).

At the municipal level, ideas of social credit are on the rise. In the past years, there have been attempts to use technology to motivate ‘good citizenship’ through the creation of a catalogue of ‘good deeds’. Examples of deeds include voting, helping the elderly, taking first-aid courses, organising cultural events, and undergoing self-employment workshops. Citizens can choose and implement good deeds from a municipal catalogue, which will then be recorded and scored; based on the score, each citizen will receive a reward from a parallel catalogue of ‘municipal benefits’. Rewards may include free public transportation and bicycle rental, tickets to cultural events, and reduced municipal housing. The idea is to use financial incentives to motivate civic engagement, social solidarity, volunteer work, and social interaction with the hope that, over time, these incentives will create a culture of communal activism. Different versions of this system exist in European cities (e.g., Innowave CityPoints in Cascais, Portugal).

China and the West

In spite of the rise of social credit systems in the West, the case of China differs significantly for at least four reasons. First, consent and choice: with some exceptions, western rating systems are voluntary, not mandatory. Participation on private platforms such as Airbnb and Uber, and even getting a credit score at a credit card company, are voluntary – one does not have to sign in and can always opt-out. In China, however, according to the State Council Agenda, the system will become mandatory in 2020 and people will be included in the scope of the system regardless of personal choices (existing regional pilots in China are mostly voluntary). This will create a coercive scoring system in which participation is the default.

Second, scope: rating systems in western societies do not usually score people qua citizens, but qua drivers, landlords, and clients. By targeting people as citizens, China’s Social Credit System does not target them according to a particular, limited profession, but according to being members of the political community. By scoring people qua citizens, every individual and every single aspect of the public life becomes conducive for being scrutinised. If a person has a low credit score in the United States, she may not be able to get a loan, but this will not affect to which schools her child can be admitted or whether she can travel on high-speed trains. In other words, the Chinese system is not limited to one field but all-encompassing. Due to that aspect, the implications of doing wrong radiate beyond the context of the wrongness into other spheres of life. The system is comprehensive for another reason. In western democracies, the rule of law entails that everything which is not forbidden is allowed; this creates a division, which is not always clear, between law and morality. In China, Social Credit Systems target not only legally impermissible actions but also socially and professionally undesirable actions. By giving rewards and impose sanctions to activities that are not illegal per se but, at most, are socially undesirable, China extends ‘citizen-making mechanisms’ to the moral improvement of people.

Third, authority: China is developing a database whose sources are both public and private and whose outcomes will be implemented by the public and the private sectors. Data sharing between government institutions in China, and between them to private companies – such as Alibaba, Tencent, WeChat, and Baidu – will produce a centralised database where many aspects of a person’s public and private life are recorded: commercial data (e.g., mobile phone purchases), social data (e.g., social media contacts), and digital data (e.g., internet search history). As an analogy, think of a data-sharing mechanism between the U.S. Government, Facebook, Google, and Amazon. Such a database, and the system of carrots and sticks attached to it, will allow China to extend the system’s logic from ‘good citizenship’ to ‘good personhood’.

Fourth, human rights: having a high or a low score, or being on a blacklist, can have far-reaching implications. The affected human rights are not peripheral but fundamental. If one has a low score as a passenger in Uber, he or she may not be able to get an Uber ride, but can still take a taxi from a different provider or take a bus or a train or drive by car; except for cases of discrimination, one has no ‘right’ to get Uber services or a Facebook account. In China, the affected human rights are fundamental: education, health, housing, as well as freedom of movement, human dignity, and privacy issues. Of course, the implications of being blacklisted in the West can also be severe – think of a person who is classified as a high-risk at the airport or has a low credit score in the United States – and may infringe upon fundamental human rights. Still, western societies have judicial review in which a person can challenge the assigned score (and the system as a whole) and can vote in a free election on the future of the system.

This is not a fundamental case against the social credit systems but, rather, against their design, application, and consequences. In other words, it is not a matter of principle, but degree. The challenge, with which political theorists are likely to struggle in the next years, is how to design a democratic version of social credit. Eventually, a social credit system mirrors all the good and the bad things of the legal system and the socio-economic structure that implements it.

[1]     Hasisi, B., Margalioth, Y., & Orgad, L. (2012). ‘Ethnic Profiling in Airport Screening: Lessons from Israel, 1968-2010’, American Law & Economics Review, 14(2), p. 517.