Advertisement

Function creep: China’s COVID data and surveillance

COVID-19 monitoring data could be used to expand China’s citizen surveillance system, argue Ausma Bernot, Alexander Trauth-Goik and Sue Trevaskes 

Sep 01, 2021, updated Sep 01, 2021
Photo: EPA/Wu Hong

Photo: EPA/Wu Hong

China has used big data to trace and control the outbreak of COVID-19. This has involved a significant endeavour to build new technologies and expand its already extensive surveillance infrastructure across the country.

In our recent study, we show how the State Council, the highest administrative government unit in China, plans to retain some of those new capabilities and incorporate them into the broader scheme of mass surveillance at a national level. This is likely to lead to tighter citizen monitoring in the long term.

This phenomenon of adopting a system of surveillance for one purpose and using it past the originally intended aims is known as “function creep”.

In China, this involves the use of big data initially collected to monitor people’s COVID status and movements around the country to keep the pandemic under control. The Chinese government has been quite successful at this, despite recent spikes in infections in eastern China.

But this big data exercise has also served as an opportunity for authorities to patch gaps in the country’s overall surveillance infrastructure and make it more cohesive, using the COVID crisis as cover to avoid citizen backlash.

Mass testing at a factory in Wuhan

Mass testing at a factory in Wuhan, where COVID was first detected in 2019.
AP

How China’s COVID surveillance system worked

Two key shifts have occurred to enable more comprehensive surveillance during the pandemic.

First, a more robust system was constructed to collect and monitor big data related to pandemic control.

Second, these data were then collated at the provincial levels and transferred to a national, unified platform where they were analysed. This analysis focused on calculated levels of risk for every individual related to possible exposure to COVID.

This is how it worked. Every night, Chinese citizens received a QR code to their mobile phone called a “health code”. The code required users to upload their personal information to a special app to verify their identity (such as their national ID number and a biometric selfie), along with their body temperature, any COVID symptoms, and their recent travel history.

The system then assessed whether they had been in close contact with an infected person. If users received a green code to their phone, they were good to go. But an orange code mandated a seven-day home isolation, and a red code was 14-day isolation.

The system was not perfect. Some people suspected their codes remained red because they were from the hotspot province of Hubei, or questioned why their codes unexpectedly turned red for just one day. Others reported the codes incorrectly identified their exposure risk.

Staff checking people's green 'health codes'.

Staff checking people’s green ‘health codes’ at the gate of the entrance to a park in Shanghai.
Yang Jianzheng/AP

How Chinese people feel about this data collection

Multiple studies suggest that although the system was intrusive, this state-controlled, big data monitoring was supported by the public because of how effective it was in containing the epidemic.

A recent study found the public viewed this comprehensive data collection as positive and that it helped strengthen the legitimacy of the Chinese Communist Party.

The Chinese public also viewed the initial criticism from Western countries as unfair and hypocritical, given many subsequently adopted varying forms of big data collection systems themselves.

One scholar, Chuncheng Liu, canvassed Chinese social media and observed a notable social backlash against this type of criticism. After the state of South Australia released a new QR code system, for example, one comment read:

China QR code – ‘invasion of privacy, invasion of human rights’. Australian QR Code – ‘Fantastic new tool’.

On the flip side, there has been some public resistance in China over the potential for health codes to be re-engineered and used for other purposes.

The city of Hangzhou was the first to implement the health codes in February 2020. However, in May 2020 when the municipal government proposed re-purposing the app for other uses after the pandemic (such as mapping people’s lifestyle habits), it was met with strong citizen backlash.

Concerns were further exacerbated when health code data was hacked in Beijing in December 2020. The hackers published the selfies that celebrities had used for biometric identity verification, as well as their COVID testing data.

How these systems can be used for other purposes

When big data systems become as expansive as they are now in China, they can shape, direct and even coerce behaviours en masse. The implications of this in a surveillance state are concerning.

In the Guangxi autonomous region in March 2020, for example, one party member suggested using pandemic surveillance to “search for people that couldn’t previously be found”, effectively turning a health service into a security tool.

Another example is how China’s notorious “social credit system” was revamped during the pandemic.

The system was originally set up before the pandemic to rate myriad “trustworthy” and “untrustworthy” behaviours among individuals and businesses. Good scores came with benefits such as cheaper transportation.

During the pandemic, this system was expanded to reward people for “good pandemic behaviour” and punish “bad pandemic behaviour”. Two academics in the Netherlands found punishments were imposed for selling medical supplies at an inflated price or counterfeit supplies, or for violating quarantine.

Such behaviour could get a person blacklisted, which might deny them the ability to travel or even serve as a civil servant, among other restrictions.

As we argue, it is crucial these surveillance systems embed principles of transparency and accountability within their design. If these systems aren’t thoroughly tested or their potential future uses questioned, people can become habituated to top-down surveillance and function creep.

To what extent these new surveillance systems will direct the behaviours of people in China remains to be seen. A lot depends on how the public reacts to them, especially as they are used for non-health purposes after the pandemic.

Ausma Bernot, PhD Candidate, Griffith University; Alexander Trauth-Goik, , University of Wollongong, and Sue Trevaskes, Head of School (Interim), Griffith University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Conversation

Local News Matters
Advertisement
Copyright © 2024 InDaily.
All rights reserved.