The new frontier of control in China is the recognition of emotions

The new frontier of control in China is the recognition of emotions

The new recognition systems used in China not only check facial features, but are used to check people's emotions and ethnicity

(Photo: STR / Getty Images) Emotion recognition is the latest evolution in the world of surveillance systems in China. After the obligation of facial recognition for anyone who owns a smartphone, the Chinese authorities also aim to monitor and control the mood of people. Clearly, the use of these technologies also involves a massive collection of sensitive personal data, such as ethnicity and mental health.

“Ordinary people here in China are not happy with this technology, but they are not has no other choice. If the police say there must be cameras in a community, people will just have to live with them. This demand has always been there and we are here to fulfill it, ”Chen Wei of Taigusys, a company specializing in emotion recognition technology, told the Guardian.

China is the first country in the world to use of facial recognition systems and the emotion recognition industry is booming. Taigusys systems are installed in around 300 prisons, detention centers and custody facilities across China and connect around 60,000 cameras to each other. In addition to the security sector, emotion control tools have also been installed in some schools to monitor teachers, pupils and staff, in nursing homes for the elderly to detect changes in the emotional state of residents and in some shopping malls and parking lots. .

Public opinion has raised some criticisms, reports the Guardian, regarding the use of emotion recognition technology in schools, but hardly any discussion regarding the use made by the authorities to control the population in general. Chen, although aware of the criticisms, insists on the contribution that the system could make in stopping accidents that cause damage to people or things. Emotion recognition technologies are presumably capable of inferring a person's feelings based on traits such as facial muscle movements, tone of voice, body movements and other biometric signals, by detecting facial expressions related to anger, sadness , happiness or boredom. Collecting this information would therefore be useful in preventing crime or violent behavior. On the other hand, however, this data can safely be used to profile and monitor people within the already super-guarded Chinese society.

Algorithm errors

Another problem is that recognition systems are based on archives created with the use of actors and actresses, who pose in what they believe to be expressions of happiness, sadness, anger or other emotional scenarios. However, facial expressions can vary widely between cultures, leading to further ethnic inaccuracies and bias. Additionally, the Taigusys systems include identifiers such as Uyghur, a Muslim ethnic minority living in Xinjiang. "In China," Chen told the Guardian, "our recognition systems are used to distinguish Uighurs from Han Chinese" referring to the country's dominant ethnicity: "If a Uighur appears, he will be labeled, which will not happen with a han" .

Numerous human rights activists such as the Article 19 group, which deals with the social and legal implications of new technologies, contest these recognition systems as based on pseudo-scientific stereotypes and potentially very dangerous as regards the protection of privacy and freedom of expression, not only in China but throughout the rest of the world. The lack of ad hoc laws gives complete freedom to the development of these technologies and the collection of impressive amounts of data.

Videogames - 56 minutes ago

The hardest videogame ever: not even the artificial intelligence manages to win

adsJSCode ("nativeADV1", [[2,1]], "true", "1"); Rules - 2 hours ago

Introducing new crimes is not enough to protect victims of online violence

adsJSCode ("nativeADV2", [[2,1]], "true", "2 "); Business - 6 hours ago

There is a company that wants workers and robots to collaborate even in small businesses


China Artificial intelligence Privacy Racism Surveillance globalData.fldTopic = "China, Artificial Intelligence, Privacy, Racism, Surveillance"

You May Also Interest

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.

Powered by Blogger.