05/31/2022 / By Ramon Tomey
China is now using specialized “emotion recognition” technology to monitor the feelings of people. This technology allows the Chinese regime to criminalize certain emotions and arrest citizens based on these. In fact, it has started using artificial intelligence that scrutinizes if a person is more nervous than usual for crime prevention purposes – as mentioned by a state-run publication.
Chinese state-run paper Global Times wrote in a March 4 article that the Chinese Communist Party (CCP) is applying the technology in “various fields” such as health and urban security. It added that Chinese citizens are increasingly being accustomed to the everyday use of emotion recognition technology.
The state-run paper illustrated an example of how emotion recognition technology can fight crime. With the help of artificial intelligence (AI) pre-warning systems observing drivers and passengers, officers can monitor people looking strangely nervous while inside a car. A subsequent flag-down and search reveal illegal drugs hidden inside the vehicle.
Global Times remarked that emotion recognition technology in China has achieved “high precision” use. According to EmoKit Tech Co., Ltd. head Wei Qingchen, products based on AI emotion recognition can achieve an accuracy rate of 70 percent to 95 percent. His firm specializes in developing products based on its emotion recognition engine Emokit: The engine “accurately [combines] signals from video, audio and body sensors and [performs] multi-frequency overlay analysis.”
Wei cited an instance of the Emokit engine’s accuracy in a clinical setting through a project done in cooperation with a renowned Beijing psychiatric hospital. The AI system achieved a combined computerized accuracy of 78.8 percent in identifying schizophrenia, almost the same as that of clinical testing performance. He added that the system diagnosed depression with a 70 percent accuracy rate, through merely listening to a patient’s voice.
Global Times also noted the use of emotion recognition technology in the Chinese penal system in 2019, adding that it has “contributed to the risk assessment of prisoners in a few regional prisons.” According to sociology professor Ma Ai of the China University of Political Science and Law, the technology helps prison officials evaluate if a prisoner “presents potential risks … [such as] possible mental problems and violent or suicidal tendencies.” He added that the technology can also estimate prisoners’ recidivism following release.
Ma told the state-run paper that this “most advanced [technology] in the world” is used in five to six prisons in China. (Related: Big Tech propping up China’s police state surveillance system.)
A December 2019 report by the MIT Technology Review estimated the emotion recognition technology market to be at least $20 billion – and is growing rapidly. Currently, the technology is used to assess people applying for a job and those suspected of crimes. Further tests are being done for other applications, such as in virtual reality headsets to find out the emotional states of gamers.
Breitbart National Security Editor Frances Martel warned that China can use emotion recognition technology for more sinister intents. She specifically noted that it can be woven into existing systems to surveil Muslim Uighurs in the western autonomous region of Xinjiang. The CCP has installed high-resolution security cameras with facial recognition technology to monitor the movements of the Uighur minority. This is aside from the arbitrary detention and forced sterilization imposed on them – which a number of nations have condemned. (Related: Leaked documents show how China is using AI and mass surveillance to commit “cultural genocide” and put people in concentration camps.)
Aside from the possible ill uses, privacy concerns have also emerged in the budding industry. Some have voiced out their discomfort about AI technology capturing and “reading” their emotions. Others have also pointed out the possible abuse and leakage of personal information that may subsequently undermine people’s safety.
But a number of Chinese insiders have remarked that emotion recognition technology need not be vilified. Ningbo University neuromanagement expert Ma Qingguo said: “The right to life is a fundamental human right. In some cases, emotion recognition can figure out dangerous or violent persons and help prevent their wrongdoings – which helps save people’s lives.”
Meanwhile, Wei commented that emotion computing is not an evil “mind-reading” technique. He asserted that Chinese companies behind emotion recognition technologies “strictly follow the principle of ‘technology for good’.” The EmoKit head told the Global Times that emotion recognition technologies have the ultimate aim of “better serving people” and “not letting lawbreakers get away with [their crimes.]”
Visit Surveillance.news to learn more about the CCP’s foray into monitoring citizens’ emotions.
Sources include:
Tagged Under:
artificial intelligence, China, Chinese Communist Party, Chinese regime, citizen monitoring, Emokit, emotion recognition, Emotions, Glitch, obey, Orwellian, police state, privacy, privacy violations, surveillance, surveillance technology, totalitarianism, Wei Qingchen
This article may contain statements that reflect the opinion of the author
COPYRIGHT © 2018 ORWELLIAN.NEWS
All content posted on this site is protected under Free Speech. Orwellian.news is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Orwellian.news assumes no responsibility for the use or misuse of this material. All trademarks, registered trademarks and service marks mentioned on this site are the property of their respective owners.