top of page
Search
  • amyo22

Emotions Meet AI

Can an AI tell if you’re happy, sad or angry? Apparently, yes!


A Chinese company, Taigusys, has developed a system that detects facial expressions of people and formulates reports on how they’re feeling. Taigusys lists various multinational corporations like Huawei, but it is not yet known if they are to use this product in the workspace.


How does the system function?

Using AI, the system runs expressions of many people at once, then analyses each person’s biometric signals plus muscle movements. It then evaluates these as emotions in three categories: ‘good’, ‘negative’, or ‘neutral’. ‘Good’ encompasses looks of happiness and surprise, whilst ‘negative’ includes looks of anger and sorrow. In fact, the software can detect when someone is fake smiling.


If people are high in negative emotion categories, the software can produce reports and recommend them for 'emotional support'. The system can also figure out if someone is doing anything suspicious.


As of now, 27 companies in China are developing emotion recognition software.


What do people think about this?

However, the question of ethics largely comes into play here. For instance, Vidushi Marda believes this technology conflicts with legal/ethical rights of employees in a workspace. She states that “Even in the premises of a privately-owned workplace, there's still an expectation of privacy and dignity." Furthermore, software can even threaten diversity. By forcing people to act a certain way that appeals to the algorithmic standard, it compromises people’s ability to think freely. She also says that the systems are pseudoscience based, meaning the notion that one’s facial expression gives an indication of their emotional statement is unproven.


Another perspective is Desmond Ong’s (assistant professor at the National University of Singapore's School of Computing). He thinks the emotional recognition software can be of great benefit in certain life-or-death scenarios, but it can also penalize performance based on subjective factors.


So...is it being used?

Right now prisons are the only place admitting to using the program. They are installed in 300 prisons in China, in order to keep prisoners docile. Monitoring people 24/7, the system aims to prevent prisoners from committing violent acts or suicide.


Apart from this though, the fate of emotion recognition technology is still uncertain. In the future, we hope that the ethical weaknesses can be overcome in some shape or form, since this technology has the potential to revolutionize work productivity and well-being for the greater good.



Sources

https://www.insider.com/ai-emotion-recognition-system-tracks-how-happy-chinas-workers-are-2021-6



Written by Amanda Y.


bottom of page