Should we be using emotion detection AI?
Share This Article
Facial recognition software, stress scoring, and biometric surveillance in hiring and policing are wrong and obnoxious.
This is similar to things like the keystroke tracker that employers put on their employees’ machines, particularly the ones that take a screenshot of what they’re working on at that particular moment.
The problem with these software programs, particularly in hiring and management, is that they give the employer a massive unfair advantage over an employee.
If I go into an interview and I have just had a rough bus ride to the interview, that stress scoring is not going to know what’s causing my stress; it’s just going to know that I’m stressed about something.
For the police to use it to detect my face seems like a great way to protect us from terrorists and criminals, but it’s half a step from there to abusing it to track down political opponents or keep track of a spouse’s movements.
Here’s the problem: these technologies exist. They are nearly unavoidable in the real world.
For employees and applicants, a zero tolerance toward this type of technology as a group is the only way that it’s not going to enter the workforce.
As a society, we need to make it clear that we do not want government and police forces using this technology in such a way that they simply default to it any time they have to do police work.
The difficulty, of course, is that the people who are going to use it for nefarious purposes aren’t actually ever going to be consulting with us about when and where they do that.
How can we deal with things like stress scoring and facial recognition software as a society?
#AI #AIissues #AIethics #business #AISensum #AIagents #technology #resposibletech #AIreplacementtheory

Check out These Related Blog Posts
