Artificial Intelligence (AI) is developing at a fast clip, and it seems like the future dreamed about for decades in sci-fi novels, TV, and movies will be here before we know it. Certain “smart” programs like Siri and Alexa have already improved some of our lives in more ways than we could have imagined.
What if instead of servicing us, these devices had enough intelligence, agency, and free will to become our friends and companions? What if they could read, and possibly even have emotions just like you and me?
The key to this lies in facial recognition. If AI can use facial recognition to understand our emotions
1. We’ve Developed Technology With A High IQ, But Not EQ
The computer will always win at chess and can even write a better song than most people. That’s because we’ve been creating technology that has a high IQ, but it’s still lacking in EQ. EQ is how we measure emotional intelligence, which is the understanding of your own as well as other people’s emotions.
If you see someone crying, you can narrow down the scenario and figure out what the cause might be. What might the future look like if AI is able to do the same?
2. It Starts With Mapping Facial Recognition
The way that AI would be able to read your emotions is by mapping your facial features. Points such as the corners of your eyebrows, mouth, and cheeks are crucial to figuring out human emotion.
We’ve already reached a milestone in creating technology that can map our faces and make assumptions based on the expressions we’re making. There are still some difficulties with that, however.
3. Not All Emotion Can Be Cataloged
There is software that has already mapped the faces of millions and has shown the ability to accurately guess emotions when combined with facial expressions as well as the tone of voice. Not all human emotion can be mapped, however.
What about someone being sarcastic? Saying “I’m so happy to be here,” can be said in many different ways and have alternate meanings depending on what you’re trying to express. AI might not be able to understand sarcasm, irony, and hyperbole.
4. Education Would Improve
If we could create AI that can accurately read human emotion, we can start improving education. Facial expressions during tests and exams can be mapped to determine if a student is underwhelmed or overwhelmed, comprehending or confused. It can then adjust course material to meet the needs of these students.