Smartphone could become over-smart soon and will know your emotions too

Laughing young guy enjoying a conversation over the cellphone

Smartphones will soon become over-smart, when they will start checking for the mood of the user. The researchers from the University of Rochester are working on this project to make it possible.

Researchers have worked on the algorithm for the speech-based emotion classification. They will present the new program at the IEEE Workshop on Spoken Language Technology on Dec. 5 in Miami, Florida. This program evaluates human feelings, excluding the meanings of the words, through speech with greater accuracy than ever.

“We actually used recordings of actors reading out the date of the month – it really doesn’t matter what they say, it’s how they’re saying it that we’re interested in,” Wendi Heinzelman, professor of electrical and computer engineering, said in a statement.

Heinzelman said that that program works on the 12 features of speech such as volume and pitch to check for one of the six emotions and it achieved 81% accuracy.

One of Heinzelman’s graduate students, Na Yang, during a summer internship at Microsoft Research, developed prototype of an app to check for the happy or sad face by recording and analyzing the user’s voice.

“The research is still in its early days,” Heinzelman added, “but it is easy to envision a more complex app that could use this technology for everything from adjusting the colors displayed on your mobile to playing music fitting to how you’re feeling after recording your voice.”

Researchers have reported that the accuracy dropped from 81% to about 30%, if the different voice is used on which the system is trained for speech-based emotion classification, and they are working on to minimize these effects. As Heinzelman said, “there are still challenges to be resolved if we want to use this system in an environment resembling a real-life situation, but we do know that the algorithm we developed is more effective than previous attempts.”