Various methods and technologies have been used to measure emotions. Surveys, natural language processing, face and speech analysis, body movements and brainwaves. Unfortunately they are not always easy to use or completely accurate. Many require special conditions and are not easy to scale. Others, like surveys, ask people for time and focus. Time is priceless and people are often distracted. Also, there is a bias known as the Hawthorne effect. It is the fact that people change their behavior because they know that they are observed and questioned. Emotions are fast, automatic, frequent and subconscious processes. We need accurate, simple and scalable methods and technologies that do not require behavior change.
According to various reports, the growth of emotion technologies' market is unparalleled. Emotion detection and recognition market size is estimated to reach 36 Billion dollars by 2021. Regarding the market of mobile phones, 1.5 billion smartphones have been shipped last year. This is an increase of 1.2% compared to 2016. At this rate, smartphones' shipments will reach 1.7 billion units in 2021. Recent studies have found that 94% of 18-24 years old own a mobile phone. They open an app every 15 minutes and spend up to 4 hours per day on their phones. Users touch their phones an average of 2600 times per day.
So, what did we do with all of this, at Emaww?
We collected and analysed hundreds of thousands of users' smartphone touches over a period of 2 years. Our algorithms recognize the following 8 emotions (and intensities of emotions): anger, awe, desire, fear, grief, hate, excitement, love (and no emotion). The accuracy of our algorithms increases as we collect more data. Right now, it varies between 91 and 100% (depending on the emotion). We use the power of linguistics and machine learning to label emotions and their intensities, refine our dynamic and auto-corrective dictionary and translate them to other languages. Our algorithms measure other important metrics like: Stress, Perception, Arousal, Usability and Dominance. These metrics are valuable for different contexts and use cases. Our invention is patent-pending and we are in the process of submitting our study and validation tests in a scientific journal. Our API is currently available for beta testing. It's free to start using it. We have flexible plans for daily, monthly and yearly usage to fit the budget and needs of small developers and big insight agencies.
Imagine the future of user experience design... We will no longer have to observe, or ask questions, during, and after the test of a new interactive design! By using our API, they will know exactly what users feel, and when... while touching menus, links, images and other content on the interface during the interactive session.
Imagine the future of human-computer interactions... Users will no longer have to say or write anything about their emotions. Apps will be enabled to listen to users' emotions, as they touch, and interact with content on the screen. A new communication channel will be opened. A portal to the subconscious mind. A touch-board for emotions! After the camera, microphone, keyboard and mouse, our invention is the next input source.
Imagine the future of emotion research and analytics... Researchers will no longer have to test everything in a lab setting. Our API will allow researchers to connect with subjects in real life. They will conduct studies via interactive sessions, and correlate emotions with various contextual metrics. They will derive insight and deep understanding on a large scale.
WE NEED YOU!
At Emaww, our ultimate mission is to empower users with emotional awareness and apps with insight. We are open to work with anyone who wants to help us build the next generation of human-computer interaction.
Emaww APIs enable touchscreen applications to sense and visualize emotions from users' touches while they interact with content.