Technology continues to shape the marketing research industry; we are more agile and efficient than ever. As a result, we can also carve out new opportunities and capture behavioral data like never before. Beyond traditional methods, mobile and social media tap into consumer insights. But what about taking it to the next level - collecting consumer emotions? We recently talked with Matt Celuszak, CEO of CrowdEmotion, on the role of emotion in consumer behavior, leveraging mobile and unmasking cultural differences.
Question: We constantly hear the need for change across the marketing research industry. How will crowdemotion’s technology platform enable change in 2016?
Answer: Quite simply it unlocks unwritten, unspoken communications. People don't often say what they feel and using sensors, we can now combine subconscious inputs with reported ones. It's like being face to face at scale without the cost.
Question: Emotional intelligence taps into our ability to use our understanding of emotions. Through your platform (i.e., smart sensors and devices), we can better understand the role of emotion in consumer behavior. How will this data affect advertising and product development?
Answer: Behaviours are largely driven by emotions, then decisions. Emotion data provides context to why people do what they do. Largely, emotion metrics help define people at a personal level - similar to DNA, but less intrusive and more related to decision-making and behaviours.
For advertising, it fundamentally redefines the "right moment" and enables advertisers to truly personalise advertisement out of home, on mobile and even more targeted on TV. For products, it should force more modularity so product users can personalise.
Finally, for research overall, it shifts from research being a separate function to delivery and brings data and delivery much closer together if not fully integrated. Imagine videos that react and respond to your reactions. True personalisation at scale.
Question: As audience research extends from local projects to global initiatives, what cultural differences are you unmasking?
Answer: We are seeing that content engagement differs by audience groups and by countries. For example, one BBC drama saw a number of people cluster emotionally across age and gender groups rather than within them. Another example is that dramas across the UK, US and Australia all need to demonstrate a furrowed brow to achieve positive engagement. However, in the US and Australia, they also need to cause audiences to smile and be surprised. This questions how we target our content promotion: by demographics or by mood states?
As we start to look at data from Latin America and Asia, we start to learn the nuances of emotional measurement at a cultural and individual level.
Question: How do emotion insights differ from traditional research metrics? How do you leverage mobile differently?
Answer: The biggest difference is that emotion metrics live on a time series (i.e., continuous) where as other reported metrics are single, static measures. In research, we often report what people said or how they behaved. With emotions, feelings change all the time and consist of many different combinations. Our focus as an emotion metric provider is to understand the continuous metric and summarise it to explain and work with other metrics in a "static" way.
Mobile inherently becomes a critical tool for emotion capture because it is the closest to providing the emotion sensory information in relation to the person's decisions and actions on device. This creates a strong feedback loop.
Question: What do you see is the future of emotional understanding and what new technologies are expected to make a difference in the next two years?
Answer: In the next two years, the consumer and B2B markets will be familiarising with emotion data and how to use it. Sensor technologies will become commonplace across consumer devices from wearables, home appliances, and mobile devices. With use, emotional data is nearly 10,000 times the size of behavioural data. So, cloud computing will need to exponentially increase its computing capabilities to process emotional data. Network systems will need to transfer data at a much more effective rate. Mobile devices will need to pre-process the sensory inputs.
The science and R&D areas should be advancing quite a bit in healthcare data and potentially security data, but I don't estimate those markets to ready for another five years at least.