What would it mean for research if our participants' hardware could share senses?
As published on Greenbook on Tuesday January 29, 2013. I often reflect on issues like ‘what if we could do…’ to spark my imagination and think about the future of our business. Many times I end up in science fiction-like scenes of ‘transporting’ and ‘de/rematerializing’, ‘traveling in time’, becoming ‘invisible’. Often nothing with real use so far, I have to admit. Until I recently read IBM’s 5 in 5 on cognitive computing. It literally and figuratively adds a lot of sense(s) to our profession.
The key line of thinking is ‘what would it mean for our research if our participants’ hardware could share their senses?’ Especially digital research (online seems too narrow here as we need to include mobile based technology) will get another boost and put more pressure on physical methods.
Imagine the consequences for research on packaging, shopper insights, product innovation concepts, even customer experience if consumers’ smartphones touchscreen and vibration capabilities can mimic the physical sensation of touching something. No more need to rely on purely audio-visual representation and textual description; we can truly be in touch from a distance.
Computers will be able to not only look at pictures but also understand them. This opens vast opportunities for ‘visual mining’ to make big data truly illustrative. What can now be done with text mining will be nothing compared to our ability in making sense of pictures in an automated way. Today we have to rely on human tags in e.g. visual ethnography analyses. Soon humans will no longer have to do this as we will be able to show computers examples and they will recognize and make sense of images themselves. InSites Consulting specializes in consumer insight generation through Research Communities and such a process often generates 1000’s of photos in a study… I can hardly wait for cognitive computing to recognize patterns, objects and brands to browse through and quantify this visual capital. And the added value is not in qualitative analysis only. If a computer or mobile device can see and understand what a consumer does as well as their context, imagine how we can detect and observe factual patterns in consumers who upload a photo in a survey and link those to the perceptual and attitudinal data.
There is a lot of discussion in the market research industry today about how to measure consumer emotions. There are ways of doing it through facial pattern recognition, colors or by means of implicit measures and cognitive load. While a discussion of the pro’s and con’s is beyond the scope here, it seems we will at least be able to complement these insights with computers’ hearing capabilities. Cognitive computing will allow detecting moods and emotions in voice patterns as computers will hear what matters. It may also completely change the way our survey interfaces work. If we can bring questions and answers to speech level, beyond text, we will be able to capture the subtleties of consumers’ answers in ‘how’ they respond to a structured question. But the same holds for qualitative in which the spoken word is key. Hearing recognition will open opportunities for analyzing consumer conversations and storytelling in ways beyond our imagination today.
Computers will be able to invent new recipes for example based on our recorder taste pallets to come up with just snacks we like and that are healthy for us. At least objectively, according to the machine. It will be interesting to see how perception and branding will kick-in and create a gap between what people should like unbranded and what they do like when branded. Such gaps will make marketing and research especially interesting. But there are also opportunities for crowd product development making the fragrance and flavor development even more scalable. Imagine Ben & Jerry’s is looking for new flavors for their portfolio of ice creams. A direct application could be to bring together ‘DIY ice cream makers’ in an online research community and have them make their favorite home-made ice cream. Based on their composition and an analysis of other consumers’ taste patterns specific creations may be matched to others and tested. It would provide an answer as to why people (dis)like certain flavors and tastes.
For market research smell recognition by computers may be the hardest to translate into concrete research applications (smell transmission to participants via computers would have more obvious applications – but that really seems Star Trek still…). I can think of hands-on research applications in the fragrances field to develop odors. Perfume brands for example may apply this technology to compose, determine and segment odors people like in different contexts to come up with new varieties. Just like in the taste example it would literally be co-created smell.
At first sight these evolutions may seem to have the power to automate a lot we do as researchers but I do not think that is entirely the case. Just as with any innovation they will lead to creative destruction. Some old value will be replaced by new. As with our actual human senses the power for research will be in the mix such that these technologies will complement what we do and enrich our human understanding. If cognitive computing sets through just like IBM predicts, we may expect a next transformational wave of innovation coming to the research and marketing industry. The recent venue of social media and current hype on big data will only fade against it.
To end, these are just a couple of first things that came to my mind when reading through the material from IBM. I am sure I have overlooked aspects of and I would be curious to see what you think. So feel free to add other thoughts … it can only make more ‘sense’ for all us!