An A.I. takeover is a hypothetical scenario by which we’ve seen demonstrated in a number of Hollywood sci-fi films and TV series, beginning from Terminator and Disney’s Smart House to Netflix’s Black Mirror and Altered Carbon.

But the concerns surrounding A.I. and a scenario where it could take over its environment, essentially turning against its creator, primarily reside throughout the workforce with respect to automation. For companies and brands that spend considerable amounts of time and money each year on advertising and marketing to promote their products, A.I. becomes exceedingly more attractive.

How cool (and effective) would it be to actually see consumer reaction while coming across your business’ promotion for the very first time?

Grit Daily spoke with Cristina de la Peña, about her company, Synapbox, which has managed to alleviate some of these concerns by collecting, measuring, and analyzing the reaction of individuals in real-time while viewing the promotional material that a company wants to launch. Pretty sweet right?

Last year, de la Peña was named as one of the winners of the Innovators Under 35 Latin America 2018 awards granted by MIT Technology Review.

Grit Daily: You’ve had your own adventures before founding Synapbox. Share those.

Cristina De La Peña: It’s been an amazing journey so far. I started as an architect doing research on the application of neuroscience in building new physical spaces. I had no idea this research and collaboration with neuroscientists would take me all the way to build what Synapbox is today.

After working on that for a couple of semesters, I got into what’s called neuromarketing and soon enough I discovered all the inefficiencies in the practice and how marketers were working around human behavior understanding. Long story short, I thought there was a better way and I committed to finding it.

GD: For those brands just immersing themselves in the world of “smart-marketing” and A.I., can you share how Synapbox becomes relevant?

CDLP: We help brands and creators understand consumer behavior towards content, in specific video and image. In today’s world being able to cut through the noise and get consumers attention is more difficult than ever. We help them to understand what elements inside their ads are impacting their consumers and how to increase the efficiency of their creativity to increase ROI.

All of this is possible through a mix of data points like real-time facial recognition of emotions, eye tracking and visual computing techniques that pair every reaction towards each element inside the video or image. To give you a better idea, we are able to detect if puppies are engaging millennials on short length videos and how this engagement may be correlated to better brand recall for a product like Coca-Cola. 

Image courtesy of Synapbox.

GD: How can A.I. really “tell” what emotions we express?

CDLP: Emotions are expressed in many ways, however, the most common way humans communicate feelings is through their facial expressions. Our face is like an open book that we share feelings in real-time and with technology we can get very precise readings of those expressions at a micro-level.

GD: Are expressions uniform across cultures? Geographies?

CDLP: This is very interesting. We actually do share universal emotions. This means that all human beings share 6 basic emotional levels (happiness, sadness, surprise, anger, disgust and fear). What differs is the intensity in which we express these emotions and what it means across different contextual scenarios (like a smile when watching a TV show or an educational piece of content).

GD: Is it true that every brand wants the emotion of “happiness” in response to their products?

CDLP: Emotions have a very unique connection with the subconscious of every person. The products and ads that are able to connect with consumers at this level have a better chance to be remembered and to produce an action, this action could be a purchase or even just a like, but all of this is the ROI that brands are looking for.

If we are able to identify better what generates these emotional engagements, we could become better at replicating them and offer more targeted ads and experiences for our consumers.                                           

Cristina De La Peña expressing “neutral” and “happiness” according to the algorithm. Can you spot the “anomaly in the person participating? Photo credit: Synapbox.

GD: We have to askhave you deployed this tech on top of deep fakes? What results?

CDLP: Is an interesting question. Synthetic data can trick an algorithm, however, all our data comes from real consumers opting to participate in our survey-like studies.

In order to do so, and control quality of data collection we include a calibration process where we can detect if there are any anomalies in the person participating. 

Featured image photo credit: Isaac Alcala