Press "Enter" to skip to content

Experts from Mercedes and more discuss their fears and dreams of the intersection of AI and neuroscience

If 2018 was the year that artificial intelligence exploded, and 2019 the year that AI became accessible to everyone, 2020 is shaping up to be the year that we become more mindful about how we use it.

And at the forefront of this work is the discussion around the intersection of AI, neuroscience, and human behavior.

Towards the end of last year, I was invited to take part in an event alongside World Summit AI, hosted by Mercedes EQ. I sat down with a selected group of experts afterwards (curated by DataSeries) where we discussed the implications of these topics, and what will come our way in 2020 and beyond.

Of course, while much of our roundtable discussion centered around these topics as a whole, some of the discussion revolved around the car industry, and what possibilities exist for embracing neuroscience in the mobility field.

But before we got to that particular sector, our talk centered around data privacy and security.

AI and machine learning require large datasets to learn from, and as we collect more information from behavioral data, including measuring emotions from video, audio, and even brain-wave activity, there are inherent dangers and concerns regarding the storage and use of this highly personal information.

“In the U.S. and Europe, there are macro effects that are affected by data privacy concerns which include democratic elections,” NASA Datanaut and CEO at CLC Advisors Cindy Chin told me. “Psychological operations — or psyops as it’s called in a military context — where computer algorithms can be used to predict behavioral patterns and skew results through a targeted campaign is dangerous. The world has already seen evidence of such occurrences in the U.K. and U.S. elections.”

“It has already been proven that by tracking a player’s gaze within a Virtual Reality (VR) environment, one can not only measure accurately what elements or ads the user is really looking at, but the advertising can — as a result — perform significantly better than advertising placed in a VR environment without this information,” General Partner at OpenOcean, Tom Henriksson, told me.

And eye-tracking is just the start. It’s almost table stakes for VR and AR applications in 2020.

“Imagine a future where the machine knows what the user is actually looking at, and it can measure from the user’s pulse, or even from their brainwaves, how the person is feeling and reacting to the advertising,” Henriksson said. “It certainly brings us one step closer to advertising that is so dynamic, personalized, and well-targeted, that it will feel more like a valuable service than advertising. At the same time, the power of this technology, which at least for the pulse and gaze tracking part already exists, also asks for incredibly high standards for user control and data protection.”

One thing is for sure. The more we talk about behavioral data, the more often the discussion of regulation is reared.

“Companies and individuals pay large amounts of information to obtain this data and there are few checks and balances or levels of enforcement against negative use and infringement of privacy behaviors,” Chin said. “Where we have seen positive results in the use of collected datasets in the healthcare industry. Already, the prediction of accurate diagnoses that leads to proper clinical treatment is found in areas such as breast cancer and Alzheimer’s disease research. I would like to see a broader international ethics framework or charter where tech companies and governments demonstrate their commitment to the data privacy of global citizens. I would also like to see broader datasets that take into account inclusion and diversity. The inaccuracies in what is being created today in AI and machine learning is alarming.”

That high level of data security is something at the forefront of Steven Peters’ – Manager Artificial Intelligence Research at Mercedes-Benz AG – mind.

“We’re known for our commitment to safety when it comes to designing and producing vehicles, and that can’t change when it comes to behavioral data,” Peters said. “Daimler has published its AI Principles in September 2019 which guide the way to a human-centric approach of AI in both, products and processes. To start a conversation about the future of AI and developments that shape tomorrow such as our electric brand Mercedes-Benz EQ we created the EQ community. Collaboration and empathy for each other as well as focusing on customer needs is our way of thinking.”

“I consider data literacy as a primary need in our more and more data-driven society,” Project Manager AI Health Intelligent Analytics at Massive Data DFKI GmbH, Anne Schwerk, Ph.D., concurs. “Everyone has to be able to understand data, the usage, and the analytics to a sufficiently high degree. Otherwise, our society will never be able to truly understand the consequences of AI and the digital world. Another very relevant aspect is the need for explainable systems, that allow users to trace decisions and be in control of the machinery behind the fancy GUI and the personalized predictions.”

There are many possible use cases for applying behavioral data in a mobility setting. Imagine, for example, that the vehicle is able to identify that the driver is feeling angry or anxious, and it then recommends calming music and adjusts the air conditioning to improve the mood of the driver.

This AI and neuroscience-driven future is not far from becoming reality, so how can we use behavioral data in a sensitive, positive way to improve products and services for consumers in the future?

“Anyone consuming media ‘for free’ would naturally like to experience more relevant and valuable advertising,” Henriksson said. “To trust technology companies and service providers to utilize such ultrapersonal data, where a machine might understand a person’s thoughts or desires before the person does, requires iron clad data protection and strong user control.”

And in an age where trust in the biggest social networks and tech giants is at an all-time low, we need to be extremely clear about what data is being collected, how it is being used, where it is being stored, and what value we’re getting in return.

“New solutions where for instance people can decide on, and filter, both what impulses the systems are allowed to measure and what advertisers are allowed to access user data, are needed,” Henriksson said. “Further, user trust in technology companies needs to be completely revamped before this can go mainstream. Unlike in the case of Facebook today, we need to be able to 100 percent trust all companies handling our deeply personal data to adhere to the highest ethical standards, implement the best possible data security, and vigourously anonymize and protect the data of customers.”

What should happen next in the use of neuroscience, behavioral data, AI, and machine learning, and where are we heading in the near future, especially when it comes to using this highly personal information in something as high-risk as mobility solutions?

“In the area of mobility, location data is an excellent example of the opportunities ahead,” Henriksson said. “Tracking of users’ locations is currently hotly debated within the location technology industry and far beyond, as there are no formal rules or ethical standards on how this sensitive data should be used. Used correctly and with respect, location is a very powerful signal, which can enhance the digital profile of a user with information on what is going on in the real world.”

Examples of how this type of data can be used for good are plenty, but we always come back to ensuring companies leverage this information in a respectful way.

“Prediction is exciting and if used in the right way, with the right level of respect for user privacy and security, can add powerful utility,” co-founder and creative director of Plumen, Nicolas Roope, told me. “If someone is driving at speed, it’s a lot more useful to recognize that they’re about to go to sleep than recognizing the event as it happens. And of course, this happy driver who is alerted — and lives as a result — doesn’t want to subsequently be bombarded with energy drink ads because the car company has shared their data. It’s also nice not to share this event with the insurance company.”

And how can that information be used outside of the world of advertising or vehicle control and safety, and what needs to happen next to make it possible?

“Humans still spend at least 70 percent of our day in the real world, so having more information about this part of our lives definitely enables better services like, for example, ride-hailing, child tracking, and better-planned cities,” Henriksson said. “Recently, the CEO of location data company Foursquare suggested in a New York Times opinion piece that technology companies should adhere to standards similar to the Hippocratic oath of doctors and that Congress should regulate the location technology industry. Perhaps such actions need to be taken to ensure we can enjoy the great mobility solutions of tomorrow.”