Press "Enter" to skip to content

This organization wants regulations to make technology more human

One San Fransisco-based organization is looking to promote the development of a new measurement of success for big technology companies. 

Instead of looking at how much time people spend and trying to get them to spend more, Tristan Harris wants companies to make decisions based on what real-world benefit they create for the user, especially interpersonal connection. 

The Center for Humane Technology is a nonprofit started by Harris, a former Google ethicist, Aza Raskin and Randima Fernando. 

In a TED talk, Harris spoke about the company CouchSurfing as an example. He said that the company measures their success in the number of positive hours of interaction people say they had. They then subtract the time they spent on the website to find that person to interact with.

“Can you imagine how inspiring it would be to come to work everyday and measure your success in the actual net new contribution of hours in people’s lives that are positive that would have never existed if you didn’t do what you were about to do at work today,” Harris said in his TED talk

Harris’ goal is for companies to help people spend their time well.

In his talk, Harris interest lies in smarter technology to benefit people by giving them choices that allow them to have better connections with people, not just more connections.

He also doesn’t believe in fully turning away from technology, but instead he wants to change it.  

The building blocks

Harris’s ideas were first heard in 2013, but the past two years have been more concentrated after the founding of the organization.

The Center for Humane Technology accomplishes their goals by combatting what they call “human downgrading” in several ways. They are appealing to the the public, but they are also formally appealing to legislators.

In June, Harris spoke to a sub-committee of the Committee on Commerce, Science, and Transportation. He implored them to begin the process of putting regulations into place that temper the level of tactics that companies use to keep people engaged with the platform without limitations.

As a design ethicist at Google in 2013, Harris first put his ideas on paper — or rather slides. He created a slide presentation for Google employees that outlined what he saw as the ethical problems of technology today.

He said in his statement to some members of the senate that he tried to see if the problem could be fixed from the inside while still with Google but concluded that it could not.

His TED talks and interview on 60 minutes in 2016 sparked an interest in what he had to say from the public and not just his colleagues.

These efforts sparked the Time Well Spent movement.

In 2018, he co-founded the Center for Humane Technology, which is a collection of “leaders in technology, humanity, mindfulness, philosophy, and education,” according to their website.

That same year both Apple and Google launched programs that showed users how much time they are spending on their phones and what they are spending it doing.

This year, in addition to his appeal to legislators he has also written an opinion article in the New York Times entitled, “Our brains are no match for technology.”

The current system

The organization looks at six key areas that they highlight as important in the human-device interaction. Those are digital addiction, mental health, breakdown of truth, polarization (especially political), political manipulation and superficiality.

The organization says that these are all side-effects of the current technology landscape and the interactions that technology has with us.

One of the ways Harris said companies keep our attention is notifications.

He said in a TED talk that companies “plan” our day with the interruptions that they introduce in our lives in the form of notifications and emails. Harris used the example of Facebook saying you were tagged in a photo.

“I’m not just going to click ‘see photo’ what I’m actually going to do is spend the next twenty minutes,” he said.

He said in this way Facebook is planning an interruption into your day.

“The worst part is is that I know that this is what is going to happen and even knowing that that’s what is going to happen doesn’t stop me from doing it again the next time,” he said.