Over the past six years, Tristan Harris has forced us to think differently about the devices and digital services we use every day.
Only as a product manager at Google and later as an external critic of the technology industry, he has shined what has been called the attention economy, how our phones and apps and web services are constantly deviating and distracting us.
It took years for his criticism to spread. But the boy has it.
News that Russian-affiliated provocators had hijacked Facebook and other social media to spread propaganda during the 2016 election contributed to increasing his profile. So also reported that social media led to a significant uptick in depression among the children. Since then, Harris has found a clear audience that extends from everyday citizens to heads of state who want to better understand how technology companies are manipulating or using to manipulate their customers.
Harris, who joined the Center for Human Sciences to develop and promote ideas for reforming the technology industry, has already noticed the industry. Features such as Apple Screen Time, which iPhone owners can use to set limits on how much they use their devices and apps, are a direct result of the criticism he has raised.
And more can be on the way. For the first time, policymakers in the United States and around the world believe that many have consulted Harris and his colleagues are seriously considering provisions to restore the relationship between technology companies and their customers and the wider community. On Monday, Britain's Information Commissioner's Office told the BBC that it was serious about limiting the number of data social networking could collect on children by introducing a series of actions, including limiting the use of similar buttons.
Business Insider talked to Harris recently about what inspired him to begin his movement and what he feels he has achieved so far. This interview has been edited for length and clarity.
Harris considered that the industry was heading in the wrong direction
Troy Wolverton: You have tried to focus attention and get technology companies to deal with the abuse of what you call the attention economy since putting together your presentation in 201
Tristan Harris: In the beginning, people did not necessarily admit that there was a problem. I mean the slide show on Google went viral and [it resonated with] people. But there was no action. There were just lots of denialism, lots of "oh, people relying on lots of things, cigarettes, alcohol. Isn't it just capitalism?"
And it is like, guys, we create a very specific form of psychological manipulation and influence that we, the technology industry is responsible for fixing. And getting people to recognize it took a really long time. We had a hard time getting people to agree that it was a problem that had to be fixed.
And I think now what has changed is that people know – because they have been forced to know – that there is a problem. So now people are talking about what we actually do with it.
What I recently heard is that for the first time managers on Facebook, their friends are now kind to them and say, "Which side story do you want to be on?"
And now I think that enough public has swung people's friends on top of these companies, that people now realize that there is something structural we need to change.
Wolverton: What made you bring the slide show together in 2013?
Harris: I felt like it was basically something wrong in the direction in which this was led, which is a really scary thought when you see a whole industry that is not in the right direction. Until then, I thought the technology was great.
This is not an anti-technology movement. But what I was really waking up was … what my most talented friends and engineers were increasingly doing was to be better and better playing tricks on the human mind to keep people hooked on the screens.
I just felt that everyone I knew really didn't do the kind of great creative thinking people had used in the 90s and early 2000s, and instead it became this breed to manipulate the human mind.
Wolverton: But was there a moment that triggered that consciousness, any epiphany?
Harris: I had a little epiphany. I went to the Santa Cruz Mountains for a weekend with my friend Aza Raskin, who is now the founder of CHT. I came back from that weekend, having fed up with nature, and something deep hit just hit me. I really don't know what came over me.
I just felt like I had to say something. It felt wrong. It felt like no one else would say anything.
I'm not the kind of person who starts to revolutionize or speak up. This is something I have learned to do.
Heads of State have hit their door
Wolverton: How has your understanding of the extent of the problem changed since you compiled your 2013 presentation?
Harris: I had been the CEO of a small Web 2.0 tech startup called Apture. I had a background, academic, in cognitive science, and computer science and linguistics, user interface, interaction between people and computers, things like that. I was educated to think about building technology products and human mind.
Since I left Google and especially after Cambridge Analytica and paired with [Silicon Valley venture capitalist] Roger [McNamee] and these problems began, my breadth of understanding and the extent of what was expanded was extended by several orders of magnitude.
The problem has raised [been] from the way a product designer would think of attention and notifications and home screens and finances in the app stores – which is how I started – now playing 12-dimensional geopolitical chess and seeing how Iran, North Korea, Russia, China, uses these platforms for global information warfare. [And it goes from there] all the way down to how these issues affect the daily social pressure and mental life of teenagers.
We have world governments knocking on our door because they want to understand these issues. Briefing Heads of State – I never thought I'd do it. This has been wild and it talks about the extent and the focus of the problem.
Read this : The real lesson of Facebook's Apple dust shows why Zuckerberg's "hacker way" is even more dangerous than we thought
I knew that this issue would affect everything, Conceptually, back in 2013. But I have not founded the understanding that I have in the past six months, where you actually meet the people in the countries where the election is at stake with these issues. Or you meet and talk to parent-child teacher groups struggling with these problems daily. So it literally affects everyone and everything. And that is the shock point for what keeps the pen of human history, which is what I think people underestimate.
Wolverton: If you imagine this process as a curve going from identifying a problem to dealing with it, where do you think we follow it?
Harris: Still opening inning, I think. I think we are at the beginning of a bill.
[Companies such as Facebook and YouTube] will be looked back as fossil fuel companies, as in the attention economy they drill deeper and deeper into the race to the bottom of the brain stem to get the attention of humans.
[They’re] Now, when there is a pressure on them, one tries to rectify the biggest of the injuries that occur, but only because usually unpaid or nonprofit paid civil society research groups stay up to 3 in the morning, scratching Facebook and YouTube and calculate the recommendations and disinformation campaigns, and then they tell … New York Times … and then Facebook or YouTube, if there is enough pressure, after [a Rep.] Adam Schiff or a Senator [Mark] Warner or [Sen. Richard] Blumenthal letter from Congress is beginning to do something about it.
I intend to look back we will say: "Oh my god, we are so glad we woke up from that nightmare and started designing and financing and structuring our technology in such a way that it is joined by the users and the constituencies It is not on an infinite treadmill, it is designed with humane business models that are burdensome for human sensitivities and vulnerabilities.
Tech companies have taken only step so far
Wolverton: You've said Companies like Facebook, Apple, and Google have taken what you call children's steps toward dealing with these issues by doing things like allowing people to set limits on the time they spend on their devices. How important are these?
Harris: They are celebrated children's steps, I just want to be clear, I'm glad they do, because it sets up a contest to the top.
I meant that I had one of the boss a to a larger technical company that you would know to say next to me at a stage in a private event, the whole industry is now in a contest to the top for the time which is good. I mean it's ridiculous. We could turn this from a race to the bottom – from who can only steal attention by pulling on our paleolithic puppet strings – until now a contest to the top. [Companies are now vying to] proves that they care more about … the individual's well-being and hopefully in the future, a whole society and civilization.
But that is why the child's steps are important. It actually coincided with all the companies that started to compete in that direction, and we have to keep that competition going.
Wolverton: With all this focus on how devices and apps require our attention, I wondered how much time you spend on your phone these days. This is one of the most important issues for the world at all times of trouble, which means constantly using technology.
I could watch my screen time app for you if you wanted, and now I know the answer to that question thanks to the features now available in one billion phones.
Let me see. Screen time, last seven days, average is 3 hours and 2 minutes per day.