Over the past six years, Tristan Harris has forced us to think differently on devices and digital services we use every day.
First, as a product leader at Google and later as an outside technology critic, he has been keen on what has been called the attention economy, the way our phones and applications and web services are constantly diverting and distracting us.
It took years for his criticism to spread. But the boy has it.
News that Russian-related provocators had hijacked Facebook and other social media to spread propaganda in the 2016 election helped boost their profile. So also reported that social media led to a significant deflation in childhood depression. Since then, Harris has found a clear audience ranging from everyday citizens to heads of state who want to better understand how technical companies are manipulating or being used to manipulate their customers.
Harris, who together with the Center for Human Technology has developed and marketed ideas for reforming the technology industry, has already noticed in the industry. Features like Apple's display time, which iPhone owners can use to set limits on how much they use devices and apps, are a direct result of the criticism he's raised.
And more can be on their way. For the first time, political decision-makers in the United States and around the world, many of whom have consulted Harris and his colleagues, are seriously considering the regulatory framework to reset the relationship between technology companies and their customers and the wider community. On Monday, the UK Information Officer's Office told the BBC that it was serious about limiting the amount of data social networks could collect on children by initiating a series of measures, including limiting the use of similar buttons.
Business Insider spoke to Harris recently about what inspired him to start his movement and what he feels he has accomplished so far. This interview has been edited for length and clarity.
Harris felt that the industry was heading in the wrong direction.
Troy Wolverton: You have tried to draw attention to and get the tech companies to take advantage of what you call the attention economy since you put together your presentation in 201
Tristan Harris: In the beginning, people did not necessarily want to admit that it was a problem. I think the slideshow on Google went viral and [it resonated with] people. But there was no action. There was just a lot of denial, many "oh, people are addicted to many things, cigarettes, alcohol. Is this not just capitalism?"
And it is like, guys, we create a very specific form of psychological manipulation and influence that we, the technical industry are responsible for fixing. And getting people to admit it took a very long time. We had a hard time getting people to agree that there was a problem that had to be solved.
And now I think what's changed is that people know – because they've been forced to know – that it's a problem. So now people are talking about what we really do about it.
What I have heard recently is that for first time Facebook leaders, their friends are now talking about turning them on and saying, "What side story do you want to be on?"
And now I think because enough of the audience has weakened friends of people on top of these companies, that people now realize that there is something structural we need to trade.
Wolverton: What made you put together the light car deck in 2013?
Harris: I felt like there was just something wrong about the direction where this was heading, which is a very scary thought when you see a whole industry that is not in the right direction. Because until then I thought the technology was great.
This is not an anti-technology movement. But what I started to really wake up was … what my most talented friends and engineers were increasingly doing was to be better and better playing tricks on the human mind to keep people hooked on the screens.
I just felt that everyone I knew really did not do the kind of great creative thinking people had used in the 90s and early 2000s, and instead this breed was manipulating the human mind.
Wolverton: But was there a moment that triggered that consciousness, some epiphany?
Harris: I had a little epiphany. I went to the Santa Cruz Mountains for a weekend with my friend Aza Raskin, who is now co-founder of CHT. I came back from that weekend, after I had connected to nature, and something profound just struck me. I really don't know what came over me.
I just felt like I had to say something. It felt wrong. It felt like nobody else was saying anything.
I'm not the kind of person who starts revolutions or speaks. This is something I've learned to learn.
Heads of State have knocked on the door
Wolverton: How has your understanding of the scope of the problem changed since you put together your 2013 presentation?
Harris: I had been CEO of a small Web 2.0 tech startup called Apture. I had a background, academic, cognitive science, and computer science, and linguistics, user interface design, human-computer interaction, such things. I was educated to think about building technology products and the human mind.
Since I left Google and especially after Cambridge Analytica and linked with [Silicon Valley venture capitalist] Roger [McNamee] and these issues took off, understood the breadth of understanding and scope of what is stake extended by several orders of magnitude.
The scale of the problem has [been] raised from the way a product designer would think of attention and alerts and home screens and the economy of the app stores – that's how I started – to play 12-dimensional geopolitical chess and see how Iran, North Korea, Russia, China, use these platforms for global information war. [And it goes from there] right down to the way these issues affect the daily social pressure and the mental life of teenagers.
We have world governments that knock on our door because they want to understand these issues. Briefing Heads of State – I never thought I would. This has been wild and it speaks to the extent and gravity of the problem.
Read this : The real lesson to Facebook's Apple dust shows why Zuckerberg's "hacker way" is even more dangerous than we thought
I knew this problem would affect everything, Conceptually, back in 2013. But I haven't painted the understanding that I have over the past one and a half years, where you actually meet the people in those countries where the elections are at stake of these issues. Or you meet and talk with parent-child-teacher groups that violate these issues daily. So it literally affects everyone and everyone. And that's the choke point for what keeps the pen of human history, which is what I think people underestimate.
Wolverton: If you imagine this process as a curve that goes from identifying a problem to adequately addressing it, where do you think we are with it?
Harris: Still opening, I think. I think we are in the opening entrance to a bill.
[Companies such as Facebook and YouTube] will be looked back as fossil fuel companies, because in the attention economy, they drill deeper and deeper into the bottom of the brain stem to get the attention out of humans.
[They’re] now, where there is pressure on them, trying to correct for the biggest damage that occurs, but only because usually unpaid or nonprofit-paid civil society research groups hold up to three in the morning, scratching Facebook and YouTube and calculating the recommendation systems and the disinformation campaigns, and then they tell … New York Times … and then Facebook or YouTube, if there is enough pressure, after [a Rep.] Adam Schiff or a senator [Mark] Warner or [Sen. Richard] Blumenthal letter from Congress Start doing something about it.
I think to look backwards, we should say, "Oh, my god, we are so glad we woke up from that nightmare and started designing and financing and structuring our technology in such a way that it is cooperative of The users and the constituencies that most influence it, not on an infinite growth treadmill, designed with human business models that are sensitive to human sensitivities and vulnerabilities
Tech companies have just taken child steps so far
Wolverton: You've said companies like Facebook, Apple, and Google have taken what you call baby steps to tackle these issues by doing things like allowing people to set the time they spend on their devices.
Harris: They are celebrating baby steps. I just want to be ready. I'm glad they do, because it sets up a run to the top.
I mean I had one of the leaders a bigger techno logistics company you wanted to know, say next to me on a private event scene, the whole industry is now in a run to the top for time spent well. I mean, it's ridiculous. We were able to turn this around from a race to the bottom – from who can only steal attention by pulling on our paleolithic doll strings – now a run to the top. [Companies are now vying to] prove that they care more about … the individual's well-being and hopefully in the future, a whole society and civilian population.
But that is why the child is progressing. It actually came with all the companies that started running in that direction, and we have to keep running.
Wolverton: With all this focus on how devices and programs require our attention, I wondered how much time you spend on your phone these days.
Harris: Well, this is one of the most important issues for the world all the time, and I, and our organization, play such a big role in what unfortunately I work continuously on this problem, meaning you uses technology all the time.
I could look at the screen time program for you if you want, and now I know the answer to that question thanks to the features now available in one billion phones.
Let me see. Screen time, last seven days, average is 3 hours and 2 minutes per day.