Written by Lucy Fuggle
We were detached and lacked self-awareness. We were building these products for people who we thought were like us. Now that they have been used by billions of people around the world who are not like us, we’re seeing there are some severe consequences.
With his curious outlook, a valuable skillset at the boundaries between disciplines, and a happily unconventional view of what success looks like, Chris Messina has cultivated a particularly interesting and impactful career in tech.
In 2007, Chris Messina invented the hashtag, proposing it as a solution for vertical/associational grouping of messages, trends, and events on Twitter. He’s also known for his involvement in helping to create the BarCamp, Spread Firefox, and co-working movements. In what some considered an unexpected move, Chris brought his mind to Google as an ‘open web advocate’ and UX designer from 2010 to 2013, and to Uber as Developer Experience Lead from 2016 to 2017.
Chris is now using his hindsight and curiosity to look forward to the next fifteen years and prevent tech from becoming an enemy to our human nature and relationships. Here are Chris’s thoughts on how can tech can become a better citizen in our human world, as shared in our recent cross-continental call to New Hampshire.
How can tech work with, not against, our human nature?
Chris tells us that he thinks often about the ethics of social tech. “I feel like we are at this rolling inflection point… if that makes sense,” he says. “When I reflect on the last fifteen years of social technology development, we have been making computers easier and easier for humans to use by making them do more for us and communicating and interacting with us more like we do with each other.”
“The more I think about it, the less I see a meaningful distinction between ourselves as humans and the technology we create and use,” says Chris. “The loop is becoming so fast that imagining the smartphone is not actually a part of us is ignoring some fundamental shift in the way we conceive of ourselves and then connect to and relate to others.”
As tech has evolved, Chris has mixed feelings about its impact: “While I really love the idea of bringing technology to more people in a way that enhances their humanity, we’ve also created systems that exploit the weaknesses in human psychology to a negative effect.”
I don’t know how to balance the bad behaviour of the pharmaceutical or tobacco industry with what the tech industry does, but we’ve engineered a generation of seductive technology…
While there was once scepticism and doubt about the use, utility, and need for social technology (such as in the early days of Twitter) that has certainly changed. “We’ve realised that these are just new and more accessible mediums for more people to be able to express themselves and share their experience,” Chris says. “But along the way, the tech industry has resorted to exploitative tactics to convince people to use the systems more frequently.”
Variable rewards are one such exploitative tactic in social tech. When you open your phone or open Twitter, you don’t know how many notifications you’re going to have or how many messages may be waiting for you. “It’s designed to feel like playing a Las Vegas slot machine,” Chris says.
This type of tactic can be used effectively if you have a goal and it’s aligned with your intentions, admits Chris. If a person wants to lose weight, for instance, you can use variable rewards to support their willpower. But when it’s done subtly and you don’t know that it’s happening to you, or you’re unaware of how these tactics are used, it can become problematic. “We can end up using technology in a way that erodes our relationships, our sense of self, or how we take care of ourselves,” says Chris.
Improving the culture that surrounds founders
For tech to benefit human nature rather than work against it, we need to help founders become more emotionally aware and mentally fit, Chris tells us. This will require a total culture shift, but Chris is focused on helping to trigger this change during his speaker sessions this year. Alongside his Turing Fest talk, Chris will be sharing his emotional toolkit with founders in a workshop, Becoming an emotionally fit entrepreneur – co-hosted with Dr Emily Anhalt.
Chris emphasises that there will need to be work higher up the tech food chain too. By creating programmes, boards and investors that know how to prioritise and work on the ethical and emotional side of tech products, we can allow this to travel downstream to the leaders setting direction and the employees actually developing the products.
My sense or my hope or my bet is that the products that emotionally aware and mentally fit people create will then have a better and more positive impact when they reach the masses.
I say this because having been in a culture surrounded by the founders of platforms like Instagram and Twitter and so on, I know how analytical we were, how detached we were, how we didn’t have sufficient self-awareness at the time. We were building these products for people who we thought were more or less like us. And now that those products have been used by billions of people around the world who are not like us, we’re seeing unintentional and undesirable consequences as a result.
In the age of machine learning and artificial intelligence, the ethical and emotional instincts of tech companies are going to become much more important. “We’re building these computers that are becoming more like us, are potentially more interesting than us, and are able to remember more of what we tell them about ourselves than our friends can,” says Chris. “And long term, this may interrupt some people’s ability to build and maintain pro-social relationships with other humans. Ultimately that would be a huge loss and something that would be very hard to recover from if we don’t get ahead of it.” This requires fast-moving action from every level of the tech industry.
Because I can look back with 10-15 years of hindsight, I’m now looking forward 10-15 years and thinking where this is going to go. I imagine we’re going to be pursuing roughly a similar outcome — which includes more global usage, more digitisation, and more cultural consequences.
Chris reminds us that as of 2019, 50% of the world’s population has been connected to the internet. We’re expected to connect the remaining 50% over the next ten years.
We’re connecting cultures and individuals who have possibly barely even had a television, and they’re suddenly going to have this enormous amount of power on their smartphones or other devices they’re using to access or talk to the internet.
It’s very important that we think about the culture that these products create in these new users. They don’t have the same inoculation we’ve had growing up with radio and television and so on.
It concerns me that there’s going to be this enormous culture shock as a result of bringing this technology to people that have lived a very different lifestyle and have a very difficult culture than a lot of us who are building these products. We can’t just cross our fingers, we have to actually do work on it… now.
To be clear, there is work being done here already. For example, Google’s Next Billion User initiative among others. But to promote that work, I still believe that there are deeper cultural changes necessary in the tech industry more broadly.
Our own evolution as consumers — and choosing what to believe
We already know how compelling and engaging social media is. With a new form of social media called synthetic media, 3D representations of people are convincingly interwoven into photos of real humans. And the technology that generates these scenes is becoming accessible such that anyone can quickly create manipulated scenes that trick people into believing they’re real.
We now have the ability to synthesise voice and faces and all the other aspects that we previously thought of as hallmarks of authentic expression. That’s now no longer the case.
It’s going to have an enormous impact on the 2020 election.
With the video of Nancy Pelosi where she was portrayed to be drunk and slurring (a so-called ‘shallow fake’), it’s already happening… the lie makes its way around the world before the truth even has a chance to get up and put its pants on.
The question then is, are we able to bring some mindfulness into the way in which we consume media so that we can have a moment to verify the provenance of information we are receiving?
I’d love to see the tech world conspire to help people adopt more carefully optimistic (but perhaps not ‘rosy’) perspectives of tech,” says Chris. “Then, we can be more intentional when these technologies filter out into the world as pro-social forces that promote human interconnectedness as opposed to disconnection.