It was one of big tech’s most ignominious moments. Back in 2015, the search feature of the Google Photos app tagged two Black people as “gorillas,” prompting a toe-curling apology from the tech giant, which blamed machine learning–trained “automatic image labeling.”
While the affair was plainly an unintentional glitch, it was in another sense no mere accident, but a product of an industry dangerously out of touch with its users. From disinformation and voter manipulation to privacy breaches and the new “feudal system” unleashed by the gig economy, tech’s toxic fallout emanates essentially from one source: how tech is built, the way engineering teams work, the bubble a lot of them live in, and the pool from which talent is selected.
It’s usually glossed over that tech engineers tend to be recruited from similar backgrounds. In my own experience, backed by academic research, they are also likely to be introverts who—because of their often narrow training and early experiences—have difficulties seeing the world from others’ perspectives, and end up writing code with little appreciation of its impact on users.
Meanwhile, for all its “woke” ideology, Silicon Valley has always been overwhelmingly male, and white or Asian. In 2019, 92% of Facebook employees, 95% of Google employees, 89% of Microsoft employees, and 84% of Apple employees were white or Asian, according to company data. Women represented 21% of employees at Microsoft, 23% at Facebook and Apple, and 26% at Google.
This entrenched underrepresentation of minorities has often resulted in an absence of diverse thinking. It’s also allowed a number of dangerous stereotypes to take root, including something I call “Steve Jobs Syndrome.” Jobs has been associated with a belief that his undoubted genius excused any kind of behavior, even to the point where many believe a founder actually has to be a jerk to be a genius. Other received wisdom includes “Tech companies are meritocratic,” “There is no bias in code,” “Whatever the problem, the answer is always more tech,” and “Disruption is just another word for innovation.” (Spoiler on that last one: It isn’t.)
Now, it must be said that many of these convictions have also helped spur a fast-moving, solution-oriented culture where people do not balk at difficult challenges, enabling small startups to take on the biggest incumbents. Yet at a macro level, it impacts the world at large not by solving society’s problems, but by creating new ones.
So what can be done?
Unsurprisingly, it starts with increasing diversity. Gender, race, and age aside, I’ve long argued that the tech giants should also hire people with humanities backgrounds—individuals as familiar with Voltaire and Paine as with Java and Python—and create special career pathways for them in product and engineering. This will bring in more diverse candidates with a wide variety of experience, not only resulting in smarter teams (according to research by McKinsey), but ones with greater emotional intelligence and that are more innovative.
Beyond increasing diversity (necessary, but not enough on its own), we need to work on instilling a more empathetic approach within engineering teams. The finest engineers I’ve worked with have all mastered cognitive empathy (the ability to put yourself in someone else’s shoes, in their case a user’s), and appreciate its pertinence to software development. It starts in undergraduate education, where classes around the ethics of innovation, conscious capitalism, and empathetic tech should be made compulsory for any computer science student.
Sending engineers out of the office to meet users would also help. Engineers, facing relentless deadlines, rarely spend meaningful time with people on the sharp end of their code. Imagine a Facebook engineer being sent to Myanmar to meet genocide victims to understand firsthand how Facebook’s product has been abused. Or consider a Twitter engineer, once a week for a year, sitting across from women who’ve faced rape and death threats on the platform. Odds are that they would return to the office chastened, and pull out all the stops to design more empathetic tech and fix these issues.
It’s also crucial that empathy is embedded within the product and feature development process itself. One way of achieving this would be to handpick a few experienced engineers, chosen for their ability to spot potentially negative impacts on users and wider society, to challenge product and engineering teams on that basis throughout the development process.
For the most strategic features, having an “empathy committee,” composed not just of engineering and business people, but also of sociologists, ethicists, and philosophers, would help. I concede that this measure would act as a brake on productivity, but would Google’s gorillas blunder have happened under this sort of scrutiny?
Clearly, none of the above will remedy overnight a situation that has been brewing for decades. But diversity targets and unconscious bias training courses certainly won’t either. And while governments and regulators have much to do to curb big tech’s worst excesses, unless the Valley itself faces up to the fact that it won’t solve its empathy crisis until it rebuilds internally, then the industry’s precipitous fall from grace is only set to continue.
Maelle Gavet is a tech executive, entrepreneur, former COO of Compass, and former executive vice president of global operations at Priceline Group. This commentary is adapted from her new book, Trampled by Unicorns: Big Tech’s Empathy Problem and How to Fix It.
More opinion in Fortune:
- To ethically tackle COVID-19, Big Pharma needs an overhaul
- FTC commissioner: Is antitrust the next stakeholder capitalism battleground?
- Donald Trump is a master of hypnotism. How he used the power on America—and then himself
- The funding gap between Black- and white-led organizations is clear—and alarming
- As Palantir goes public, consider its troubling human rights record