There used to be a dream that technology would be the great equalizer. From social media democratizing national and global discourse to new technology simplifying elections to computer algorithms cutting through prejudice. But that hasn’t panned out: social media is ripping our social fabric to shreds, elections are under constant siege, and our technology is perpetuating rather than eliminating biases.
You get the sense that we had a bit too much confidence in the cleansing power of the electronic microchip.
Human-built machines immortalize human problems, as we are discovering more and more. Voice recognition software isn’t good at identifying higher-pitched (i.e., predominantly women’s) voices. Facial recognition software is far superior at identifying white men’s faces than literally anyone else’s. Motion sensors often seem to be unable to detect dark skin, a problem that seems to also infect some wearable health monitors. And Amazon famously built an AI recruiting tool that filtered out women.
Let’s zero in on that last one for a moment. Amazon, a famously enormous company that relies heavily on computer engineering talent, wrote AI software to sort through resumes to identify top applicants. But because of how Amazon recruited and hired over the previous ten years – the base dataset that the AI used to train itself – the software penalized any mention of “women’s” (as in “women’s tennis” and so on) and disregarded candidates from women’s colleges. And why? Because it was basing its definition of an optimal candidate on human hiring decisions. And since tech is so dominated by men, that definition assumed the optimal candidate would be as well.
In short, Amazon automated human bias. Isn’t the future grand?
How does this happen? Because the people building these tools were hired as part of a technology culture that undervalues women and people of color (among others) and as a result, women and people of color are predominantly not involved in the creation, refinement, or testing of critical tools (or, in Amazon’s case, were so underrepresented in the base dataset that the AI tool automatically deemed them suboptimal). It’s all really one problem: when you don’t hire diverse candidates, tools will assume a certain kind of person is preferred (or at least the default), and perpetuate that thoughtlessly for as long as they’re designed to.
Think about it. Motion sensors released to the market that clearly were never tested on a dark-skinned person, which indicates that no dark-skinned people were really involved in their creation. That’s such an enormous oversight, and one that would seem to be easy to correct. But there are entrenched forces at play much bigger than conscious bias. In fact, this isn’t even just a problem with new technology; film photography has long been criticized for its use of white skin as the metric for skin-color balance, resulting in poorly developed photos (a problem that was only corrected after companies complained that they were unable to properly photograph dark wood furniture) .
A while back, I spoke with Goldman Sachs about its new diversity program, which impressed me with its scope and thoughtfulness. It wasn’t based on quotas but on hard data trends that uncovered why even progressive recruitment out of college wasn’t solving the problem. And what they discovered was telling.
Women and minorities, it turned out, even when hired at the same rates as their white male counterparts, kept falling out of the pipeline. Attrition was enormous; this data had never been looked at in this way, and so nobody had quite discovered the trends. Both populations were more likely to quit finance altogether than their peers and were simultaneously more likely to be replaced by white men moving laterally from another company. Additionally, they were less likely to be promoted and less likely to even be considered for promotion.
What we’re seeing, in other words, are systemic cultural forces at play, bigger than any one company’s hiring policy. And while Goldman is a finance company, those cultural forces and biases extend well beyond it deep into the business world. From companies that value people able to work bewildering hours to male-focused socialization opportunities to white hiring managers recruiting from their own social circle to the overwhelming tech bro culture of Silicon Valley – where women and people of color don’t get VC capital and don’t get hired and don’t last when they do – it comes down to the fact that the machines that fill our lives are built by, and ultimately for, a certain kind of person.
It’s a kind of tunnel vision or myopia, where they’re unable to see beyond their immediate circumstances or consider that their POV might not be a universal one. It is, essentially, culture at work, where human prejudices and biases are built into the technology we’re all supposed to use, discriminating automatically. It’s an enormous problem, and one that can only be solved by identifying the forces at play on an ongoing basis, and not simply hiring more women or people of color. Retention and promotion is as important as hiring, and if you aren’t making workplaces where people feel welcome or are really enabled to succeed – often in the intangible ways of mentoring, social connection, or environment – you aren’t going to see these problems worn down to the nub the way they need to be. After all, you can hire all the women you want, but if the company turns a blind eye toward sexism, for example, you probably aren’t going to keep them. And it’s going to do more harm than you realize, far beyond your complimentary buffet-style lunch and office ping-pong table.
Let’s spend time working on identifying how women and minorities get filtered out, and what makes them drop out, industry by industry. I suspect a lot of the causes will be the same, but the first step toward eliminating systematizing bias in code and machine learning tools is to ensure you’ve created the conditions that let people from those groups advance. Companies need to do the work that Goldman Sachs did, as I described above: get the numbers, look at when you’re losing those players and why they’re falling out of the pipeline, and then tech steps to rectify it. In Goldman’s case, they’ve incentivized hiring and promotion of affected groups by tying performances bonuses to keeping not only hiring, but maintaining a diverse (and gender-balanced) team, as well as changing requirements to include skillsets rather than specific career histories or experience, which encourages hiring managers to both actively recruit outside the usual pool. There are realistic, practical steps you can take to improve retention and promotion without compromising the quality of your team.
We can’t just fill the leaky barrel. We’ve got to plug the holes.
[“source=forbes”]