It's not the pipeline problem. It's the hiring process.
Tech's diversity problem is not a secret. Throughout the industry, and especially in technical roles, the majority of employees are white men. This is a problem for a host of reasons, ranging from the obvious, like inclusion and equality, to the economic, like how companies with diverse teams are, on average, more profitable.
Knowing the benefits and importance of diversity, many companies have pledged to build inclusive teams, and yet, many still struggle to do so.
In this piece, we explore three common elements in tech interview loops—the job posting, the initial screen, the algorithmic interview—and how, when executed incorrectly, they can bias an interview process against candidates from diverse backgrounds.
Your hiring process begins before the phone screen. It begins with your job post. The moment you post a job opening, your potential candidate pool goes from “everyone in tech” to “people with X skills and Y years of experience.”
The goal of your job listing should be to make qualified candidates excited about the opportunity, without discouraging anyone—particularly underrepresented individuals—from applying. The language you use becomes key, as it can affect whether a candidate feels welcome to apply.
As University of Winnipeg professor Danielle Gaucher points out in a summary of her 2011 study of gendered wording in job posts, “Words such as competitive, dominant or leader are associated with male stereotypes, while words such as support, understand and interpersonal are associated with female stereotypes.”
As Gaucher writes, “The naturalistic data suggests that (gendered wording) is common in male-dominated fields, and contributes to the division of traditional gender roles by dissuading women's interest in jobs that are masculine worded.”
Ways to make your job post more inclusive:
First, rewrite any job postings that include words like “compete,” “confident,” or “adventurous.” According to Gaucher's research, gendered words like these make the job less appealing to women.
Next, explicitly invite under-represented people to apply in the body of your job post. This is something that can be incredibly simple, too. Slack, as one example, consistently outperforms the Silicon Valley average in terms of team member diversity. Slack has numerous strategies and initiatives around diversity, but a simple tactic that anyone can employ is the language in their job posts:
“Slack is an Equal Opportunity Employer and participant in the U.S. Federal E-Verify program. Women, minorities, individuals with disabilities and protected veterans are encouraged to apply. Slack will consider qualified applicants with criminal histories in a manner consistent with the San Francisco Fair Chance Ordinance...
Ensuring a diverse and inclusive workplace where we learn from each other is core to Slack’s values. We welcome people of different backgrounds, experiences, abilities and perspectives. We are an equal opportunity employer and a pleasant and supportive place to work.”
Simply saying that you welcome and value a diverse group of applicants goes a long way—though it's not the only measure you should take.
Also, use a tool like Textio to analyze your job postings for gendered or otherwise noninclusive language. Textio offers language analysis of your company's writings that includes a “Bias Meter,” which reflects how inclusive your language is.
Resume and phone screens are tactics companies employ in the name of efficiency. Not every applicant can be brought in for a holistic interview, so screens are one way companies attempt to filter out poorly fitting candidates early on.
However, quick screening methods have a tendency to result in candidates being rejected for biased reasons. A 2003 study by the National Bureau of Economic Research found that, subconsciously or not, recruiters tend to discriminate based on candidates' names and assumed race/gender when screening their resumes.
Taking things a step further, by only valuing certain credentials, like an Ivy League degree, recruiters can introduce more bias into their screenings. As Project Include, a nonprofit that uses data to advocate for diversity in tech, writes in its hiring guide, “Pattern matching is a common problem found while reviewing resumes, from universities to previous companies worked. Think about employing a 'distance traveled' metric: Where did a candidate begin their journey? Which achievements were accidents of birth and linked to privilege (e.g. an internship at a family or friend’s company) as opposed to earned in a meritocratic competition?”
Ways to reduce pattern matching:
Train your interviewers.
According to Project Include, companies should “use structured interviews, asking the same standard questions, and making them applicable to the type of role you’re filling...Avoid trivia, single-answer questions, those that rely on esoteric knowledge or photographic recall, or those that don’t reflect the work or culture of a company. Questions shouldn’t be 'gotchas,' but rather opportunities for people to discuss the value they will add to the company.”
Along with running a structured interview, Project Include suggests trying “blind hiring,” in which candidates' names and identifying information are removed from their resumes before it is seen by an interviewer.
Whiteboard interviews, for better or worse, are standard practice in technical interviews. Interviewers like them because they are repeatable, can be done fairly quickly, and, in theory, reflect a candidate's ability to think through technical problems.
In reality, however, algorithmic whiteboard interviews can be biased when used for roles that don't really require those skills. They evaluate candidates' knowledge of a very specific thing—algorithms and data structures—with no consideration of the rest of their skills. If this knowledge is critical to the role you're hiring for, that's great, but using these interviews as a catch-all engineering test biases the outcomes toward people who've:
The last point is a little more complicated, but important. Performing well in a whiteboard interview requires candidates to have memorized a good number of problems, and as Quincy Larson, founder of FreeCodeCamp, has pointed out, that requirement “freezes out many of the people who are under-represented in the software development field. If you’re busy working and raising kids, you want to spend as much of your scarce time as possible learning to code—not performing rote memorization that won’t matter once you start your job.”
Ways to reduce whiteboard bias:
Evaluate whether a whiteboard interview is actually relevant for this role. Is this going to be a role that will require heavy knowledge of algorithms and data structures? If not, you don't need to do a whiteboard interview.
Instead, try any of the following:
In addition to these strategies, there's also a lot to be said about instilling a sense of ease during these performative interviews. Small actions to make the candidate feel welcomed—asking about allergies, factoring in time for breaks, encouraging them to ask questions and communicate needs during the interview—all go a long way towards improving the candidate experience.
Broken hiring processes do more than keep under-represented people from joining your team. They push under-represented people to leave tech altogether.
As Michael Connor, executive director at OpenMic, a media company focused on social justice in tech, reported, “People of color who enter the tech industry leave the field at more than 3.5 times the rate of white men.” Similarly, women are twice as likely to leave a career in tech as men.
While the drivers of this attrition among under-represented people are varied—some say they feel stalled in their career, others report “isolation, discrimination, and toxic work environments”—bias remains a consistent factor.
Improving your hiring loop isn't just a good way to build a more inclusive team, it's a way to give underrepresented candidates an equal hiring experience, and to help set an example for the rest of industry.