How We Made Our Hiring Process Fair & Equitable

Leadership

2021-09-17T18:24:53.176Z

Zigzag

This article was first published in Built In, written by CTO, Allan Wintersieck

As an engineering organization, we are always iterating to success. Little tweaks here and there over many years compound to create better products.

However, our internal hiring process was a different story.

In late 2019, we decided to take a step back and overhaul our engineering hiring process. Throughout 2020, we tested, iterated, and hired new team members under this new system.

And in 2021, we really hit our stride. We now have a hiring process that finds qualified, talented individuals all while reducing stress, resources, and biases.

Here’s why and how we did it.

Why We Revamped Our Engineering Hiring Process

When Devetry was a small organization, our VP of engineering and I were in charge of all engineering hiring decisions. It mostly worked.

Over the years, though, we had a few hiring misses. Often we would hire based on the feeling we got from a candidate. It was one-sided, subjective, and error-prone.

At the same time, we were growing (from 20 to 30 full-time engineers) and needed to increase our capacity for interviewing and hiring. To facilitate this, we decided to include multiple software engineers in the interview process. However, if we had 10 different participants, then we’d end up with 10 different opinions on each candidate. One would rate a candidate on performance optimization, another on database experience, and so on. We needed structure.

On top of these internal challenges, we knew that the world of software engineering was male-dominated and mostly white. Since we wanted diverse candidates, we thought we could simultaneously improve the way we recruited and assessed to achieve that goal.

After asking ourselves how to get the best talent while being fair, equitable, and efficient, we arrived at our current hiring process.

The  Recruitment Stage

Here’s how we adjusted our recruitment methods:

  • Job descriptions: Job descriptions are tricky, but we focus on writing them as objectively and unbiased as possible by removing overly-masculine words. These include words like “quarterback” and “dominate.”
  • Placements: While we post on many standard job sites, we also post on Diversify Tech, a job board for underrepresented candidates looking to work in technology.
  • Goal alignment: For qualified candidates, before inviting them to interview, we like to hop on a quick call to make sure what they’re looking for aligns with what we’re looking for. If a candidate wants to manage people, but the role doesn’t have those responsibilities, it’s a moot point.
  • Skills tests: If things appear aligned, we test their development skills via a challenge on HackerRank. Last year, one of our engineers brought up the fact that timed online assessments can be biased so we also offer every candidate the option of extra time if they need it.

The Interview Process

Once all the pre-qualification steps are complete, we bring in our team for traditional interviews. Our in-person (or via video) interview starts with a technical interview where the candidate works through multiple coding problems.

At Devetry, a technical interview simulates a day in the life. For this, programming languages don’t matter. It’s all about how a problem is approached and the thought process for solving it. What would they Google (yes, of course, they can Google whatever!)? What ideas do they bounce off the person they’re paired with? How do they solve something when they don’t already know the answer?

Then we ask discussion-based questions. A discussion-based question is a great place to find out a candidate’s working style and thought process for solving the technical challenge. We also take a step back and ask theoretical questions. For example, we may ask them to talk about coupling and cohesiveness in relation to system design.

How We Assess Candidates

We have two tools to assess and make the ultimate decision on whether someone gets the job.

To start, we created a standardized rubric to streamline and remove biases from the interview process. Rather than arbitrarily saying, “I liked that person,” assessors must evaluate the candidate on predetermined criteria. These are the categories we currently use:

  • Communication: For a junior developer, we want to know that working with them isn’t going to be a frustrating process full of misunderstandings. For a senior role, we want to know that we can leave them alone in a room with a client and not be worried about it.
  • Skills: Do they know anything about the language and framework we use? What about other languages and frameworks?
  • Wisdom and problem-solving: Can they find the right answer when they don’t already know it?
  • Culture fit: Do they share our core values? Beyond that, are they people we want to work with? Note: This is not intended to screen for “do I want to get a beer with this person?”
  • Experience: Has the candidate’s past experience prepared them for a job here? Even if not directly related to this position, which of their experiences are beneficial?

Each team member independently fills out the rubric so no one is influenced by another. They are submitted to our VP of engineering to review and calculate.

How We Hire Candidates

After the rubric score is completed, we meet and share our final thoughts. In this meeting, the first thing we do is an exercise to combat confirmation bias and herd mentality. Rather than ask, “Should we hire this person?” and take turns sharing, everyone closes their eyes and votes with a thumbs-up or thumbs-down. Once everyone is ready, we open our eyes and compare.

This simple act is incredibly effective. It removes the HIPPO (highest-paid person’s opinion) in the room and gives equal weight to everyone’s opinion.

Plus, there’s no middle ground. You have to lean one way or the other. As engineers, it’s hard to pick a side of the line when you haven’t heard what other people think yet, but it’s essential.

Once everyone’s thumbs are pointing in a direction, we discuss the candidate and the rubric score. If they receive thumbs up (over 50 percent) and a passing rubric score, we extend an offer.

Why the Process Works

This process works because it combines the objective and subjective. The thumbs-up or down exercise is subjective. How did each person like this candidate and would they want them on their team?

The rubric is objective. It gives an equal chance to every candidate, even if they rubbed somebody the wrong way.

Both of these tasks are done before any group discussion because it’s easy to hear a coworker that you respect talk positively or negatively about a candidate and change your mind. That’s not always a bad thing, but we think it’s important to decide your unbiased opinion before coming together as a team.

We’ve seen great success with this process. While it still takes time and energy to find the right people, it removes tediousness and gives us confidence in every single hire we make.

Depending on what stage of growth you’re currently in, you may or may not be able to implement this. The takeaway is that hiring quality people requires a thoughtful and fair process. I invite you to steal any part of this and test and iterate until you find something that works for you.