Running New Relic’s Ignite Program, Part 1 - Hiring

This is part 1 of a series of blog posts about New Relic’s Ignite program, which I designed and ran for about five years. Ignite was an incubator team for early career engineers. We hired folks into what was usually their first software engineering role, gave them the extra support and training they needed to be successful, and then placed them on engineering teams throughout the company. While the program was eventually closed down due to overall economic conditions (if you’re not hiring, you can’t really run a hiring program), it was extremely successful by every measure we we looked at - reported candidate experience, engineer performance and promotion rates, manager satisfaction, employee retention, candidate diversity, and so on. It was easily the most satisfying and also the most impactful work of my career to date. My hope is that sharing these methods will help other companies who would like to run a similar program, though you could also apply most of the material in this post to any engineering hiring pipeline.

Companies often severely undervalue early career engineers. Even when a company does recognize their value, they may struggle to hire, onboard, and place those engineers. You can’t just treat them like senior engineers and hope for the best! Hiring for an early career role is significantly different than hiring for a more senior position. For mid- to lead-level roles, the challenge is finding qualified candidates. You can do a lot of filtering based on resumes to see if they have the needed experience. After that, you’ll likely end up with only a handful of folks that you’d consider bringing in for a final interview. The “best of the best” are almost never on the job market, so working connections and active recruiting are critical.

For early career roles, though, almost everybody is on the open job market. There are far, far more candidates eager for their first engineering job than there are positions to go around. Sifting through applications is like drinking from a firehose, and the challenge is processing the sheer volume of candidates and narrowing it down to the most promising folks. Every time we opened a round of hiring for Ignite, the volume of applicants went up – after running it for a few years, it was around 100 per day.

Before You Start

Take Notes

Before you do anything else, I recommend creating a document for recording notes, observations, and feedback about how the program is going. It’s particularly valuable to track anything that didn’t work well, anything that worked really well, or any ideas you have for ways to improve. That way, when you periodically look back to iterate on the program, you’ll have a list of things to change or adjust going forward.

The Job Description

There’s a lot of great advice out there about writing inclusive job descriptions, so I won’t repeat it here. Make sure that you do some reading on that topic, though, and consider using automated tools to review the text.

Work with Talent Acquisition

Make sure you go over the full process with your Talent Acquisition team (or Recruiting, or whatever your office calls this group). They’ll be the first point of contact with your candidates, and will be setting up interview schedules and such. They need to be able to explain the process, get the right people in the right interviews, and so on.

Set Up an Interview Panel

Speaking of interviews, since engineers aren’t being hired into a specific existing team with a program like this, there’s no automatic group of folks to do the interviewing. For each round of Ignite hiring, we set up a group of engineers and managers to both evaluate take-home exercises and conduct in-person interviews. You could call for volunteers, draw from the teams being considered for placement, hand-pick a specific panel, or otherwise handle this in whatever way seems best for your organization. Make sure you have all the necessary skills represented in your panel (see below for the various sorts of interview sessions we conduct). Once you have your full set of interviewers, it’s good to let them know about how many interviews they should expect, over what period of time. We usually aimed for about 3 interviews per person plus 1 training session, over a period of around 3 weeks, so the burden on any individual was fairly light. It’ll depend on how many applicants you have and how many interviewers for each type of interview session, though. You can start doing this around the time the position becomes available for applications – it’ll take a little time for candidates to get to the interview phase.

The Hiring Process

When creating the Ignite hiring process, the specific traits we designed it to select for are:

In addition, there were a number of principles we strove to incorporate into the process:

Some of the measures we took toward those goals included:

Ignite’s overall hiring process was structured as a funnel, like many others. At each step throughout the funnel, we were looking to filter out about 50-70% of remaining candidates, in order to arrive at a manageable number of finalists. We moved segments that took a lot of time to evaluate or for candidates to complete later in the process, and front loaded the less time intensive exercises, both to keep our workload reasonable and to respect candidates’ time. As such, we settled on the following steps for the Ignite interview process.

That worked fine for the lifetime of the program. However, as the number of applicants kept growing, we were approaching the point where we would have needed to add additional steps early in the process or become much more strict in our evaluations in order to maintain a manageable volume of candidates in the later (more time intensive) phases.

Application

The initial application consisted of a resume, an optional cover letter, and a short answer question.

Resume

In the Ignite program, candidates were often applying for their first engineering job, and we were specifically looking for candidates from diverse backgrounds. As such, we didn’t weigh resumes as highly for Ignite positions as we might have for more senior roles. That said, there were a few things that would make a resume stand out for us, such as signs of completed software projects, or job experience in other fields that might be relevant or valuable in software engineering. For example, teaching experience is a good sign, anything with math or especially statistics is very good, and any programming-adjacent work such as design or product management is also a plus. None of these things were necessary, though.

Cover Letter

Having a cover letter was not required (at all), but we read them when they were there. Since cover letters take time for the candidate to write, and time for us to read and evaluate, we suggested candidates only include one if there was something specific they really wanted to tell us that couldn’t be said elsewhere in the application. A great cover letter was a small plus, but a generic one was neutral at best, and a poorly written cover letter could be a significant negative. So, most Ignite applications were better off leaving them out. That said, job seekers should be aware that opinions about cover letters vary widely – for some roles, hiring managers strongly prefer they be included.

Short Answer Question

We asked one short answer question in our application form:

“Tell us about a technical project you’ve worked on. Explain the details of how it worked, and describe at least one tradeoff you considered. Please keep your response to no more than 2-3 paragraphs.”

With this question, we wanted to see that candidates could take complex ideas and break them down clearly in a way that non-experts could understand. After reading the project description, a reviewer should come away knowing what was built, why it was built that way, and roughly what all the moving pieces were. Answers should be three paragraphs at most – a concise style was one of the specific things we were looking for. Also, the project described should be technical in some sense, but didn’t need to be a software project – building a camera or some other complicated project would also be fine, as long as there was some technical depth to the topic. Perfect grammar and punctuation weren’t required, but the ability to clearly express complex ideas was.

In particular, the directions asked for tradeoffs, so we wanted to see at least one tradeoff described. An application that did not include any tradeoffs was unlikely to proceed. A tradeoff should have at least two plausible choices with pluses and minuses on both sides. “We didn’t have time for everything, so we focused on X” is not a tradeoff at all, because only one possibility is mentioned (having infinite time is not an option). “We only had time to implement either feature X or feature Y. We went with X because [reasons]” is a mediocre tradeoff description, but not great. A great description of a tradeoff might look something like this: “We had to choose option X or option Y. X is better at [things], but Y is better for [other things]. [Something] was the most important consideration for us, because [reasons], so we went with X.” This could be about architectural styles, libraries, databases, or any other decision that had multiple reasonable options that the candidate had to choose between. Experienced engineers are almost always thinking about tradeoffs like this as they design and build their systems.

Hiring Manager Screen

The Hiring Manager Screen consisted of a light technical exercise and an opportunity for the candidate to ask any questions they may have about the program. The technical exercise was on about a FizzBuzz level of difficulty, but it wasn’t actually FizzBuzz. Candidates had 15 minutes, could write their solution in any IDE and any language they like, and were free to use Google or look up API documentation (though we did ask that they not look up the exact literal solution to the exercise).

If a candidate was not able to complete the technical screen, or if there were any behavioral red flags, we explained to them why they would not be continuing to the next step and thanked them for their time.

If a candidate successfully completed the exercise and would be continuing, then we walked through what the rest of the interview process would look like and what kind of timeline to expect. If applicable, we let them know that since we did anonymized batches of code evaluations, we might need to wait for a full batch to come in before evaluating their submission, so to expect some delay before hearing back.

As a side note, this is probably one of the most stressful steps for candidates, and I would have loved to avoid it if possible. However, for early career engineers coming into their first programming job, we found that we must test for the fundamental building blocks of programming, such as loops, conditionals, and the most basic data structures like lists and maps. Understandably, it’s very difficult for a new engineer to judge whether their technical skills are ready for that next step into a professional position, and many candidates needed a bit more time to firmly establish those building blocks when they first applied. While we were initially more lenient with this interview, the later technical elements of the hiring process became progressively more challenging, and what we found over the years is that if a candidate struggled at this step, those struggles would increase going forward. It is not a kindness to put somebody in a position where they will not be successful, so I recommend being clear about what is required to advance and remaining firm in those requirements.

Take-Home Exercise

We specifically wanted a diverse set of backgrounds, and unlike most programming jobs, there was no specific type or style of code that we focused on for this role. So, in the Ignite program we offered two different choices of exercise, so that candidates could choose the one that best showcased their specific experience and strengths. One of the exercises focused on React and CSS, and the other was a small command-line tool. We had a set of four of each type of exercise, which we rotated through quarterly, so folks could reapply in later quarters.

All exercises were evaluated based on a strict rubric with very specific criteria, and the broad outline of those criteria were described in the text of the exercise.

Exercises were also evaluated anonymously, so evaluators had no identifying information about the candidate when they reviewed the code. We had two ways to handle anonymization, depending on the situation. Ideally, one person would anonymize each entry, and somebody else did the evaluation. That way, we could evaluate each submission as it came in. It’s also possible for one person to handle everything themselves by collecting batches of 3-5 submissions, shuffling them together, and then doing all the evaluations. We used a small command-line script to anonymize and deanonymize the submissions when using that approach.

Final Interview

Interviewer Preparation and Logistics

We had a one hour training session with all the interviewers before anybody started interviewing. In this training, we covered the following areas:

In addition to the initial training, we also sent out an email to each interviewer the day before they interview, which contained links to the Interviewer Guidebook and the feedback form for that specific candidate. The feedback form was a Google Docs spreadsheet with one page for each interviewer, plus a page with instructions and a scoring guide.

We always aimed to have two interviewers in each session, if possible – one primary and one observer. The observer would mostly just be observing, but if there was something they really wanted to say, or if the candidate wanted to ask them a question, that was fine. Interviewers generally observed at least a couple times before being primary for any given type of session. We also tried to have at least one interviewer in each session who identified as a member of an underrepresented group (as optionally indicated in our interviewer signup form).

After all the interviewers submitted their feedback for a candidate, the hiring manager reviewed all the feedback and ensured it was appropriate to share with the candidate. In other words, all of the written comments were kind, any areas to improve were clearly spelled out, no proprietary information was included, etc. The feedback form was then sent back to the candidate, and we let them know if they would be considered in our final selection. If they were, we let them know the expected timeline for the final decision.

Interview Sessions

We wanted to respect the candidates’ time, as well as our interviewers’, so the final interview stage consisted of three one hour sessions with a break between each one. Each of these sessions is described below.

Technical Session

We wanted to mitigate the stress of live coding as much as possible, and give our candidates as much opportunity to shine as we could. So, for the technical interview, we started with the code they had already submitted, since it’s guaranteed to be a domain and codebase they’re familiar with. We also let them choose from several different features to implement, and allowed them to look up whatever resources or documentation they liked online (aside from the literal exact solution to the exercise).

Behavioral Session

This was a standard behavioral interview with “Tell me about a time when…” style questions. We worked with our ERGs to develop the specific questions, but they were mostly fairly typical of this style of interview.

We provided the full text of the questions to all candidates - in fact, they were published on our website. To some extent, like many of our practices, this was to mitigate the stress of interviewing. However, we also found that this practice provided a significantly better hiring signal. Typically, folks have a handful of stories they always tell in response to these kinds of questions, and the third or fourth time they tell that story is always more polished than the first. We don’t want to select for whether they happen to have already been asked a similar question in the past. We also don’t want to know about just whatever example they could come up with in the heat of the moment. We want to know their best example of a given behavior, described as well as they can describe it, and publishing the questions ahead of time helped us achieve that.

Verbal Communication Session

We provided the following prompt to the candidate ahead of time, so they had a chance to think about what topic they’d like to explain and practice walking through it if they liked:

“Please explain a technical topic of your choice. It doesn’t necessarily need to be about computers, but computer related topics are often a good choice. What we’re specifically looking for is your ability to take complex abstract concepts, break them down, and explain them clearly. You’re welcome to use informal visual aids if you like, but please do not have any prepared slides or materials ahead of time. When we used to do these in person, we encouraged folks to use a whiteboard, and you’re welcome to use anything that’s basically equivalent, though it’s also fine if you’d prefer not to have any visuals. This will be a very informal, conversational session, so expect that there will be some back and forth as we stop and ask questions and such. To get an idea of the format, imagine that somebody has just joined your team and you need to explain a system your team owns, or perhaps an executive has asked you to explain some concept to them.”

I always personally conducted all of the Verbal Communication interview sessions (with a secondary interviewer). While we did have a scoring rubric that aimed to be as objective as possible, this session has a higher degree of subjectivity than most. Having one person conduct all the interviews helped to create a more consistent set of evaluations across candidates, and as the hiring manager, I was the only person who was relatively guaranteed to be available to interview all candidates.

Final Selection

To make the final selection, I scheduled a meeting with the recruiters, any interviewers who were interested in attending, and any other parties that wanted to be involved.

Ahead of time, I prepared a spreadsheet of the candidates in a suggested stacked rank order, including summaries of their feedback at each step and any additional comments or observations that seem relevant. Generally, the ordering followed the scoring from the interviews, but there were occasionally reasons why I might choose to deviate from that. For example, perhaps a candidate scored well but there were behavioral concerns, or perhaps they didn’t score quite as well as another candidate, but they had relevant non-programming job experience that the other candidate lacked.

At the selection meeting, I laid out the reasoning for the initial ordering, and asked for feedback or other impressions. Based on feedback, I would then adjust the rankings as appropriate. The final decision was mine, but I found that the feedback was almost always insightful and worth taking into account. At the end of the meeting, the selection was fixed and the top candidates were informed that they had been accepted to the program. If some of them ended up accepting other offers or were otherwise no longer available, we’d reach out to the next person on the list. Once we had enough accepted offers to fill out a cohort, then we informed the other finalists that, though it was a close decision, they had not been selected.

Even for folks who didn’t make it into the program, though, that wasn’t necessarily the end of our interactions with them. For one thing, anybody was welcome to apply again, regardless of how far they got on their first try. Also, anybody who made it to the final selection was somebody we thought was exceptional. If a candidate was interested, then when we saw other roles throughout the company that we thought they might be a good fit for, we would reach out to those managers and share our feedback from the Ignite interviews. Several of those candidates did end up being placed in other roles within the company.

With the high volume and very high caliber of candidates that we saw, especially once the program had been running for a few years, final decisions often came down to the thinnest of margins. In all honestly, it would often just depend on how well one candidate or another happened to show up on that particular day for that particular interview. It can be difficult to make such significant decisions based on this. However, hiring decisions must be made one way or another, and judging based on criteria that are as fair, objective, and rigorous as we know how to make them is the best way I’ve found to do it.

What’s Next?

In the second article in this series, Part 2 - Onboarding and Rotations, I’ll go over how we handled onboarding a new ASE and discuss everything that went into running the bulk of an engineer’s time in the programming.