We’re excited to be collaborating with our partners at Applied as part of our #AdasListIWD events. Applied are on a mission to de-bias the hiring process so that everyone gets a fair shot - regardless of gender, race or background.
“Whilst most organisations would like to improve their diversity, real, measurable change starts with a conscious decision to challenge the status quo.” - Applied
This article/blog post looks in detail at the ways in which you can #ChooseToChallenge your hiring practices - from the sourcing stage through to interviewing, to allow for a more inclusive hiring process.
Next week, Ada’s List and Applied will be joining forces in a live Q&A webinar, on Tuesday 9th March. We’ll have 2 womxn from the Applied team and 2 Ada’s List members coming together in an online panel event, to talk to the Ada's List Community, Applied employees and customers around the International Women's Day theme of #ChooseToChallenge.
We’d love to feature some of your questions in this Q&A webinar. Read more about this event, sign up and include any questions you have for the speakers panel.
Below, we’ll share what years of behavioural science research and a whole lot of trial and error have taught us about building a fairer, more inclusive hiring process that will attract, hire and retain more women.
We’ll talk you through the ways in which you can choose to challenge your hiring practices - from the sourcing stage through to interviewing.
Everything we recommend below is based on behavioural science research and can be implemented without spending a penny.
Does our approach work?
The proof is in the pudding (or pie in this case):
Gender diversity at the initial sourcing stage matters.
According to one study, when there’s just one woman in the finalist pool, their chances of being hired are statistically zero.
So, here’s how can you craft job descriptions that will attract a more diverse candidate pool...
The words you use in your job description will have a direct impact on who applies.
Words and phrases convey subconscious meaning.
When we read a job description, we’re assessing whether or not the role would be a good fit for us… and whether or not we’d fit in.
If you’re not careful about how you word your job description, those who feel that they don’t ‘fit the bill’ will qualify themselves out.
Women will be turned off from applying if your job description includes excessive masculine language.
A 2013 study found people are less likely to apply for jobs that had words biased in favour of the opposite gender in the job description.
Why? Because by listing characteristics typically attributed to males, you’re signalling that a man would be a better fit.
Although the impact of gendered language will vary from person to person, the results of our own research reflected that of the 2013 study.
There was a higher proportion of female applicants when the job description was feminine (51%) or strongly feminine (54%).
There were fewer female applicants when the job description was masculine-coded (48%) or strongly masculine (44%).
So, generally speaking, feminine-coded job descriptions will increase the odds of women applying and masculine-coded job descriptions will decrease these odds.
We also looked at the gender of those who were actually hired.
We found that women were more likely to be hired when the job description was either feminine or neutral-coded.
Below are a few examples of commonly used gendered terms.
*Although it’s quickly becoming a cliche, many tech companies still use terms like ‘ninja’, ‘rockstar’ and ‘guru.’ Not only are these vague and cringe-inducing, but they’re also likely to deter women in the same way as the masculine-coded terms above.
Whilst you can check for these gendered words manually, we actually built a Job Description Analysis Tool to detect this language and suggest alternatives.
The number of requirements you list can also have an effect on the diversity of your candidate pool.
LinkedIn's data showed that while both men and women browse jobs online in a similar way, they apply for them differently.
Women are 16% less likely to apply after viewing a job.
And women apply to 20% fewer jobs.
This is probably due to the fact that women generally tend to apply only when they meet 100% of the criteria, whereas men will apply when they meet 60%.
This difference in approach hasn’t got anything to do with how women see themselves and their skills.
It's about how they see the hiring process.
As the survey below shows, the most common reason for not applying was “I didn’t think they would hire me since I didn’t meet the qualifications and I didn’t want to waste my time and energy.”
There was also a significant gender difference for one particular reason: “I was following the guidelines about who should apply.”
Only 8% of men had this as their reason for not applying.
If you list an excessive amount of ‘requirements’ - some candidates will view them as a barrier to entry while others will see them as desirables.
Ideally, you should stick to the essential requirements needed to do the job and consider removing any 'nice-to-haves'.
While it may absolutely be the case that your sourcing isn’t effectively reaching women - if you’re using a standard CV and interview process, there’s a strong possibility that it’s actually your selection process that is preventing you from hiring more women.
If you want to challenge your hiring practices to be more inclusive, ditching CVs is a must.
A study carried out across the English labour market back in 2006 sent job applications (CVs) to open vacancies.
These applications were identical other than the sex of the candidate.
Researchers found that males tend to be favoured for typically male roles and females tend to be favoured for typically female roles.
If a job is dominated by the opposite sex, you’re likely to be penalised.
The problem here is that ‘typically female’ roles tend to be less senior.
According to Mercer’s 2020 report, the higher up the corporate ladder you go, the fewer women you’ll come across.
You can gen up on more research like this via our Gender Bias in Hiring Report.
Although we can’t rule it out entirely, it's fairly safe to assume that for the most part, hirers involved in these studies weren’t part of a wider campaign to oppress women or keep them out of a certain profession.
We see these outcomes because, when reviewing CVs, we open ourselves up to unconscious bias.
We tend to fall back on using our intuitive, mental shortcut-based system of thinking which relies heavily on instinctive associations and stereotypes.
We’re all prone to bias, whether we’re aware of it or not.
You can’t de-bias people - but you can de-bias processes.
That’s why the only way to effectively remove this bias from hiring is to re-design the hiring process itself.
Whilst some organisations have started using anonymised CVs, this only solves part of the equation, even if it is a step in the right direction.
Even when we remove names (that indicate sex) from CVs, the fact remains that women tend to downplay their achievements.
Attributes we may praise, when exhibited by men, tend to be seen as negative when exhibited by women (the most famous example of this is the Heidi Roizen case study).
And on average, women rate their performance less favourably than equally performing men.
If we look at the assessment methods we have at our disposal, we can see that education and experience - staples of the CV - aren’t actually predictive of real-life ability (you can check out the metastudy here).
If we then look at what is predictive, we can see it’s ‘work sample tests’ - these are what we use to screen candidates here at Applied.
Work samples take parts of a role and turn them into questions/tasks.
Education and experience aren’t entirely useless but they’re proxies for skills.
Work samples test for skills directly.
Work samples are created by identifying which skills are needed for the job and then building questions that will test these skills.
The philosophy behind work samples is simple: the most predictive way of testing someone’s ability to do a job is by getting them to perform small parts of it.
Here’s a work sample we used for a recent Community Lead role:
You’ve been invited to be on a panel on hiring & recruitment. You’re the only D&I expert (possibly the only one that thinks it’s important there) in the room. What are your opening lines to the audience to convince and engage them on the subject?
Skills tested: D&I Knowledge, Communication
The closer to real-life your work samples are, the more predictive they’ll be.
Your work samples could be posed as hypothetical scenarios like the example above.
Or you could simply ask candidates to perform a given task, like writing a blog post or drafting an email to a customer.
When applying through the Applied Platform, candidates submit 3-5 work samples anonymously (instead of a CV/cover letter).
When we put work samples to the test, we found that 60% of candidates hired through our process would’ve been missed via a traditional CV screening.
We also found that work samples addressed the lack of female representation in senior roles. Roughly 50% of candidates who applied for senior roles (through our platform) were women and roughly 50% of those who were hired were also women.
When it comes to the interview stage, meeting candidates face to face will always result in an unavoidable degree of bias.
Given that women tend to be viewed more negatively for showing the same attributes that we praise when shown by men, we obviously want to minimise this bias as much as possible.
Similarly to the screening stage, the most success is had when changing the process itself, rather than individuals within it.
Structured interviews entail asking all candidates the same question in the same order.
We recommend using work sample-style questions at the interview stage too.
Rather than asking candidates about a time when they did X, ask how they would do X if it were to happen.
Remember: we’re testing for skills learned through experience, not experience itself. Since we know that women are less inclined to self-promote than men, the fairest, most accurate way to interview is to simply have candidates show their skills, rather than tell you about them.
Here’s an example question used for the same Community Lead role as above:
There are a lot of businesses that claim to have a community. What does a successful community look like to you? How would you measure it?
Skills tested: Data-driven, Community Knowledge
Questions like this give you an opportunity to see how candidates would think and act should they get the job and don’t require candidates to be overly confident or talk up their skills.
Scoring criteria is absolutely essential for fairer, more gender-diverse hiring outcomes.
For every work sample and interview question, you’ll need to give yourself a scale and criteria against which to score answers.
At Applied, we use a simple 1-5 star scale - writing a few bullet points detailing what a good, mediocre and bad answer would include.
Take a look at this example question and it’s corresponding criteria...
What is your favourite SaaS website and why? How does it encourage inbound leads to get in touch (calls to action, sign-ups, chatbots) and how do they do a good job of this?
The more you do to remove ‘gut instinct’ the fairer your process will be.
The purpose of the scoring criteria is to make sure candidates are being assessed against something objective and that each of their answers is equally weighted (we have a tendency to look for reasons to keep liking someone once we’ve decided we like them).
When it comes to scoring answers (both work samples and interview questions) we strongly recommend having three reviewers.
We do this due to a phenomenon known as ‘crowd wisdom’ - the general rule that collective judgement is more accurate than that of an individual.
Ideally, you’d have a different three colleagues score each round (e.g screening, interview round 1, interview round 2).
By using multiple reviewers, individual biases will be averaged out by the other reviewers.
The more diverse the panel, the more objective the scores so if you’re aiming to hire more women, you should make sure these panels are gender diverse.
Culture fit can be tricky.
How fair your assessment of culture fit is depends on how you define ‘culture.’
If you see culture as something fixed and to be protected, then you could run into trouble when attempting to attract/hire more women into your ranks.
If your company was founded and initially staffed by men, then you might find that your culture leans towards being ‘masculine’ - and so hiring women who ‘fit’ this will be difficult.
Interviewers are also influenced by ‘looking-glass merit’ - managers assess a candidate’s future job success based on how closely the candidate mirrors their own life experiences.
If interviewers are predominantly male, you can imagine what sort of outcomes this might result in.
To avoid all the pitfalls associated with culture fit, you could switch to ‘culture add’ - an assessment of what people can add to your culture, but we decided to move away from culture/personality all together.
Instead, we test for mission and values alignment.
All we want to know is: are candidates as passionate about de-biasing hiring as we are and are they aligned with our team values?
Data collection and tracking are critical for improving diversity of any kind. If you’re not tracking this, you can’t be sure that you’re making improvements or that your process is fair.
If your goal is to attract and hire more women, you need to know how many women enter at the top of the funnel, and how they progress through the hiring process.
Once you can see how candidates move through the process, you can monitor scores for any questions or stages that cause a disproportionate drop-off.
Here’s how we map the hiring funnel in-app (we have charts for each metric of diversity):
It may be that one of your interview questions is geared towards a male style of approach (corporate hiring processes in the U.S, for example, have been shown to prefer a masculine style of leadership).
Tracking diversity data will allow you to hone in on specific questions to ensure no particular group has an unfair advantage or disadvantage.
To start collecting this data, try adding a form at the start of your application.
Understandably, candidates may be skeptical about sharing this sort of information, which is why it’s vital to explain what this data is being used for (to ensure the fairness of the process) and that it will only ever be used at an aggregate level.
Transforming the way you hire is no small feat.
Right now, only 5% of leadership positions in the tech industry are held by women. If we want to change this, we have to do more than just raise awareness…
Just telling people to be less biased doesn’t work. We have to challenge the hiring process itself to achieve tangible results.
A big thank you to our partners at Applied for collaborating with us on this topic. We hope that you can take these actionable points forward and use them in your own hiring practices - help us #ChooseToChallenge the hiring process for women in tech.
Sign up to the live Q&A webinar on Tuesday 9th March.