by Terena Bell

New technologies take aim at IT’s diversity problem

Feature
Nov 22, 2017
CareersIT LeadershipStaff Management

From the job description to the workplace, female entrepreneurs are designing new tools to mitigate bias in hiring, company culture and decision making.

women crowd colorful diversity
Credit: Thinkstock

If your company is having difficulty establishing a diverse workforce, you can stop blaming the pipeline. That’s the collective message sent by new HR technologies that help organizations remove bias from their hiring practices and identify and retain female talent. Coincidentally, these new tools are all designed by women.

First, there’s Talent Sonar. Founder and CEO Laura Mather says Talent Sonar’s acquisition platform removes applicant and college names from resumes to help prevent bias when hiring managers see a typically female first name — like Mary or Linda — or read that an applicant attended a historically black college. It also analyses the language used in job descriptions to make employers more approachable.

On how it analyzes or what, Mather is intentionally vague. But competitor Textio isn’t afraid to get specific. CEO and cofounder Kieran Snyder says her company uses natural language processing (NLP) to score job descriptions for gender neutrality from zero to 100. “[The tech’s] really looking top to bottom at all aspects of how the job post shows up to a candidate. Yes, vocabulary is part of it,” she explains. “Various words and phrases may get highlighted as reaching certain audiences or [being] problematic.” But syntax, the inclusion of equal opportunity statements, and even formatting are involved.

Next, Textio’s augmented writing platform makes suggestions for improvement. “If you want to attract men and women to apply for a job … you want your job post to have a third bulleted content,” Snyder says. Go above half and women won’t apply. Below one-fourth and you lose men. “Is it discriminatory to use a lot of bullets in your job post or to use none at all?” she asks. “No, it isn’t. These things are so below the level of consciousness. You and I could make a theory as to why men or women are more likely to respond to these different types of formatting, … but the reality is, whatever the theory, it’s what happens with the data.”

And the data shows it works: Within six months of sign-on, manufacturer Avery Dennison saw a 60 percent spike in female candidates. After combining Textio with other processes, collaboration tools maker Atlassian hired an entering engineer class that’s 57 percent women.

Tackling language bias

Mixing layout analysis with traditional NLP is new to HR tech. The rudimentary processing of words and phrases is not, especially when it comes to the language used by applicants themselves.

“Applicant tracking systems … have been around for a very long time,” says Josh Bersin, principal at HR research firm Bersin by Deloitte. “All they do is they look for word matches between the resume and the job description and as they see word matches, they score them high.” In other words, if you’re hiring a computer programmer, legacy HR tech ranks someone with this job title higher than someone who’s called a software engineer.

Relatively speaking, the two are the same. And while there’s no data regarding whether women have one title more than men, there is evidence that — in general — the genders don’t use language the same. Legacy tech can search resumes for both titles, but it takes deeper NLP to analyze duties performed, descriptions, skills sections and other writing on a resume.

Bersin explains, “If your resume for some reason was written in a way that it doesn’t look like it fits the job, you’re not gonna get a call — even though you might be the smartest guy on the planet.” Note he does not say “the smartest gal,” showing how even the most rote of expressions can seem exclusionary.

To men’s credit, female-driven startups don’t have a complete lockdown on bias-removal tech: Andres Blank is cofounder and CEO of Scout. Unlike Talent Sonar, which removes gender indicators from a resume, Scout intentionally looks for them. Blank explains, “We index tens of thousands of a female’s first name. We get asked a lot for female engineers, for example.” This allows companies that deliberately seek women to pull their applications to the top.

Re-engineering the workplace

When pipeline actually is the problem, Scout offers a solid fix. But as Talent Sonar’s Mather says, the trick is “to actually get more diverse hires — not just get more diverse people applying.”

To move beyond application numbers, you have to become a place where women want to work — and show them this during recruiting. That’s what InHerSight does. Similar to Glassdoor, the company uses crowdsourcing to rate companies. Whereas Glassdoor ranks generic attributes like “comp & benefits,” InHerSight measures “maternity and adoption leave,” “family growth support,” and other female-focused categories such as “equal opportunities for women and men.” Those scores are then backed by data measuring “company responsiveness,” according to founder and CEO Ursula Mead: “How satisfied are you with how well a company responds when you escalate issues [such as not being able to take maternity?] … For us, it’s important to understand how well a company responds when you have a problem.”

Companies such as Ericsson, HubSpot, and others advertise on InHerSight to drive job applications and then use platform responses to find areas where they could become more female-friendly. Mead says, “They’re looking at our data and they’re trying to understand what’s working and what isn’t for the women who work for them and what they need to do to be more attractive, not just to new candidates, but to also the women who are currently at their companies.” As sexual harassment and other policies are adopted, clients use InHerSight to measure their success: Did ratings go up after implementation?

Toward a true meritocracy

For an alternate way to track this, there’s Baloonr, a polling platform cofounded by Amanda Greenberg, CEO. Greenberg says in her last job, she saw firsthand how gender bias “negatively impacted the bottom line of the company.” No matter how many women her employer hired, their voices were never heard: “Ideas and feedback were evaluated differently based on whose they were.”

So Greenberg built software that keeps the originator of an idea hidden until management has accepted or rejected it on merit alone. Employees log in the platform, anonymously provide project input, then up-vote ideas they agree with. Only after an idea’s selected does the system show who it came from. “We create an idea meritocracy by using a unique flow, components of anonymity and randomization,” she says.

In this meritocracy, business operations become more effective. Greenberg claims blind idea generation “replaces in-person meetings, shortens meeting times. It speeds up product and project work.” Women no longer have to wonder if gender is why their idea was dismissed. Tools like Baloonr also put an end to “manpeating” — a practice where a female employee’s idea is shot down, just to be accepted after a male employee immediately rephrases it.

According to Greenberg, “There are dozens of types of bias that drive down decision making, that stall innovation and creativity. … What looks like a lack of employee knowledge in the employer’s mind may be a lack of safety in the employee’s.” In other words, if a woman isn’t speaking up, don’t assume she doesn’t have anything to say.

“Bias mitigation is the next wave of productivity,” she proclaims. “Drive an inclusive culture and employees will share ideas. Encourage a speak-up culture and you’ll get a more innovative culture.” You’ll also retain more of those female hires you worked so hard to attract. “Bias has an impact on the bottom line,” she concludes, “Enlightened leaders really understand bias has a negative impact on productivity and innovation. From hiring to retention … in every aspect, groups must eliminate anchoring bias and be deliberate in how they think.”