On diversity, bias and decision-making

4 minutes read

Biased language: how an algorithm revealed gender preference on Sortable posted jobs

Until recently, it was unlikely that a Canadian organization was expected to publish diversity statistics or take on long-term programming to support an inclusive workplace. Collecting these statistics is labour and resource intensive. And in Canada, the laws governing how employers can collect, use, and publicly disclose diversity data differs from US legal standards due to privacy legislation. This recent CBC report on diversity in tech offered a miserable outlook on the industry but is also potentially hamstrung by disclosure practices in Canada.

At Sortable, we’re constantly reviewing and improving our hiring processes to be a more inclusive organization. Part of that is pulling back the curtain to show the decision-making process, warts and all. Here’s one recent example of how we’ve approached — and worked to correct — institutional bias in a company of 80 employees.

Decoding language in our job postings. Or, how we might have scared away female devs.

We have a Slack channel called #sortabledev. It’s a place for devs to post what they’re reading, watching, and consuming. A few weeks ago, a link was posted about a local developer who assessed the gendered language in Communitech’s tech job board. One of our developers plugged in Sortable postings.

Here’s how Sortable’s ads evaluated at that time.

Partner Coordinator: feminine-coded
Engineering Manager: masculine-coded
IT System Administrator: strongly feminine-coded
Software Developer: strongly masculine-coded
Technical Product Manager: masculine-coded
Digital Marketer: strongly masculine-coded
Manager, Publisher Consultants: neutral
Partner Account Manager: strongly masculine-coded

Coordinator and administrative functions skewed toward encouraging female applicants; and manager titles skewed male, according to the algorithm.

This set off a half-day discussion in the channel. Titles and job descriptions hadn’t been created by any one person, but rather looked over by several pairs of eyes. We had assumed they were pretty reasonable. In particular, it was disheartening to see that software developer is coded as strongly masculine with certain word repetition - like ‘challenge’:

Software Developer
Masculine-coded
5 Masculine-coded words: determining, compete, principles, challenge, leadership
4 Feminine-coded words: supportive, response, supporting, collaboration

But unbiasing language is very hard. Some devs took on rudimentary word exercises to illustrate the bias and were unable to offer words separated from historical context. There was also an interesting perspective on the challenge — the fact that we called it a ‘challenge’ inadvertently created barriers for both genders.

Andy, an engineering manager, participated in the conversation and took it back to the hiring committee (these folks last seen here) to revise for gender-balanced language as part of their ongoing hiring process improvement. Rather than expect the candidate to be excited solely on the premise of a challenge, we revisited why we include a challenge at all.

(We now call it a take-home project. We expanded on the reasoning and benefit for a take-home project. It goes like this: we ask for a project so potential team members are not in a live interview situation trying to write code with examiners breathing down their neck. It also aims to mitigate some of the performance anxiety and tension that can cause. Candidates can work on it in their own time at their own pace. We get a much more accurate view of the work that they are capable of.)

Here are additional practices taken. It’s not a comprehensive list:

  • Our job descriptions going forward will be functionally gender-bias neutral–we can’t avoid using gender-coded words, but we can ensure it doesn’t lean heavily in either direction.
  • As mentioned in this hiring discussion, we review take-home projects blind. Submissions are anonymized and then reviewed by a rotation of developers. Once the results are compiled, they’re de-anonymized and passed back to the recruiting team.
  • Sortable supports, sponsors or provides scholarships to conferences, peer groups and meet-ups which encourage diversity in tech and STEM, including Technical Chats for Women Peer2Peer, Scala Up North, and Star*Con.
  • We hold a local tech startup event called Startups and Beer, which raises money for local, community initiatives. We’ve partnered with incredible organizations that deal with gentrification (The Working Centre), women and children assault (Women’s Crisis Services), and inclusion programs (Extend-A-Family WR). This offers the tech community exposure to non-tech sectors and services.
  • We commit publicly to improving with our Sortable for All statement and with active participation in diversity peer groups.

This is where the conclusion would be, if there was a decisive conclusion.

Right now it feels like some of these changes are fairly subtle, but we’re betting that we’ll look back in a few years time and wonder how we were so naive about this. We’re committed to keeping the conversation going internally around unconscious bias, and continue to iterate on hiring and training. There is no magic bullet solution.

Sortable for All

We believe in hiring smart, nice, curious, and productive people. We believe that diverse perspectives and experiences are a competitive advantage. And that the biggest bias humans have is believing they have no bias. At Sortable, we foster inclusion and diversity in our teams, from improving our hiring processes to constantly re-evaluating our interview experience.

Above all, at Sortable we believe that our workplace should reflect the world we want to live in and that the potential for greatness has nothing to do culture, social or economic background, ethnicity, gender or orientation.

Updated: