1. 13
  1.  

  2. 5

    Awesome approach.

    I’ve considered this before for hiring and thought a blinded approach would be good to help check those pesky biases we all have: if I’m hiring for a software engineer on my team, I’d ask another engineering team to screen the candidate, and then make a sanitized written transcript and resume for me/my team to review. They wouldn’t provide a thumbs up/thumbs down, just a record of what happened in the interview.

    That way the candidate’s age/race/gender/… should be effectively hidden from me until I’ve made the decision to bring them in for an interview.

    I wouldn’t claim something like this is perfect, and I haven’t yet had the chance to put it into practice - though it’s an idea I’ve pitched several times - but I think it’d actually be a fairly low friction way to ensure that I’m not outright rejecting candidates for bad reasons before I’ve even met them.

    1. 2

      It sounds like an awesome approach, and I’d like to see it apply in more places, but what do you do when it yields 20 white male speakers for a conference, because that’s the vast majority of people who submit talks?

      1. 6

        but what do you do when it yields 20 white male speakers for a conference, because that’s the vast majority of people who submit talks?

        Courtney Stanton’s linked post touches on this. Whether we’re talking of conferences, hiring or hacker school - hopefully creating an environment that takes blind applications as well as encouraging non-white male applications will eventually reverse that trend.

        I’ve asked programmer friends who are women, black, trans, etc about their experiences. To a person they tell me it’s an incredibly daunting experience to be in the minority, no matter how friendly their companies try to be. Taking Stanton’s point, it’s no wonder so few women are conference speakers when conferences are basically a white guy’s domain. So in your example what you’re seeing isn’t that you’re picking the 20 best applicants (who happen to be white males) out of a pool of 200 qualified individuals – you’re picking the 20 best white males out of a pool of 200 white male applicants, because the excellent non-white male candidates didn’t apply to begin with.

        1. 2

          This approach has been getting some traction at CS conferences lately. My impression (possibly wrong?) is that it’s been much more successful with gender than race, though. Conferences that have put in substantial effort at recruiting a broader set of submissions have succeeded somewhat in recruiting white women, but not as much in recruiting, say, black men or women.

          Probably a lot of reasons for that, including worse starting numbers. But perhaps also social distance: one guess is that white women are, while underrepresented, not as socially distant from the dominant culture. A lot of advertising/recruiting happens informally through “networks”, so if your conference starts out with mainly white men, the likely recruits are those who are in the friends-of-friends network of those men. Those typically are much more cross-gender than they are cross-racial, since in general society is much more segregated racially than on gender lines (in many cities black and white people don’t even live in the same geographical areas, don’t attend the same schools, don’t attend the same churches, etc., while men vs. women mix much more on all those axes).

        2. 4

          I’d say you’ve failed at your goals if you got to that point. The intake shouldn’t be passive otherwise you won’t achieve your goals.

          http://continuousdelivery.com/2013/09/how-we-got-40-female-speakers-at-flowcon/

          You need to ensure that other people submit for talks, the side that would be doing the screening/filtering should be checking this. Then if you get a 50/50 split to the inputs and the outputs give you all white male speakers, you might have to either accept they were the best choices or look at how you can get better applications from applicants. Or possibly something else I don’t know.

          If your inputs are diverse and your outputs aren’t but are anonymized what SHOULD the response be? Drop 50% of them and enforce your goals? I’m the wrong person to ask this question to.

      2. 1

        The other option is to positively discriminate when choosing, eg. “We need 10 new employees, let’s match the US population statistics, so 5 will be female, 5 will be non-white.”. You could look at other factors, 1 has to be disabled, 1 has to be non-heterosexual, 5 have to be not-classically handsome, 2 have to be lacking confidence, 2 have to be over 60, 5 with skills but haven’t been to university/college. You can even over compensate, we’re going to employ 6/10 females until our company is 50/50.