I think these types of analysis aren’t terribly useful because they only measure decisions at the end of the interview process. As it says
we’re evaluating all of our experiments against our final round interview decisions.
Of course they do admit
This does create some danger of circular reasoning (perhaps we’re just carefully describing our own biases).
So, until we see the results of the “job performance”, which really are the only measurements that matter – after all, when we interview, we’re not doing it to get to a Yes/No, we’re doing it to, y'know, find someone who will actually provide value to the company (preferably a high multiple of their compensation). Of course, measuring “job performance” is an extremely tricky area, so the main outcome I’d be interested in is the longevity of the person hired.
Also, there’s an implicit selection process going on: those who apply to TripleByte in the first place. Are they representative of the overall population? Who knows?
I guess what they’re trying to do is refine the pre-interview screening process so as to maximize the proportion of positive results from interviews. Which is fair enough, running interviews is time consuming and distracting and companies would like to do as few of them as possible. I think that’s their business plan.
But for the company which is actually hiring, the real risk isn’t a bad interview, but a bad hire. Unfortunately, that’s hard to measure in a sensible time frame.
As an aside, note that the authors are shocked to discover that technical interview success where they really want coding chops is better preconditioned on having candidates pass an online programming quiz first.
In the words of a heroin addict living off of a diet of string cheese: “No shit”.
I’m fine with this kind of advertising and found this story in the process of submitting it myself.
They did a lot of hard work testing hypotheses and reported on the results. Yes, that work is part of a commercial service they provide, but the results are useful to everyone hiring developers and they’re presented informatively rather than in a self-aggrandizing way (“we learned the secret to hiring devs xxx% better!”).
I also disagree that their results were not worthwhile. A lot of this research topics that look obvious get knocked down, or only look obvious in hindsight bias. We can’t know unless we run the experiment. I really loved the linked article where a recruiter attempted (pretty roughly, but at least they tested and reported back) to test employer’s abilities to screen resumes for quality candidates. It’s a ubiquitous process with dubious results and I’m really happy to see it put to the test. Even if the result of a great study is “actually, resume screens are a cheap, highly-accurate filter” I’d be glad to see the standard practice confirmed (once I got over my shock).
I dislike this sort of advertising precisely because it looks useful but isn’t actionable.
Without some idea of what the questions are, what the processes are, what the interviews are, etc., it’s just data wanking. While I may have missed it, I didn’t even see a link to sign up for one of their interviews.
It’s very popular these days to put up these pseudo-informational articles (TED style, grr) that make you come away feeling like you learned something, and yet not actually have any implementation details.
I think these types of analysis aren’t terribly useful because they only measure decisions at the end of the interview process. As it says
Of course they do admit
So, until we see the results of the “job performance”, which really are the only measurements that matter – after all, when we interview, we’re not doing it to get to a Yes/No, we’re doing it to, y'know, find someone who will actually provide value to the company (preferably a high multiple of their compensation). Of course, measuring “job performance” is an extremely tricky area, so the main outcome I’d be interested in is the longevity of the person hired.
Also, there’s an implicit selection process going on: those who apply to TripleByte in the first place. Are they representative of the overall population? Who knows?
I guess what they’re trying to do is refine the pre-interview screening process so as to maximize the proportion of positive results from interviews. Which is fair enough, running interviews is time consuming and distracting and companies would like to do as few of them as possible. I think that’s their business plan.
But for the company which is actually hiring, the real risk isn’t a bad interview, but a bad hire. Unfortunately, that’s hard to measure in a sensible time frame.
Flagged, organic advertising for triplebyte
As an aside, note that the authors are shocked to discover that technical interview success where they really want coding chops is better preconditioned on having candidates pass an online programming quiz first.
In the words of a heroin addict living off of a diet of string cheese: “No shit”.
I’m fine with this kind of advertising and found this story in the process of submitting it myself.
They did a lot of hard work testing hypotheses and reported on the results. Yes, that work is part of a commercial service they provide, but the results are useful to everyone hiring developers and they’re presented informatively rather than in a self-aggrandizing way (“we learned the secret to hiring devs xxx% better!”).
I also disagree that their results were not worthwhile. A lot of this research topics that look obvious get knocked down, or only look obvious in hindsight bias. We can’t know unless we run the experiment. I really loved the linked article where a recruiter attempted (pretty roughly, but at least they tested and reported back) to test employer’s abilities to screen resumes for quality candidates. It’s a ubiquitous process with dubious results and I’m really happy to see it put to the test. Even if the result of a great study is “actually, resume screens are a cheap, highly-accurate filter” I’d be glad to see the standard practice confirmed (once I got over my shock).
I dislike this sort of advertising precisely because it looks useful but isn’t actionable.
Without some idea of what the questions are, what the processes are, what the interviews are, etc., it’s just data wanking. While I may have missed it, I didn’t even see a link to sign up for one of their interviews.
It’s very popular these days to put up these pseudo-informational articles (TED style, grr) that make you come away feeling like you learned something, and yet not actually have any implementation details.