Malcom Gladwell's book Blink: The Power of Thinking Without Thinking is a really interesting study in decision making and how the availability of information impacts the accuracy of those decisions. Over the last couple of years, I've occasionally mulled over how this could be applied to technical hiring.
From my previous blog posts ('Hiring is Hard'), after analyzing the results of our hiring process at Ettus Research, we made some changes that improved the performance of our hiring pipeline - i.e., we more accurately identified candidates to whom we were likely to extend offers earlier in the process. Interestingly, we did this with less information.
One of the case studies in Blink is kind of similar in that regard. Specifically, the heart attack diagnosis procedure at Cook County Hospital. What they found in that particular scenario was that having more information didn't lead to more accurate decisions - it merely led to the decision makers being more confident in their decisions, and the hospital spending more money making them. When they formalized a decision tree to something that processed less information but was built on data and targeted experience, the accuracy of the diagnoses improved significantly (from 75-89% to 95+%).
So, this got me wondering if we could use the same principle to improve the accuracy of our technical hiring process. Specifically, if it would be possible to identify indicators or basic tests that provided information highly correlated to strong potential hires. Things like "Is the candidate passionate about the tools they use?" and "Has the candidate read books about their craft in the last two years?"
For each question I've thought of, I've also been able to pretty quickly identify very strong engineers that I personally know that would answer in the negative. Many of the best engineers I know, for example, are highly passionate about their tools (e.g., editor, window manager, terminal, keyboard), but I can also name several people that I think are truly brilliant engineers that don't care in the least about them.
So, certainly no one of these could stand alone, but perhaps if considered together, the whole might yield something useful. This might also allay any discriminatory potential that some of the questions might have - for example, perhaps the candidate very much wants to be reading books about their field, but has been a single parent working two jobs and learning to code, and quite simply doesn't have time.
To be clear, the goal isn't to make hiring decisions based solely on these questions, but rather provide high-value input into the decision. And, knowing whether or not this approach has any value at all would require testing the questions and gathering data on their utility, which means not only tracking whether candidates received offers but also their contribution once on-board. In short, a lot of time and money spent on things that are very difficult to actually quantify.
So, the long and short of it is that I think this is an interesting concept, but that's about it for now. I'd like to think this is more than just an academic curiosity, but who knows. Of course, if you have ideas, get in touch!