Jeff Atwood - Why Can't Programmers Program?

Jeff Atwood, who made StackOverflow among other things, wrote a blog post arguing that most programmers can't program.  A lot of people have accepted the conclusions at face value (eg, in this Quora post), but there are some serious problems with it, so I decided to weigh in as well.

The article argues that most people applying for programming jobs  can't program.  There are three explanations for that (all of which might be partly true): 

  1. The statistics underlying the article are poor
  2. Interviews are very different from programming
  3. Some programmers don't know how to program

I'll explore each in order.

The statistics underlying the article are poor

Assume we have 100 candidates.  90 are strong, and 10 are weak.  They're applying for jobs at 10 companies.  A person will keep interviewing until they get a job or until they have exhausted their options.

The strong candidates interview at one company, get a job, and don't interview again.  Now, the industry sees a total of 90 good interviews.

The weak candidates interview at all 10 companies since they keep getting rejected, and then in 6 months, they interview again.  After half a year, then, the industry sees a total of 200 bad interviews.

In other words, in an industry with 90% good candidates, it wouldn't be too hard to have two thirds of your interviews be bad candidates.

There are plenty of problems with those assumptions (eg, good candidates might try interviewing at a bunch of different companies, and bad candidates might get exhausted and give up before their 20th interview), but there are also some good candidates who take an internship and then don't even have to interview to get the full time offer.  

Interviews are very different from programming

Here are a few differences between the real world and interviews:

  • In the real world, I'm easygoing.  In an interview, I am stressed.
  • In the real world, I like long, descriptive variable names.  In an interview, it would take forever to write that up on the whiteboard.
  • In the real world, if I have a hard problem, I'll probably talk it over with my teammates or Google it to see if it has been solved before.  In an interview, if I have a hard problem, I'm expected to solve it all by myself even if that means reinventing the wheel.
  • In the real world, software engineering (good coding style, good modularity, unit testing, etc) is often more important than being able to come up with the perfect algorithm on the spot.  In an interview, folks usually ask questions that emphasize theory and coding something up quickly rather than coding it up well.
  • In the real world, I program in my text editor, so it's easy to move code around, change variable names, add a new line between two other lines, and refactor duplicate code.  In an interview, I'll probably be programming on a whiteboard, so I can't do any of that.  People think differently about a problem when they're out of room and at the bottom of a whiteboard than they do when they have enough room.
  • There is also the phenomenon of embedded cognition.  My knowledge as a programmer comes to me more easily when I have my normal programming toolset in front of me than when I'm at a whiteboard.

These differences are important.  One of my coworkers wanted practice interviewing, so I did a mock interview with him.  He cranks out more code than me, and it's all well tested, modular, etc.  He's a great software engineer.  However, he answered my question fairly poorly.

Anecdotally, I have interviewed dozens of people, and I have only encountered about four where the interview demonstrated that they were a bad fit (eg, one candidate didn't indent anything.  Another candidate just gave up on answering the problem).  A few demonstrate that they're really well prepared for the interview and don't make any mistakes.  But most candidates do moderately well -- they make a few mistakes, but they're all reasonable mistakes to make, and there weren't any red flags that indicate that they would be unprepared for a programming job.

I suspect that much of the negative perception of interview candidates comes because of bad interviewers.  The interviewer has thought about their problem a ton, so the solution is obvious to them, and any mistakes a candidate makes are glaring.  I have looked at interview packets where an interviewer gave a candidate a bad score because, even though they got the correct solution, they didn't seem confident.  I have looked at interview packets where an interviewer gave a candidate a bad score because they confused size() and length() or forgot a semicolon.

Being a good interviewer is really hard.

And let's not even go into the rat's nests of phone interviews where the connection quality might negatively influence how well you do or interviews where the candidate is great, but the interviewer misunderstands them because either the candidate or the interviewer speaks a different native language.

Some programmers don't know how to program

There are clearly some programmers who don't know how to program.

I think that much of this problem is due to schools emphasizing different things.  There are many CS programs in Europe, for instance, that emphasize theory over programming, so you get folks who are great with algorithms and can program in Haskell, but who couldn't solve a simple coding question in an imperative language.

You also might have the opposite problem with self-taught programmers.  Without a formal CS education, you might not get much of an attention to theory, or you might not get much of an attention to style, preferring short variable names and whatnot.

However, by this point, it should be clear that I think the methodology supporting the argument that programmers don't know how to program is extremely suspect.  I also think it's problematic to draw conclusions when you have a demonstrably poor methodology.  When anecdotal observations yield a surprising conclusion, I think it's a good idea to verify that the observations are reasonable.