One of the points the author makes I think is worth expanding upon: that there is no clear path for entry into a career as a developer.
Right now, CS covers such a broad range of problems as to be basically useless as an indicator of competence. For some people, their CS education is glorified scripting and enterprise IT work. For others, it’s quite theoretical bordering on applied math. For still others, it’s software engineering and project management. And for yet others, it’s computer and electrical engineering. Or web app development.
This being the case, it’s really hard to tell new folks “Oh, you want to do that, you should start here” where here is anything other than “Go fuck around with computers for a decade and come back to me when you’ve been exposed to enough that you’ve pattern matched on what you fancy.”
Worse, because of the social biases that the author alludes to (which kinda do need more exploration elsewhere, and I’d love to read them), nobody outside of our field(s) really bothers to make a distinction between “Oh, you’ve implemented a functional language” and “oh, you can get my printer on the wifis”. :\
I only have my perspective on the issue (I can program; I haven’t taught anybody how to program), but I don’t like the claim that someone’s affinity for programming is innate. It’s a very comfortable, fixed-mindset approach that protects everybody involved (students, instructors) from any sort of blame. It’s easy to discern between groups of students who have this ‘natural aptitude’ for programming, but I don’t think it’s fair to say that this aptitude can’t be acquired. There are many cases of people who failed through math and science in school who went on to make incredible contributions in those fields later in life.
That said, I do agree that the vast amount of beginner programming courses are unlikely to make up for the expected skills shortage in CS jobs. I think that those introductory courses filter for people with the ‘natural aptitude’ for programming, rather than teach people without the natural aptitude how to acquire it. I think that people without this aptitude can acquire it by learning how to think like a computer. Given a group of people, a set of instructions (e.g you can ask one person how much change they have in their pockets), and a goal, such as “Find the person with the most change in his pockets”, a person will figure out to ask each person in the group, and keep track of the maximum until every person has answered. This type of problem can be translated to pseudocode, and then translated into a programming language. You can teach somebody assignment, conditional branching and loops without from a computer, and then try and have them apply it to a programming problem.
I think the fact that people without the ‘natural aptitude’ for programming can cook a meal for themselves, or devise a multiple-step plan for performing a series of complex tasks (go to store, buy groceries, go home, cook meal) should be evidence enough that anybody can learn to program. I think the problem is that some people have difficulty crossing a particular barrier, whether it’s finding analogues between CS concepts with objects in reality, formulating a solution in terms of a restricted set of instructions, or trying to express the instructions in terms of statements and expressions in a particular programming language.
That said, while anybody might be able to learn to sing or run fast, most people won’t become professional singers or runners. It’s possible that professional programming will continue to be a career for people with the ‘natural aptitude’ for it, but I don’t think it’s fair to say that the field is limited exclusively to people based on some innate factor.
I’m curious to hear other people’s opinions. Does anybody with experience teaching programming or somebody who has tried to learn programming and did not succeed have anything to add?
I think part of the argument is that lots of coders-in-training (including people with degrees in CS fields) lack skills that employers are looking for, such as project management and effective use of version control systems. People may be able to comprehend program structure and flow, but when you bring higher-level concepts like objects and compilation into the mix, I think most people just don’t have the patience to work through it all.
This whole article is just a series of bad and uninformed arguments. I don’t even fully understand what the article is ranting about. Should nobody learn to code because it’s hard and they may not be good at it?
First, I’m not so sure programming (or coding, software development, or whatever you want to call it) is any different than any other profession. Some people will be better at it than others, but there’s still a lot of room for mediocre and below average people. Not everybody needs to be a “rockstar” or “ninja” or whatever trendy word startups are using nowadays.
Second, I’m not sure any of the “learn to code in [some short period of time]” type programs are really aiming to train people to be professional developers in a day, a week, or a year, but that doesn’t mean the programs are useless. There’s a lot of low hanging fruit in computer software, and a lot of gains to be had by “amateurs” writing their own programs to automate bits of their life. Read some books or articles from the 70s and 80s, and most of them are written from the perspective that in the future most average people will interact with computers and write (at least a little) code to get things done more efficiently.
Third, a lot of self-selecting CS students sucking at writing code doesn’t say anything about how well people in the general population can code. It’s entirely possible that the personality traits that make people select a CS major make them particularly bad programmers. I don’t know, the author doesn’t know, and I don’t think anybody’s studied it.
Fourth, being self-taught at something doesn’t mean a person can’t suck at that thing. I’ve seen a lot of self-taught people waste a lot of time doing things poorly or inefficiently in fields as diverse as writing software, skiing, bike repair, cooking, etc. I’d even say self-taught people are more likely to be incorrect or do things wrong than people who’ve gone through some kind of formal training.
The examples of Googling how to become a doctor or lawyer are just stupid, IMO. Steps like “Complete a medical residency” are even more vague than “Learn a programming language.” He’s also ignoring the fact that a lot of people who go to school for medicine, law, and other professions end up being bad at those professions and often end up doing something else.
CS degrees are definitely all over the place on what they teach and how well they teach it, but it’s still a young profession. Doctors, lawyers, and some of the engineering disciplines have been around hundreds of years (if not thousands of years), and they’ve had a lot more time to figure out how to train and educate newbies. I’ll be more worried if the degree is still all over the place in 50 or 100 years.
While I agree that learning to code is hard, I’m not convinced by the professional body approach, and an exam to weed out the chaff from the wheat - just weeds out those who are not good at exams.
The ability to continue to learn, is really important in any profession. Learning and Teaching are not easy, but there is a lack of evidence on how to make teaching effective - in medicine they have double blind trials to prove if therapies are effective - no such thing happens in education - so we have no idea what actually works…
I’d imagine people said the same sort of things about algebra, or even literacy.
The bottom line for learning any skill is that everybody has got to start somewhere, so let’s make sure that everyone has the opportunity for a starting point. Everything else is marketing, which is a necessary evil in success in any worthwhile endeavor.
This relays an important but socially unacceptable message: not everyone can do what we do, and the percentage of people who can do it well is rather small, and we ought to be better at converting that into social status.
Professional athletics are the same way, in terms of low-frequency natural abilities having a major impact. No one would pay a dime to see me play professional football… except perhaps for my enemies, in order to see me humiliated. I wasn’t born with the genes for it. 99% of us don’t. Hence, the people who are able to become very good at it (through a combination of natural ability and effort) get paid a lot and enjoy high status in society. No one says, “Learn to play professional football: it’s easier than you think”, and it’s not socially unacceptable to admit that the vast majority of people could never get to the point of being paid to play a sport, because it’s the obvious truth.
Yet programmers allow MBAs to think that they could do our jobs if they put in the time to learn our skills, while we couldn’t do theirs– and, of course, the truth is the absolute reverse of that. This lowers our status, and it floods the market with unskilled, negatively-productive commodity programmers. Unfortunately, MBA Culture has such a hard-on for headcount– managing a team of three elite people who do the work of 100 is deemed less impressive than having “an organization” with 75 people doing the work of five– that this shows no signs of abating. I don’t see the open-plan offices full of untrained brogrammers whose parents dropped $15,000 on a bootcamp to get them to move out of their basements going away, any time soon, because that’s just what startup culture is, these days.
Here’s what I consider to be the painful (from a feel-good liberal perspective) truth. Learning “to code” isn’t hard, but that’s not what makes good programmers worth their salt. We’re problem solvers. We can think through a problem and solve it. We recognize that the compiler doesn’t hate us, we know how to design interfaces, we can apply knowledge from one discipline or problem type to another, and we can think about systems in a reliable, methodical way. That doesn’t come naturally to anyone, and some people become fluent in it while others do. Is it the only meaningful kind of intelligence? No. There are several “intelligences” that matter in human society. Is it very important for what we do, and rare (2-4 percent) at the levels necessary to do it well? Yes. I don’t know whether it’s innate, epigenetic, environmental, or otherwise. People who challenge themselves tend to go up in this kind of intelligence with age, and people who don’t, decline; so it seems to have some plasticity. “Need for cognition” seems to be a lifestyle trait that can improve a person’s often-thought-innate intelligences (“IQ” or “g”) between childhood and midlife. What I do know is that it’s not something that can be acquired in a few weeks. Maybe 5 years, and it takes so much work and lifestyle change that in the best case it’s comparable in difficulty to losing a lot of weight and keeping it off: i.e., hard as hell but doable.
The upshot of this is that we can use this information, as much as it offends social acceptability to admit that those-who-can in this field have a rare ability that is either innate or extremely difficult to attain. I’m not exactly sure how to use it– most of us suffer in companies run by MBA-type founders and executives who think they could do our jobs, which is so ridiculous that it hurts– but all information has an angle.
I think that this article raises some interesting questions, but I take some issue with:
The current fad for short learn-to-code courses is selling people a lie and will do nothing to help the skills shortage for professional programmers.
If you believe that there is some quality that is unevenly distributed that enables one to become a professional programmer, then, unless you believe that the current systems we have in place to find these people are functioning optimally, it seems to me that exposing more people to a selection process is exactly what needs to happen.