1. 3
  1.  

  2. 3

    Although I think Dijkstra’s version of this rant was better than Joel’s, these types of articles are making the mistake of confusing a small problem (what is good for teaching computer science) with a big problem: the traditional educational system is not good for teaching programming.

    In a semester-by-semester educational setting, you get sixteen weeks. During those sixteen weeks, a three-credit course has a total class time of 48 hours, not considering time wasted for course bookkeeping and exams. I am a self-taught programmer; most of the people I work with are self-taught programmers; and even the people I know who have degrees knew how to program before they got to a university.

    Can you imagine being taught to program in 48 class hours?

    Of course, there’s homework and the book, but then university has another disadvantage over simple self-teaching: you can push yourself as hard as you need if you’re directing the exploration, whereas a university must make a course passable by a relatively large portion of its students.

    You arrive at the point where your university work is really just a formality, a minimum bound where the people who are legitimately capable are going home and teaching themselves anyway, because your educational model cannot cope with the volume and variety of information that they can learn on their own. That’s great, and it means that your top-level autodidacts are going to be successful and skilled programmers, but turning out five or ten good programmers who would’ve been good programmers anyway doesn’t look good for metrics.

    So those courses that must be passable? They end up being the baseline for a bunch of barely-competent grinders who get through via hours of memorization, cheating, and “Gentleman’s C” grading. This happened with Java, but it would’ve eventually happened with any language as a teaching language, because administrators make the decisions about which programs and courses are on offer, and administrators have a magic number of students who they want to pass. Anything that’s “too hard” will disappear unless it’s in a prestige professional school.

    1. 2

      By ‘programming’, do you mean the vocational, CRUD & ETL, logging and metrics, case-vs-if body of knowledge, or do you mean one level up, where people are taught the art of taste, sensibility, good design, and so on? It’s not clear that first thing is the job of a university, which is trying to instill core principles and the ability to learn. That second thing is the job of the humanities, which were devastated in the 80s and which show no signs of being less trampled upon.

      1. 1

        Both. Neither are taught well by the current educational system. Instead, you get an attempt at computer science – but most college graduates that I’ve worked with or interviewed don’t seem to have a good grasp of that, either.

      2. 1

        I learned programming in college, and it was a lot longer than that. There’s two entry level nuts and bolts, braces and semicolons, classes which focus on getting the computer to do something without worry about program design. Then there’s two more classes on practical data structures and architecture.

        After that come the more serious classes which expected you to know how to program. My operating systems class was all c/c++ (as opposed to java) which I think is what Joel is referring to, but that was a quite some time after the first intro class.

        1. 1

          True, there’s more than one class – but one class is 48 hours. The rigor of the core courses also varies by institution. Of course at a top-tier computer science school, the rigor will be excellent, but at a “normal” college it will be variable. You might get a fantastic professor for data structures and algorithms, then get a guy who thinks Windows NT was the beginning of “servers” for Operating Systems.