1. 47
    1. 21

      I used to work in academia, and this is an argument that I had many times. “Teaching programming” is really about teaching symbolic logic and algorithmic thinking, and any number of languages can do that without the baggage and complexity of C++. I think, if I was in a similar position again, I’d probably argue for Scheme and use The Little Schemer as the class text.

      1. 10

        This is called computational thinking. I’ve found the topic to be contentious in universities, where many people are exposed to programming for the first time. Idealists will want to focus on intangible, fundamental skills with languages that have a simple core, like scheme, while pragmatists will want to give students more marketable skills (e.g. python/java/matlab modeling). Students also get frustrated (understandably) at learning “some niche language” instead of the languages requested on job postings.

        Regardless, I think we can all agree C++ is indeed a terrible first language to learn.

        1. 9

          Ironically, if you’d asked me ten years ago I would’ve said Python. I suppose I’ve become more idealist over time: I think those intangible, fundamental skills are the necessary ingredients for a successful programmer. I’ve worked with a lot of people who “knew Python” but couldn’t think their way through a problem at all; I’ve had to whiteboard for someone why their contradictory boolean condition would never work. Logic and algorithms matter a lot.

          1. 9

            I think python is a nice compromise. The syntax and semantics are simple enough that you can focus on the fundamentals, and at the same time it gives a base for students to explore more practical aspects of they want.

        2. 7

          Students also get frustrated (understandably) at learning “some niche language” instead of the languages requested on job postings.

          Yeah, I feel like universities could do a better job at setting the stage for this stuff. They should explain why the “niche language” is being used, and help the students understand that this will give them a long term competitive advantage over people who have just been chasing the latest fads based on the whims of industry.

          Then there is also the additional problem of industry pressuring universities into becoming job training institutions, rather than places for fostering far-looking, independent thinkers, with a deep understanding of theory and history. :/

          1. 3

            I’ve been thinking about this a bit lately, because I’m teaching an intro programming languages course in Spring ‘19 (not intro to programming, but a 2nd year course that’s supposed to survey programming paradigms and fundamental concepts). I have some scope to revise the curriculum, and want to balance giving a survey of what I think of as fundamentals with picking specific languages to do assignments in that students will perceive as relevant, and ideally can even put on their resumes as something they have intro-level experience in.

            I think it might be getting easier than it has been in a while to square this circle though. For some language families at least, you can find a flavor that has some kind of modern relevance that students & employers will respect. Clojure is more mainstream than any Lisp has been in decades, for example. I may personally prefer CL or Scheme, but most of what I’d teach in those I can teach in Clojure. Or another one: I took a course that used SML in the early 2000s, and liked it, but it was very much not an “industry” language at the time. Nowadays ReasonML is from Facebook, so is hard to dismiss as purely ivory tower, and OCaml on a resume is something that increasingly gets respect. Even for things that haven’t quite been picked up in industry, there are modernish communities around some, e.g. Factor is an up-to-date take on stack languages.

            1. 3

              I one way you can look at it is: understanding how to analyse the syntax and semantics of programming languages can help you a great deal when learning new languages, and even in learning new frameworks (Rails, RSpec, Ember, React, NumPy, Regex, Query builders, etc. could all be seen as domain specific PLs embedded in a host language). Often they have weird behaviours, but it really helps to have a mental framework to quickly understand new language concepts.

              Note that I wouldn’t recommend this as a beginner programming language course - indeed I’d probably go with TypeScript, because if all else fails they’ll have learned something that can work in many places, and sets them on the path of using types early on. From the teaching languages Pyret looks good too, but you’d have to prevent it from being rejected. But as soon as possible I think it’s important to get them onto something like Coursera’s Programming Languages course (which goes from SML -> Racket -> Ruby, and shows them how to pick up new languages quickly).

        3. 7

          I started college in 1998, and our intro CS class was in Scheme. At the time, I already had done BASIC, Pascal, and C++, and was (over)confident in all of them, and I hated doing Scheme. It was different, it was impractical, I saw no use in learning it. By my sophomore year I was telling everyone who would listen that we should just do intro in Perl, because you can do useful things in it!

          Boy howdy, was I wrong, and not just about Perl. I didn’t appreciate it at the time, and I didn’t actually appreciate it until years later. It just sorta percolated up as, “Holy crap, this stuff is in my brain and it’s useful.”

        4. 3

          I hear this reasoning, about teaching tangible skills, but even one two or three quarters for Python is not enough for a job, at least it shouldn’t be. If it is, then employers are totally ok with extremely shallow knowledge.

        5. 1

          I didn’t even realize I had read this a month ago, nevermind I had commented on it, before I wrote my own post on the topic. Subconscious motivations at its finest.

    2. 12

      I wish I had had Python or Scheme as my first language. A mix of C64 basic, GW-Basic and Quick Basic and Pascal, followed by C/C++, was an extremely confusing way to learn programming.

      1. 3

        QB and VB6 here. It was confusing way for me to try to learn C/C++. However, what it (plus Common LISP) did teach me is crash/hack-prone software that compiles slowly breaking mental flow was just bad design in C/C++. That kept me looking at alternatives, like Delphi, that preserved lightening-fast, less-problems development with great, GUI support. The 4GL’s, more BASIC-like, also taught me many problems should be solvable with compact, readable code that auto-generates boilerplate.

        These things I learned are still true with three languages… Go, Nim, and Julia… preserving some of these traits with appeal to mainstream audiences. Go already has massive adoption, too. So, the kind of thinking behind languages like BASIC and Pascal just failed for a while before some people, including a C inventor, helped them redo and promote that. :)

        1. 3

          Delphi was amazing. I think it took Microsoft many years for Visual Studio to catch up, I think only 2003 with C# could you approach the speed that Delphi boasted in 1996.

          Funnily enough, most of modern front-end web development still struggles to reach Delphi levels of accessibility and power. Though one could say the context, technology and environments were vastly different, Windows in the late 90s was just as ubiquitous as the web is today, and we’re still doing the same shitty applications with buttons, images, event handlers and forms.

          edit: worth noting that C# came from the designer of Delphi, Anders Hejlsberg

      2. 1

        Interesting that you mention those two.

        In the Programming 101 course that I give, I start with Racket (at that level there’s no essential difference to Scheme) and then switch to Python. This gives the student two approaches to compare; functional and imperative.

    3. 7

      I agree. I like to think teaching languages like Java and C++ to newcomers is the reason today’s software world largely consists of “library-based programming”.

      1. 7

        “library-based programming”.

        I see this first hand with students I work with. Whenever there’s a problem the first thing they reach for is a library that claims to solve it.

        Often these things are simple problems, or the library creates more problems than it solves. I wish students were taught to assess a library before dragging it into their projects.

    4. 5

      As I read the comments here I see most of the posters thinking along the lines of “Set a high bar and make them WORK for it because otherwise they won’t REALLY know how to code”.

      OK. If we’re talking about training software developers, these points have great merit. However, what if we’re looking to teach a school child who may never even WANT to write software for a living, but for whom having the basic ideas of programming as conceptual building blocks will accelerate their development as thinking human beings existing in a society where technology runs ABSOLUTELY EVERYTHING?

      For these students, I’d argue that the “Set a high bar” argument sinks like a brick.

      So, as always, broad generalizations aren’t the most useful.

      1. 1

        Scratch and Pyret (with Bootstrap) are making headway with that group.

        1. 2

          Definitely! And Mu and Pygame Zero and and and :)

          The point is that trying to make generalizations about “You should/shouldn’t use tool X to teach programming” is a nearly pointless assertion unless we set some parameters around who we’re educating and what our goals are.

          1. 2

            Good point!

    5. 5

      C++ was not my first language, but it was my second. I taught myself programming when I was 11 and C++ at 13. I learned it just as the first standard was created in 1998.

      Thanks to the new standards, it’s arguably much easier to learn now then when I did it.

      A great language to learn for beginners is Smalltalk, but it will spoil them because the environment is so complete.

      1. 9

        Having high standards and expectations of your tools is a good thing. It forces them to become stronger.

      2. 1

        Thanks to the new standards, it’s arguably much easier to learn now then when I did it.

        I suspect this will depend a lot on how good your teacher is. If your teacher knows ‘modern C++’[1] and provides plenty of guidance to keep you using the good bits, you might be ok. If you’re largely abandoned to explore the language yourself, who knows what horrors you’ll dredge up without knowing. This is a big problem with C++: by maintaining backwards compatibility, it can never actually escape its past.

        [1] I understand there are plenty of people in academia still teaching programming languages as they learnt them decades ago.

    6. 4

      I feel that if a curriculum is actively teaching any programming language, they are already doing the right thing. This article evokes strong feelings and bad memories.

      To put it in context, my university did not teach programming languages at all in their Computer Science curriculum (1994-1997); it was expected that the students would learn on their own time. The courses were taught with examples in whichever language the professor preferred, and the programming assignments were generally expected to be done in a particular language (Fortran 90 for the introductory programming course, C for Operating Systems and Database courses, C/assembly for Computer Architecture, C++ for the Data Structures and Graphics courses). There was no consistency, which was both good and bad: horses for courses.

      Unfortunately, the professor for my Introduction to Object-Oriented Programming course did not know C++, so he used Fortran 77 in his lectures, we were expected to use C++ in our lab assignments, and the exam was Pascal-based blue-book - the course was incoherent and we were all dumber after taking it. I spent many late nights in the Engineering building trying to stay ahead of the lab assignments, as they ended up being the only real basis for a grade.

      Ultimately, I learned C++ from the compiler and a few minutes a week with a graduate student during the lab assignments. I just picked it up again a few weeks ago, and still had some trouble with pointers vs. references and stack vs. heap. I think C++ is a language that needs to be taught, but I’ll agree that it would be traumatic as a first language.

    7. 3

      Suggest that teaching programming is viewed as a applied skill by many, like teaching how to be an auto mechanic with four cylinder imports, so the choice of C++ is equally bland for wide applicability.

      Suggest a better approach is from the start always teaching how to write a good, understandable program. It’s no different from writing a good essay, math proof, or sonnet. What I find is that even with those who don’t really want to continue to learn/apply more, they seem to be versatile enough to “shift language gears” to another, because they recognize “something’s missing”, locate it, and restore the “form” because it was what was practiced from this “habit”.

      To this end Python has been of great help. It’s got some dumb stuff like for for/while loops with else, so not perfect. But it’s concise enough that coaxes a form of obvious style regardless, because beginners will “blow off” style guides (they really don’t see the point, just “make work”). And if one of my student’s crap’s out some Python, all I need to do is pick out another student’s adequate job (maybe four of them), then they get embarrassed and resubmit because it doesn’t take much to meet appearance.

      I’ve also taught scientists to forgo IDL (and Fortran) in favor of Python, fighting the institutional bias. Most scientific code is horribly written, which is too easy with such languages, to the point that they even forget why they coded something the way that they did, as the obtuse mounts up. It’s annoying for them at first, but the dividends pile up fast because in proving the science results clarity of form adds to the confirmation they are on the right track. The irony is that

      After this, “form” is followed by function - thinking algorithmically. A sense of the machine matters, as does the algorithm itself.

      C and C++ no longer do this well, because the machine’s complexity/variability has grown too high.

      Even worse a choice is the language GO. It’s a sloppy way of getting a lot of crap code running quickly/performantly, with a lesser trained staff - this is exactly what it was intended for. But like Fortran and scientific code, the swampy mess that results may take a while to get into a finished state. It’s as if they took the worst of C/C++ and built on top of that.

    8. 3

      I’d argue that any degree still teaching C++ as a first language mainly shows how out-of-date the degree is.

      I have worked/studied in several universities around Europe and you can clearly see when a syllabus was properly updated or when it was slightly tweaked.

    9. 2

      Who the hell learned C++ as their first language anyway? I thought I was the only one, due to a fluke in 1998 when the AP Computer Science programme for US students decided to use C++ for that one year. I think they used it for one more year and then switched to Java.

      Like, really, is teaching C++ to absolute beginners really a thing? Where? Anyone out there who can commiserate with me?

      1. 1

        Well, I was already a programmer by then, but I took a “intro to programming” class at the college for an easy A… and it used C++. That as 2007.

        But I did C++ early in my programming myself anyway while self-teaching… but it was my third language, after basic on the ti-83 calculator and assembly language….

    10. 1

      One good thing about C/C++ is that you’re closer to making a mental model of how CPU hardware works. I think there’s value in that.

      Of course, by this argument, schools should teach assembly first.

      1. 3

        I agree that a mental model of how CPU works is important. I disagree that C/C++ is actually close to how hardware works, or that a useful mental model of computers has to be closer to metal.

        C/C++ is as far from how hardware works as pretty much any other imperative language. The distance between C and Java or Python is a lot smaller than the distance between C and actual circuits.

        The machine that C/C++ pretends exists is an abstraction, and that is the machine that is good to have a mental model of. But every other imperative language works on top of that machine, anyways. Given that, I think it makes more sense to choose something with higher signal to noise rate. I had Pascal in college, python is also a good choice, in my opinion. Javascript has a worse signal to noise ratio (where signal is expressing basic ideas of algorithms and data structures, and noise is specific syntax and semantics), but has the upside of being everywhere that is a browser.

        Update: This is a longer article about this and some other things that says in more (and better) words what I mean.

      2. 2

        C/C++ seems like a good balance between low level assembly and higher level languages that do all the memory management for you (among other things).

      3. 1

        It’s closer than some but also an abstract machine with its own oddities. I’ve always been a fan of educational languages that get down to the essence of it. So, after they learn computational thinking with easy language, they can learn more about low-level programming with a simple, imperative language (example) with pointers, modules, and compound types. Teach them pointers, stacks, heaps, and caches with coding examples in easy language and lower-level language language. Teach them about temporal errors like use-after-free along with ways to prevent and/or detect them. Then, follow up with concurrency mentioning memory models and atomic instructions. Then, parallel programming with basic multicore and SIMD, maybe covering parallel languages. Show assembly for each of these.

        Then, they’ll have a mental model of how the CPU’s work vs how C wants them to work. Plus, how to code for them.

        EDIT: Looking at their syllabus, I found out that computational thinking that @mikelui brought up is the first goal they mention. I might have to look into the C0 work more closely.

    11. 1

      One thing that comes to mind reading this post and the comments is that it’s fair to assume that most software developers know jack shit about education.

      Sure, we all learned how to program, lots of us even learned it in college, some of us might even have taught it. But I’d wager very few developers actually studied education at an academic level or even have extensive experience teaching.

      So, maybe we need less takes (hot or not) and more proper multi disciplinary research into how to better teach programming and computational thinking at various levels.

      By the way, I’m not saying this post is a ~~hot take~~, I think it’s actually pretty sensible, but it’s not research, it’s anecdotal at best, and so are most comments here (mine included). I guess my point is: we don’t know the answer, and most of us don’t even know the proper questions to ask, so maybe we shouldn’t have strong opinions on this (or at least hold on to them lightly).