1. 41

When teaching someone programming for the first time, is it better pedagogy for them to learn a lower level language or a higher level language first? Both in terms of understanding and motivating them to continue learning to code.

I was inspired to ask this question after listening to Casey Muratori talk in a podcast about his experience learning Basic and how it was closer to the actual control flow of assembly with goto <==> jmp (compared with a kid who grows up learning Python in the modern day and doesn’t gain an intuition), as well as Bryan Cantrill’s hatred of Logo and embrace of lower level programming after he found it more enjoyable. On the flip side, SICP famously uses Scheme and has been tremendously popular. Curious what people here think, anecdotes from your lives.

  1.  

  2. 71

    I think it depends heavily on who you are trying to teach. I strongly believe the goal should be to ignite curiosity and empower someone to follow where that curiosity leads them. For one person, this might be writing some C to run on a microprocessor to read data from a sensor. For another, it might be building a Django app. Figuring that out is the hard part, in my opinion. Once you do that, things will fall into place for a while.

    1. 15

      Adding to that, motivation is always key when learning anything, and working on stuff you want to work on is the best way to be motivated.

      1. 8

        A corollary here is that the easier it is to find an engaging project to write in a language, the easier it is to learn a language. That’s why I feel like the best way to proceed when teaching kids is to start with making a game. Of course, every kid is different; some kids may be able to sustain interest just on the basis of a fascination with logic alone (like Bryan Cantrill in the OP) but if you assume everyone thinks this way going in you’re just going to end up with frustration. Building a game is a starting point that will resonate with a much wider audience!

        1. 3

          Some random musings:

          My first program was to sort the list of games I had; every floppy disk we had contained a bunch of games, something like 10-20 of them. And “I want to play game foo” meant going through the hundreds of disks we had to find it. I numbered the disks and wrote down which games it had, but a handwritten notebook isn’t actually all that much easier, whereas a sorted list is.

          I remember typing ZZZZZZZZZZZZZ at some point; it was slow as hell, probably somewhere between O(omg) and O(fucked). But it did work and solved a real practical problem I had, and a 12-year old me was very proud of it and happy.

          In spite of all the new languages, the internet, a plethora of books and courses, etc. I sometimes feel that getting started is, in a way, harder today than it was when I was young all those years ago. You just don’t really run in to these kind of fairly simple practical problems any more: you’d just use a spreadsheet or something like that today. Of course that’s not a bad thing as such, but it does mean there are far fewer problems to solve. I sometimes wonder if I ever would have gotten in to programming if I was born 25 years later.

          After we got a “proper” Windows 95 PC I stopped programming, as I didn’t really know how to and was so much harder to get started. The MSX just booted in a BASIC environment and you could program. I didn’t know about Python or Perl and such, and mucking about with Visual Studio was complicated and I didn’t really understand it (it didn’t help that the “Teach yourself C++ in 10 minutes”-book I got from the local bookstore was beyond horrible). I got back in to it a few years later by making some mods for Unreal and Deus Ex (which resulted in avoiding OOP, and inheritance in particular, for years to come), and after I started using Linux and FreeBSD I discovered Perl and Python and that you didn’t need an expensive (pirated of course) copy of Visual Studio to prmeansogram on modern machines. Things have certainly improved in that area.

          1. 3

            I have a similar story where I got started on QBasic and loved the immediacy but I hated DOS. When I learned to program a “proper” GUI for my Mac I got so bummed out by the tedium that I stopped programming for years.

            That’s part of why I’m excited about environments like TIC-80. They may never ship with the OS, but it has all the immediacy and “batteries included” and none of the “faff around with the environment before you can get it working” problems that most beginners face nowadays.

          2. 3

            A corollary here is that the easier it is to find an engaging project to write in a language, the easier it is to learn a language.

            For some people (i.e. me) this can be tricky. My first steps in programming were in Basic on a ZX Spectrum. Once we got a PC, it was natural to continue with QBasic or something like that there. But I actually stopped for a while then. Because while I could write programs, they would need to run in the Basic interpreter. And therefore, they were not “real” programs, like the other one I had on the PC. And if it was not “real” then I didn’t want to waste time on it. It felt more like making toys, in a playground called Basic.

            Thinking about it now, that was apparently the first manifestation of the programming me that wants to avoid lock-ins and resents unnecessary dependencies.

        2. 8

          This is roughly what I came to say. I dropped out of c++ focused cs because the projects weren’t engaging enough to suffer long projects while working nights.

          Later, I crawled through glass to pick up php for some projects that interested me.

          I imagine not everyone has the luxury of finding that driving project to learn through… But I still suspect it’s best to follow curiosity if at all possible?

          1. 4

            Same - I dropped out of my C++ classes, in fact I despised coding until ten years after college. A combination of luck and opportunity changed my life, and I learned Rails. I actively despise the approach in Rails apps now, but I feel like for the place and time learning a web framework which forced me to “just build a cool website” was a life changing and positive experience. I’m now working for a big tech company and life is more or less ok.

            1. 4

              I have a very similar story. Grew up with BASIC, “learned” C++/Java at university, hated that, stopped programming for maybe 5 years, got sucked back in by Python, became a professional programmer nine years after graduating with a CS degree.

            2. 1

              Just great…. C++ is my next class!!!! I still keep asking myself if I enjoy programming or not, I’ve only written one “real” program in my entire life and it was in BASIC back in the Commodore 64 days. I was an avid radio scanner buff (still am to this day) and spent I don’t know how many painstaking hours writing out a database from a copy of Police Call all so I could query my computer vs. look it up in a book!

              I have personally struggled in this space, I’ve spent countless hours and money on books, courses etc trying to learn C/C++ but just doesn’t jive with me for whatever reason, and it seems like I’m not alone. Perhaps it will “click” for me someday.

              1. 1

                I mean, I wouldn’t say it has to be bad.

                It wasn’t the C++ that I didn’t like, it was being run ragged by long debug sessions on things I didn’t care about like a program to play tic-tac-toe. If I had cared about programming or the language for its own sake, that might have changed the dynamic a little.

                ~7 years later (after the PHP project opened my eyes a bit to how code could fit into my projects) the language I ended up really cutting my teeth on was LPC, a C-alike used to write LD/LPMuds. It took a while for things to really click, so it helped to be working on something I found fun.

            3. 5

              Adding to that, it is good to begin solving a real problem from the get-go. Working on an imaginary problem may not sparkle authentic and sustainable curiosity.

            4. 20

              I’ve thought about this a lot actually, and I think the answer is probably high level at the very start and low level as soon as it’s feasible.

              It can be so empowering just to be able to run a one line script and see the computer do something, anything at all. I wrote a few medium sized Ruby programs when I first started but then I wondered pretty quickly about the internals of all that and felt compelled to learn about C pretty soon after. There’s just so much extra information you need to at least be aware of to understand how and why lower level languages do what they do and require f.ex header files and boiler plate etc, but if you’ve got some kind of hook into it (the scripting up front) it can be a lot easier.

              I think the exact structure of a course of learning is highly dependent on the individual and also up for debate in general, but I strongly think “both as soon as possible” is ultimately the correct approach.

              1. 2

                I came here to say similar things!

                I’m on the younger end of Gen X, and my own path followed this sort of approach. The first exposures I ever had to programming at all were with the BASIC language on Apple ][ machines at school and on an 8088-based PC (Tandy 1000 HX!) at home. The school exercises were pretty trivial (but important), and most of my BASIC on the PC at home was copying down and slightly-modifying listings out of 3-2-1 Contact (a print magazine for kids!). My next step after that, by chance, was diving straight down to x86 assembly (in hex no less, I didn’t have a macro/symbolic assembler until quite some time later!).

                I stuck with just x86 ASM for years up through high school: making graphics demos, writing text-mode GUI programs with pop-up dialogs and such, etc. I learned a lot about how to organize that asm code into files, modules, subroutines, etc out of necessity just to manage the complexity, and I learned a lot about hardware. Along the way I was also learning basic analog and digital electronics (first on my own from Radio Shack kits and Forest Mims books when I was younger, then later it was offered in High School as an elective I took for three years), which also dovetailed in nicely with the hardware side of things I was getting from the assembly language and “build your own PC” perspective.

                Near the end of high school, I took another elective course that taught programming in Pascal. I didn’t like it much at the time, but it was valuable! Shortly after that the explosion of the early Internet was happening (mid-90’s) and I was installing early Linux, learning Perl (my first “real” higher-level language) to write CGI scripts, etc. My first exposures to really using OO concepts was implementing them for myself on top of non-OO languages. Working in the *nix/Internet industry and on these types of software eventually led me to needing to do patchwork and bug-hunting in C source code, and it came pretty naturally and I eventually developed pretty decent C skills, and of course over time I was exposed to and absorbed many higher-level languages (e.g. C++, Ruby, Python, etc) and eventually became the sort of language-neutral programmer that’s willing to take a shot at any language on the fly as necessary, if I can find the documentation for it.

                I really think coming at it from both ends in a back and forth kind of process that eventually meets in the middle was critical to whatever successes I’ve had in this field. Starting with a higher-level language for the very first introductory experiences is probably easiest, though!

              2. 18

                IMHO (and in my very limited teaching experience) the most productive approach is:

                • If you’re trying to teach someone just enough programming to know what it means to program a computer, then a higher-level language is sufficient
                • If you’re trying to teach someone programming with a long-term goal of that person becoming a professional programmer, then you should teach both, and I don’t think it’s too relevant which one you start with

                That’s because, hidden under the friendly name of “basic programming”, there are actually five distinct things that you are actually teaching:

                1. “Algorithmic” thinking (formulate explicit description of computation actions in unambiguous terms – IMHO the “give explicit instructions that a machine can follow” approach is a very bad analogy that sets us back tremendously, but that’s a whole other discussion)
                2. Using formal, non-human languages (i.e. describing computation in a language other than English or the programmer’s native language – a language that is inherently less flexible and more obtuse than spoken languages).
                3. How computers work (i.e. how a machine runs your program, and how it obtains and presents its output).
                4. How to reason about programs (i.e. how to structure your code, how to adequately describe the data you’re using, how to think about solving a problem in terms of data structures and algorithms).
                5. The actual act of programming or “coding” – how to use a text editor, an IDE, a compiler, how to manage projects and so on.

                Thing is, there’s only so much stuff you can teach at once. 1), 2) are the biggest obstacles, so that’s where most of the teaching effort is going to be at first. 5) is ultimately no different than learning e.g. Word or Excel. 3) is absolutely required if you’re ever going to write programs worth running, an intuitive understanding of these things (which is fairly common in most developed countries, where children kindda grow up with computers nowadays) is sufficient at first. And if you’re just starting out, and you’re just curious, it’s unlikely that you’ll ever write programs complex enough to hit 4), at least not to a degree that’s relevant enough to worth spending too much time on it.

                Most higher-level languages tend to place less of a cognitive burden in terms of 5), mostly manage to avoid 3), and while they anecdotally take some discipline for 4), they’re also more forgiving and you have bigger fish to fry when you’re a beginner, anyway. This is a good thing: it enables you to teach people about functions, loops and classes without having to take nasty detours and tell them about the stack and the heap and explain what vtables are. All these things are essential if programming is what you do to put bread on the table, but they are ultimately implementation details – it’s important to understand some of them so that you can then keep up with all the other implementation details our industry keeps coming up with.

                If your long-term purpose is to help someone become a programmer, that’s somewhat different. Much like a broken clock is still correct twice a day, a programmer who doesn’t understand how computers work and can’t reason about programs will occasionally write good programs, but mostly by accident, as their ability to understand what’s relevant for a good program and what isn’t will be very limited.

                How exactly you teach them these things is a whole other story though. Students enrolled in a CompEng program have years to pour into studying these things, and you can generally approach it from several angles at the same time (i.e. you teach more than one class each semester). Finding “the right order” in which to teach all these things is probably NP-complete :). And I don’t think it’s that important or relevant, either. Ultimately, handling things that you don’t completely understand at every level will be part of any programming job, and always has been, really – I doubt that the EDVAC’s programmers truly understood all the murky details of mercury delay lines, for example, as there’s only so much you can get to know about fluid dynamics without eventually becoming an expert in fluid dynamics, rather than computer programming. Dealing with this abyss under your feet is an essential programming skill, too – the earlier you teach it, the better :).

                There’s always a “murky” ground – what about people who just want to learn enough programming to get an entry-level programming job? Unfortunately, most of these “code camp” jobs are in large corporations that insist on demanding university degrees before hiring someone and assigning them high school level tasks. I have no good solution for this problem. I think out industry handles this very poorly and most people who want this kind of job would be far better served going into a field that handles it better. Seriously, if you’re considering this kind of career change, becoming an electrician, for example, is far better, not only for your mental sanity, but often in terms of money and opportunities, too.

                1. 11

                  I think it is worth to teach a high-level language, because the focus is solving problems, not on von Neumann architecture. And I think that is the reason most of the courses (I did) and books (I read) on algorithms use pseudo-code, not even programming languages.

                  Thus, the small victories of solving increasingly more difficult problems tends to come easier in a high-level programming language than a low-level one.

                  If we compare to other fields, such as natural languages, one also learns high-level concepts first (words, then classifying them in nouns and adjectives, then verbs, and much, much later, get into grammatical details, such as direct object), then goes to the fundamentals of the language. That even happens for languages where the fundamental structure is utterly important to proper communication, such as those with grammatical cases.

                  However, there are situations in which knowing the underlying machinery (what I previously referred to as von Neumann architecture) helps to optimally resolve a problem.

                  In my experience, though, they usually come up later in the learning path of programming.

                  1. 7

                    Perhaps it makes sense to look at person’s overall experience.

                    I made a mistake earlier in my career to teach adults programming the same way I learned programming (I have a CS degree, my learning was very much formal computer science covering logic, computer architecture, algorithms, etc).

                    This was a big mistake.

                    Now, I think, the right approach is to evaluate person’s interests and overall professional or scholastic experience. And integrate those into the overall approach.

                    If a person studied literature and humanities, I would look at programming ecosystems that support so called ‘digital humanities’, and select language and tools that are the ‘highest level’ in that area.

                    If a person is into games, I would look for something in that area.

                    If a person worked in accounting, then spreadsheet-centric systems might be better…

                    and so on.

                    Low - level programming (eg at the level where hardware needs to be understood) – would suit for beginners that have background in things like CNC machines and other physical equipment.

                    I would call this style of teaching ‘connected style of learning’, there is probably already a better name for this.. though.

                    1. 7

                      Literally whatever the student thinks is cool about computers. If they love the web, start with web technologies. If they love games, start with games. Go after the real thing. Their excitement is the limiting factor, not your choice of material.

                      When you really have an intrinsic drive to learn something, no teacher has to coerce you into learning or threaten you with bad grades. Instead the teacher’s job is just to make space for learning and keep the A’s coming. You want to foster that drive by prompting them to follow their nose, and supporting them wherever that leads.

                      After practice takes hold of them, offer the benefits of theory. Theory expands their creative options for practice. Having practiced, they’ll understand what’s valuable about theory. Then you guide them to stay hungry for both and remain in that virtuous cycle.

                      1. 6

                        You might want to have a look at The Programmer’s Brain by Felienne Hermans, which talks about how newcomers learn and what approach to take.

                        The author has done a lot of work in this area and might provide you with some answers that you’re looking for.

                        1. 4

                          Both ends of the abstraction spectrum leads to some kind of deformation and such mindset is harmful if you have to do some job on the other end of the spectrum. To overcome this deformation you have to dig through all the layers of IT and get real experience with various levels of abstraction… this is a long term journey.

                          Students should start with the level of abstraction that is closer to the job, they will do. And then they have rest of their lives to learn the rest of the IT, as they need it.

                          1. 4

                            Personally, I don’t think it’s that important whether you’re teaching C or Python or JavaScript from a language perspective. The languages aren’t that different, a beginner can get going in any language (except for the really low-level languages like assembly; and I suspect having structured control flow is an advantage).

                            However, I think it’s very important to have simple, interesting input and output. Beginners often don’t want to learn programming for the sake of it; they often want to actually make something. The learning environment should make it easy to draw shapes to the screen and read the state of keys on the keyboard. It’s extremely satisfying, I think, to make a “game” where you have a circle on the screen which you can move around with WASD, and it’s unfortunate that most environments don’t really let you do that without an absolute ton of complexity.

                            I think your most important job as a programmer teacher is to find something for your student to do, something they’re interested in. And that usually involves something graphical. Show them how to create an HTML canvas element and draw to it from JavaScript. Or the basics of how to use Processing. Or how to create a window and draw to it in PyGame. Don’t just show them how to print to the terminal or read lines of text from the terminal; it’s extremely hard to capture someone’s imagination that way.

                            But regarding language, I think JavaScript is the right choice. Not because of anything particular to the language, but because it’s very easy to show other people what you’ve made. A person who has just learned programming can make a small “game” in JavaScript (drawing to an HTML canvas element), and you can help them put their site somewhere publicly accessible and the student suddenly has a link to show their friends. I think that’s very appealing, and might drive your student to do more and more interesting things, because they feel like they’re making something “real” which other people can also use.

                            1. 3

                              I remember trying to learn C++ for Win32 when I was a kid. The amount of boilerplate and int WINAPI wWinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, PWSTR pCmdLine, int nCmdShow);, just to get “Hello, World!” going made the threshold too high. After QBasic and Turbo Pascal, I started learning 16-bit and 32-bit Assembly instead, and it was much easier.

                              If I were to teach programming to a beginner today, I would start with Crystal, because:

                              • It has a human friendly Ruby-like syntax.
                              • It compiles to native. It’s conceptually satisfying to be able to go from a short program to an actual executable.
                              • No boilerplate is needed to get going with a real program.
                              • No #!/usr/bin/env python and # -*- coding: utf-8 -*-. No if __name__ == "__main__" (which are not strictly needed, but commonly used).
                              • No weird operator overloaded std::cout << ... << std::endl;.
                              • No (cdr (cons 'SUBJECT 'VERB)). Heads attached to tails is not an intuitive way to think about a list of three numbers.
                              • No abbreviated printf("%2$d %2$#x; %1$d %1$#x",16,17) (which is handy later on, but cryptic to a beginner).
                              • Going up one abstraction level in Crystal, to if or def does not require to keep track of indentation, curly brackets or parentheses. The block just ends with end.

                              After that, I would set up a fun domain that pointed to a Linode or Cloud server that was running a simple Crystal web server. Then I would set up a GitLab git repository with CI, so that when the kid made changes to the simple Crystal web server and typed git push, the domain would reflect the changes or they could check the CI web page if the test failed. This way they could experiment with the code and also easily share the results with friends and family, just by giving them the domain (“kids-name-says-hello.com”).

                              They would also learn modern development practices, but with the low threshold of only having to:

                              • Edit a very short Crystal source file.
                              • Learning to type git commit -a -m "Message" and git push.
                              • Check the CI pipeline web page for if there were any issues.

                              The next step could be to teach them very basic HTML (no CSS) and how to add images to the web page, and then how to play sounds if images are clicked on. Add silly noises and it’s bound to be a hit.

                              The next goal after that would be to draw colorful circles on that web page by using TypeScript.

                              With basic Crystal, HTML, git usage and TypeScript under their belt, they should have a solid start. Once they were ready for it, creating a small game with Crystal and SFML, or in TypeScript, would be a possibility.

                              Of course, all kids are different and are motivated by different things, but this is one of many possible ways I would envision that someone could get into programming and find it interesting and fun, with an emphasis on being able to share the results.

                              1. 1

                                No #!/usr/bin/env python and # -- coding: utf-8 --. No if __name__ == “__main__”.

                                I agree these things are weird but you absolutely don’t need them to start hacking around in Python.

                                1. 2

                                  I agree, but having to keep track of indentation levels is still a disadvantage of Python. At least in the very beginning, IMO.

                              2. 2

                                If I had to start a boot camp right now for software engineers, I would teach Go with a relentless emphasis on test driven development.

                                Go strikes the best balance between environment, package handling, git versioning, and cli of any other comparable low level language in my opinion. It comes with a robust but simple test framework out of the box, it’s static and compiled so students learn about binaries and types and lots of high level CS concepts with boots on the ground, and it has good reasons for all of its opinions that don’t need to be understood while learning but can be appreciated later as a good solution.

                                I wasted most of my first job in software not doing test driven design and regretting it once I discovered and realized its power.

                                1. 2

                                  The thing is that BASIC is kinda high-level in its own weird way. Compare PRINT and INPUT with printf() and scanf(). I think that helps, but unlike many high-level languages its execution model is still really simple: execute each numbered statement unless you’re told to do otherwise (by GOTO etc).

                                  Similarly, one of the benefits of teaching using pure functional programming is that you can teach the substitution execution model, and I’ve found that having students manually step through code is a great way to get the lightbulbs to come on.

                                  1. 2

                                    I think it’s the wrong question.

                                    A beginner should be taught a language that is simple enough they can learn essentially all of the language itself (or at least a useful subset) and exactly what the constructs mean as quickly and easily as possible, so that they can then spend their time learning how to program. Ideally you’d like to be able to teach the language itself in an hour.

                                    You also want a language that you can write arbitrarily large and complex programs in without bashing your head against the wall.

                                    Scheme is good. It’s very easy to teach just a handful of constructs which can be combined together in any way and it will always work as expected. It’s easy to describe the semantics using a substitution model.

                                    Some real machine/assembly languages are simple enough to do this.

                                    • 6502 or 6800 – 6800 is simpler but more annoying and is basically dead at this point, while the 6502 community is thriving and you can buy brand new 65C02s for under $10.

                                    • AVR. Tons of real hardware available cheap.

                                    • RISC-V RV32I. Less hardware but some of it is very cheap (e.g. Longan Nano board $5 including a small LCD display). Lots of software support, lots of simulators including GUI ones.

                                    • Maybe a simple flavour of 32 bit ARM. Thumb suffers from arbitrary restrictions and has toooo many instruction formats (as does RISC-V C extension) while original ARMv2 - ARMv4 has very few instruction formats (which is good) but the presence of conditional execution in every instruction and a shifted/rotated operand in most of them complicates matters right from the start. Obviously there are a huge number of boards available.

                                    • MIPS. Popular in the past, but zero reason to use it now there’s RISC-V with the same good features but none of the bad features.

                                    The difficulty with machine/assembly languages is that while it’s easy to understand exactly what any given instruction will do, it’s difficult to understand how to put them together to achieve something useful. The 8 bit ones are worse for this than the 32 bit ones both because it’s hard to deal with 16 bit or 32 bit data and because the shortage of registers (except AVR) forces you to learn about RAM early. Managing register allocation is one of the worst points.

                                    FORTH is easy to explain, but hard to use. Postscript might be better – the model for defining named variables and functions is easier, and you can draw pretty things with it. Logo and Smalltalk fall into this space too.

                                    BASIC meets the “can teach it in an hour” standard but … ugh. You hit the complexity wall in your own program code very quickly.

                                    Unix shell might not be a bad place to start. Arithmetic is annoying, but pipes are a powerful construct and it’s all very useful stuff to know anyway.

                                    Writing custom filters to fit into pipes is a good motivating example to use your assembly language (or other) code for. On Linux you can use binfmt_misc wth qemu or another emulator to transparently run ARM or MIPS or RISC-V or other machine code as if it was native code (just five or so times more slowly, which is often unnoticeable).

                                    1. 2

                                      I don’t think you can answer that properly for everyone right now, or it would be some kind of common knowledge.

                                      Let’s assume you take first year computer science students, who might be beginner coders. What’s the official goal here? Do some research, so the language they learn will not matter. Maybe it won’t even matter how well they code, depending on what they will do, maybe more math :P What is the actual goal? Produce an employable software developer. How high are the chances they will work with assembler versus some sort of high level language? I’m betting on high level language. (Last I checked universities were also going back and forth between imperative and functional, so maybe it’s between 3 and not 2 possibilities).

                                      I am not sure if there were a lot of actual studies of cohorts of students doing this or that, that would be the only thing I’d base some trust on, because every coder you ask will have an opinion and maybe some selection bias because it “worked for them”. But I think in the last 20 years there are less and less people who started bottom up (with assembler or C) and most started top down (with JS or python or PHP). Are they worse programmers? Could they have learned better? I don’t know.

                                      1. 2

                                        What is the actual goal? Produce an employable software developer.

                                        At least in the case of Germany, the goal of Computer Science education is not to produce software developers, because if it were then what they do would be woefully inadequate. What they try to teach you is to become a computer scientist. Of course, many computer scientists become software developers but these with little to no experience outside university aren’t particularly better skilled than those that chose a different education path (like FH, self-study etc).

                                        So what would be the learning goal for a computer scientist? It sort of depends on what the subject is:

                                        1. Technical computer science: Things like allocation algorithms etc. makes obviously more sense in lower-level languages.
                                        2. Theoretical computer science: Things like recursion, complexity classes etc make more sense in higher-level languages.

                                        It really depends what you actually want to teach.

                                        1. 2

                                          Thus my distinction. I firmly believe that even the universities have grasped the reality that 90%? 75%? of people who finish will not end up in academia.

                                          Also I’m not even saying they’re doing a good job of either “goal”, but it is what it is.

                                          1. 1

                                            Much much higher than that if you’re talking about undergrads; generally CS grad students will know how to program coming in.

                                            1. 1

                                              Should’ve clarified I come from an ancient time where we didn’t have those things at all German unis ;) I have one of last diplomas, a 9-12 semester degree which is like BSc+MSc. So I never think about a Master’s when talking about “graduate students”.

                                      2. 1

                                        Why not both?

                                        Even high level langues leak low level details like 0.1 + 0.2 != 0.3.

                                        BASIC was interesting in that it had no extensibility or libraries. If you wanted to do anything not built in, it required raw assembly. That was fun!

                                        1. 1

                                          I say both.

                                          Learning high level programming gives you power, expressiveness and swiftness. Learning low level gives you certainty, insight and coherence.

                                          Solely learning highlevel programming is like learning to read without learning about the alphabeth. You can try to directly memorize the shape of each word but it’s much easier if you also learn what the characters mean and how they play together.

                                          1. 1

                                            I think scheme and lisp are more the exception from the rule, because it makes you think in terms of the lambda calculus which can be really eye opening and e think one at least should have looked into that anyways.

                                            As someone who started out with higher level languages I’d argue a bit with lower level languages because they on average are simpler (but maybe not easier) and being lower level tend to cause you to learn about computers in a more practical manner.

                                            A lot of CS you will come across you may have learned about in a theoretical way and I think if you do that on the side by understanding a language you might not waste a lot of time at some point mentally mapping that theory to praxis.

                                            But whatever you do, you cheat yourself if you only end up learning some library instead of some language. With lower level languages you might still need to grasp more of non-library stuff, but it depends.

                                            So whatever you do, the important thing is to not go beyond fundamentals without grasping them, like really understanding them and not just “makes sense”. The reason is that if you build up on misconceptions there it can be very hard and annoying to unwind that and at times you might really kill motivation to learn anything. Keep in mind it’s all Turing machines, no matter how far up you are.

                                            But minds work differently so what works for you might not work for another person.

                                            1. 1

                                              I’m of the mind a mix of both is good. The first programming class I took used Python, which made it easy to translate my thoughts into a working program. Then, pretty early on in my CS track one of my classes had several assignments where we had to interpret assembly (I forget the instruction set). That got me to “think like a computer” and understand how my programs worked more than any other class I’ve taken.

                                              1. 1

                                                In one-on-one or in very small groups, I think the best choice is to teach a language that will best help the learners in their goals and interests. In larger groups or classroom settings I lean toward teaching a language that has a concept of pointers and manual memory management, but in the end I think the language doesn’t really matter.

                                                I’ve been a teaching assistant for my university’s intro CS class for nearly 3 years now. We currently teach C++. For the majority of the students the course is their first exposure to programming. As a teaching assistant I focus on answering student questions, so I every day I see what topics the students struggle with. Every year the biggest conceptual hurdles are functions, classes/objects, and pointers. But by the end of the semester, nearly every student figures the hard topics out.

                                                I started programming by teaching myself JavaScript, Python, and ActionScript, then APCS with Java in high school. It wasn’t until I took this intro C++ class a few years ago that some concepts in Java and Python suddenly made sense to me. Having an understanding of pointers and memory helped me make more sense of the higher-level languages that I had learned previously.

                                                I wish I had learned C or C++ sooner. I think some concepts in Python and Java would have been more clear to me had I known about memory management. But memory is only one part of programming. For understanding control flow and expressing intent in a computer language, I don’t think the programming language matters. The learners will always learn something useful about programming that will make learning their next language easier.

                                                1. 1

                                                  OTOH, it wasn’t until I learned some assembly that I actually understood pointers in C.

                                                2. 1

                                                  I personally think it is much more helpful to teach them “computers stuff” in general first without any specific language. By which I mean, networking and operating systems concepts ( Linux specific probably the most helpful). I think it is a very missing element of a lot of new programmer’s knowledge. It does not have to be a CS degree deep dive, but understanding the environment that computers run in goes a very long way, since after all, they are learning to program a computer.

                                                  1. 1

                                                    In all things, whatever gets to the desired result the quickest. Expressed another way, get an end-to-end process going as quickly as possible. Go for the Dopamine hit.

                                                    Here, the desired result is “teach programming” so avoiding the incantation and ritual of compilation is probably a good idea.

                                                    I learned programming first with BASIC and then more seriously with PHP before entering college where most of the first two years were taught in Java then MIPS assembly before language hopping based on coursework (C, Scheme, Smalltalk, Prolog, Python, Bash). Nowadays, I’d advocate teaching Ruby or Python for basic principles then expand into compiled languages by going for Crystal to stay in the bracketless world or to Java or C# to build ALGOL-family language familiarity while teaching concepts such as static typing, build workflows, etc.

                                                    1. 1

                                                      I worry that “high-level” and “low-level” are unhelpful terms here, at best. Also, it misleads people into thinking that they must specialize in a certain direction. Instead, teach neophytes a comprehensive selection of many different languages. The traditional selection is a Forth, a Smalltalk, a Prolog, a ML, and an ALGOL descendant. When I did this, I used GNU Forth, GNU Smalltalk, GNU Prolog, Haskell, and Python 2; I think that today I would recommend Factor, E, miniKanren, OCaml, and Python 3.

                                                      1. 1

                                                        either is fine

                                                        1. 1

                                                          I think it simply depends on the person.

                                                          I learned assembly when I was a kid (the syntax is trivial), and I think it has helped me to understand how computers work in general. This in turn has helped me to understand higher level languages. Some high level concepts are still hard for me to grasp. I don’t think I would be able to learn it the other way around. I find it easier to follow the path of how things evolved (e.g. this particular set of details create a situation when some general thing happens, that’s why we need to change those details so the things doesn’t happen anymore).

                                                          When going the other way around, from high level to low level, there’s often the case that we need to throw away our assumptions completely and start over, because some abstraction has been created in the sole purpose of creating an illution of something. I would find that discouraging, because when learning something about an abstraction, I would never be sure I won’t have to throw it all away.

                                                          But I know that some people want to know the purpose first, and the mechanism later. So I guess the OP’s question is pretty subjective, and both styles of teaching should be available.

                                                          1. 1

                                                            I think low level. It you are just the slightest bit serious about becoming a programmer, there is no possible way to skip the fundamentals. Which are very much embodied in lower level languages. There’s no point into starting from the end, only to be required repeatedly to dig deeper in an unstructured and random manner.

                                                            Finding an interesting project is important, and if higher level languages are better at this, the. Maybe learn them in parallel. Or go throught the fundamentals as quick as possible so you can jump on the engaging project.