1. 7

    Down under the level of assembly language is the level of hardware design diagrams. And under those are transistors. And down under that is the level of magnetic currents and fluctuations. And if you go down much further we don’t have language for the levels and the problems any more — my understanding bottoms out even before “quantum physics” — but even lower levels still exist.

    I like the article but this bugged me. It’s largely irrelevant what’s below that which can be achieved with assembly language, because that isn’t programming. You can’t change the hardware design with code (FPGAs aside…) so it’s not relevant to the discussion.

    1. 8

      It really depends on the context. If you’re a programmer for an embedded system, selecting an appropriate MCU for the task is arguably a fundamental responsibility of the job depending on how your company is organized.

      If you’re locked into the x86 world, you can report issues to AMD and Intel and it’s frequently possible for a fix to be delivered in a microcode patch.

      Edit: Also, to the fundamental argument of “hardware isn’t programming so it’s not important,” you should understand, at a minimum, one layer below the lowest layer that you work on. Otherwise, how will you know if your layer or the layer below it is misbehaving when you’re debugging a problem?

      1. 4

        I concur that understanding layer N-1 is a good idea. That, and having a deep understanding of what can go the application code. It’s very helpful to understand what sorts of “magic” can happen, and how to de-mystify them.

      2. 4

        I like the article but this bugged me. It’s largely irrelevant what’s below that which can be achieved with assembly language, because that isn’t programming. You can’t change the hardware design with code (FPGAs aside…) so it’s not relevant to the discussion.

        That can still be relevant: how CPU caches behave is super relevant for performance, but you can’t change how they behave. And even beyond that, Rowhammer showed us that even very low-level details like how closely chips are packed can be influenced all the way up from high level languages, even JavaScript. Even something as super-low-level like cosmic rays striking your RAM causing bits to flip might even be something that could crop up from time to time (and you could error-detect/correct it in software), especially if you’re writing code to run on spacecraft.

        1. 2

          Sure, but they’re not really relevant to the points that are “You should learn C”, “You should learn asm”, “You should write an OS” etc. There’s no “You should build an x86 machine from scratch”, because it’s not about programming, that’s electronic engineering. I thought about cache optimisation as part of that which can be achieved with assembly, which is relevant; hardware design diagrams are, for 99.99% of cases, not.

          1. 4

            Our main technical university teaches all those things. In my country you can’t become a (formally educated) web dev without learning about electronic engineering (it’s in the name even).

            1. 1

              It sounds like your country is doing it wrong.

              1. 1

                Sounds right to me

                1. 1

                  Well the school isn’t for web developers, it’s for engineers and computer scientists. However it just so happens that a lot of them become web developers because that’s relatively lucrative and a popular avenue at the moment.

                  The school does some things wrong but overall it’s a great, intensive program.

                2. 1

                  What university is that?

                  1. 2

                    https://www.fer.unizg.hr/en

                    I should mention I was wrong, there are some more vocational schools here where you indeed don’t learn about most of that stuff, they just teach you to code in something practical.

            2. 3

              Realistically, you can’t change the kernel or the web browser either.

              In a sense, it is useful to define software as something you can change, reclassifying the kernel and the web browser as part of hardware. Depending on your situation, Django or Rails also can be the hardware. While it is sometimes useful to know about the hardware, I agree it is mostly irrelevant for people working on software.

              1. 7

                You can realistically change the kernel and/or the browser, or more realistically write patches and improvements for the existing code. That’s not a realistic option with your processor.

                1. 0

                  No, you can’t. Maybe you are submitting your application to App Store, maybe your web app needs to run on iOS Safari. I guess you could apply to Apple, but then, you could also apply to Intel.

                  1. 7

                    Have you never compiled your own kernel or run Greasemonkey?

                    You very much can change both the browser and kernel. A locked-down platform means the platform is wrong, not the person attempting to modify it.

                    1. 1

                      If you feel like it, read iOS instead of the kernel and Safari instead of the web browser. My point still stands.

                      iOS and Safari are as unmodifiable as Intel CPU for me. That one is hardware and the other is software matters little.

                      1. 3

                        Using a CPU you can’t modify is unavoidable for anyone using a computer. Avoiding using iOS or Safari is very easy in comparison.

                        1. 10

                          You two are talking past each other: one of you is speaking corporate engineer at work and the other is speaking FOSS zealot. Both perspectives are valid, but neither is complete.

                    2. 3

                      All kinds of apps and updates get into the App Store. It’s a multi-billion dollar business generated by 3rd parties. Whereas, Intel’s silicon is totally made by them with no 3rd-party patches. Your comparison might work for microcode. They seem to make those changes on their own. Sparingly, too.

                      One can definitely get a valuable update into a web browser or onto Apple phone before making a change to pre-deployment or post-deployment silicon. Software shouldn’t be thought of like hardware unless it’s software nobody can change or understand. Closest comparisons are mission-critical, legacy systems in enterprises and governments.

                2. 2

                  But there are machines, like the PERQ where you can change the machine code.

                1. 40

                  A note to our younger readers, please ignore lists like these. People who’ve been around the block will know to just ignore this nonsense, but younger people may take it seriously.

                  The only reasonably accurate statements about life, in this context, are

                  1. The world is full of people giving free advice. The advice may even have worked for them, at some point, even not by chance.
                  2. You have to figure out your strengths, weaknesses and favorites yourself. This is a lifelong process. They change all the time. You should keep looking.
                  3. You have to keep learning to ensure you remain relevant.
                  4. You will always find someone who will help you, but you have to ask.
                  1. 17

                    I feel like we read different lists, because this one didn’t read as any prescriptive advice or something that people shouldn’t take seriously. It was basically telling students they don’t know as much as they think they do and they can learn from people in the industry. I would say that is 10000% good advice.

                    1. 14

                      The problem with this format is that it says a lot of what’s wrong but doesn’t provide any way to learn what’s “right”, or even a clue about how to get that experience. I agree it reads like someone who’s angry, whether the author intended it that way or not.

                      1. 2

                        this format is that it says a lot of what’s wrong but doesn’t provide any way to learn what’s “right”

                        That’s just the format of this sort of post, exemplified by the original(?) “falsehoods programmers believe about time”.

                        1. 14

                          The original was “falsehoods programmers believe about names”, which gets a pass for being the first. The format is kinda junk for actually learning about the problem domain, so if you want to make a good Falsehoods you really should explain why.

                          1. 4

                            imo in addition to the lack of explanation, “falsehoods X believe about Y”-style articles are toxic because they’re a convenient way to snipe at others. With this post, the author can hide behind “being helpful” while ascribing a bunch of naive/incorrect/otherwise bad beliefs to a huge class of people.

                            1. 4

                              I think they should at the very list include pointers for learning why it’s false.

                              Like in my ultimate snowclone, a list of falsehoods about falsehoods. ;)

                            2. 7

                              Sure I understand that, but I’m saying it’s a bad format for conveying information.

                              I think the original may have been pithy and effective. But it got mimicked, and it’s become ineffective.

                              If the goal is to appear smug and create clickbait, then it’s effective :)

                          2. 2

                            My opinions

                            1. A list of things you shouldn’t believe is advice
                            2. This list is full of absolutes that shouldn’t be taken seriously.
                          3. 5

                            People who’ve been around the block will know to just ignore this nonsense, but younger people may take it seriously.

                            Unfortunately this list is pretty accurate when it comes to what undergrads (in my experience) are being taught along with other falsehoods like “MD5 is a good password hashing algorithm” if your classes even get that far.

                          1. 3

                            “lots of wimpy CPUs is just wimpy” is definitely context dependent. I did some work in HPC (a lot of C, C++, and Fortran programs may be considered “legacy” by some) and much involved the Xeon Phi CPUs. These are just “wimpy” 1.4 GHz cores, but with 64-72 cores along with loads of memory on each die, if your application needs lots of communication between parallel threads they aren’t particularly wimpy at all.

                            1. 2

                              BlueGene and Cell were both comprised of “wimpy” cores and simultaneously very successful in certain HPC applications.

                              I think what architects learned from that, though, is that faster single-threaded performance is easier, and therefore more productive, for the vast majority of programmers.

                              1. 3

                                yes, most desktop/mobile/web software is “wait for someone to press something then do the work for them”, until they get lots of customers when it’s “do this independent work for as many people as possible”. Those benefit from non-wimpy cores and don’t need great synchronisation.

                              2. 1

                                Tilera has long been in that space, too. Graphics cards used general-purpose could probably count. Adapteva shipped some, too. Moore et all have their Forth chip for doing it low-energy.

                                It does depend on context, though.

                              1. 26

                                Something clearly got this author’s goat; this rebuttal feels less like a reasoned response and more like someone yelling “NO U” into Wordpress.

                                Out of order execution is used to hide the latency from talking to other components of the system that aren’t the current CPU, not “to make C run faster”.

                                Also, attacking academics as people living in ivory towers is an obvious ad hominem. It doesn’t serve any purpose in this article and, if anything, weakens it. Tremendous amounts of practical CS come from academia and professional researchers. That doesn’t mean it should be thrown out.

                                1. 10

                                  So, in context, the bit you quote is:

                                  The author criticizes things like “out-of-order” execution which has lead to the Spectre sidechannel vulnerabilities. Out-of-order execution is necessary to make C run faster.

                                  The author was completely correct here, and substituting in JS/C++/D/Rust/Fortan/Ada would’ve still resulted in a correct statement.

                                  The academic software preference (assuming that such a thing exists) is clearly for parallelism, for “dumb” chips (because computer science and PLT is cooler than computer/electrical engineering, one supposes), for “smart” compilers and PL tricks, and against “dumb” languages like C. That appears to be the assertion the author here would make, and I don’t think it’s particularly wrong.

                                  Here’s thing though: none of that has been borne out in mainstream usage. In fact, the big failure the author mentioned here (the Sparc Tx line) was not alone! The other big offender of this you may have heard of is the Itanic, from the folks at Intel. A similar example of the philosophy not really getting traction is the (very neat and clever) Parallax Propeller line. Or the relative failure of the Adapteva Parallela boards and their Epiphany processors.

                                  For completeness sake, the only chips with massive core counts and simple execution models are GPUs, and those are only really showing their talent in number crunching and hashing–and even then, for the last decade, somehow limping along with C variants!

                                  1. 2

                                    One problem with the original article was that it located the requirement for ILP in the imagined defects of the C language. that’s just false.

                                    Weird how nobody seems to remember the Terra.

                                    1. 3

                                      In order to remember you would have to have learned about it first. My experience is that no one who isn’t studying computer architecture or compilers in graduate school will be exposed to more exotic architectures. For most technology professionals, working on anything other than x86 is way out of the mainstream. We can thank the iPhone for at least making “normal” software people aware of ARM.

                                      1. 4

                                        I am so old, that I remember reading about the Tomasula algorithm in Computer Architecture class and wondering why anyone would need that on a modern computer with a fast cache - like a VAX.

                                      2. 1

                                        For those of us who don’t, what’s Terra?

                                        1. 2

                                          Of course, I spelled it wrong.

                                          https://en.wikipedia.org/wiki/Cray_MTA

                                    2. 9

                                      The purpose of out of order execution is to increase instruction-level parallelism (ILP). And while it’s frequently the case that covering the latency of off chip access is one way out of order execution helps, the other (more common) reason is that non-dependent instructions that use independent ALUs can issue immediately and retire in whatever order instead of stalling the whole pipeline to maintain instruction ordering. When you mix this with good branch prediction and complex fetch and issue logic, then you get, in effect, unrolled, parallelized loops with vanilla C code.

                                      Whether it’s fair to say the reasoning was “to make C run faster” is certainly debatable, but the first mainstream out of order processor was the Pentium Pro (1996). Back then, the vast majority of software was written in C, and Intel was hellbent on making each generation of Pentium run single-threaded code faster until they hit the inevitable power wall at the end of the NetBurst life. We only saw the proliferation of highly parallel programming languages and libraries in the mainstream consciousness after that, when multicores became the norm to keep the marketing materials full of speed gainz despite the roadblock on clockspeed and, relatedly, single-threaded performance.

                                      1. 1

                                        the first mainstream out of order processor was the Pentium Pro (1996).

                                        Nope.

                                    1. 3

                                      Christ, this discussion is way beyond me. I’d still really like to understand it though. Anyone available to compsplain?

                                      1. 4

                                        Back in the day, there was a great series of articles by Jon Stokes on Ars Technica that covered a lot of microarchitectural concepts like this. You can find them with google or buy them compiled into a book: https://nostarch.com/insidemachine.htm

                                        1. 4

                                          I managed to find a collection of his explanatory articles, of which the two-part series “Understanding the Microprocessor” seemed to best match what you described.

                                      1. 9

                                        This post follows the rule of headlines perfectly: when a headline asks a question, the answer is always no.

                                        I am not completely convinced that the article’s stated reason is “the reason”, or that the solution is “the solution”. But it’s a good start. And it’s one of my specific pet peeves, so I’m going to share. My general pet peeve is when people of all stripes behave as though they’re unique snowflakes and the rules that govern all other work does not and can not apply to them. My specific pet peeve is when programmers play this “we do black magic and therefore need to be treated specially” game.

                                        It drives me nutty.

                                        The solution presented here is a good start. We should all endeavour to implement this.

                                        1. 2

                                          This post follows the rule of headlines perfectly: when a headline asks a question, the answer is always no.

                                          In case you’re wondering, it’s called Betteridge’s law of headlines.

                                          1. 1

                                            You might consider why this is your pet peeve and it “drives [you] nutty.” Does it bother you when other people ask for understanding and accommodation in how they work because you aren’t getting your needs met at work?

                                            Also, I’m downvoting this as trolling and will do the same for any other comment I see on lobste.rs that uses “special/unique snowflake” because we’re all intelligent and articulate enough to communicate without obvious epithets, and the use seems to always precede or be interlaced in some insensitive/dismissive rant.

                                            1. 2

                                              Interesting. I hadn’t considered “snowflake” to be an epithet necessarily, more tongue in cheek. Thanks for pointing this perspective out; I will keep it in mind.

                                              1. 1

                                                I should also answer your question, because I think it’s a good one.

                                                The reason it drives me nutty is because my (especially recent) experience is the people who are demanding Complete Silence because they are doing Very Serious Work are also doing at least one (but usually both) of the following: 1) over-engineering and making complicated messes of fairly simple things, and/or 2) implying (or worse) that those who do not demand Complete Silence are not doing Very Serious Work.

                                                I realize that not everyone will share this experience, but we’re all shaped by our experiences to some extent, and this is mine.

                                            1. 4

                                              I’m skeptic, but I think they can pull it off.

                                              In the end, they only need to reach half of Intel’s performance, as benchmarks suggest that macOS’ performance is roughly half of Linux’ when running on the same hardware.

                                              With their own hardware, they might be able to get closer to the raw performance offered by the CPU.

                                              1. 7

                                                they only need to reach half of Intel’s performance, as benchmarks suggest that macOS’ performance is roughly half of Linux’ when running on the same hardware

                                                I’m confused. Doesn’t that mean they need to reach double Intel’s performance?

                                                1. 11

                                                  It was probably worded quite poorly, my calculation was like:

                                                  • Raw Intel performance = 100
                                                  • macOS Intel performance ~= 50
                                                  • Raw Apple CPU performance = 50
                                                  • macOS Appe CPU performance ~= 50

                                                  So if they build chips that are half as fast as “raw” Intel, but are able to better optimize their software for their own chips, they can get way closer to the raw performance of their hardware than they manage to do on Intel.

                                                2. 7

                                                  Why skeptic? They’ve done it twice before (68000 -> PowerPC and PowerPC -> Intel x86).

                                                  1. 4

                                                    And the PPC → x86 transition was within the past fifteen years and well after they had recovered from their slump of the ‘90s, and didn’t seem to hurt them. They’re one of the few companies in existence with recent experience transitioning microarchitectures, and they’re well-positioned to do it with minimal hiccups.

                                                    That said, I’m somewhat skeptical, too; it’s a huge undertaking even if everything goes as smoothly as it did with the x86 transition, which is very far from a guarantee. This transition will be away from the dominant architecture in its niche, which will introduce additional friction which was not present for their last transition.

                                                    1. 2

                                                      They also did ARM32->ARM64 on iOS.

                                                      1. 3

                                                        That’s not much of a transition. They did i386 -> amd64 too then.

                                                        (fun fact, I also did that, on the scale of one single Mac - swapped a Core Duo to a Core 2 Duo in a ’06 mini :D)

                                                        1. 1

                                                          My understanding is that they’re removing some of the 32-bit instructions on ARM. Any clue if that’s correct?

                                                          1. 1

                                                            AArch64 processors implement AArch32 too for backwards compatibility, just like it works on amd64.

                                                            1. 1

                                                              As of iOS 11, 32-bit apps won’t load. So if Apple devices that come with iOS 11 still have CPUs that implement AArch32, I’d guess it’s only because it was easier to leave it in than pull it out.

                                                              1. 1

                                                                Oh, sure – of course they can remove it, maybe even on the chip level (since they make fully custom ones now), or maybe not (macOS also doesn’t load 32-bit apps, right?). The point is that this transition used backwards compatible CPUs, so it’s not really comparable to 68k to PPC to x86.

                                                                1. 1

                                                                  I of course agree that this most recent transition isn’t comparable with the others. To answer your question: the version of macOS they just released a few days ago (10.13.4) is the first to come with a boot flag that lets you disable loading of 32-bit applications to, as they put it, “prepare for a future release of macOS in which 32-bit software will no longer run without compromise.”

                                                    2. 3

                                                      I didn’t know this. Do you know which benchmarks show macOS at half of Linux performance?

                                                      1. 3

                                                        Have a look at the benchmarks Phoronix has done. Some of them are older, but I think they show the general trend.

                                                        This of course doesn’t take GPU performance into account. I could imagine that they take additional hit there as companies (that don’t use triple AAA game engines) rather do …

                                                        Application → Vulkan API → MoltenVK → Metal

                                                        … than write a Metal-specific backend.

                                                        1. 1

                                                          I guess you’re talking about these? https://www.phoronix.com/scan.php?page=article&item=macos-1013-linux

                                                          Aside from OpenGL and a handful of other outliers for each platform, they seem quite comparable, with each being a bit faster at some things and a bit slower at others. Reading your comments I’d assumed they were showing Linux as being much faster in most areas, usually ending up about twice as fast.

                                                      2. 3

                                                        The things they’re slow at don’t seem to be particularly CPU architecture specific. But the poor performance of their software doesn’t seem to hurt their market share.

                                                      1. 17

                                                        Honest question, if not Stack Overflow, where to get help from? Sometimes I don’t have to post anything, the existing questions already solve my problem. I can’t think of any other community where I can get help from. Reddit works sometimes, but not always. Related IRCs work, but get lost in other noise. So, where?

                                                        1. 15

                                                          That email is from the openbsd-misc mailing list. Lots of open source projects have their own lists where you can get help. C++ people still use usenet (comp.lang.c++).

                                                          1. 36

                                                            From my experience (not talking about OpenBSD), a lot of those mailing lists don’t provide a user/developer support role, and are often far more toxic than StackOverflow.

                                                            and in a lot of cases, mails or posts simply go unanswered in dedicated project support channels.

                                                            You can say what you want about StackOverflow, and a lot of the problems mentioned here and in other discussions are real and serious problems, but they still have a huge body of useful information for a lot of problems people encounter.

                                                            1. 3

                                                              Usually a busy-ish open source project will have several mailing list channels. One is typically dev’s only chatting about patches and the like, one is for announcements only and one is for users to chit chat.

                                                              Make sure you choose the right one. If it’s “How do you do this?” type questions always go to the user one.

                                                              If it’s a “I think there is a bug in…” make sure you have a good repeatable shortest possible test case in hand and then try the devs list.

                                                              Even better than a nice neat repeatable test case, is a nice neat repeatable test case and a (small) patch off the mainline that fixes it.

                                                              If you say something like, “Your code is crap. It doesn’t work in my companies million lines of proprietary spaghetti which you can’t look at”… Yup. Count yourself lucky if your question goes unanswered. Sometimes the toxins are there to kill stupid.

                                                              Always show some signs that you have, indeed, Read The Fine Manual, such as exists and maybe the unit tests for the functionality you using.

                                                              I have pretty much near 99% success rate in getting excellent answers from every open source mailling list I have interacted with.

                                                              Be prepared to read code, some of the best answers come in the form, “Ah, I think that’s handled somewhere such and such a file… Have a look at the comments and the test cases for function …”

                                                              Be prepared for the answer to be, yup, it’s fixed in version x.y

                                                              1. 1

                                                                That all sounds like a lot of mental load to get a quick answer that’s blocking my work.

                                                                1. 1

                                                                  He who asks low (or no) effort questions should expect low (or no) effort answers.

                                                                  However, friction and entropy exist in everything so that should be…

                                                                  He who asks low (or no) effort questions should expect very low effort answers if they’re lucky, snarks if they aren’t.

                                                          2. 4

                                                            Stack Overflow is often references more than official documentation, and it’s way way better than what we previously had: Xperts Exchange (which had the answers at the bottom, but was setup so it looked like you had to pay to see them).

                                                            They might have their issues, but I still have found the Stack Exchange sites really useful. Until I read this post, I wasn’t even aware of the massive deletion problem. I don’t think any of my posts have been deleted, but there’s no way to know for sure.

                                                            1. 29

                                                              As someone who was in the position of the child not so long ago - please don’t do this*. Giving children without any explicit interest to learn about these things gifts trying to initiate some interest will fail for both 90% of the time. Sure, most people here would have loved (or were fortunate enough) to have been given technological gifts when they were children, but that’s easy to say now. If on the other hand you would have been giving something you had absolutely no interest in, or unfortunately no capacity to learn at that age, say a dictionary of ancient greek, an introduction to advanced arctic-geology, the collected works of Hegel or socks, and you know on some level that the person giving you the present is hoping for you to be as happy about it as they think they would have been - well that kind of “pressure” (for the lack of a better word) is not really a nice present, even if it was unintentionally. On the other hand, from the side of the person who gave the gift, unless you enjoy disappointment, you won’t feel much better either.

                                                              *: I’d like to clarify that I’m not trying to universally condem any gifts with the intention to boost a childs interest in some subject - just be sure that he or she has a potential to understand it, and know her or him good enough to be sure that they are the kind of person to be interested in it. Not every present is appropriate for every child. Thinking about it twice will prevent you from becoming the person who is trying to force his interest on children and your present to just disappear in a cupboard indefinitely.

                                                              1. 7

                                                                +1 to this, as the parent of a 6-year-old.

                                                                We have Robot Turtles (as mentioned in another thread) and we’ve played it quite a few times, and she simply doesn’t find it compelling. This is not intended as a knock on the game, I’m sure it’s great for a lot of kids, but different kids like different things. I bought the card game SET, and she gets it and will grudgingly play it with me but insists that it’s boring and that she’d rather do something else. I bought her “No Stress Chess” and she learned how the game works and how the pieces move but decided she would rather act out little dramas with the king and queen and such.

                                                                I’ll keep trying more things, but you can’t force kids into any of this stuff. (Or at least you shouldn’t, is my belief.)

                                                                I would love it if she wanted to learn coding, but this year for Christmas she really wants a Barbie that turns into a mermaid and also into a fairy, so that’s what she’s getting. Maybe next year.

                                                                1. 2

                                                                  Edit: Previous comment didn’t really move the discussion forward, so here’s a new one.

                                                                  Can you make your comment more constructive? Answers to some of the following would really help.

                                                                  • Was there a certain approach, attitude, or expectation that put you off?
                                                                  • How was “gift to try and initiate interest” conveyed? If it had been conveyed differently, like “toy that might resonate with deeper interest”, would you have had a better experience? What would each of these approaches look like to you?
                                                                  • Is there a certain kind of gift/kit/etc that was too complicated/specialized/specifically about learning?
                                                                  • Was there an interest of yours that had been mistaken for an interest in programming?
                                                                  • Were there redeeming parts of your experience that could be illustrative for a better approach?
                                                                  • Any specific input on what “he or she has a potential to understand it” means as it relates to your experience, or that would surprise a casual observer?

                                                                  Surely there are ways to go about giving gifts that involve learning (not necessarily as a primary focus) that isn’t “pressure”.

                                                                  I feel like you have an interesting perspective to share, but it’s all hidden behind a dismissive post. Even if your experience was an unmitigated disaster, there is something you could offer beyond “don’t even think about doing this”.

                                                                  1. 4

                                                                    Was there a certain approach, attitude, or expectation that put you off?

                                                                    Not really “put me off “ - but I’d say that there was often an expectation that I already understood more than I did. In my case it was a electronics kit, but I didn’t know (and nobody told me (or at least I didn’t understand if if anyone did)) that electricity needs to flow in a circuit - and why should it? There’s only one wire from the plug hole to a lamp, why would this be any different?

                                                                    How was “gift to try and initiate interest” conveyed?

                                                                    To give an opposite example from my previous one - my grandfather, who was a professor of physics, once bought me some game (I can’t remember what it actually was, I was 5 y/o) that had to do with motors, moment and mechanics, etc. And he wanted to explain it all to me, but - not that I didn’t like it per se - but I just wasn’t interested in the physical stuff. There were little cut-out mammoths I found great delight with, and I remember my grandfather being disappointed to put it mildly that I didn’t want to play with the actual things…

                                                                    If it had been conveyed differently, like “toy that might resonate with deeper interest”, would you have had a better experience?

                                                                    … so it’s not really a problem of intention, or that’s at least not what I meant (I’m sorry if I was misunderstood). The issue just was that back then, I had e.g more interest in ancient animals than the laws of mechanics. So maybe it would have been different if I had an interest in physics, but for that I would have had to have had a basic understanding of the subject - without that - if all these things stay “mystical”, “magical” ideas beyond comprehension - I believe not much can be done to help the child develop an interest. So again, make sure the child is curious and capable (age and education wise) to engage with subject you want to introduce them to.

                                                                    Is there a certain kind of gift/kit/etc that was too complicated/specialized/specifically about learning?

                                                                    I’ve given examples already from my childhood, but for the most part I’d recommend not to give toolkits as first gifts. If one doesn’t have any idea what to do with it, or how to use it, it will either be forgotten or broken before one actually learns to use it properly.

                                                                    Was there an interest of yours that had been mistaken for an interest in programming?

                                                                    Well in my case it wasn’t programming, I had to teach myself all of that. Interestingly enough, I did always have a greater interest in things related to computers, but I guess my family were less interested in it, so they didn’t feel like supporting it. So the tip here would be to maybe transcend ones owns interest and actually try to support something the child actually likes.

                                                                    Were there redeeming parts of your experience that could be illustrative for a better approach?

                                                                    None of which I could think of spontaneously, I might edit the post later on if I come to think of something.

                                                                    Any specific input on what “he or she has a potential to understand it” means as it relates to your experience, or that would surprise a casual observer?

                                                                    “If the toy says ages 9-16, don’t give it to a 5 year old child” would be a good guideline. I’ve already implied it, but I’ll say it again, make sure the child’s first exposure isn’t this toy - 95% of the time this will go wrong, especially with younger children.

                                                                    Surely there are ways to go about giving gifts that involve learning (not necessarily as a primary focus) that isn’t “pressure”.

                                                                    Of course, the pressure I was talking about doesn’t (or at least in my case didn’t) come from the presents themselves, but the expectation from the people who gave them to me, to flourish or immediately develop a profound interest in the subject. I guess you could see this more as an attitude problem from the perspective of the gift-giver, but (depending of the child) he or she can feel that too. That’s the uncomfortable part, I really want children to be spared from.

                                                                    I feel like you have an interesting perspective to share, but it’s all hidden behind a dismissive post. Even if your experience was an unmitigated disaster, there is something you could offer beyond “don’t even think about doing this”.

                                                                    I apologize if my first commend was a bit too dismissive, I hoped my last paragraph would give the whole thing a positive turn, that’s why I added the footnote after the first sentence. But I hope I could clarify a few things now, and help you and anyone reading this with coming to an informed choice, when thinking about giving gifts with good intentions. Again, if it’s the right gift for the right person, it’s fantastic, but it’s not that easy to make sure that that is the case!

                                                                    1. 4

                                                                      if you are lucky enough to be able to work with the child and the gift, or you know their parents will be supportive, then you might create an interest, otherwise @zge comment is unforutnately the likely outcome - unless you know that they already have an interest in that area.

                                                                      however, if the gift is fun and doable by the child then it can be a real success - although, the age on the tin is not helpful, my youngest is 7 years younger than her older siblings and she has aways played with age inappropriate toys :~)

                                                                      my 2 pence worth from the perspective of a being a Dad :~)

                                                                      1. 4

                                                                        if you are lucky enough to be able to work with the child and the gift

                                                                        I wish I could edit the OP as this is exactly the case, and there has been expressed interest.

                                                                        1. 4

                                                                          If it’s practical to do so, why not take the child somewhere where sciency toys are on display and see what he/she gravitates towards? I think if the learning is initiated by curiosity in the child then it’s more likely to have lasting effects.

                                                                          I started taking guitar lessons when I was five years old because my granddad saw me staring at a guitar and he asked me if I wanted to learn (and I did). I don’t know how I would have reacted if I was just given an instrument as a gift without anyone asking beforehand what I thought about it.

                                                                  1. 1

                                                                    I generally agree with the observations except this one:

                                                                    Second, IT engineers by nature tend to be optimists, as reflected in the common acronym SMOP: “simple matter of programming.”

                                                                    Maybe I just hang out with the cynical crowd, but I’ve never heard a professional programmer use that phrase in a non-snarky way. How could we get such a genius character like Gilfoyle if IT people are generally optimists?

                                                                    1. 2

                                                                      I agree, I’ve only seen SMOP used sarcastically—but there’s still an optimism under it. It might be “we can do that in six months, not two weeks,” but there is always the concept of of course we can do it… underlying it.

                                                                    1. 3

                                                                      Really great article. I am reminded of an older post “Taco Bell Programming”[1] that trolls a bit but does a good job of getting the Unix philosophy point across…and can easily lead to the dangers elaborated in this post.

                                                                      I have indeed committed the sin of parsing stdout and stderr in ways I should not have from programs that were not meant for such things. I have been bitten. But I wasn’t crazy enough to put it in production!

                                                                      http://widgetsandshit.com/teddziuba/2010/10/taco-bell-programming.html

                                                                      1. 6

                                                                        I don’t see it that way. The author never makes a convincing case that Qt Creator hanging has anything to do with passing text between Qt Creator and gdb. You can pass error codes in any number of ways and you can set network timeouts even on command line tools. It seems like there was just an unfortunate bug in QtCreator that set him off on a rant.

                                                                        Also this:

                                                                        You see, on UNIX there’s GDB. That’s the debugger. That’s the only debugger. It’s very old and has had a lot of work put into it, and as a result it usually works pretty well, at least in terms of functionality. But on every other metric you measure software by, it kinda sucks.

                                                                        So by the metrics of completely being open source, costing 0 dollars, and supporting tons of targets, gdb “kinda sucks”?

                                                                        Despite every computer made in the past 40 years having a graphical display, GDB lives in a parallel universe where the framebuffer was never invented and we all still use teletype printers.

                                                                        gdb lives in a parallel universe where having a limited debug server that can run on lots of targets is really useful.

                                                                        1. 2

                                                                          I wasn’t arguing that point - gdb is a fine tool. My argument was a bit more general. There are many tools out there that don’t output in a way meant for automated consumption, and the output is subject to change at any moment because a contract was never declared. Unless formalized and specified outright and beforehand, text messages and log data is not an API and should not be treated as such. Those that write software around command line tools that don’t have output specifications available should be wary of using the tools that way. The strange thing as that as even though these tools are rewritten as open source in Linux and BSD, many go untouched and treated as black boxes. Why not crack open ssh and hook into lib calls instead? And if the code is not amenable to that, why not fork it and make it so?

                                                                          1. 3

                                                                            As mentioned by myfreeweb, GDB does have a full terminal interface (although enabling it is an obscure command I can never remember). Also, GDB has a documented protocol to communicate with it—they don’t parse the text output.

                                                                            The post is a rant about bad error reporting, but it doesn’t acknowledge that error reporting is both tedious and hard. For example, a routine to copy a file, given a source filename and a destination file name. First point of failure—can’t open the source file. Second point of failure—can’t open the destination file. Hard problem number 1—if the destination exists, is that an error? Or not? Do you allow the option to overwrite the destination? Okay, still can’t open the destination file, hard problem number two—you need to close the open source file (else you leak an open file descriptor), but the close can fail. If it fails, do you report that error? Or the original failure? Can the language you use even allow multiple error codes to be returned? Hard problem number three—how do you return which file had the error? Can your language do that? It doesn’t matter if your language just does return codes, or exceptions, it’s still the same issue—what if your exception handler throws an exception (and not via an explicit “throw”)?

                                                                            And I never did get to the actual copying of data either …

                                                                      1. 18

                                                                        Every time I talk to a recent grad I hear a vadriation of the phrase “I know how to code, I can code in anything”.

                                                                        The other way this fails is languages not derived from ALGOL. I had this attitude a few years into programming when I knew HyperTalk, Visual Basic, PHP, and Python. I was corrected by diving into SQL, assembly, PostScript, Haskell, esolangs…

                                                                        1. 7

                                                                          Algol-derived is a big one, but bigger is whether the language assumes mutable state is the default state of existence, and immutability is either inexpressible or tacked-on.

                                                                          For example, Lisp isn’t Algol derived. However, Algol and Lisp are more similar than different on the level of how data moves through a program, because they both assume mutable state is the default, and an unmarked default at that, with relatively weak (if any) support for immutable state tacked on later, if ever. The essential similarity between Lisp and Algol is really pointed up by Scheme, on one side, and Python, on the other.

                                                                          OTOH, declarative languages, such as SQL, and functional languages, such as Haskell, really break brains because their model of data is very different. Spreadsheets are another data paradigm: Automated data flow from cell to cell, in an implicitly parallel environment.

                                                                          1. 3

                                                                            I was surprised when I learned that XSLT is a pure functional language, mainly because I didn’t find it that hard at all (my website is generated from an XML file via XSLT). Verbose, hell yes. Hard? Not really. But in retrospect, I can see the functional nature of XSLT.

                                                                            1. 2

                                                                              I wonder how hard it is than to switch from Common Lisp to Clojure considering completely different take on mutability.

                                                                            2. 3

                                                                              APL derivatives, Lisplikes, Autohotkey, LabVIEW, Prolog, Excel… I’ve wondered what it would be like to construct a list of “basis” languages that cover all of the different forms of programming, and if such a list is even possible.

                                                                              1. 5

                                                                                Have you looked into programming language genealogy? I find the family tree diagrams especially fascinating. http://rigaux.org/language-study/diagram.html

                                                                                1. 3

                                                                                  @xmodem Is there a field of learning with literature for this ? Here is another programming language family tree that I see more often.

                                                                                  I too am fascinated by genealogy and the events that shape programming languages!

                                                                                  1. 4

                                                                                    Concepts, Techniques, and Models of Programming covers several programming paradigms. There’s a diagram from the book that shows their taxonomy.

                                                                                  2. 3

                                                                                    Me too! I created an ascii diagram for fun a few years ago: https://gist.github.com/ChadSki/f0be01dd2556f04753b1

                                                                                    1. 1

                                                                                      Where’s COBOL and BASIC in this? They each had huge impact. Main concept is that programming could be almost as easy as pseudocode for basic applications. A flawed idea but all kinds of people did productive things with it.

                                                                                      1. 2

                                                                                        COBOL is on line #4.

                                                                                        I think I omitted BASIC because it doesn’t influence enough other languages. Wikipedia lists Visual Basic, VB.Net, RealBasic (Xojo), and AutoHotKey. There aren’t any interesting crossovers with e.g. Lisp or ML.

                                                                                        1. 1

                                                                                          Darn, I don’t know how I overlooked it. Apologies. I guess BASIC could be omitted on that criteria as COBOL already did the Code Like English concept. It’s overall a nice tree of languages. +1 for text art. :)

                                                                                  3. 2

                                                                                    My attempt at that was to look at Wikipedia’s list of programming paradigms to find key languages for each. Filter out those that dont have FOSS implementation. Pick the best of them as far as stated complexity vs learning materials available. And you then have close to a list like what you’re looking for.

                                                                                    1. 3

                                                                                      In true Larry Wall style (Laziness Impatience Hubris)
                                                                                      Care to share your results?

                                                                                1. 11

                                                                                  If you read in between the lines, it appears that management was complacent to lay problems at Rick’s doorstop, and didn’t care that Rick and/or the team didn’t take time to document the problem and/or resolution.

                                                                                  …..

                                                                                  Instead of tackling the root cause of the issue (hey man, whats eating you?), they opted for the quick and easy fix (Hey Rick, GTFO!). Par for the course, as far as I can tell.

                                                                                  If you read actual text, you’d see that this was something the company already thought of:

                                                                                  I agree that the situation that came about was also his manager’s fault. He never should have been allowed to take on so much. If it gives comfort to anyone else reading this, the manager went first because ultimately management bears responsibility, always.

                                                                                  They then followed up with:

                                                                                  Rick rejected months of overtures by leadership. He refused to take time off or allow any work to be delegated. He also repeatedly rejected attempts to introduce free open source frameworks to replace hard-to-maintain bespoke tools.

                                                                                  As I mention in a comment on the original post, I’m surprised at how many people are kneejerk defending Rick. In this case, I’m embarassed for this poster who not only kneejerk defended him, but claimed additional insight into the story, all while ignoring the wealth of info provided by the original author.

                                                                                  Could he have provided this info in the original post? Sure. Why should he have to? What is so special about this particular “we fired a toxic team member” story that everyone is instantly certain they did it wrong? And unwilling to do even the least bit of additional reading about it?

                                                                                  Why does this story of Rick prompt such irrational, emotional responses?

                                                                                  1. 22

                                                                                    Why does this story of Rick prompt such irrational, emotional responses?

                                                                                    Explaining why someone was terminated within a company is a really delicate task. Doing so on the internet requires even more tact.

                                                                                    Comparing the terminated employee with a narcissistic, nihilistic, and downright crazy cartoon character doesn’t demonstrate much respect for the terminated employee or the seriousness of the situation. I think that’s the main reason the original article left a bad taste in my mouth.

                                                                                    1. 5

                                                                                      Thanks @davidholman, it’s bizarre to me that someone could think the comparison or even the title of the original blog post are any acceptable way for a manager to discuss other colleagues.

                                                                                    2. 8

                                                                                      Hey thanks for the reply, I did apparently miss something in the original - likely as it was hidden underneath the blanket of scapegoating. The “actual text” link from you is a completely different article however, that I had not yet seen.

                                                                                      To answer the question you pose at the bottom: it’s because many of us have been there. Either directly involved or on the sidelines. We’ve seen the personalities and the egos and the mismanagement. It’s a difficult subject. However I wouldn’t call the responses “irrational”. Emotional, yes, but those empathetic enough will relive their own personal experiences and react. I worked for a toxic company for several years, and saw some bad shit. I saw crazy nepotistic owners oversell the world and then fire those that they used after burning them out to the core. Those who survive take away an insight that we shouldn’t need. Ask me how many times a day I get to say “no” to some absurd request now :)

                                                                                      1. 3

                                                                                        The “actual text” link from you is a completely different article however

                                                                                        It’s a comment on the original article. Medium treats it as an additional document, but it’s eminently findable on the original article page.

                                                                                        However I wouldn’t call the responses “irrational”.

                                                                                        How is it rational? A rational response to “somebody I don’t know got fired” might be something like “did he deserve to be fired? I’ll look into that”, or “something sounds fishy about this story. If I feel the need to post my own essay response, it will be asking those questions and examining different ways they could be answered”.

                                                                                        Not “I’m now going to post a kneejerk rant against imagined management problems, because Rick deserved better!”. That seems textbook irrational to me.

                                                                                        those empathetic enough will relive their own personal experiences and react.

                                                                                        I did. I’ve been burnt by Ricks before. I suspect I’ll be burnt by Ricks again. My response is similar to the original article author’s: fix systemic problems where possible, train toxic people when possible, but fire toxic people who insist on remaining toxic.

                                                                                      2. 8

                                                                                        Could he have provided this info in the original post? Sure. Why should he have to? What is so special about this particular “we fired a toxic team member” story that everyone is instantly certain they did it wrong? And unwilling to do even the least bit of additional reading about it?

                                                                                        The guy who wrote the original article, which is 99% scapegoating “Rick”, is a manager who seriously thinks literally months of 12-hour days 7 days a week is a good idea: https://medium.freecodecamp.org/our-team-broke-up-with-instant-legacy-releases-and-you-can-too-d129d7ae96bb :

                                                                                        It took eight months of seven-day weeks and twelve-hour days to complete our last legacy system overhaul.

                                                                                        No wonder “Rick” burnt out.

                                                                                        1. 6

                                                                                          The title of the story is about firing rick, and being proud of it, not “we fucked up bad and we unfortunately had to fire someone”. The content of the article is 99% about how Rick was to blame for everything. I can’t even find the link you gave off the front page, I assume it’s nested somewhere in the content. So your claim that you just have to read the actual text and everyone is freaking out over nothing doesn’t jive with reality:

                                                                                          The author, as management, does not take responsibility in the original post.

                                                                                        1. 4

                                                                                          Interestingly, this is one of the questions that has a precise mathematical solution (when it is idealized of course). Essentially, if there is a fixed cost associated with owning a thing vs a cost for using it, each time period, you can break even (on average) if you buy the thing just after you have spent sufficient money on rent to have bought it in the first place.

                                                                                          1. 4

                                                                                            The problem you linked is trading off buying versus renting when the future use is uncertain, but most people expect to either own a home or pay rent every month until they die. The NYT calculator is mainly about calculating the NPV of two streams of payments and trading off the opposing opportunity costs (investing your down payment vs missing out on rising home values).

                                                                                          1. 4

                                                                                            Stupid question: can I just put the local prices (I don’t live in the USA) there and have some meaningful results? In other words: is there anything US specific?

                                                                                            1. 2

                                                                                              In the US, one can deduct mortgage interest paid on their primary residence from their income. The calculator factors this in, so you could set the marginal tax rate slider to 0% and that should remove it from the calculations if you don’t have an equivalent deduction in your country.

                                                                                              1. 3

                                                                                                I mean you can, but I haven’t paid enough interest to justify itemizing for years at this point. I think, like… two or maybe three years of my mortgage generated enough. Another year of my business generated enough. Mostly, though, it hasn’t been worth doing any deductions.

                                                                                                1. 3

                                                                                                  If you live in a state with state income tax then that deduction alone can put you over the standard deduction to start with, so the home mortgage interest will be added on top even if it’s small on its own.

                                                                                                  1. 1

                                                                                                    Wait, you can deduct state income tax? Do you know if online services like TurboTax take that into account when recommending whether you should itemize or not? I’ve never even attempted to itemize because I assumed it wouldn’t be worthwhile…

                                                                                                    1. 4

                                                                                                      Every few years, get a CPA to do your taxes. Find out what you’ve been doing wrong. Re-file. Then use the program for a few more years.

                                                                                                      1. 3

                                                                                                        Every tax program should handle that.

                                                                                              1. 4

                                                                                                Such was the state of C programming tutorials in 2007. Plenty of lies about heap, stack, global variables, and other made-up features not defined in the C spec. No mention of undefined behavior.

                                                                                                1. 6

                                                                                                  But those are all terms that experienced C programmers use commonly. Anyone wanting to learn the language would need to understand this to communicate with people who already know C and have used it for years.

                                                                                                  Also, a lie is an intentional falsehood. Accusing someone of putting intentional falsehoods in their free tutorial seems like an unnecessarily damning accusation to throw around.

                                                                                                  1. 3

                                                                                                    I would have phrased it differently, but I think there’s a big difference between learning C programming as a concept, and then noting there are some practical implications for real world implementation, vs trying to teach a particular implementation. People who focus on “what really happens” ironically seem to make the most mistakes. More chances to go astray I think. For instance, calling static variables heap variables seems very error prone given the common advice to also free heap memory.

                                                                                                    1. 2

                                                                                                      But those are all terms that experienced C programmers use commonly. Anyone wanting to learn the language would need to understand this to communicate with people who already know C and have used it for years.

                                                                                                      Of course. So instead of abusing terms, what a good tutorial might do is properly explain these terms and how they relate to typical implementations of C. Having explained said terms is no excuse for not also explaining and then using actual defined C concepts such as scopes, storage durations and linkage. There’ll be fewer lies to unlearn.

                                                                                                    2. 1

                                                                                                      Where would you point people for a C tutorial relevant in 2017?

                                                                                                      1. 12

                                                                                                        I’m not aware of one I’d really want to endorse.

                                                                                                        “Modern C” gets more things right than your typical C tutorial – which go to the greatest lengths to avoid using standard terminology and make up their own nonsense instead. Unfortunately Modern C is also way more verbose (and makes a poor job of getting to the point) than it needs to be, and comes with plenty of dogma. It’s not entirely free of nonsense either. But it’s probably among the best of the bunch.

                                                                                                        http://icube-icps.unistra.fr/index.php/File:ModernC.pdf

                                                                                                        1. 1

                                                                                                          I have this one on the queue; it seems pretty clear headed. But you’re right, it is very long winded.

                                                                                                          1. 1

                                                                                                            One approach I’ve recommended to people in the past is to just pick up whatever tutorial they need to get started with (basic syntax and concepts, a few examples), then grab a copy of the C standard drafts and just start working with real code. Look things up as you go. Read the man pages (especially from OpenBSD) for any library function you encounter. Search for dowd_ch06.pdf and read that carefully. Expert C Programming is a decent read too, once you’ve got things rolling.

                                                                                                            It’ll take a while to get all the details right, but C is ultimately a fairly simple language. It shouldn’t need a huge exposition like Modern C; and if a newcomer starts by reading one, chances are they’ll forget most of the details anyway, or fail to appreciate their significance. For reference material, one might as well go straight to N1256 (or preferred version).

                                                                                                        2. 1

                                                                                                          Learn C The Hard Way, which is a paid book now :( The online version was taken down… but you can still find it

                                                                                                          1. 5

                                                                                                            I’ve worked about a third of the way through the free version. Shaw is sort of hard to take, especially when I don’t know enough about what he writing about to judge for myself–his opinions on Python 2 vs. 3 soured me on using him as an initial tutorial for, well, pretty much anything.

                                                                                                      1. 3

                                                                                                        This isn’t “the hard thing” about software development, as if there were only one.

                                                                                                        First of all, employers want to make regular coding work a commodity. They’ve succeeded. The race to the bottom, as described in the article, exists. The product is junk, but (a) you can always hire better engineers later, right? and (b) no middle manager gets fired for buying cheap and squeezing down.

                                                                                                        The issue is that people who are smart enough to solve tough problems (i.e., to do the work that commodity rent-a-coders can’t do) are generally averse to being typecast as business subordinates. They’ll only care about executive-level business concerns if they’re paid and treated at least as well as the execs, if not better (since a half-decent programmer– okay, my bar for “half-decent” might be high and someone else’s “quite good”– is smarter than 95% of non-technical VPs).

                                                                                                        Employers want contradictory things from programmers. They want them to care about the business and think proactively about the business’s needs, but they also want for programmers to be subordinates. You really don’t get both, not from a smart person.

                                                                                                        1. 2

                                                                                                          It’s not just an unwillingness to be subordinate. Many companies (if not the vast majority) have an openly hostile environment for engineers that want to understand and contribute to the business side of things. Stepping out of line creates drama that many technical people don’t have the emotional self-management skills to tolerate.

                                                                                                          This is where strong engineering management matters. If you’re a good EM that protects talented people, then they stick around and grow. If you let them get kicked around by insecure product managers who can’t tolerate anyone else understanding how the business works, then as soon as someone progresses past the junior engineer stage, they’ll be looking for a new job where their opinions matter.

                                                                                                          The fact that the author can’t find good senior engineers might say something about the culture he’s immersed in.

                                                                                                        1. 1

                                                                                                          Every single CEO of any IT company wants to build software faster. Time is the most expensive and valuable resource. You can’t waste it on re-work, refactoring, meetings, physical activities.

                                                                                                          I have literally never worked for anyone with this belief. Stability, brand reputation, security, and financial correctness have always taken priority over time to market. I am certain there are very, very rare exceptions, but they rarely pass the sniff test.

                                                                                                          I think the whole culture around delivery speed is a natural and logical business reaction to the uncertainty inherent in a part-creative discipline like software. Anyone who is realistically concerned about such things is in desperate need of a talented manager.

                                                                                                          1. 5

                                                                                                            Stability, brand reputation, security, and financial correctness have always taken priority over time to market.

                                                                                                            Based on what I’ve read and experienced it seems like you’ve had exceptional leadership where you’ve worked. Can you share the company or if there’s some way to identify a workplace like this? You can ask directly in an interview but everyone claims they care about quality…

                                                                                                            1. 2

                                                                                                              the one example I think I can safely share was AMZN. (this was over 7 years ago, so adjust numbers accordingly) when touching anything in the ordering pipeline, the first question was “is this stable” and the last metric viewed was “did the order rate dip”. when you are processing 2k - 20k orders/second and the average user won’t retry after a code 500, a 60 second outage could cost a million bucks.

                                                                                                            2. 3

                                                                                                              Totally agreed. If a CEO wants speed and you explain the trade offs that will be made, if he/she still wants speed, it’s then a matter of strategy. People are not all evil and when you explain them why quality is a cost, they understand it and from my little experience, play with it a few time and find the right gauge.

                                                                                                              I listened recently an old Devops café podcast that was talking about this issue, and the interviewee explained that they put a speed scale on the wall for the Product Mangers, explaining that faster is possible, but it has some costs, but let the PMs change the speed. Interestingly, the PM understood pretty quickly that being >80% wasn’t a good idea.