1. 13

    Ah they tricked me with this one, it’s a Medium article hidden behind another domain.

    (Whenever I see “medium.com” next to lobsters articles I know not to click, since the result will be a weak thinkpiece by a frontend developer, wrapped in obtrusive markup.)

    1.  

      i had literally the exact same response. “Ah, a medium article….about frontend dev……(tab closed)”.

      1. 3

        Interesting ‘hot take’!

        You judge people based on the ‘medium’ that they use.

        1. 8

          “The medium is the message” ;)

          I have to admit though that seeing a medium link is generally a negative signal for me. Still click on many of them.

          1. 7

            I think Medium’s original USP was “only quality content”.

            Predictably, that didn’t scale.

            1.  

              Many confuse Marshall McLuhan’s original meaning of that phrase. It didn’t really mean that the way a message was delivered was part of the message itself. It actually meant that the vast majority of messages were medium or average.

              It would have been better said, “meh, the message is average.”

              1.  

                Interesting interpretation. I am not sure how he originally came to that phrase, but his book certainly spent a lot of time and effort arguing for the now prevalent meaning.

                1.  

                  This didn’t really make sense to me, so I looked it up, and I don’t think that’s right. The original meaning is exactly what we’ve come to understand it as:

                  The medium is the message because it is the medium that shapes and controls the scale and form of human association and action. The content or uses of such media are as diverse as they are ineffectual in shaping the form of human association. Indeed, it is only too typical that the “content” of any medium blinds us to the character of the medium. (Understanding Media: The Extensions of Man, 1964, p.9)

                  I wonder where you’ve heard your interpretation?

                  1.  

                    This comment is obviously a troll. Fitting, given that McLuhan himself was a troll.

            1. 3

              I post rarely, and there’s a lot of far-left politics in there as well: https://flowing.systems/

              1. 6

                What are the actual benefits of compiling to wasm rather than JS? One drawback is that, at least for the near future, a wasm module has its own heap rather than participating in the garbage-collected heap of the JS engine. It’s also likely that wasm means a larger runtime footprint.

                1. 3

                  Betting on future states of technology maybe?

                  1. 1

                    Garbage collection

                    1. 1

                      How is that a benefit? I would agree if compilation to WebAssembly avoids GC, but Grain seems to require GC anyway.

                      1. 1

                        Yes, but you get to pick and ship the GC for your application.

                    2. 1

                      Fun? Experimentation?

                      1. 0

                        I see no benefit. I said the same thing for Go WebAssembly backend; there is already GopherJS.

                      1. 1

                        My criticism of Stallman, think it was posted here before: http://flowing.systems/2016/09/24/a-short-critique-of-stallmanism/

                        1. 30

                          Other important political aspect of Material Design (and some other UI/web styles that are popular now) is “minimalism”. Your UI should have few buttons. User should have no choices. User should be consumer of content, not a producer. Having play and pause buttons is enough. User should have few choices how and what to consume — recommender system (“algorithmic timeline”, “AI”) should tell them what to consume. This rhetoric is repeated over and over in web and mobile dev blogs.

                          Imagine graphics editor or DAW with “material design”. It’s just nearly impossible. It’s suitable only for scroll-feed consumption and “personal information sharing” applications.

                          Also, it’s “mobile-first”, because Google controls mobile (80% market share or something like that). Some pages on Google itself (i.e. account settings) look on desktop like I’m viewing it on giant handset.

                          P.S. compared with “hipster” modernist things of ~2010, which often were nice and “warm”, Material Design looks really creepy for me even when considering only visual appearance.

                          1. 10

                            A potentially interesting challenge: What does a design language for maker-first applications look like?

                            1. 17

                              Not sure if such design languages exist, but from what I’ve seen, I have feeling that every “industry” has its own conventions and guidelines, and everything is very inconsistent.

                              • Word processors: lots of toolbar buttons (still lots of them now, but in “ribbons” which are just tabbed widgets). Use of ancient features like scroll lock key. Other types of apps usually have actions in menus or in searchable “run” dialogs, not toolbar button for each feature.
                              • Graphics editors: narrow toolbars with very small buttons (popularized by both Adobe and Macromedia, I think). Various non-modal dialogs have widgets of nonstandard small size. Dark themes.
                              • DAWs: lots of insane skeuomorphism! Everything should look like real synths and effects, with lots of knobs and skinning. Dark themes. Nonstandard widgets everywhere. Single program may have lots of multiple different styles of widgets (i.e. Reason, Fruity Loops).
                              • 3D: complicated window splits, use of all 3 mouse buttons, also dark themes. Nonstandard widgets, again. UI have heritage from Silicon Graphics workstations and maybe Amiga.

                              I thought UI guidelines for desktop systems (as opposed to cellphone systems) have lots of recommendations for such data editing programs, but seems that no, they mostly describe how to place standard widgets in dialogs. MacOS guidelines are based on programs that are included with MacOS, which are mostly for regular consumers or “casual office” use. Windows and Gnome guidelines even try to combine desktop and mobile into one thing.

                              Most “editing” programs ignore these guidelines and have non-native look and feel (often the same look-and-feel on different OSes).

                              1. 3

                                3D: complicated window splits, use of all 3 mouse buttons, also dark themes. Nonstandard widgets, again. UI have heritage from Silicon Graphics workstations and maybe Amiga.

                                Try Lisp machines. 3D was a strong market for Symbolics.

                              2. 9

                                I’d suggest–from time spent dealing with CAD, programming, and design tools–that the biggest thing is having common options right there, and not having overly spiffy UI. Ugly Java swing and MFC apps have shipped more content than pretty interfaces with notions of UX (notable exceptions tend to be music tools and DAW stuff, for reasons incomprehensible to me). A serious tool-user will learn their tooling and extend it if necessary if the tool is powerful enough.

                                1. 0

                                  (notable exceptions tend to be music tools and DAW stuff, for reasons incomprehensible to me)

                                  Because artists demand an artsy-looking interface!

                                2. 6

                                  We had a great post about two months back on pie menus. After that, my mind goes to how the Android app Podcast Addict does it: everything is configurable. You can change everything from the buttons it shows to the tabs it has to what happens when you double-click your headset mic. All the good maker applications I’ve used give me as much customization as possible.

                                  1. 2

                                    It’s identical to the material design guidelines but with a section on hotkeys, scripts, and macros.

                                  2. 5

                                    P.S. compared with “hipster” modernist things of ~2010

                                    What do you mean by this

                                    1. 4

                                      Stuff like Bootstrap mentioned there, early Instagram, Github. Look-and-feels commonly associated with Silicon Valley startups (even today).

                                      These things usually have the same intentions and sins mentioned in this article, but at least look not as cold-dead as Material Design.

                                      1. 3

                                        Isn’t this like… today? My understanding was: web apps got the material design feel, while landing pages and blogs got bootstrappy.

                                        I may be totally misinterpreting what went on though

                                      2. 3

                                        Bootstrap lookalikes?

                                    1. 2

                                      This makes sense. I adopted WSL recently because my corporate overlords will let me remote in using a Windows 10 AWS workspace.

                                      WSL overall is an impressive piece of work but the Windows Console is… Challenging. It’s the single biggest fly in my WSL ointment.

                                      Tools like ConsEmu and Cmder help but can’t address the underlying slowness of the APIs in question. I suspect it may be the fact that the windows console isn’t just a text entity. With things like ConsEmu you can actually embed “simple GUI apps” like PuTTY into the console AFAICT.

                                      Key blocker for me that I hope they fix/improve is basic inability to select multiple ages of text from the console to be pasted elsewhere.

                                      This is a real pain for me since our workflow is - run command, copy&paste output into a web form, lather, rinse, repeat.

                                      This post provided a usable workaround - mainly piping to clip.exe and I’ve been hobbling by with that, I defintely do miss Terminal.app/iTerm2!

                                      1. 3

                                        FWIW, I pipe to xclip even on Linux (and pbcopy on macOS). It’s not like copy&paste is pleasant elsewhere.

                                        1. 2

                                          It’s just Ctrl+Shift+C/V on Linux/Mac. Always was pleasant to me…

                                          1. 1

                                            Nit: It’s Cmd-c /Cmd-v on a Mac :)

                                            1. 1

                                              Right! Is regular copy/paste that as well?

                                              1. 1

                                                Yes. That is “regular” copy/paste.

                                          2. 1

                                            So, yeah I see where you’re coming from, but cutting & pasting in iTerm2 / Terminal is LIGHTNING fast and super responsive. I can bulk select several pages of text, cut and paste. Boom. No problem.

                                            Try that in a Windows console, sadness will ensue.

                                            Your post gave me an idea though, I can use the logging functions of screen to best advantage. Start logging (C-b -H) run invocation, stop logging (C-b- C-H) suspend that screen session, copy log to WSL host, then delete. Then on the WSL side just cat log | clip.exe and delete. Lather rinse repeat.

                                        1. 2

                                          The use of discrimination to meet quotas at the expense of merit is exactly the opposite of what made diversity initiatives laudable in the first place.

                                          This is exactly the type of novel, interesting, and may I add brave writing that I have come to expect from Quillette.

                                          1. 4

                                            Not to spoil the article, but the ending quote in bold really sort of scares me. I’ve spent most of my life hearing stories about how machines will take human jobs. The reality of that has played out much less scary (so far) than they’d have had us believe 20 or 30 years ago. It’s never really occurred to me, though, that in another 20 or 30 years that my job as a programmer might be obsoleted as well. It’s like the matrix; funny ha ha, but for real.

                                            Thankfully I hope to be retired 30 years from now :P

                                            1. 4

                                              I think this is the natural progression of things, isn’t it? Programming isn’t immune to the effects of automation - just the opposite, in fact. It’s like boiling a frog - things are automated so often and so incrementally that programmers no longer notice when jobs that would have taken 10x longer a few years ago are basically instantaneous today.

                                              1. 11

                                                Programming will be the last thing to be automated, because it is itself automation - once you have automated programming you just have to run your automated programmer and then you’ve automated everything.

                                                1. 2

                                                  …No. The only thing that will save programming from being automated NEXT is… wait, I see what you did there. “Your keys are always found in the last place you look.” :)

                                                  On a serious note, regarding future job prospects, I think programming will not be the last available job. Some job that isn’t an attractive candidate for automation will be the last available job. Programming, with all its expense, is a prime target.

                                                  1. 4

                                                    Once you can automate programming you can automate everything else at approaching 0 cost, so it’s moot.

                                                    1. 1

                                                      Can you? I would imagine lots of jobs rely on intrinsically tacit, “local” intuition, and not merely knowledge and cognitive function, which is what it seems to me the only thing that “solving programming” entails automatically.

                                                      1. 1

                                                        Programming often relies on intrinsically tacit, local intuition. I mean think of the last time you received feedback from the customer about how they felt the software should work.

                                                        1. 1

                                                          Good point I didn’t think about that end of the situation

                                                2. 2

                                                  Hopefully, this allows them (and me) to do their (and my) jobs more efficiently, and focusing on other more important things. Of course, other stuff will eventually fall into obsolescence, but don’t we have graveyard keepers, working on decrepit technologies for sizeable amounts of money? COBOL experts, where art thou?

                                                  1. 2

                                                    All very true. I think the reality just sort startled me.

                                                  2. 5

                                                    This is why it’s important to move past capitalism ASAP: it’s more and more immoral to couple the ability to get a job with the ability to stay alive and retain dignity. Once all labor is automated, there shouldn’t be any jobs (coerced or obligatory labor), and we should all be rejoicing.

                                                    1. 0

                                                      Will there still be a free market? Or will what we consume be planned by the machines. At which, point, without the ability to decide what I want - or the illusion thereof - my job as a human is done too …

                                                      1. 5

                                                        woe to those who think their job as humans is to consume

                                                        1. 1

                                                          I eat, therefore I am.

                                                        2. 1
                                                          1. We all make the world;
                                                          2. define “free market”.
                                                          1. 0

                                                            There is a medium of exchange (please not barter) and a market for goods and services. I have goods/services to offer and I have goods/services I need. I have markets to go to sell and buy these. The market is not controlled by the commissariat which determines how much toothpaste I get and what color tube it comes in because for reasons most people can not fathom, I like to chose.

                                                            1. 1

                                                              you can chose what color tube your toothpaste comes in?

                                                              1. 0

                                                                In capitalist America, toothpaste color chooses you!

                                                              2. 0

                                                                What is available in these markets? What is not? How are its dynamics damped, to avoid balloons and crashes? How are negative externalities, like advertising or air pollution, accounted for? You throw around the “free” as though its interpretation were obvious, when the devil is in the details, and the details are everything.

                                                                1. 0

                                                                  This is strawman nonsense, and nowhere do I imply central planning. What you’re really saying is, “I want freedom of choice for consumption and production,” which doesn’t require capitalism, though you’re strongly implying you think it does.

                                                                  1. 0

                                                                    You need to elaborate your scheme then. Every time I’ve heard someone say “I hate capitalism and I have an alternative for it” what they really have is state capitalism (AKA communism in practice as opposed to the silly theory of communism written down somewhere).

                                                                    1. 0

                                                                      The universal means of production (automated labor), universally distributed.

                                                                      1. 0

                                                                        Who decides resource allocation?

                                                                        1. 0

                                                                          Who decides it now?

                                                                          1. 0

                                                                            The market

                                                                            1. 0

                                                                              How’s that workin’ out.

                                                                              1. 0

                                                                                Better than anything else people have tried.

                                                                                1. 0

                                                                                  Citation needed.

                                                                                  1. 0

                                                                                    Also, punch cards were better than anything that came before, and then we had better ideas that were enabled by advancing technology. It’s time we did the same for meeting basic human needs.

                                                                                    1. -1

                                                                                      You haven’t actually said what the replacement is for free markets and capitalism.

                                                                                      1. 0

                                                                                        Start with democratic socialism. End with technological post-scarcity.

                                                                                        1. 0

                                                                                          All countries with governments are socialist, not all are democratic, and not all have free markets. So that doesn’t add anything new.

                                                                                          Post-scarcity is another way of saying we have no plan on how to deal with resource contention, which is the hard problem

                                                              3. -1

                                                                it’s more and more immoral to couple the ability to get a job with the ability to stay alive and retain dignity.

                                                                What dignity is possible once you’re livestock to be taken care of?

                                                                The truth of the matter is there’s an ongoing demographic implosion. If they wait it out awhile, there won’t be that many people to have to have the universal income or whatever it is you’re arguing for.

                                                                1. 3

                                                                  You’re assuming that dignity and purpose are only possible under conditions of coerced labor. Your premise is false.

                                                                  I’m not arguing for UBI. I’m arguing for democratic access to the means of universal production (robotic labor, molecular nanotech, etc.), removing the need for things like “income”.

                                                            1. 2

                                                              I’m a student of philosophy, biology, and mathematics.

                                                              quotes Deleuze in an article about functional programming

                                                              Are you me lmao. Although I’m a math student:) How did you get into Deleuze and Guattari?

                                                              1. 3

                                                                Accidentally :) I wrote my undergrad thesis on Levinas, so I was exposed to the French thinkers and naturally read bits of Foucault and D&G.

                                                                1. 3

                                                                  Deleuze is my favorite :D

                                                                  1. 1

                                                                    Wow that’s so great to hear!:D Deleuze is one of the primary reasons behind the gust of motivation for mathematics I’ve gained in the past year. It’s really great how it made me sprawl in so many directions, dynamical systems over formal logic to abstract algebra. It was also really awesome to see these intersect when in the middle of studying for my analysis 2 exam, I stumbled upon an article connecting Taylor series and major analysis topics to algebraic types:D

                                                                1. 3

                                                                  This is a monad tutorial tutorial.

                                                                  1. 2

                                                                    I doubt that was the intention? It’s more like a personal experience report.

                                                                  1. 10

                                                                    No, you don’t need C aliasing to obtain vector optimization for this sort of code. You can do it with standards-conforming code via memcpy(): https://godbolt.org/g/55pxUS

                                                                    1. 2

                                                                      Wow, it’s actually completely optimizing out the memcpy()? While awesome, that’s the kind of optimization I hate to depend on. One little seemingly inconsequential nudge and the optimizer might not be able to prove that’s safe, and suddenly there’s an additional O(n) copy silently going on.

                                                                      1. 2

                                                                        memset/memcpy get optimized out a lot, hence libraries making things like this: https://monocypher.org/manual/wipe

                                                                        1. 1

                                                                          Actually it’s not optimizing it out, it’s simply allocating the auto array into SIMD registers. You always must copy data into SIMD registers first before performing SIMD operations. The memcpy() code resembles a SIMD implementation more than the aliasing version.

                                                                        2. 1

                                                                          You can - and thanks for the illustration - but the memcpy is antethical to the C design paradigm in my always humble opinion. And my point was not that you needed aliasing to get the vector optimization, but that aliasing does not interfere with the vector optimization.

                                                                          1. 8

                                                                            I’m sorry but the justifications for your opinion no longer hold. memcpy() is the only unambiguous and well-defined way to do this. It also works across all architectures and input pointer values without having to worry about crashes due to misaligned accesses, while your code doesn’t. Both gcc and clang are now able to optimize away memcpy() and auto vars. An opinion here is simply not relevant, invoking undefined behavior when it increases risk for no benefit is irrational.

                                                                            1. -1

                                                                              Au contraire. As I showed, C standard does not need to graft on a clumsy and painful anti-alias mechanism and programmers don’t need to go though stupid contortions with allocation of buffers that disappear under optimization , because the compiler does not need it. My code does’t have alignment problems. The justification for pointer alias rules is false. The end.

                                                                              1. 10

                                                                                There are plenty of structs that only contain shorts and char, and in those cases employing aliasing as a rule would have alignment problems while the well-defined version wouldn’t. It’s not the end, you’re just in denial.

                                                                                1. -2

                                                                                  In those cases, you need to use an alignment modifier or sizeof. No magic needed. There is a reason that both gcc and clang have been forced to support -fnostrict_alias and now both support may_alias. The memcpy trick is a stupid hack that can easily go wrong - e.g one is not guaranteed that the compiler will optimize away the buffer, and a large buffer could overflow stack. You’re solving a non-problem by introducing complexity and opacity.

                                                                                  1. 10

                                                                                    In what world is memcpy() magic and alignment modifiers aren’t? memcpy() is an old standard library function, alignment modifiers are compiler-specific syntax extensions.

                                                                                    memcpy() isn’t a hack, it’s always well-defined while aliasing can never be well-defined in all cases. Promoting aliasing as a rule is like promoting using the equality operator between floats – it can never work in all cases, though it may be possible to define meaningful behavior in specific cases. Promoting aliasing as a rule is promoting the false idea that C is a thin layer above contemporary architectures, it isn’t. Struct memory is not necessarily the same as array memory, not every machine that C supports can deference an int32 inside of an int64, not every machine can deference an int32 at any offset. Do you want C to die with x86_64 or do you want C to live?

                                                                                    Optimizations don’t need to be guaranteed when the code isn’t even correct in the first place. First make sure your code is correct, then worry about optimizing. You talk about alignment modifiers but they are rarely used, and usually they are used after a bug has already occurred. Code should be correct first, and memcpy() is the rule we should be promoting since it is always correct. Optimizers can meticulously add aliasing for specific cases once a bottleneck has been demonstrated. You’re solving a non-problem by indulging in premature optimization.

                                                                                    1. 3

                                                                                      Do you want C to die with x86_64 or do you want C to live?

                                                                                      Heh I bet you’d get quite varied answers to this one here

                                                                                      1. 0

                                                                                        The memcpy hack is a hack because the programmer is supposed to write a copy of A to B and then back to A and rely on the optimizer to skip the copy and delete the buffer. So unoptimized the code may fault on stack overflows for data structures that exist only to make the compiler writers happier. And with a novel architecture, if the programmer wants to take advantage of a new capability - say 512 bit simd instructions , she can wait until the compiler has added it to its toolset and be happy with how it is used.

                                                                                        As for this not working in all cases: Big deal. C is not supposed to hide those things. In fact, the compiler has no idea if the memory is device memory with restrictions on how it can be addressed or memory with a copy on write semantics or …. You want C to be Pascal or Java and then announce that making C look like Pascal or Java can only be solved at the expense of making C unusable for low level programming. Which programming communities are asking for such insulation? None. C works fine on many architectures. C programmers know the difference between portable and non-portable constructs. C compilers can take advantage of SIMD instructions without requiring C programmers to give up low level memory access - one of the key advantages of programming in C. Basically, people who don’t like C are trying to turn C into something else and are offended that few are grateful.

                                                                                        1. 4

                                                                                          You aren’t writing a copy of a buffer back and forth. In your example, you are reducing an encoding of a buffer into a checksum. You are only copying one way, and that is for the sake of normalization. All SIMD code works that way, you always must copy into SIMD registers first before doing SIMD operations. In your example, the aliasing code doesn’t resemble SIMD code both syntactically and semantically as much the memcpy() code does and in fact requires a smarter compiler to transform.

                                                                                          The chance of overflowing the stack is remote, since stacks now automatically grow and structs tend to be < 512 bytes, but if that is a legitimate concern you can do what you already do to avoid that situation, either use a static buffer (jeopardizing reentrancy) or use malloc().

                                                                                          By liberally using aliasing, you are assuming a specific implementation or underlying architecture. My point is that in general you cannot assume arbitrary internal addresses of a struct can always be dereferenced as int32s, so in general that should not be practiced. In specific cases you can alias, but those are the exceptions not the rule.

                                                                                          1. 1

                                                                                            All copies on some architectures reduce to: load into register, store from register. So what? That is why we have a high level language which can translate *x = *y efficiently. The pointer alias code directly shows programmer intent. The memcpy code does not. The “sake of normalization” is just another way of saying “in order to cooperate with the fiction that the inconsistency in the standard produces”.

                                                                                            In many contexts, stacks do NOT automatically grow.Again, C is not Java. OS code, drivers, embedded code, even many applications for large systems - all need control over stack size. Triggering stack growth may even turn out to be a security failure for encryption which is almost universally written in C because in C you can assure time invariance (or you could until the language lawyers decided to improve it). Your proposal that programmers not only use a buffer, but use a malloced buffer, in order to allow the optimizer (they hope) not to use it, is ridiculous and is a direct violation of the C model.

                                                                                            “3. C code can be non-portable. Although it strove to give programmers the opportunity to write truly portable programs, the Committee did not want to force programmers into writing portably, to preclude the use of C as a “high-level assembler;” the ability to write machine-specific code is one of the strengths of C. It is this principle which largely motivates drawing the distinction between strictly conforming program and conforming program.” ( http://www.open-std.org/jtc1/sc22/wg14/www/docs/n2021.htm)

                                                                                            Give me an example of an architecture where a properly aligned structure where sizeof(struct x)%sizeof(int32) == 0 cannot be accessed by int32s ? Maybe the itanium, but I doubt it. Again: every major OS turns off strict alias in the compilers and they seem to work. Furthermore, the standard itself permits aliasing via char* (as another hack). In practice, more architectures have trouble addressing individual bytes than addressing int32s.

                                                                                            I’d really like to see more alias analysis optimization in C code (and more optimization from static analysis) but this poorly designed, badly thought through approach we have currently is not going to get us there. To solve any software engineering problem, you have to first understand the use cases instead of imposing some synthetic design.

                                                                                            Anyways off the airport. Later. vy

                                                                                            1. 2

                                                                                              I’m willing to agree with you that the aliasing version more clearly shows intent in this specific case but then I ask, what do you do when the code aliases a struct that isn’t properly aligned? There are a lot of solutions but in the spirit of C, I think the right answer is that it is undefined.

                                                                                              So I think what you want is the standard to define one specific instance of previously undefined behavior. I think in this specific case, it’s fair to ask for locally aliasing an int32-aligned struct pointer to an int32 pointer to be explicitly defined by the standards committee. What I think you’re ignoring, however, is all the work the standards committee has already done to weigh the implications of defining behavior like that. At the very least, it’s not unlikely that there will be machines in the future where implementing the behavior you want will be non-trivial. Couple that with the burden of a more complex standard. So maybe the right answer to maximize global utility is to leave it undefined and to let optimization-focused coders use implementation-defined behavior when it matters but, as I’m arguing, use memcpy() by default. I tend to defer to the standards committees because I have read many of their feature proposals and accompanying rationales and they are usually pretty thorough and rarely miss things that I don’t miss.

                                                                                              Everybody arguing here loves C. You shouldn’t assume the standards committee is dumb or that anyone here wants C to be something it’s not. As much as you may think otherwise, I think C is good as it is and I don’t want it to be like other languages. I want C to be a maximally portable implementation language. We are all arguing in good faith and want the best for C, we just have different ideas about how that should happen.

                                                                                              1. 1

                                                                                                what do you do when the code aliases a struct that isn’t properly aligned? There are a lot of solutions but in the spirit of C, I think the right answer is that it is undefined.

                                                                                                Implementation dependent.

                                                                                                Couple that with the burden of a more complex standard.

                                                                                                The current standard on when an lvalue works is complex and murky. Wg14 discussion on how it applies shows that it’s not even clear to them. The exception for char pointers was hurriedly added when they realized they had made memcpy impossible to implement. It seems as if malloc can’t be implemented in conforming c ( there is no method of changing storage type to reallocate it)

                                                                                                C would benefit from more clarity on many issues. I am very sympathetic to making pointer validity more transparent and well defined. I just think the current approach has failed and the c89 error has not been fixed but made worse. Also restrict has been fumbled away.

                                                                                            2. 1

                                                                                              The chance of overflowing the stack is remote, since stacks now automatically grow and structs tend to be < 512 bytes, but if that is a legitimate concern you can

                                                                                              … just copy the ints out one at a time :) https://godbolt.org/g/g8s1vQ

                                                                                              The compiler largely sees this as a (legal) version of the OP’s code, so there’s basically zero chance it won’t be optimised in exactly the same way.

                                                                                        2. 2

                                                                                          You don’t need a large buffer. You can memcpy the integers used for the calculation out one at a time, rather than memcpy’ing the entire struct at once.

                                                                                          Your designation of using memcpy as a “stupid hack” is pretty biased. The code you posted can go wrong, legitimately, because of course it invokes undefined behaviour, and is more of a hack than using memcpy is. You’ve made it clear that you think the aliasing rules should be changed (or shouldn’t exist) but this “evidence” you’ve given has clearly been debunked.

                                                                                          1. 0

                                                                                            Funny use of “debunked”. You are using circular logic. My point was that this aliasing method is clearly amenable to optimization and vectorization - as seen. Therefore the argument for strict alias in the standard seems even weaker than it might. Your point seems to be that the standard makes aliasing undefined so aliasing is bad. Ok. I like your hack around the hack. The question is: why should C programmers have to jump through hoops to avoid triggering dangerous “optimizations”? The answer: because it’s in the standard, is not an answer.

                                                                                            1. 3

                                                                                              Funny use of “debunked”. You are using circular logic. My point was that this aliasing method is clearly amenable to optimization and vectorization - as seen

                                                                                              You have shown a case where, if the strict aliasing rule did not exist, some code could [edit] still [/edit] be optimised and vectorised. That I agree with, though nobody claimed that the existence of the strict aliasing rule was necessary for all optimisation and vectorisation, so it’s not clear what you do think this proves. Your title says that the optimisation is BECAUSE of aliasing, which is demonstrably false. Hence, debunked. Why is that “funny”? And how is your logic any less circular then mine?

                                                                                              The question is: why should C programmers have to jump through hoops to avoid triggering dangerous “optimizations”?

                                                                                              Characterising optimisations as “dangerous” already implies that the code was correct before the optimisation was applied and that the optimisation can somehow make it incorrect. The logic you are using relies on the code (such as what you’ve posted) being correct - which it isn’t, according to the rules of the language (which, yes, are written in a standard). But why is using memcpy “jumping through hoops” whereas casting a pointer to a different type of pointer and then de-referencing it not? The answer is, as far as I can see, because you like doing the latter but you don’t like doing the former.

                                                                                      2. 1

                                                                                        The end.

                                                                                        The internet has no end.

                                                                                1. 0

                                                                                  So far I’ve only found one solution that is actually robust. Which is to manually check that the value is not nil before actually using it.

                                                                                  This seems reasonable to me. If anything, I’d consider knowing how and when to use this kind of check a part of language competency knowledge as it is how Go was designed.

                                                                                  1. 9

                                                                                    We expect people to be competent enough to not crash their cars, but we still put seatbelts in.

                                                                                    That’s perhaps a bad analogy, because most people would say that there are scenarios where you being involved in a car crash wasn’t your fault. (My former driver’s ed teacher would disagree, but that’s another post.) However, the point remains that mistakes happen, and can remain undiscovered for a disturbingly long period of time. Putting it all down to competence is counter to what we’ve learned about what happens with software projects, whether we want it to happen or not.

                                                                                    1. 9

                                                                                      I wish more languages had patterns. Haskell example:

                                                                                      data Named = Named {Name :: Text} deriving Show
                                                                                      
                                                                                      greeting :: Maybe Named -> Text
                                                                                      greeting (Just thing) = "Hello " + (Name thing)
                                                                                      greeting _ = ""
                                                                                      

                                                                                      You still have to implement each pattern, but it’s so much easier, especially since the compiler will warn you when you miss one.

                                                                                      1. 3

                                                                                        Swift does this well with Optionals

                                                                                        1. 5

                                                                                          You can even use an optional type in C++. It’s been a part of the Boost library for a while and was added to the language itself in C++17.

                                                                                          1. 4

                                                                                            You can do anything in C++ but most libraries and people don’t. The point is to make these features integral.

                                                                                            1. 1

                                                                                              It’s in the standard library now so I think it’s integral.

                                                                                              1. 4

                                                                                                If it’s not returned as a rule and not as an exception throughout the standard library it doesn’t matter though. C++, both the stdlib and the wider ecosystem, rely primarily on error handling outside of the type-system, as do many languages with even more integrated Maybe types

                                                                                          2. 2

                                                                                            Yep. Swift has nil, and by default no type can hold a nil. You have to annotate them with ? (or ! if you just don’t care, see below).

                                                                                            var x: Int = nil // error
                                                                                            var x: Int? = nil // ok
                                                                                            

                                                                                            It’s unwrapped with either if let or guard let

                                                                                            if let unwrapped_x = x {
                                                                                                print("x is \(x)") 
                                                                                            } else {
                                                                                                print("x was nil")
                                                                                            }
                                                                                            
                                                                                            guard let unwrapped_x = x else {
                                                                                                print("x was nil")
                                                                                                return
                                                                                            }
                                                                                            

                                                                                            Guard expects that you leave the surrounding block if the check fails.

                                                                                            You can also force the unwraps with !.

                                                                                            let x_str = "3"
                                                                                            let x = Int(x_str)! // would crash at run-time if the conversion wouldn't succeed
                                                                                            

                                                                                            Then there’s implicit unwraps, which are pretty much like Java objects in the sense that if the object is nil when you try to use it, you get a run-time crash.

                                                                                            let x: Int! = nil
                                                                                            
                                                                                        2. 7

                                                                                          Hey, I’m the author of the post. And indeed that does work, which is why I’m doing that currently. However, like I try to explain further in the post this has quite some downsides. The main one is that it can be easily forgotten. The worst part of which is that if you did forget, you will likely find out only by a runtime panic. Which if you have some bad luck will occur in production. The point I try to make is that it would be nice to have this be a compile time failure.

                                                                                          1. 1

                                                                                            Sure, and that point came across. I think you’d agree that language shortcomings - and certainly this one - are generally excused (by the language itself) by what I mentioned?

                                                                                        1. 3

                                                                                          This article is mostly fluff, but I did laugh out loud at the line about “hacker news and reddit commenters are smarter than average”.

                                                                                          1. 1

                                                                                            I’m pretty sure that they are. Not because of some virtue that belonging to those communities inherently confers, but because of pre-selection for slightly-above-average invested people.

                                                                                            HN is sketchier though, while I was active there it seemed to have more business people than actual industry workers…

                                                                                            1. 1

                                                                                              I can see why you would think that, based only on the messages that are posted, without being able to see the ones that weren’t. But this is more like an iceberg: the smartest and most experienced take one look at these communities and nope away forever. (Yes, I already know what that says about me.) :)

                                                                                          1. 0

                                                                                            A list of beliefs about programming that I maintain are misconceptions.

                                                                                            1. 3

                                                                                              Small suggestion: use a darker, bigger font. There are likely guidelines somewhere but I don’t think you can fail with using #000 for text people are supposed to read for longer than a couple of seconds.

                                                                                              1. 3

                                                                                                Current web design seems allergic to any sort of contrast. Even hyper-minimalist web design calls for less contrast for reasons I can’t figure out. Admittedly, I’m a sucker for contrast; I find most programming colorschemes hugely distasteful for the lack of contrast.

                                                                                                1. 6

                                                                                                  I think a lot of people find the maximum contrast ratios their screens can produce physically unpleasant to look at when reading text.

                                                                                                  I believe that people with dyslexia in particular find reading easier with contrast ratios lower than #000-on-#fff. Research on this is a bit of a mixed bag but offhand I think a whole bunch of people report that contrast ratios around 10:1 are more comfortable for them to read.

                                                                                                  As well as personal preference, I think it’s also quite situational? IME, bright screens in dark rooms make black-on-white headache inducing but charcoal-on-silver or grey-on-black really nice to look at.

                                                                                                  WCAG AAA asks for a contrast ratio of 7:1 or higher in body text which does leave a nice amount of leeway for producing something that doesn’t look like looking into a laser pointer in the dark every time you hit the edge of a glyph. :)

                                                                                                  As for the people putting, like, #777-on-#999 on the web, I assume they’re just assholes or something, I dunno.

                                                                                                  Lobsters is #333-on-#fefefe which is a 12.5:1 contrast ratio and IMHO quite nice with these fairly narrow glyphs.

                                                                                                  (FWIW, I configure most of my software for contrast ratios around 8:1.)

                                                                                                  1. 2

                                                                                                    Very informative, thank you!

                                                                                              2. 3

                                                                                                I think the byte-order argument doesn’t hold when you mentioned ntohs and htons which are exactly where byte-order needs to be accounted for…

                                                                                                1. 2

                                                                                                  If you read the byte stream as a byte stream and shift them into position, there’s no need to check endianness of your machine (just need to know endianness of the stream) - the shifts will always do the right thing. That’s the point he was trying to make there.

                                                                                                  1. 2

                                                                                                    ntohs and htons do that exact thing and you don’t need to check endianess of your machine, so the comment about not understanding why they exist makes me feel like the author is not quite groking it. Those functions/macros can be implemented to do the exact thing linked to in the blog post.

                                                                                              1. 2

                                                                                                It seems when they say “functional” they mean “purely functional” or “haskell”.

                                                                                                The functional style of programming, in contrast, represents programs as relationships between mathematical expressions which are based on dependencies

                                                                                                the leading functional language (Haskell)

                                                                                                Is haskell the leading functional language? It seems like there are several ahead of it: scheme, ML, erlang, maybe prolog, maybe swift depending on your definition of functional.

                                                                                                Lastly, I’m not sure it’s fair to compare C++, a language without a GC, with Haskell which does have a GC. Certainly if you were really interested in comparing functional vs. imperative languages, I’d want to see a larger variety in the mix.

                                                                                                1. 1

                                                                                                  Referential transparency is a pretty central concept in FP. Every modern language can let you have basic pattern-matching and first-class functions. But that’s not functional programming in and of itself.

                                                                                                  You certainly can program in a functional style in, say, Scala, but it ends up looking like really ugly Haskell code anyway. Other times you’re usually just using first-class functions as someone might drop in English terms when speaking their native non-English language.

                                                                                                  1. 1

                                                                                                    It’s pretty common to think of Haskell as the functional language since it’s both pure and widely used. The fairness is questionable. Of the imperative languages, C++ is the goto choice for programming in the large that has to be efficient. It’s flexible enough to do a lot of different styles. The fact that Haskell has a GC is a language, design choice on their part. C++ offers several options. If the design slows it down, then that’s just what they chose. That Rust has some features of functional language without a GC might make it a nice alternative, though. Might even be able to do both imperative and functional version in it.

                                                                                                  1. 4

                                                                                                    It is very nice that people are trying to compare imperative and functional languages under measurements that mostly people are interested: lines of code and performance.

                                                                                                    I just skimmed through the text and found some downsides, namely:

                                                                                                    • use of infinite list comprehension to replace a loop mutating three variables. Given that a list is a linked list, I tend to believe they’re comparing algorithms with different orders ( O(1) vs. O(n) ), although GHC possibly optimises it;
                                                                                                    • they do not provide the source code of the programs.

                                                                                                    Still, I’m glad to see an effort to compare the implementations from a “standard programmer”.

                                                                                                    1. 1

                                                                                                      It isn’t a problem if the structure has O(n) random access if you don’t access it randomly, i.e. recursion over a linked list is the same as looping.

                                                                                                      1. 1

                                                                                                        Yeah I couldn’t tell how they implemented the haskell algorithms, so it was hard to deduce if the haskell was slower due to using improper constructs or not. It can be really easy to make things slow in haskell if you don’t understand the transformations behind things. Also hard to tell why its slow without a dump of the assembly.

                                                                                                        I’m not a fan of papers that don’t provide source, doesn’t make any findings all that useful. Just looks like a filler paper if I can’t poke around.

                                                                                                      1. 4

                                                                                                        I thought this wasn’t supposed to be like HN

                                                                                                        (Okay needless snark on my side since it was flagged multiple times apparently, good)

                                                                                                          1. -2

                                                                                                            That domain name is the worst thing ever, so many hyphens

                                                                                                            1. 9

                                                                                                              No, that’s this one.

                                                                                                          1. 7

                                                                                                            I always laugh when people come up with convoluted defenses for C and the effort that goes into that (even writing papers). Their attachment to this language has caused billions if not trillions worth of damages to society.

                                                                                                            All of the defenses that I’ve seen, including this one, boil down to nonsense. Like others, the author calls for “improved C implementations”. Well, we have those already, and they’re called Rust, Swift, and, for the things C is not needed for, yes, even JavaScript is better than C (if you’re not doing systems-programming).

                                                                                                            1. 31

                                                                                                              Their attachment to this language has caused billions if not trillions worth of damages to society.

                                                                                                              Their attachment to a language with known but manageable defects has created trillions if not more in value for society. Don’t be absurd.

                                                                                                              1. 4

                                                                                                                [citation needed] on the defects of memory unsafety being manageable. To a first approximation every large C/C++ codebase overfloweth with exploitable vulnerabilities, even after decades of attempting to resolve them (Windows, Linux, Firefox, Chrome, Edge, to take a few examples.)

                                                                                                                1. 2

                                                                                                                  Compared to the widely used large codebase in which language for which application that accepts and parses external data and yet has no exploitable vulnerabilities? BTW: http://cr.yp.to/qmail/guarantee.html

                                                                                                                  1. 6

                                                                                                                    Your counter example is a smaller, low-featured, mail server written by a math and coding genius. I could cite Dean Karnazes doing ultramarathons on how far people can run. That doesn’t change that almost all runners would drop before 50 miles, esp before 300. Likewise with C code, citing the best of the secure coders doesn’t change what most will do or have done. I took author’s statement “to first approximation every” to mean “almost all” but not “every one.” It’s still true.

                                                                                                                    Whereas, Ada and Rust code have done a lot better on memory-safety even when non-experts are using them. Might be something to that.

                                                                                                                    1. 2

                                                                                                                      I’m still asking for the non C widely used large scale system with significant parsing that has no errors.

                                                                                                                      1. 3

                                                                                                                        That’s cheating saying “non-c” and “widely used.” Most of the no-error parsing systems I’ve seen use a formal grammar with autogeneration. They usually extract to Ocaml. Some also generate C just to plug into the ecosystem since it’s a C/C++-based ecosystem. It’s incidental in those cases: could be any language since the real programming is in the grammar and generator. An example of that is the parser in Mongrel server which was doing a solid job when I was following it. I’m not sure if they found vulnerabilities in it later.

                                                                                                                    2. 5

                                                                                                                      At the bottom of the page you linked:

                                                                                                                      I’ve mostly given up on the standard C library. Many of its facilities, particularly stdio, seem designed to encourage bugs.

                                                                                                                      Not great support for your claim.

                                                                                                                      1. 2

                                                                                                                        There was an integer overflow reported in qmail in 2005. Bernstein does not consider this a vulnerability.

                                                                                                                    3. 3

                                                                                                                      That’s not what I meant by attachment. Their interest in C certainly created much value.

                                                                                                                    4. 9

                                                                                                                      Their attachment to this language has caused billions if not trillions worth of damages to society.

                                                                                                                      Inflammatory much? I’m highly skeptical that the damages have reached trillions, especially when you consider what wouldn’t have been built without C.

                                                                                                                      1. 12

                                                                                                                        Tony Hoare, null’s creator, regrets its invention and says that just inserting the one idea has cost billions. He mentions it in talks. It’s interesting to think that language creators even think of the mistakes they’ve made have caused billions in damages.

                                                                                                                        “I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn’t resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years.

                                                                                                                        If the billion dollar mistake was the null pointer, the C gets function is a multi-billion dollar mistake that created the opportunity for malware and viruses to thrive.

                                                                                                                        1. 2

                                                                                                                          He’s deluded. You want a billion dollar mistake: try CSP/Occam plus Hoare Logic. Null is a necessary byproduct of implementing total functions that approximate partial ones. See, for example, McCarthy in 1958 defining a LISP search function with a null return on failure. http://www.softwarepreservation.org/projects/LISP/MIT/AIM-001.pdf

                                                                                                                          1. 3

                                                                                                                            “ try CSP/Occam plus Hoare Logic”

                                                                                                                            I think you meant formal verification, which is arguable. They could’ve wasted a hundred million easily on the useless stuff. Two out of three are bad examples, though.

                                                                                                                            Spin has had a ton of industrial success easily knocking out problems in protocols and hardware that are hard to find via other methods. With hardware, the defects could’ve caused recalls like the Pentium bug. Likewise, Hoare-style logic has been doing its job in Design-by-Contract which knocks time off debugging and maintenance phases. The most expensive. If anything, not using tech like this can add up to a billion dollar mistake over time.

                                                                                                                            Occam looks like it was a large waste of money, esp in the Transputer.

                                                                                                                            1. 1

                                                                                                                              No. I meant what I wrote. I like spin.

                                                                                                                          2. 1

                                                                                                                            Note what he does not claim is that the net result of C’s continued existence is negative. Something can have massive defects and still be an improvement over the alternatives.

                                                                                                                          3. 7

                                                                                                                            “especially when you consider what wouldn’t have been built without C.”

                                                                                                                            I just countered that. The language didn’t have to be built the way it was or persist that way. We could be building new stuff in a C-compatible language with many benefits of HLL’s like Smalltalk, LISP, Ada, or Rust with the legacy C getting gradually rewritten over time. If that started in the 90’s, we could have equivalent of a LISP machine for C code, OS, and browser by now.

                                                                                                                            1. 1

                                                                                                                              It didn’t have to, but it was, and it was then used to create tremendous value. Although I concur with the numerous shortcomings of C, and it’s past time to move on, I also prefer the concrete over the hypothetical.

                                                                                                                              The world is a messy place, and what actually happens is more interesting (and more realistic, obviously) than what people think could have happened. There are plenty of examples of this inside and outside of engineering.

                                                                                                                              1. 3

                                                                                                                                The major problem I see with this “concrete” winners-take-all mindset is that it encourages whig history which can’t distinguish the merely victorious from the inevitable. In order to learn from the past, we need to understand what alternatives were present before we can hope to discern what may have caused some to succeed and others to fail.

                                                                                                                                1. 2

                                                                                                                                  Imagine if someone created Car2 which crashed 10% of the time that Car did, but Car just happened to win. Sure, Car created tremendous value. Do you really think people you’re arguing with think that most systems software, which is written in C, is not extremely valuable?

                                                                                                                                  It would be valuable even if C was twice as bad. Because no one is arguing about absolute value, that’s a silly thing to impute. This is about opportunity cost.

                                                                                                                                  Now we can debate whether this opportunity cost is an issue. Whether C is really comparatively bad. But that’s a different discussion, one where it doesn’t matter that C created value absolutely.

                                                                                                                            2. 8

                                                                                                                              C is still much more widely used than those safer alternatives, I don’t see how laughing off a fact is better than researching its causes.

                                                                                                                              1. 10

                                                                                                                                Billions of lines of COBOL run mission-critical services of the top 500 companies in America. Better to research the causes of this than laughing it off. Are you ready to give up C for COBOL on mainframes or you think both of them’s popularity were caused by historical events/contexts with inertia taking over? Im in latter camp.

                                                                                                                                1. 7

                                                                                                                                  Are you ready to give up C for COBOL on mainframes or you think both of them’s popularity were caused by historical events/contexts with inertia taking over? Im in latter camp.

                                                                                                                                  Researching the causes of something doesn’t imply taking a stance on it, if anything, taking a stance on something should hopefully imply you’ve researched it. Even with your comment I still don’t see how laughing off a fact is better than researching its causes.

                                                                                                                                  You might be interested in laughing about all the cobol still in use, or in research that looks into the causes of that. I’m in the latter camp.

                                                                                                                                  1. 5

                                                                                                                                    I think you might be confused at what I’m laughing at. If someone wrote up a paper about how we should continue to use COBOL for reasons X, Y, Z, I would laugh at that too.

                                                                                                                                    1. 3

                                                                                                                                      Cobol has some interesting features(!) that make it very “safe”. Referring to the 85 standard:

                                                                                                                                      X. No runtime stack, no stack overflow vulnerabilities
                                                                                                                                      Y. No dynamic memory allocation, impossible to consume heap
                                                                                                                                      Z. All memory statically allocated (see Y); no buffer overflows
                                                                                                                                      
                                                                                                                                      1. 3

                                                                                                                                        We should use COBOL with contracts for transactions on the blockchains. The reasons are:

                                                                                                                                        X. It’s already got compilers big businesses are willing to bet their future on.

                                                                                                                                        Y. It supports decimal math instead of floating point. No real-world to fake, computer-math conversions needed.

                                                                                                                                        Z. It’s been used in transaction-processing systems that have run for decades with no major downtime or financial losses disclosed to investors.

                                                                                                                                        λ. It can be mathematically verified by some people who understand the letter on the left.

                                                                                                                                        You can laugh. You’d still be missing out on a potentially $25+ million opportunity for IBM. Your call.

                                                                                                                                        1. 1

                                                                                                                                          Your call.

                                                                                                                                          I believe you just made it your call, Nick. $25+ million opportunity, according to you. What are you waiting for?

                                                                                                                                          1. 4

                                                                                                                                            You’re right! I’ll pitch IBM’s senior executives on it the first chance I get. I’ll even put on a $600 suit so they know I have more business acumen than most coin pitchers. I’ll use phrases like vertical integration of the coin stack. Haha.

                                                                                                                                      2. 4

                                                                                                                                        That makes sense. I did do the C research. Ill be posting about that in a reply later tonight.

                                                                                                                                        1. 10

                                                                                                                                          Ill be posting about that in a reply later tonight.

                                                                                                                                          Good god man, get a blog already.

                                                                                                                                          Like, seriously, do we need to pass a hat around or something? :P

                                                                                                                                          1. 5

                                                                                                                                            Haha. Someone actually built me a prototype a while back. Makes me feel guilty that I dont have one instead of the usual lazy or overloaded.

                                                                                                                                              1. 2

                                                                                                                                                That’s cool. Setting one up isn’t the hard part. The hard part is doing a presentable design, organizing the complex activities I do, moving my write-ups into it adding metadata, and so on. I’m still not sure how much I should worry about the design. One’s site can be considered a marketing tool for people that might offer jobs and such. I’d go into more detail but you’d tell me “that might be a better fit for Barnacles.” :P

                                                                                                                                                1. 3

                                                                                                                                                  Skip the presentable design. Dan Luu’s blog does pretty well it’s not working hard to be easy on the eyes. The rest of that stuff you can add as you go - remember, perfect is the enemy of good.

                                                                                                                                                  1. 0

                                                                                                                                                    This.

                                                                                                                                                    Hell, Charles Bloom’s blog is basically an append-only textfile.

                                                                                                                                                  2. 1

                                                                                                                                                    ugh okay next Christmas I’ll add all the metadata, how does that sound

                                                                                                                                                    1. 1

                                                                                                                                                      Making me feel guilty again. Nah, I’ll build it myself likely on a VPS.

                                                                                                                                                      And damn time has been flying. Doesnt feel like several months have passed on my end.

                                                                                                                                            1. 1

                                                                                                                                              looking forward to read it:)

                                                                                                                                      3. 4

                                                                                                                                        Well, we have those already, and they’re called Rust, Swift, ….

                                                                                                                                        And D maybe too. D’s “better-c” is pretty interesting, in my mind.

                                                                                                                                        1. 3

                                                                                                                                          Last i checked, D’s “better-c” was a prototype.

                                                                                                                                        2. 5

                                                                                                                                          If you had actually made a serious effort at understanding the article, you might have come away with an understanding of what Rust, Swift, etc. are lacking to be a better C. By laughing at it, you learned nothing.

                                                                                                                                          1. 2

                                                                                                                                            the author calls for “improved C implementations”. Well, we have those already, and they’re called Rust, Swift

                                                                                                                                            Those (and Ada, and others) don’t translate to assembly well. And they’re harder to implement than, say, C90.

                                                                                                                                            1. 3

                                                                                                                                              Is there a reason why you believe that other languages don’t translate to assembly well?

                                                                                                                                              It’s true those other languages are harder to implement, but it seems to be a moot point to me when compilers for them already exist.

                                                                                                                                              1. 1

                                                                                                                                                Some users of C need an assembly-level understanding of what their code does. With most other languages that isn’t really achievable. It is also increasingly less possible with modern C compilers, and said users aren’t very happy about it (see various rants by Torvalds about braindamaged compilers etc.)

                                                                                                                                                1. 4

                                                                                                                                                  “Some users of C need an assembly-level understanding of what their code does.”

                                                                                                                                                  Which C doesnt give them due to compiler differences and effects of optimization. Aside from spotting errors, it’s why folks in safety- critical are required to check the assembly against the code. The C language is certainly closer to assembly behavior but doesnt by itself gives assembly-level understanding.

                                                                                                                                            2. 2

                                                                                                                                              So true. Every time I use the internet, the solid engineering of the Java/Jscript components just blows me away.

                                                                                                                                              1. 1

                                                                                                                                                Everyone prefers the smell of their own … software stack. I can only judge by what I can use now based on the merits I can measure. I don’t write new services in C, but the best operating systems are still written in it.

                                                                                                                                                1. 5

                                                                                                                                                  “but the best operating systems are still written in it.”

                                                                                                                                                  That’s an incidental part of history, though. People who are writing, say, a new x86 OS with a language balancing safety, maintenance, performance, and so on might not choose C. At least three chose Rust, one Ada, one SPARK, several Java, several C#, one LISP, one Haskell, one Go, and many C++. Plenty of choices being explored including languages C coders might say arent good for OS’s.

                                                                                                                                                  Additionally, many choosing C or C++ say it’s for existing tooling, tutorials, talent, or libraries. Those are also incidental to its history rather than advantages of its language design. Definitely worthwhile reasons to choose a language for a project but they shift the language argument itself implying they had better things in mind that werent usable yet for that project.

                                                                                                                                                  1. 4

                                                                                                                                                    I think you misinterpreted what I meant. I don’t think the best operating systems are written in C because of C. I am just stating that the best current operating system I can run a website from is written in C, I’ll switch as soon as it is practical and beneficial to switch.

                                                                                                                                                    1. 2

                                                                                                                                                      Oh OK. My bad. That’s a reasonable position.

                                                                                                                                                      1. 3

                                                                                                                                                        I worded it poorly, I won’t edit though for context.

                                                                                                                                              1. 7

                                                                                                                                                No the Google manifesto is a resentful political document. Attempts to claim it is scientific are dishonest hackery. This is how the document starts:

                                                                                                                                                Google’s political bias has equated the freedom from offense with psychological safety, but shaming into silence is the antithesis of psychological safety. This silencing has created an ideological echo chamber where some ideas are too sacred to be honestly discussed. The lack of discussion fosters the most extreme and authoritarian elements of this ideology.

                                                                                                                                                People who can claim such nonsense to be science don’t understand science or are being openly dishonest.

                                                                                                                                                1. 2

                                                                                                                                                  At least some scientists find that memo pretty scientifically accurate (as already posted somewhere in this thread):

                                                                                                                                                  The author of the Google essay on issues related to diversity gets nearly all of the science and its implications exactly right.

                                                                                                                                                  For what it’s worth, I think that almost all of the Google memo’s empirical claims are scientifically accurate.

                                                                                                                                                  As a woman who’s worked in academia and within STEM, I didn’t find the memo offensive or sexist in the least. I found it to be a well thought out document, asking for greater tolerance for differences in opinion, and treating people as individuals instead of based on group membership.

                                                                                                                                                  1. -1

                                                                                                                                                    What a collection of worthless hacks.

                                                                                                                                                    1. 3

                                                                                                                                                      Jordan Peterson in his interview with author of that memo also thinks that the facts in it are essentially correct.

                                                                                                                                                      Can you clarify why do you call scientists quoted in my previous message “worthless hacks”?

                                                                                                                                                      1. 3

                                                                                                                                                        Jordan Peterson, a guy who’s famous for Jungian psychology and misunderstanding pretty much every philosophical position he touches (“postmodernist neo-Marxism”), is somehow relevant in a discussion whether something is scientific?

                                                                                                                                                        The guy who literally uses Nazi conspiracy theories (cultural Marxism) is your go-to guy to show that something is not a right-wing rant?

                                                                                                                                                        It’s extremely funny how Americans (presumably, or anglos) discard Freud as unscientific without giving it a second thought, but Jung, a total unapologetic mystic, is suddenly OK because he has been shoehorned into alt-right ideology. I think this is what those damn postmodernists mean when they imply that science is political.

                                                                                                                                                        1. 2

                                                                                                                                                          The manifesto begins with

                                                                                                                                                          Google’s political bias has equated the freedom from offense with psychological safety

                                                                                                                                                          None of that is at all concerned with science. It is just a hackneyed wingnut political opinion. Your first “scientist” has made a career of explaining that stereotypes are right and he wades right in by asserting that the manifesto’s tedious political assertions are scientifically valid. He’s a fake scientist attempting to promote a political ideology as if it were blessed by the science gods.