1. 2

    Off-topic but relevant to the link: WAPO’s new method of nudging folks with ad-blockers is pretty anti-user. It appears to push the root of the site onto the history, so a reload of the page loads the front page instead of the story. The way to stop it? Stop page loading as soon as the body text loads: the gatekeeper is JavaScript and among the last things to load.

    1. 4

      I’m using uBlock Origin and uMatrix and I don’t see anything.

      1. 1

        Pihole, here.

        1. 2

          I’ve always found the granularity of running my adblock/noscript in-browser to far exceed any convenience advantage enjoyed by a pihole. IP bans just aren’t enough, especially vs “modern” sites with tons of JS.

    1. 1

      It turns out there are no cross-platform data-directory libraries for C. I was looking for something to implement xdg support and the equivalents on windows/macos, but all I found was appdirs. Hence, c-appdir. I currently have a linux implementation piggy-backing off of libxdg-basedir, but I’m working on getting access to macos and windows development environments to code that aspect.

      1. 5

        Stable (even if “preview”) rustfmt!

        1. 1

          Is this a rewrite of the old rustfmt?

          1. 6

            Nah, it’s the same one.

            What happened was that rustfmt was moved into the distribution, but this process made it nightly-only for a while. It’s back now.

        1. 9

          The official upload for consistency: https://www.youtube.com/watch?v=NSemlYagjIU

          1. 1

            And it has balanced audio too(!) Thanks.

          1. 8

            Is anyone aware of the kind of “slow and subtle” niches the author hopes for? Any slow and simple and modern design philosophies, techniques, tutorials, communities, etc?

            1. 16

              I’ve been doing simple web development on the side off and on. I’ve heard of the handmade network. I also like the how the Go community tends to build things, from an operational standpoint. You have the big things, like Docker, Kubernetes, and Mattermost which are big corporate things, but then you have smaller things, like linx, Hugo, or the various things built on boltdb.

              I’d start by removing “modern” as a hard requirement and look at how older, stable tech works.

              Any community where web development has mostly been in one place over the last 7 years probably has a lot of the basics covered already, and isn’t likely to be piling on the complexity in droves. My experiments in doing websites in Erlang (as opposed to Elixir) suggest that is one place where things seem to move slower. I suspect the Python web development is similar.

              Learn some old, mature tools, like make, bash, python, awk, graphviz or TCL/tk, and use them when you have task that fits them

              Build websites with as little JS as possible, just to see how far you can take forms and pages. Take time to evaluate how it works out over time, how the maintenance holds up, and so on.

              You might see what sorts of things the modern stacks are trying to compensate for, or realize that there is an easier way to solve a certain set of problems. Mining the past for ideas allows you too get a notion of the fundamental vs incidental complexity

              1. 5

                I’d start by removing “modern” as a hard requirement and look at how older, stable tech works.

                “Modern” means having options anyway.

                Back in the Old Days, large computers were the only option. They were great for large institutions with large problems or, at least, large endowments to effectively allow young programmers to build games on them, but if you wanted a small computer for something like process control or home office work or similar, you didn’t have that option. Vacuum tubes, and then discrete transistors, only got so small.

                ICs allowed us to make small computers, but they also allowed us to make big computers more efficient, because some problems require big computers, large amounts of compute concentrated in one location to grind away at a task, or large amounts of RAM or storage to handle huge amounts of data, or both.

                Similarly with graphics: We’ve gone from having command lines as the only option to having command lines coexisting with GUIs, each allowing the other to do what it does.

                Do some older practices get replaced? Yes, particularly in hardware. Nobody but a hobbyist would build a computer out of vacuum tubes these days. Cardstock is no longer a valid input mechanism. CRTs are no longer RAM. But progress is more about having options.

                1. 8

                  To be fair, I’m not against using modern tech, just that if you want something simpler, sometimes older stuff is where it’s at. Not always, of course, but by comparing the old against the new, one can find where real progress has been made, and where something might just be the fad of the year.

                  React, for example, seems to be acknowledged as a real improvement in the state of SPA in JS. It is certainly unlike most things that came before it on the JS side of things, though it seems kinda similar to IMGUI, and some people have drawn similarities to old Win32 code.

                  1. 2

                    Thanks, I’ll check these out.

                  2. 3

                    I’ve heard of the handmade network.

                    What does “handmade” mean? I read their manifesto, but I have no idea what criteria would be used to define which software falls under their umbrella. Is metafont handmade? Linux? The front page of Google? What about wordpress the software (and not the innumerable wordpress blogs)? All of these were created by a large amount of human effort over years of development. What would disqualify one of the other?

                    1. 2

                      The way I’ve understood is as a loose analogue for “artisnal”. Projects that can be created by a small team, rather than massive one. But I don’t speak for the group, so YMMV.

                1. 6

                  Employers think the elder employees have families and want more work-life balance, so they won’t work over-time without complaint like fresh graduates.

                  It’s true. I’m 34 and I do care more about getting a weekend hike in or working on my book than rendering additional hours, for free, unto capitalism. Even in terms of career investment, I’m generally more adept at investing in my own career than some duplicitous corporate manager who says he has my back but is really out for himself.

                  Older people know more. They’ve made their bad decisions already. (Some of us have made more than enough for two or three lifetimes.) All that stuff makes us better at everything… but harder to take advantage of.

                  The reason the pre-20th-century geniuses like Keats and Galois peaked so early is… they died. Before 1900, the age of 50 was fairly old and you were very lucky if you got to 60 with your health intact. (It happened; it wasn’t common.) We live in a different era and the intellectual peak seems to be quite late– at least 30, probably around 50– with the decline being extremely slow (if not nonexistent) in people who stay in good health.

                  1. 9

                    As a 24 year old, I hate the fact that enough people my age work long hours such that it is almost expected of me. Everyone is free to do what they want, but I can’t imagine not having enough hobbies so you willingly fill time doing work. Even with my strict 8 hour schedule I feel like I don’t have enough time to do what I want!

                    Also, almost every young programmer I’ve seen put long hours to “impress” bosses has failed. Software doesn’t work like that and most spend the extra time tabbing in and out of Reddit anyway.

                    1. 3

                      Everyone is free to do what they want, but I can’t imagine not having enough hobbies so you willingly fill time doing work.

                      :( as a 25-year old who doesn’t have enough hobbies and fills his time sometimes doing work, ouch.

                      1. 4

                        I relate to that (as a 26yo that was doing +12h/day). I read some books about work life balance (Off Balance) and others (Dream Manager, The Rythm Of Life) from Matthew Kelly, and then tried to apply.

                        When I changed job recently, I decided to add a challenge and directly told my future manager that I won’t work more than the legal 8 hours per day. He was comprehensive and even if he overworks a lot, I don’t feel pressured to do the same.

                        At the beginning coming home earlier was a pain, I wondered what to do, and spent hours reading HN/Lobsters/The Guardian, watching YouTube/Netflix etc…

                        Then I started to cook a bit more complex stuff and challenged myself to impress my girlfriend with it, I started to contribute on Github (very small things but I cleared myself from computers at home so I do my PR from and iPad now!), I’m reading more than ever, I now sleep much better, probably because I’m out of screens earlier, so I can wake up at 6 and go to the gym…

                        I really think that if you try to un-focus from work, you’ll find things to do. Play a musical instrument, help a local community, or if you still want to dev, dev for yourself or the community !

                        1. 1

                          He was comprehensive and even if he overworks a lot, I don’t feel pressured to do the same.

                          That’s an interesting choice of words.

                          1. 2

                            Oh sorry, got messed up in the translation! I meant understanding!

                  1. 4

                    I’m particularly partial to Simon Tatham’s version, which only gives you solvable puzzles. There is definitely something reassuring in that you always know a puzzle is solvable, so it’s just you against it. (There is a desktop version on the main page for those so inclined).

                    1. 16

                      If folks actually read this story Firefox is working pretty hard to make this a non invasive, non privacy compromising feature change, and they’re also opening themselves up for public comment.

                      Consider voicing your objections rather than simply jumping ship. Having a viable open source option is important for the web ecosystem IMO.

                      1. 15

                        If folks actually read this story Firefox is working pretty hard to make this a non invasive, non privacy compromising feature change, and they’re also opening themselves up for public comment.

                        i just want a freaking browser engine with the possibility of enhancements via extensions. i don’t want to turn off magick features. i just want a browser which displays websites. the new firefox engine is really great, but i fear that now they slowly fill firefox with crappy features until its slow again.

                        1. 3

                          What happens on the “New Tab page has zero effect on page load times. If you don’t like what the New Tab page looks like, customize it. Some of your options are:

                          • set it to blank
                          • install an extension that takes over
                          • customize the current experience

                          For the last option, click the little gear icon in the top right and you should see this https://imgur.com/a/1p47a where you can turn off any section that you don’t like.

                          1. 7

                            yes, i know. i still don’t want these features shipped and active by default. if i want pocket, i could install it as extension. heck, i wouldn’t mind if they said “hey, we have this great extension named pocket, care to try it out?” on their default new page, with a link to install it. but not shipped by default.

                            1. 4

                              What happens on the “New Tab page has zero effect on page load times.

                              I don’t care so much about page load times; sites which care are already fast (e.g. no JS BS), whilst those which don’t will soon bloat up to offset any increase in browser performance.

                              My main complaints with Pocket/Hello/Looking Glass/pdf.js/etc. are code bloat, install size, added complexity, attack surface, etc.

                              1. 1

                                You can’t do that on mobile.

                          1. 3

                            Typewriters were optimized for, well, typing.

                            Within significant technological constraints of metallurgy, plastics, and mechanics that now offer vastly different tradeoffs. Even the language is different: when’s the last time you saw a semicolon outside of code?

                            1. 17

                              One of my goals in life is to be able to properly use a semicolon in the normal course of my writing; it’s not as hard as you would think.

                              1. 4

                                I often use them in SMS; I see it as a kind of challenge.

                                1. 1

                                  A colon or plain period would be more appropriate than semicolon here.

                              2. 3

                                I use them all the time; much better than comma splices.

                                1. 3

                                  Ironically this would not have been a comma splice, and a comma would be more appropriate.

                                  1. 1

                                    What is your criteria for a comma over a semicolon? It looks good to me.

                                    1. 8

                                      The clauses aren’t independent. Actually that form is specifically used for dependent clauses. For example, “I use them all the time, more than before.” The base “I use them” applies to both parts: “I use them all the time” and “I use them more than before.” But “I use them much better than comma splices” doesn’t make any sense, so that’s not what’s happening here. Forty-Bot is omitting “they are”—typically handled with an emdash, or parentheses if the additional content has only minor significance.

                                      For a semicolon to apply, “they are” must be included to create a second independent clause:

                                      I use them all the time; they’re much better than comma splices.

                                      Using a comma instead of an emdash is mildly incorrect, but widely accepted in conversational writing. Since Forty-Bot explicitly called out the comma, I only pointed out the comma would be more appropriate. Though an emdash would be most appropriate. Semicolons see little use because conversational writing favors such omissions.

                                      1. 2

                                        Thank you for the reply, this is very informative.

                                        1. 1

                                          Semicolons are also useful for: separating list elements, when they contain commas; showing off, often in language discussions :)

                                1. 4

                                  Ever typed anything like this?

                                  $ grp somestring somefile
                                  -bash: grp: command not found
                                  

                                  Sigh. Hit ‘up’, ‘left’ until at the ‘p’ and type ‘e’ and return.

                                  Yeah, but I finnd using “up” “Ctrl-a” “Ctrl-d” “grep” easier, especially as an emacs user.

                                  Generally speaking that would say that’s the biggest “hidden” feature of bash: emacs bindings by default. And that’s not only limited to movement commands like C-a, C-e, C-p, M-b, etc. You can kill lines or words with C-k or M-d, and yank them back in when needed with C-y. There’s even an “infinite kill-king” (by far the coolest name for a editor feature), to replace the last yanked section with the next item in the kill-ring. Of course, not everything is implemented, so theres no hidden mail client or M-x butterfly, but if you already are used to the default emacs editor binding, you get used to this quickly. And afaik, all tools that require GNU readline can do this. I just tested it out with Python, and it seemed to work.

                                  I also recall reading something about vi-bindings in the bash manpage, but I can’t testify on how useful harmful, annoying or useless they are.

                                  1. 6

                                    Emacs bindings by default is also one of the biggest hidden features of MacOS: the bindings work in all GUI text fields.

                                    1. 1

                                      Wow, I learned something new today. Prompted by your comment, I found this table comparing emacs bindings, OSX’s version of emacs bindings, and the more traditional Mac-style bindings for various operations.

                                      Looks like emacs’s M- bindings are mapped to ctrl-opt- on MacOS, which isn’t super convenient (e.g. I don’t see myself getting in the habit of using ctrl-opt-f over opt-rightarrow to move forward a word), but most of the C- bindings are convenient enough.

                                      1. 1

                                        I just discovered this a few days ago by accident because I have it set in GTK so I can use the bindings in my browser. I had to use a colleague’s (who is on macOS) browser and I just used them without thinking and only later realised ‘hey, wait a minute, why did that work?’.

                                        1. 1

                                          This is a major reason why I stay on OS X. I’m pretty sure I could reconfigure some Linux to get most of this, but probably not all of the niceness of text fields

                                          Would love to be proven wrong though

                                          1. 3

                                            I haven’t used GNOME for a while now, but I remember there being an option somewhere to used emacs keybindigs. And as it seems, all you need nowadays is to install the GNOME tweak tools, to activate it.

                                            (Alternatively, just use Emacs as your OS, heard they’ve got good support for emacs keybindings)

                                            1. 2

                                              Just FYI: That page is outdated, being written for 2.x era Gnome. Now the Emacs Input toggle is under the Keyboard & Mouse section.

                                        2. 3

                                          Yeah, won’t disagree. Occasionally, I find myself reaching for the caret because it comes to mind first.

                                          I’m an avid vi user, but the vi bindings on the command line never take for me. I always go back to the emacs ones.

                                          1. 3

                                            I use vi vindings and love them! I also never use ^ because I prefer interactive editing.

                                            It’s really nice that they work in Python and R as well as bash (because Python and R both use readline).

                                            In fact I think a large part of the reason that my OCaml usage trailed off is that the REPL utop doesn’t support readline. It only has emacs bindings!

                                            For those who don’t know, here is the beginning of my .inputrc:

                                            $ cat ~/.inputrc 
                                            set editing-mode vi
                                            
                                            set bell-style visible    # no beep
                                            
                                            1. 2

                                              Deleting words with C-w is also very helpful ime.

                                              1. 1

                                                I use fc for that. Opens your $EDITOR with the last command in a file, the edited command will be run

                                              1. 2

                                                Does it fake accesses to the vdso?

                                                1. 0

                                                  You don’t want to manually write makefiles. Use Autotools instead: https://autotools.io/index.html

                                                  1. 14

                                                    Why not, its completely fine to write “simple” makefiles for “simple” projects. I think the musl makefile is a good example for a not so simple but still simple makefile.

                                                    To me autotools generated makefiles are a hell to debug, just slightly less hellish than debugging cmake.

                                                    1. 5

                                                      The musl makefile is one of the cleanest production makefiles I’ve ever seen. But I note even its opening comment says, “This is how simple every makefile should be… No, I take that back - actually most should be less than half this size.”

                                                      I count, at least, 3 languages used:

                                                      1. GNU dialect of make
                                                      2. shell
                                                      3. sed

                                                      And hacks like this:

                                                      obj/musl-gcc: config.mak
                                                      	printf '#!/bin/sh\nexec "$${REALGCC:-$(WRAPCC_GCC)}" "$$@" -specs "%s/musl-gcc.specs"\n' "$(libdir)" > $@
                                                      	chmod +x $@
                                                      
                                                      obj/%-clang: $(srcdir)/tools/%-clang.in config.mak
                                                      	sed -e 's!@CC@!$(WRAPCC_CLANG)!g' -e 's!@PREFIX@!$(prefix)!g' -e 's!@INCDIR@!$(includedir)!g' -e 's!@LIBDIR@!$(libdir)!g' -e 's!@LDSO@!$(LDSO_PATHNAME)!g' $< > $@
                                                      	chmod +x $@
                                                      

                                                      Local legend @andyc of Oil Shell fame pushes the idea that Shell, Awk, and Make Should Be Combined. IMHO, the musl example is persuasive empirical evidence for his position.

                                                      (I’m hoping @stefantalpalaru is being sarcastic about Autotools and we’re all falling to Poe’s Law.)

                                                      1. 2

                                                        (I’m hoping @stefantalpalaru is being sarcastic about Autotools and we’re all falling to Poe’s Law.)

                                                        No, I’m not. I’ve worked with hand written Makefiles, Autotools and CMake on complex projects and I honestly think that Autotools is the lesser of all evils.

                                                        Local legend @andyc of Oil Shell fame

                                                        Now I hope you’re the one being sarcastic. Who exactly uses the Oil Shell?

                                                        1. 3

                                                          Now I hope you’re the one being sarcastic.

                                                          I was not being sarcastic. @andyc’s Oil Shell posts consistently do well in voting here.

                                                          Who exactly uses the Oil Shell?

                                                          Who uses a less than a year old shell that explicitly isn’t for public use yet? I’m hoping very few people.

                                                          What does the number of Oil Shell users have to do with his argument?

                                                    2. 7

                                                      Err. Last time I touched autotools it was a crawling horror.

                                                      Makefiles are fine. Just don’t write ones that call other makefiles (recursive make considered harmful and all that).

                                                      1. 3

                                                        Just don’t write ones that call other makefiles (recursive make considered harmful and all that).

                                                        Clearly someone needs to write “Make: The Good Parts” 😂

                                                        1. 2

                                                          Isn’t non-recursive make also considered harmful?

                                                          1. 2

                                                            I think the ideal is a makefile that includes parts from all over your source tree, so that there’s one virtual Makefile (no recursive make invocation O(n^2) issues) but changes can be made locally. Best of both worlds!

                                                        2. 5

                                                          I’m no expert on Make or Autotools, but my response to building automation on top of an automation tool is: “Please, god, no!”

                                                          If your automation tool lacks the expressiveness to automate the problem your solving, the solution should never be to bolt another automation tool on top of it. It’s time to start over.

                                                          1. 2

                                                            I’m going to take a guess that you’re not a DevOps engineer.

                                                            1. 1

                                                              “Lets just use another layer of automation!” is the DevOps version of solving every problem with another layer of indirection clearly! :)

                                                            2. 1

                                                              But why not? One tool (cmake, meson) checks dependencies and works on high-level concepts such as “binary”, “library”. Other tool (make, ninja) is low-level and operates on building individual files. It’s sane separation of concerns.

                                                              Make, however, tries to be high-level (at least GNU make, it even has built-in Scheme), but not enough high-level and it might be better at low level (that’s why ninja was invented).

                                                              Monolith build tools like Bazel might better handle invalidation but this class of tools is not explored enough (existing tools are designed for specific Google/Facebook’s use cases).

                                                            3. [Comment removed by author]

                                                              1. 3

                                                                Autotools is terrible.

                                                                Yes, but all the alternatives are worse.

                                                              2. 1

                                                                Disagree

                                                                https://varnish-cache.org/docs/2.1/phk/autocrap.html

                                                                And look how complicated this configure script is which doesn’t take 60 seconds to run:

                                                                https://github.com/bsdphk/Ntimed/blob/master/configure

                                                                1. 2

                                                                  This is a really cool project. However, even though he is using old hardware for an old process, it still seems absurdly outside of my budget (as do almost all hardware projects). I suppose that’s why I keep to software: “free” is a hard price-point to beat.

                                                                  1. 1

                                                                    Cool textbook, helped fill in a few gaps I had after taking statistics. I wish it didn’t so heavily rely on the TI-8x line of calculators, however. Although it’s practical, it reinforces dependence on one vendor and (I think) diminishes understanding of the underlying maths a bit.

                                                                    1. 21

                                                                      Compiling Firefox on 8 core ryzen targetting the host arch takes between 10 and 15 minutes.

                                                                      1. 10

                                                                        Wow that is fast, takes ~2h on a build server that takes 7 hours for chromium. All the recent rust stuff really slowed it down.

                                                                        1. 6

                                                                          All the recent rust stuff really slowed it down.

                                                                          Oof, yeah, I bet. Rust is notoriously slow to compile. Luckily, incremental compilation is in the nightly compiler right now. I’ve been using it wherever I can and it really does make a difference. But I suppose it wouldn’t do much for an initial compilation of a project. :(

                                                                          1. 4

                                                                            In this case a large chunk of this is just bindgen; we need to generate bindings so we throw a libclang-based-bindings generator at all of the header files. Twice (complicated reasons for this).

                                                                            It’s also pretty single threaded (codegen units will help with this, but currently isn’t, and I have to investigate why).

                                                                            Incremental compilation and working codegen units and cleaning up the bindgen situation will help a lot. Going to take a while, though.

                                                                            1. 3

                                                                              But I suppose it wouldn’t do much for an initial compilation of a project.

                                                                              Also not going to help packagers who use only temporary compilation environments which are discarded after a package is built.

                                                                              1. 6

                                                                                Package managers (and regular builds also) should not be starting from scratch every time. Even if we insist on doing all source editing in ASCII, we need to be delivering modules as fully-typed, parsed ASTs.

                                                                                This insistence on going back to plain source code every chance we get and starting over is easily wasting more energy than the bitcoin bubble.

                                                                                1. 9

                                                                                  Package managers (and regular builds also) should not be starting from scratch every time.

                                                                                  They should if you want reproducible builds.

                                                                                  1. 3

                                                                                    These are completely orthogonal. There’s nothing stopping reproduceable builds where you run the entire pipeline if you insist on comparing the hash of the source and the hash of the output. And you would still get the same benefit by comparing source<->ast and ast<->binary

                                                                                    1. 1

                                                                                      Yeah, ideally a compiler would take a difference in source to a difference in output. Compiler differentiation?

                                                                                    2. 2

                                                                                      I believe you can do a decent amount of caching in this space? Like MSFT and co. have compile serverse that will store incrementally compiled stuff for a lot of projects so you’re only compiling changes

                                                                                2. 7

                                                                                  My non-beefy lenovo x series laptop takes ~45 minutes for a complete recompile, ~12min for average changes w/ ccache etc. and ~min for JS-only changes (which you can do as artifact builds, so they’re always 3min unless you need to build C++/Rust components

                                                                                1. 2

                                                                                  Personally, I really like having a staging area by default. I like my commits to have higher granularity (i.e. do one thing in a commit). When editing code, I will often make several commits worth of modifications before finalizing on what I want to change. I also usually have some modifications (i.e. debug printfs or minor build file modifications) which I don’t want to commit, but I may need to commit other parts of those files. Having a staging area makes it easier to pick-and-choose what parts of what files go into each commit. It could be argued that what I really want is just git add -p, and that the staging area is irrelevant to that. However, I like the interactivity that a two-phase commit process affords (and the ability to easily undo changes). In the article, Szorc argues that this shouldn’t be a default feature, but this seems to be a very common use-case (especially once multiple people start working on a project and you want a clean commit-history).

                                                                                  1. 2

                                                                                    Initially, I too thought that this was the argument that he was making, but if you think it through, and understand the nature of the what happens when you git add -p, it’s exactly the same as what happens when you commit, without the commit message. Git creates a copy of the file in the objects folder, takes the hash and names it appropriately based on the hash. The commit is just putting a label to the stuff that you put into the index (/staging area). Conceptually, there’s no difference between writing a label on an envelope and then one or more things in it vs putting one or more things in it and then writing a label on it. I’m a decently experienced git user and can see how the simplification of removing the staging area would make understanding git easier.

                                                                                    One of the places that it might make things harder is in resets - i.e. I’ve committed and I want to reset --soft HEAD~1 to redo the commit. It would be difficult to keep the existing changes in my workdir and the reset files separate. Perhaps there’s an obvious way around this though?

                                                                                  1. 2

                                                                                    Currently reading Faust by Goethe. It’s very interesting: he employs (or rather, the translator employs) a wide vocabulary, and makes a lot of references esp. to antiquity. Although researching everything makes for slower going, it is well-written and better than I’d expected.

                                                                                    1. 2

                                                                                      Humor on the mailing list. There’s also some interesting follow-up discussion on git:// vs https://.

                                                                                      1. 4

                                                                                        This looks like a nice intro to lockless concurrency - that being said, I think that the main thing for programmers to know about lockless concurrency is to fear it :P

                                                                                        1. 3

                                                                                          I think that the main thing for programmers to know about lockless concurrency is to fear it :P

                                                                                          The problem with rolling your own concurrency is not the complexity of the individual components, but the exponential nature of their interactions. Unfortunately at thee systems level there is often not much one can do to avoid it. I like the approach Rust has taken here by encoding concurrency garunteees in the type system, but there isn’t much help for C programmers.

                                                                                          1. 2

                                                                                            In the past, the limitation made people adopt a CSP-like style for programming where the global interactions were explicit. Then, they’d check a model of that with something like SPIN. These days, there’s more effort on static analysis among those who haven’t given up on C concurrency entirely. ;) Found another one just now:

                                                                                            https://lobste.rs/s/hiwfqh/locksmith_practical_static_race