1. 0

    So far I’ve only found one solution that is actually robust. Which is to manually check that the value is not nil before actually using it.

    This seems reasonable to me. If anything, I’d consider knowing how and when to use this kind of check a part of language competency knowledge as it is how Go was designed.

    1. 9

      We expect people to be competent enough to not crash their cars, but we still put seatbelts in.

      That’s perhaps a bad analogy, because most people would say that there are scenarios where you being involved in a car crash wasn’t your fault. (My former driver’s ed teacher would disagree, but that’s another post.) However, the point remains that mistakes happen, and can remain undiscovered for a disturbingly long period of time. Putting it all down to competence is counter to what we’ve learned about what happens with software projects, whether we want it to happen or not.

      1. 9

        I wish more languages had patterns. Haskell example:

        data Named = Named {Name :: Text} deriving Show
        
        greeting :: Maybe Named -> Text
        greeting (Just thing) = "Hello " + (Name thing)
        greeting _ = ""
        

        You still have to implement each pattern, but it’s so much easier, especially since the compiler will warn you when you miss one.

        1. 3

          Swift does this well with Optionals

          1. 5

            You can even use an optional type in C++. It’s been a part of the Boost library for a while and was added to the language itself in C++17.

            1. 4

              You can do anything in C++ but most libraries and people don’t. The point is to make these features integral.

              1. 1

                It’s in the standard library now so I think it’s integral.

                1. 4

                  If it’s not returned as a rule and not as an exception throughout the standard library it doesn’t matter though. C++, both the stdlib and the wider ecosystem, rely primarily on error handling outside of the type-system, as do many languages with even more integrated Maybe types

            2. 2

              Yep. Swift has nil, and by default no type can hold a nil. You have to annotate them with ? (or ! if you just don’t care, see below).

              var x: Int = nil // error
              var x: Int? = nil // ok
              

              It’s unwrapped with either if let or guard let

              if let unwrapped_x = x {
                  print("x is \(x)") 
              } else {
                  print("x was nil")
              }
              
              guard let unwrapped_x = x else {
                  print("x was nil")
                  return
              }
              

              Guard expects that you leave the surrounding block if the check fails.

              You can also force the unwraps with !.

              let x_str = "3"
              let x = Int(x_str)! // would crash at run-time if the conversion wouldn't succeed
              

              Then there’s implicit unwraps, which are pretty much like Java objects in the sense that if the object is nil when you try to use it, you get a run-time crash.

              let x: Int! = nil
              
          2. 7

            Hey, I’m the author of the post. And indeed that does work, which is why I’m doing that currently. However, like I try to explain further in the post this has quite some downsides. The main one is that it can be easily forgotten. The worst part of which is that if you did forget, you will likely find out only by a runtime panic. Which if you have some bad luck will occur in production. The point I try to make is that it would be nice to have this be a compile time failure.

            1. 1

              Sure, and that point came across. I think you’d agree that language shortcomings - and certainly this one - are generally excused (by the language itself) by what I mentioned?

          1. 28

            It’s what I’d do even if I weren’t getting paid. It’s my primary hobby, my profession, and what I’ve wanted to do since I can remember.

            1. 6

              This is exactly my feeling. I love solving problems using software I write, I love knowing people are using what I make, and I love reducing the amount of boring and error-prone repetition in my life using my own software.

              I think people who don’t enjoy coding belong in the industry just as much as anyone else. At that level, it’s a job, and we don’t expect people in any other field to love it, necessarily. But it’s a very odd, specialized job which is definitely not for everyone, and anyone smart enough to be a programmer is probably smart enough to make a living in some other field which doesn’t involve the obscure pains programming can expose you to.

              1. 4

                That’s awesome that your hobby and profession aligns 🙂

                Do you have any advice for people who currently don’t enjoy doing what they do for a profession and how to get there?

                1. 11

                  Well, I guess my first advice would be to ask, what do you love? There’s absolutely nothing wrong with coding to pay the bills and doing what you love with your free time. My other hobby is Western religious history, which definitely is not a moneymaking industry. :)

                  I’d also say, coding is a huge field, like “talking”. There are a million languages, a million techniques. You might find that you like the theoretical side more (math, complexity theory, category theory, etc). You might find out that you really like writing parsers. Maybe you hate Java but you end up loving Erlang. There’s a lot out there.

                  But, again, there’s nothing wrong with just having a day job. Life is meant to be enjoyed so don’t worry about what you “should” like, do what you do like however you can.

                  1. 1

                    I like to ice skate and tinker, mostly with keyboards and other input devices, but the skies the limit really.

                    Unfortunately neither pay well so here I am 🙂

                    1. 5

                      Has it always been the case or in the beginning you used to enjoy programming?

                      If you used to like it, but not anymore, it’s worth trying to investigate why. This is something that has also happened to me, and it took me a while to finally find an answer. At first, I thought this was a matter of abstract vs concrete activities and I tried to find concrete activities that I could find interesting. Though I did start cooking (and discovered I enjoy it), this still wasn’t the real issue. You said you like ice skating and tinkering, so maybe this is a good place to start thinking about it.

                      Later, I realized that what I really miss from the early days is having everything under control: relatively small code bases that I understand well, none or just a few third party libraries, etc. In contrast, most professional projects I have taken are the complete opposite: large code bases that no one really understands with lots of external libraries. I still don’t know what exactly I can do about this, but just from knowing where my discomfort comes from, I already feel a bit better about programming overall.

                      This has been my quest so far to try to start enjoying coding as much as used to. You’ll probably find other topics that I haven’t mentioned, but that are important to you. This is an introspective exercise I think you should do if you also wish to get back to programming with a refreshed feeling.

                      1. 4

                        I used to like it. I had a really good job at the start of my career with an amazing mentor. However, since that job, I haven’t liked it much.

                        I’ll try to reflect on exactly why, but I have a suspicion that it’s when I realized that agile and startups are irrevocably intertwined with coding. Both I can’t stand.

                        This is an introspective exercise I think you should do if you also wish to get back to programming with a refreshed feeling.

                        That sounds amazing. Hopefully, I can get there. 🙂

                  2. 4

                    Not the parent commenter, but how about looking into other programming paradigms? e.g. if you only do OOP right now, try looking into functional programming, there’s a lot of interesting and beautiful stuff there that might appeal to you.

                    Or perhaps have a look at formal methods, either “lightweight” ones like TLA, or the more heavy weight ones like Coq and Lean. I’ve found writing specifications above the level of abstraction that most programming languages provide and model checking them or writing proofs for them quite an interesting/enlightening experience for me and it’s one of my favourite things to do.

                    Lastly, I think generally identifying pain points in your workflows and writing little programs to automate them and scratch your itch could be another way to find joy in programming. For me that’s writing little Haskell scripts or elisp (Emacs Lisp) tidbits here and there.

                    1. 2

                      Thanks for the reply! I know this is a “me problem” but when I have to learn new technologies I find it very frustrating. Recently I’ve been on a Scala project and FP hasn’t done anything for me.

                      It wasn’t always that way though, I used to get excited about new technologies but after a while it just notice it for the revolving door it is.

                      1. 2

                        […] but when I have to learn new technologies I find it very frustrating.

                        I can relate to that :)

                        Recently I’ve been on a Scala project and FP hasn’t done anything for me.

                        Interesting. I’d say in that specific case, the language and the machinery it offers probably influence that as well. Like, for instance I’d take writing Haskell over Scala any day. But oh well, that’s me.

                        It wasn’t always that way though, I used to get excited about new technologies but after a while it just notice it for the revolving door it is.

                        Right. Same here actually. I don’t go looking for shiny new things to use just because they’re shiny and new. But I would still gladly consider things that could help improve the quality and correctness of programs that I write, and see if they’d be worth the investment of time for learning/integrating them into my workflow. And that bar has certainly increased over the last couple years.

                1. 1

                  This is a rather… uh… factually-challenged ideological piece, and isn’t very interesting because of that.

                  1. 28

                    This “there are no full stack devs” meme is horseshit. I’ll accept that keeping up with web frontend (especially JS) is very challenging lately, and requires a substantial investment of time. But I have experience doing every single one of the “impossible” list of skills, even having used most of these skills at the same company. We’re not unicorns, we just have more than five years in the field.

                    My take on it is that full-stack development is really only relevant on very small teams (1-5 devs), and that specialization happens from there, and that’s a good thing. Looking for full-stack devs on a team that’s bigger than this is usually the result of lazy resource planning. But when you don’t have a lot of hands on keyboards, full-stack or “T-shaped” developers are a great asset.

                    1. 10

                      I find it pretty funny to see this article on Lobsters, a site where I personally exercise every skill in the “impossible” list.

                      (OK, except for a front-end framework because we don’t need one, though I’ve worked with React.)

                      1. 8

                        Agreed. I’m really sick of this meme too. There are a LOT of apps/sites out there that are small, simple, and serve a limited audience very well. Not every app needs tons of developers. It’s really insulting to us full-stack people who take pride in keeping those apps running, to imply or outright say we must be bad at our jobs.

                        1. 4

                          It is rare for me to hear a dev say what they work on is ‘simple,’ even if it is. I’ve felt that a lot of programming in industry is somewhat simple with mountains of incidental complexity brought on by inexperience, poor practices, bad languages and paradigms, and unrelenting schedules. But devs seem unable to separate these things from one another, so it feels taboo to say something like that.

                          1. 3

                            At least half of what we all do is data shoveling. Simple doesn’t mean easy though, it just means uncomplicated. Digging a six foot hole is simple.

                        2. 4

                          “Keeping up with the frontend” is a bit rough, right?

                          Like, the products we work on don’t magically fall apart every time a new framework comes out–we do this to ourselves.

                          1. 2

                            I’ve done all those things as well, and web isn’t really my field. I really thought this would be something along the lines of what @technomancy said, in which case I would not yet qualify. And I fully agree with @hwayne’s comment, and simply consider myself a generalist.

                            1. 2

                              Agreed. There definitely are full stack developers, and while they need to have irons in a lot of dumpster fires to remain up to date on all of the fads, the collection of moving parts is fairly small really: some database knowledge, some SOA knowledge, and some presentation knowledge.

                              I think there’s some nuance missing in your “full-stack development is for small teams” idea: I agree with that part of it, but the part that growing the team means adding specialisation seems to imply that growing a team is natural and inevitable, so that small generalist teams evolve into large specialist teams. Either is a way to staff a software team, software teams have probably succeeded or failed using either approach, and a well-performing small team of generalists will probably continue to deliver successfully without adding some specialists. A well-performing large team of specialists will probably continue to deliver successfully, too.

                              1. -3

                                This “there are no full stack devs” meme is horseshit.

                                Saying it’s horseshit is, itself, horseshit unless you have a syllabus which, when mastered, will make someone a full-stack developer.

                                Until then, it’s equivalent to being “Cool”:

                                What makes someone “Cool”? Being “Cool”.

                                OK… who’s “Cool”? Not you…

                                1. 4

                                  There’s no single syllabus; it all depends on the business requirements, which drive decisions about the software stack. If you can solve every problem encountered with that software stack, congratulations – to that business, you are a full-stack developer.

                              1. 2

                                Seems to be down? I get an error “Too many redirections”

                                1. 2

                                  “Due to the GDPR, you have to upgrade your product.”

                                  Ha! If you’ve gotta do something, you might as well turn a profit. Next step: Blame random outages on the GDPR, as I’m sure has already happened. This is such a gift for certain companies.

                                  1. 0

                                    Can the admins merge this story with that one?

                                    1. 8

                                      I’d rather not have the Medium link, thanks.

                                    1. 27

                                      What are the advantages to making it federated over the current setup?

                                      1. 7

                                        In terms of content and moderation, each instance would be kind of like a “view” over the aggregate data. If you want stricter moderation you could sign up for one instance over another. Each instance could also cater to a different crowd with different focuses, e.g. Linux vs. BSD vs. business-friendly non-technical vs. memes vs. …. Stories not fitting an instance could be blocked by the instance owner. Of course you could also get the catch-all instance where you see every type of story; it might feel like HN.

                                        The current Lobsters has a very specific focus and culture, and also locked into a specific moderation style. Federating it would allow a system closer to Reddit and its subreddit system where each instance has more autonomy, yet the content from the federated instances would all be aggregated.

                                        So of course such a system wouldn’t be a one-to-one replacement for Lobsters but a superset. Ideally an individual instance could be managed and moderated such that it would feel like the Lobsters of today.

                                        1. 18

                                          The current Lobsters has a very specific focus and culture, and also locked into a specific moderation style. Federating it would allow a system closer to Reddit and its subreddit system where each instance has more autonomy, yet the content from the federated instances would all be aggregated.

                                          If federation results in a reddit-like site, I’d much rather that lobste.rs doesn’t federate. It’s a tech-news aggregator with comments, there’s no real benefit in splitting it up, especially at it’s current scale.

                                          1. 6

                                            I get what you’re saying. I think OP framed the idea wrong. People come to Lobsters because they like Lobsters. The question is whom would the federated Lobsters benefit – it would mostly benefit people who aren’t already Lobsters users.

                                            It’s just that the Lobsters code base is open source and actively developed, and much simpler than Reddit’s old open source code. So it’s not unreasonable to want to build a federated version on top of Lobsters’ code rather than start somewhere else.

                                            1. 3

                                              it would mostly benefit people who aren’t already Lobsters users.

                                              Well that was my point. Any spammer or shiller can create and recreate reddit and hacker-news accounts, thereby decreasing the quality and the standard of the platform, and making moderation more difficult. This is exactly what the invite tree-concept prevents, which is quite the opposite of (free) federation.

                                              1. 8

                                                We do have one persistent fellow who created himself ~20 accounts to submit and upvote his SEO spam. He’s still nosing around trying to re-establish himself on Lobsters. I’m very glad not to be in an arms race with him trying to prevent him from abusing open signups.

                                                1. 1
                                          2. 2

                                            Based on my experience in community management, including here on Lobsters, I do not believe it’s possible for an individual instance in a system like you describe to have a coherent culture which is different from the top-level culture in substantial ways, unless you’re okay with participants feeling constantly under siege. The top-level culture always propagates downward, and overriding it takes an enormous amount of resources and constant effort.

                                            1. 1

                                              Have you used Mastodon at all? If that’s used as a model, it seems each instance can have a distinct personality, as Mastodon instances do today. Contrast with traditional forums, and Reddit to some extent, which do more-or-less have a tree structure and where your concern definitely applies. With federation there doesn’t necessarily need to exist a top-down structure, even if that might be the easiest to architect (although I don’t know if it is the easiest).

                                              1. 1

                                                I have used Mastodon, but not enough to have a strong opinion on it. It’s been a challenge for me to pay enough attention to it to keep up with what’s happening; it’s kind of an all-or-nothing thing, and right now Twitter is still taking the attention that I would have to give to Mastodon.

                                          3. 7

                                            Biggest argument in favor is probably for people that want to leech off of the quality submissions/culture here but who don’t want to actively participate in the community or follow its norms. That and the general meme today of “federated and decentralized is obviously better than the alternative”.

                                            Everybody wants the fruit of tilled gardens, but most people don’t want to put in the effort to actually do the work required to keep them running.

                                            The funny thing is that we’d probably just end up with a handful (N < 4) of lobster peers (after the novelty wears off), probably split along roughly ideological lines:

                                            • Lobsters for people that want a more “open” community (signups, etc.) and with heavier bias towards news and nerdbait
                                            • Lobsters for social-justice and progressive people
                                            • Lobsters for edgelords and people who complain about “social injustice”
                                            • Lobsters Classic, this site

                                            And sure, that’d scratch some itches, but it’d probably just result in fracturing the community unnecessarily and creating the requirement for careful monitoring of what gets shared between sites. As a staunch supporter of Lobsters Classic, though, I’m of course biased.

                                            1. 3

                                              So “federation” is what the cool kids are calling “forking” nowadays? Good to know ;)

                                            2. 2

                                              I’d be quite interested to see lobsters publish as ActivityPub/OStatus (so I could, for instance, use a mastodon account to follow users / tags / all stories). I don’t see any reason to import off-site activity; one of the key advantages of lobsters is that growth is managed carefully.

                                              1. 1

                                                Lobsters actually already does this with Twitter, so that seems both entirely straightforward to add and in line with existing functionality.

                                                (Note that I don’t use Twitter, so I can’t speak to how well that feed actually works.)

                                                1. 1

                                                  The feeds already exist, just have to WebSub enable them…

                                                2. 1

                                                  It won’t go away entirely if the one, special person who happens to own this system decides to make it go away for whatever reason of their own. It won’t die off if this specific instance gets sold or given to someone who can’t handle it and who runs it into the ground.

                                                1. 4

                                                  TLDR: Most of the keyboard shortcuts here work basically in every software that reads input text.

                                                  I daily use CTRL+L (clear screen) and CTRL+U (to cut everything before the cursor) in my shell.

                                                  1. 2

                                                    I use Ctrl-W (delete previous word) Ctrl-A (go to beginning of line) and Ctrl-R (search previous lines) a lot.

                                                    1. 2

                                                      I used exactly these 15 years long until I discovered that I can switch the VI mode on. Now the first line that I type when logged in on a (foreign) Linux box is “set -o vi”. I hope all terminal REPL applications would use readline so that I could use VI mode line editing everywhere. But that’s not everywhere the case.

                                                  1. 10

                                                    I enjoyed this, but it did make me wonder – what would a true low-level language designed today actually look like? I’ll hang up and take your answers over the air.

                                                    1. 5

                                                      If I’m reading the article’s premise properly, the author doesn’t even consider assembly language to be ‘low level’ on modern processors, because the implicit parallel execution performed by speculative execution is not fully exposed as controllable by the programmer. It’s an interesting take, but I don’t think anybody other than the author would use “low level” to mean what he does.

                                                      That said, if we were to make a language that met the author’s standards (let’s say “hardware-parallelism-transparent” rather than “low-level”), we’d probably be seeing something that vaguely resembled Erlang or Miranda in terms of how branching worked – i.e., a lot of guards around blocks of code rather than conditional jumps (or, rather, in this case, conditional jumps with their conditions inverted before small blocks of serial code).

                                                      People later in the thread are talking about threading & how there’s no reason threading couldn’t be put into the C standard, but threading doesn’t appear to be the kind of parallelism the author is concerned about exposing. (To be honest, I wonder if the author has a similarly spicy take on microcode, or on firmware, or on the programmability of interrupt controllers!)

                                                      He seems to be saying that, because we made it easy to ignore certain things that were going on in hardware (like real instructions being executed and then un-done), we were taken off-guard by the consequences when a hole was poked in the facade in the form of operations that couldn’t be hidden in that way. I don’t think that’s a controversial statement – indeed, I’m pretty sure that everybody who makes compatibility-based abstractions is aware that such abstractions become a problem when they fail.

                                                      He suggests that the fault lies in not graduating to an abstraction closer to the actual operation of the machine, which is probably fair, although chip architectures in general and x86 in particular are often seen as vast accumulations of ill-conceived kludges and it is this very bug-compatibility that’s often credited with x86’s continued dominance even in the face of other architectures that don’t pretend it’s 1977 when you lower the reset pin and don’t require trampolining off three chunks of arcane code to go from a 40 bit address bus and 16 bit words to 64 bit words.

                                                      People don’t usually go as far as to suggest that speculative execution should be exposed to system programmers as something to be directly manipulated, and mechanisms to prevent this are literally part of the hardware, but it’s an interesting idea to consider, in the same way that (despite their limitations) it’s interesting to consider what could be done with one of those PPC chips with an FPGA on the die.

                                                      The quick and easy answer to what people would do with such facilities is the same as with most forms of added flexibility: most people will shoot themselves in the foot, a few people would make amazing works of art, and then somebody would come along and impose standards that limit how big a hole in your foot you can shoot and it’d kill off the artworks.

                                                      1. 4

                                                        Probably a parallel/concurrent-by-default language like ParaSail or Chapel with C-like design as a base to plug into ecosystem designed for it. Macros for DSL’s, too, since they’re popular for mapping stuff to specific hardware accelerators. I already had a Scheme + C project in mind for sequential code. When brainstorming on parallel part, mapping stuff from above languages onto C was the idea. Probably start with something simpler like Cilk to get feet wet, though. That was the concept.

                                                        1. 1

                                                          Or maybe it would look like Rust.

                                                          1. 8

                                                            The article’s point is that things are parallel by default at multiple levels, there’s different memories with different performance based on locality, orderings with consistency models, and so on. The parallel languages assume most of that given they were originally designed for NUMA’s and clusters. They force you to address it with sequential stuff being an exception. They also use compilers and runtimes to schedule that much like compiler + CPU models.

                                                            Looking at Rust, it seems like it believes in the imaginary model C does that’s sequential, ordered, and so on. It certainly has some libraries or features that help with parallelism and concurrency. Yet, it looks sequential at the core to me. Makes sense as a C replacement.

                                                            1. 3

                                                              But, Rust is the only new low-level language I’m aware of, so empirically: new low-level languages look like Rust.

                                                              Looking at Rust, it seems like it believes in the imaginary model C does that’s sequential, ordered, and so on.

                                                              To be fair, the processor puts a lot of effort into letting you imagine it. Maybe we don’t have languages that look more like the underlying chip is because it’s very difficult to reason about.

                                                              Talking out of my domain here: but the out of order stuff and all that the processor gives you is pretty granular, not at the whole-task level, so maybe we are doing the right thing by imagining sequential execution because that’s what we do at the level we think at. Or, maybe we should just use Haskell where order of execution doesn’t matter.

                                                              1. 3

                                                                How does rust qualify as “low level”?

                                                                1. 1

                                                                  From my understanding, being low-level is one of the goals of the project? Whatever “low-level” means. It’s certainly trying to compete where one would use C and C++.

                                                                  1. 3

                                                                    But does rust meet the criteria for low level that C does not (per the link)?

                                                                    1. -1

                                                                      The Rust wikipedia page claims that Rust is a concurrent language, which seems to be a relevant part of the blog. I don’t know if Rust is a concurrent language, though.

                                                                      1. 3

                                                                        I think you’re probably putting too much faith in Wikipedia. With that said, I must confess, I have no insight into the decision procedure that chooses the terms to describe Rust in that infobox.

                                                                        One possible explanation is that Rust used to bake lightweight threads into its runtime, not unlike Go. Go is also described as being concurrent on Wikipedia. To that end, the terms are at least consistent, given where Rust was somewhere around 4 years ago. Is it possible that the infobox simply hasn’t been updated? Or perhaps there is a turf war? Or perhaps there are more meanings to what “concurrent” actually signifies? Does having channels in the standard library mean Rust is “concurrent”? I dunno.

                                                                        Rust has stuff in the type system to eliminate data races in safe code. Separate from that, there are some conveniences that help avoid deadlock (e.g., you typically never explicitly unlock a mutex). But concurrency is definitely not built into the language like it is for Go.

                                                                        (I make no comment on Rust’s relevance to the other comments in this thread, mostly because I don’t give a poop. This navel gazing about categorization is a huge unproductive waste of time from my perspective. ’round and ’round we go.)

                                                                        1. 1

                                                                          Pretty sure having a type system designed to prevent data races makes Rust count as “concurrent” for many (including me).

                                                                          1. 3

                                                                            The interesting bit is that the type system itself wasn’t designed for it. The elimination of data races fell out of the ownership/aliasing model when coupled with the Send and Sync traits.

                                                                            The nomicon has some words on the topic, but that section gets into the weeds pretty quickly.

                                                                            1. 1

                                                                              I see where you are going with that. The traditional use of it was expressing concepts in a concurrent way. It had to make that easier. The type system eliminates some problems. It’s a building block one can use for safe concurrency with mutable state. It doesn’t by itself let you express things in a concurrent fashion easily. So, they built concurrency frameworks on top of it. A version of Rust where the language worked that way by default would be a concurrent language.

                                                                              Right now, it looks to be a sequential, multi-paradigm language with a type system that makes building concurrency easier. Then, the concurrency frameworks themselves built on top of it may be be thought similar to DSL’s that are concurrent. With that mental model, you’re still using two languages: a concurrent one along with a non-concurrent, base language. This is actually common in high assurance where they simultaneously wrote formal specs in something sequential like Z and CSP for concurrent stuff. The concurrent-by-default languages are the rare thing with sequential and concurrent usually treated separately in most tools.

                                                                  2. 2

                                                                    If exploring such models, check out HLL-to-microcode compilers and No Instruction Set Computing (NISC).

                                                                  3. 1

                                                                    Interestingly, the Rust wikipedia page makes a bit deal about it trying to be a “concurrent” language. Apparently it’s not delivering if that is the major counter you gave.

                                                                    1. 2

                                                                      Occam is an example of a concurrency-oriented language. The core of it is a concurrency model. The Rust language has a design meant to make building safe concurrency easier. Those frameworks or whatever might be concurrency-oriented. That’s why they’re advertised as such. Underneath, they’re probably still leveraging a more sequential model in base language.

                                                                      Whereas, in concurrency- or parallelism-first languages, it’s usually the other way around or sequential is a bit more work. Likewise, the HDL’s the CPU’s are designed with appear to be concurrency-first with them beating the designs into orderly, sequential CPU’s.

                                                                      So, Im not saying Rust isnt good for concurrency or cant emulate that well. Just it might not be that at core, by default, and easiest style to use. Some languages are. That make more sense?

                                                                      1. 0

                                                                        Yes I know all that, my point was that the wikipedia page explicitly states Rust is a concurrent language, which if true means it fits into the idea of this post.

                                                                  4. 3

                                                                    Does Rust do much to address the non-sequential nature of modern high-performance CPU architectures, though? I think of it as a modern C – certainly cleaned up, certainly informed by the last 50 years of industry and academic work on PLT, but not so much trying to provide an abstract machine better matched to the capabilities of today’s hardware. Am I wrong?

                                                                    1. 3

                                                                      By the definitions in the article, Rust is not a low level language, because it does not explicitly force the programmer to schedule instructions and rename registers.

                                                                      (By the definitions in that article, assembly is also not a low level language.)

                                                                      1. 1

                                                                        Ownership semantics make Rust higher-level than C.

                                                                        1. 3

                                                                          I disagree:

                                                                          1. Parallelism would make whatever language higher-level than C too but the point seems to be that a low-level language should have it.
                                                                          2. Even if true, ownership is purely a compile-time construct that completely disappears at run-time, so there is no cost, so it does not get in the way of being a low-level language.
                                                                          1. 2

                                                                            Parallelism would make whatever language higher-level than C too but the point seems to be that a low-level language should have it.

                                                                            This premise is false: Parallelism which mirrors the parallelism in the hardware would make a language lower-level, as it would better mirror the underlying system.

                                                                            Even if true, ownership is purely a compile-time construct that completely disappears at run-time, so there is no cost, so it does not get in the way of being a low-level language.

                                                                            You misunderstand what makes a language low-level. “Zero-cost” abstractions move a language to a higher level, as they take the programmer away from the hardware.

                                                                    2. 2

                                                                      I came across the X Sharp high-level assembler recently, I don’t know if it’s low-level enough for you but it piqued my interest.

                                                                      1. 2

                                                                        There’s no point of a true low-end language, because we can’t access the hardware at that level. The problem (in this case) isn’t C per se, but the complexity within modern chips that are required to make them pretend to be a gussied-up in-order CPU circa 1993.

                                                                      1. 1

                                                                        As an American, I was really confused by the date of this article. I kept thinking to myself, “Wow, this post is from January and it just now made it to lobste.rs?” Then I clicked on the News homepage to see what other news they had, and promptly realized they’re using the European format (01.05.2018) on the article, but a less ambiguous format (May 01, 2018) for the News homepage.

                                                                        1. 22

                                                                          It’s not the “European” format. It’s the international format. The US, of course, needs to be a snowflake.

                                                                          1. 18

                                                                            YYYY-MM-DD is the one true international date format! :-)

                                                                            DMY is definitely more widespread than MDY, I’ll agree, but it isn’t used in most of East Asia, besides the US. People in countries that don’t use either of those often find it ambiguous whether a year-last date was intended as a “European-style” or “American-style” date (which in my limited experience is what Japanese and Chinese call those two formats), since both styles are foreign. You can even find examples of all three styles on Chinese universities’ English-language pages…

                                                                            1. 5

                                                                              Going by user population size, by international standards, and by rationality (sort lexicographically!), YYYY-MM-DD is probably the only format that deserves to be called international. It’s also much less ambiguous than month-first and date-first, given that the US and Europe do the opposite thing but write it the same way. I suppose someone could write YYYY-DD-MM but I don’t remember having seen this, while I definitely am confused about whether someone is writing in the European/US style from time to time.

                                                                              This is as an American, born and raised. :) I still prefer to write MM/DD, though, because we speak dates that way. Maybe it’s different in other languages.

                                                                              EDIT: Actually, according to Wikipedia, DMY is used by the most people! https://en.m.wikipedia.org/wiki/Date_format_by_country

                                                                              1. 4

                                                                                Other than ISO 8601, I prefer DMY with the month written as a three-letter abbreviation. ex: 01 May 2018. It prevents the confusion over whether 01 is the first day of the month or the first month of the year, and reads in the order one typically cares about while preserving the rank order of the components. When I need a checksum I put the day of the week in front: Tue 01 May 2018. That lets me be confident I didn’t make a transcription error and lets the person I’m communicating with check my work if they need to.

                                                                                1. 2

                                                                                  Good point, I definitely think the day of the week as checksum is underused. I always try to include it in scheduling emails in case I mistype a number.

                                                                                  1. 2

                                                                                    MDY and DMY are equally unambiguous when the month is written as an abbreviation, but a numeric month papers over language differences: It doesn’t matter if you call it “Aug” or “八月”, it’s 8.

                                                                                    (That requires everyone to standardize on the Hindu-Arabic numerals, but, in practice, that seems like it’s happened, even in places which don’t use the Latin alphabet.)

                                                                                  2. 3

                                                                                    In Hungary, though we are in Europe, we don’t use the “European format”. The hungarian standard format is “YYYY. MM. DD.”. I prefer the ISO format for anything international, as it is easy to recognize from the dashes, and avoids confusion. (In my heart I know that our format is the one true format, but I’m happy the ISO has also recognized it! 😉)

                                                                                    Edit: To me the D M Y format can be justified, though for me Y M D seems more logical. (specifying a time instance from the specific to the generic, or from the generic to the specific range can both be ok) What I cannot grasp is how the M D Y format appeared.

                                                                                    1. 3

                                                                                      What I cannot grasp is how the M D Y format appeared.

                                                                                      The tentative progression I pieced together last time I looked into it, though note that this is definitely not scientific grade historical research, is something like this:

                                                                                      1. When talking about a date without the year, English has for centuries used both “May 1st” and “1st May” (or “1st of May”), unlike some languages where one or the other order strongly predominates. Nowadays there’s a strong UK/US split on that one, but in 18th-19th century England they were both common;

                                                                                      2. it seems to have been common for authors to form a fully qualified date by just tacking on the year to however they normally wrote the month/day, so some wrote “May 5th, 1855” and others “5th May, 1855”;

                                                                                      3. fairly early on, the “May 5th” and “May 5th, 1755” forms seem to have become dominant in the US for whatever reason; and finally

                                                                                      4. much later, when writing dates in fully numerical format became a thing, Americans kept the same MDY order that they had gotten used to for the written-out dates.

                                                                                2. 1

                                                                                  In my mind if it’s not the American standard it must be the European standard. Even it encompasses more than Europe. I understand that’s probably not the best way to think of things.

                                                                                  1. 6

                                                                                    As an Australian, I get pretty annoyed every time I read a US article and have to deal with the mental switch. Even worse because I work for a US company and people throw around “we’re doing this 6/5”, and that doesn’t even look like a date to my eyes — we never just do D/M, so “number/number” looks like a fraction. once I work out it’s a date, I realise it’s an American thing and realise it must be M/D.

                                                                                  2. 1

                                                                                    I use YYYY-MM-DD for no other reason other than it’s sorts files nicely in a folder.

                                                                                1. 2

                                                                                  The big value of the education part of an undergraduate education is knowing what all the sub-headings are in a given field, and where to look in those sub-headings to find information on how to solve a specific problem. There’s a Hard Problem of Knowledge Organization, analogous to the Hard Problem of AI, and we’ve tried to solve it in multiple ways, but most enduring is to split fields into sub-fields, recursively, and turn people into generalists who go on to specialize, either in academia or in their careers; it isn’t impossible to jump fields, but it requires time and mistakes to get good at a new field, and humans only have so much of either in them.

                                                                                  Therefore, expecting People What Know-All And Do-All is contrary to both our educational system and basic human biology, and expecting it at junior dev salaries is even less sane.

                                                                                  1. 4

                                                                                    Is there anyone who can review a distro without reviewing some desktop manager?

                                                                                    Is there anyone who understands that desktop managers are independent of distros?

                                                                                    1. 5

                                                                                      distros are mostly the same under the hood, linux, systemd and deb/rpm packages.

                                                                                      the interesting parts are things like “will it destroy itself during distro upgrades” but those are rarely included in reviews

                                                                                    1. 4

                                                                                      Mgy favourite falsehood about Unicode — toUpper/toLower does not change the length of a string. At least when measured in graphemes. For latin-1. — is sadly coming to an end nowadays.

                                                                                      Previously, “ß” uppercase equivalent was “SS”, it compared equal to “SZ” and “SS”, and “SS” lowercase equivalent was “ss”, but “ss” and “ß” were not equal.

                                                                                      Now that ẞ exists instead as official uppercase form for ß and is to be used since 2017, the next version of Unicode is going to standardize ẞ.

                                                                                      1. 1

                                                                                        The capital sharp s existed long before Unicode:

                                                                                        Historical typefaces offering a capitalized eszett mostly date to the time between 1905 and 1930. The first known typesets to include capital eszett were produced by the Schelter & Giesecke foundry in Leipzig, in 1905/6. Schelter & Giesecke at the time widely advocated the use of this type, but its use remained very limited.

                                                                                        … and it’s been in Unicode since 2008:

                                                                                        Capital ß (ẞ) was introduced as part of the Latin Extended Additional block in Unicode version 5.1 in 2008 (U+1E9E ẞ LATIN CAPITAL LETTER SHARP S).

                                                                                        The only thing that changed in 2017 was the opinion of the Council for German Orthography.

                                                                                        https://en.wikipedia.org/wiki/Capital_%E1%BA%9E

                                                                                        1. 2

                                                                                          Yes, I realize that – but it’s expected that the next version of Unicode is going to standardize the new capitalization rules, which they haven’t yet.

                                                                                      1. 20

                                                                                        While I’m not going to argue against their findings per say, you do have to wonder if the drop off in contributors is in large part simply due to the completeness of the wiki. Not that Wikipedia EN actually encompasses all of human knowledge, but that 90% of the low hanging fruit has been plucked already. As time goes on contributions will only become more or more specialized. I don’t think this is a bad thing at all, and I believe actually mimics the growth of software projects.

                                                                                        1. 6

                                                                                          One should also keep in mind that the bigger and more mature a wiki becomes, contributing becomes a bit more difficult for newcomers, since they either might not know where or whether to add something or they might not be familiar enough with the user’s/admin’s etiquette, thus “scaring” them away from contributing.

                                                                                          1. 3

                                                                                            I was an editor during 2003–06 and the culture was just toxic but, amazingly, most of the articles (95–99 percent) were excellent, even then. If a topic wasn’t controversial, you’d find something decent. The product itself, I think, has even gotten better. I don’t see a slight decline in the number of editors as necessarily a bad thing, although it will have to bring in new talent.

                                                                                            I do think Wikipedia was easier to get into, in 2003, probably because the standard to which one had to write an article to make it fit in was lower. These days, articles tend to have pictures with captions, footnotes, and a lot of other adornments that make the pages better, no doubt, but possibly make the editing process more intimidating. That said, I hope the toxic hostility I encountered in the mid-2000s has abated.

                                                                                            1. 7

                                                                                              This might be unpopular, but I find articles on controversial topics to be high-quality and well-sourced due to the editing wars those articles endure: If you can’t edit a paragraph into a sloppy form because people are actively monitoring that article to ensure that old fights aren’t starting back up, the paragraphs are going to be pretty tight; similarly, they’re not going to be one-sided because both sides will know the system well enough to start a resolution process when well-sourced material is being kept out.

                                                                                              1. 2

                                                                                                I don’t edit in particularly controversial areas, but I haven’t found the bar for contributing lately to be all that high. The main way it’s gotten higher is that sources aren’t optional anymore. In the early days it was common to just write a bunch of text without necessarily citing anything, with the expectation that it could be properly referenced later if necessary (more of a classic wiki style of working, like how things were done on the old C2 wiki). Now, if you’re creating a new article, it’s expected you’ll cite some decent sources for it. But like one or two decent sources and one paragraph of text is fine. I create a lot of short biographies of historical figures, and short articles on archaeological sites, and nobody has ever complained about them as far as I remember.

                                                                                                1. 2

                                                                                                  I got interested in unusual architectures in GCC tree and wrote some articles with sources. They were swiftly deleted for “not being notable”.

                                                                                                  1. 2

                                                                                                    The notability requirement has always seemed bonkers to me… that made sense in a paper encyclopedia, but wikipedia can have as many articles as it wants…

                                                                                                    1. 3

                                                                                                      Notability is still useful because you have to draw a line somewhere, lest Wikipedia becomes an indiscriminate collection of information.

                                                                                                      1. 1

                                                                                                        To me, having an upstream GCC port automatically makes an architecture notable (there are only 49 of them at the moment), but Guardians of Wikipedia (TM) seem to disagree.

                                                                                                      2. 2

                                                                                                        It’s probably to block self-promotion. A lot of people used to write articles about themselves.

                                                                                                        In reflection, now that I am borderline “notable”, the absolute last thing I would want is a Wikipedia article about me. There’s likely to be one after my book comes out (mid-2019) and I’m dreading the thought.

                                                                                                        1. 2

                                                                                                          For currently living people, the standards have been tightened up (especially around sourcing) partly because a lot of people share that view: they’d rather have no article about them than a bad one. The subject still doesn’t get a veto over the article, but there are some interests to balance there. I mean any bad article is bad in some way just because it spreads disinformation. But a bad article about a specific person who’s currently alive, especially if they aren’t even a major public figure, is bad in a more personal way in that it can harm the reputation, job prospects, etc. of that person directly.

                                                                                              1. 1

                                                                                                This wavers a bit, from dealing with properties of names (“People’s first names and last names are, by necessity, different.”) and properties of technology (“People’s names are written in any single character set.”) which is a bit odd, and makes it something of a rant about how limited text-handling still is: Yes, there are characters which do not exist in Unicode, or any character encoding standard, and I’m sure some people write their names with them. However, that issue would come up any time those characters are used, in a name or not, and will cease to be a problem eventually given that Unicode continues to expand.

                                                                                                1. 2

                                                                                                  Still makes me sad that even in UTF-8 there are invalid code points. ie. You have to double inspect every damn byte if you’re doing data mining.

                                                                                                  Typically in data mining you are presented with source material. It’s not your material, it’s whatever is given to you.

                                                                                                  If somebody has screwed up the Unicode encoding, you can’t fix it. You have work with whatever hits the fan, and everything else in your ecosystem is going barf if you throw an invalid code point at it, even if it was just going to ignore it anyway.

                                                                                                  So you first have to inspect every byte and see if it’s a valid code point and then on the fly squash them to the special invalid thingy. ie. Double work for each byte and you can’t just mmap the file.

                                                                                                  Ah for The Good Old Bad Old Days of 8bit ascii.

                                                                                                  1. 6

                                                                                                    Still makes me sad that even in UTF-8 there are invalid code points. ie. You have to double inspect every damn byte if you’re doing data mining.

                                                                                                    I disagree. It’s an amazing feature of UTF-8 because it allows me to be certain to exclude utf-8 from a list of possible encodings a body of text might have. No other 8-bit encoding has that feature. A blob of bytes that happens to be text encoded in ISO-8859-1 looks exactly the same as a blob of bytes that is encoded in ISO-8859-3, but it can’t be utf-8 (at least when it’s using anything outside of the ASCII range).

                                                                                                    Ah for The Good Old Bad Old Days of 8bit ascii.

                                                                                                    if you need to make sense of the data you have mined, the Old Days were as bad as the new days are because you’re still stuck having to guess the encoding by interpreting the blob of bytes as different encodings and then trying to see whether the text makes sense in any of the possible languages that could have been used in conjunction with your candidate encoding.

                                                                                                    This is incredibly hard and error-prone.

                                                                                                    1. 1

                                                                                                      I guess I’d like a Shiny New Future where nobody tries to guess encoding, because standards bodies and software manufacturers insist on making it explicit, and all software by default splats bad code points to invalid without doing something really stupid like throwing an exception….

                                                                                                      Sigh.

                                                                                                      I guess for decades to come I’ll still remember the Good Old Bad Old days of everything is Ascii (and if it wasn’t we carried on anyway) fondly….. I’m not going to hold my breathe waiting for a sane future.

                                                                                                    2. 2

                                                                                                      Ah for The Good Old Bad Old Days of 8bit ascii.

                                                                                                      It wasn’t ASCII, and that’s the point: There was no way to verify what encoding you had, even if you knew the file was uncorrupted and you had a substantial corpus. You could, at best, guess at it, but since there was no way to disprove any of your guesses conclusively, that wasn’t hugely helpful.

                                                                                                      I remember playing with the encoding functionality in web browsers to try to figure out what a page was written in, operating on the somewhat-optimistic premise that it had a single, consistent text encoding which didn’t change partway through. I didn’t always succeed.

                                                                                                      UTF-8 is great because absolutely nothing looks like UTF-8. UTF-16 is fairly good because you can usually detect it with a high confidence, too, even without a BOM. UCS-4 is good because absolutely nobody uses it to store or ship text across the Internet, as far as I can tell.

                                                                                                    1. 3

                                                                                                      That’s amazingly cool. Is it possible to download a disk image with a ready-to-go system installed?

                                                                                                      1. 6

                                                                                                        http://multicians.org/simulator.html has links and instructions for running it yourself. While you shouldn’t expect it to easily support hundreds or thousands of users out of the box on eight or more front-end network processors, etc, in the QuickStart configuration - it will include most of the software that my system does.

                                                                                                        I do my best to report all the issues I’ve run into (and often I have no choice but to defer to the Multicians for help) so you can be sure that the next release of the Multics distribution (MR12.6g or MR12.7) will contain all of the fixes for issues I’ve run into stress-testing this large configuration.

                                                                                                        FYI - I have setup a virtual MIKSD (Multics Internet Kermit Service Daemon - analogous to a FTP site) which will be used to provide an easily accessible archive of new and user-supported and ports to other Multics sites over the Internet, without having to resort to manual, albeit simulated, tape operations, or IMFT/X.25 simulation. I’m still working finalizing the distribution archive format, but the infrastructure is there and running.

                                                                                                      1. 17

                                                                                                        Pointfree style in ML-family languages (e.g. Haskell) lets you avoid naming things, though the style was designed more for general conciseness than that specifically.

                                                                                                        1. 9

                                                                                                          Similarly, pipelines in shell scripts let you avoid naming intermediates.

                                                                                                          (Pipelines are a combination of pointfree style and array language style, in that well-designed pipelines are side-effect-free and the programs in the pipeline mostly implicitly iterate over their input.)

                                                                                                          1. 2

                                                                                                            There’s also a tool called Pointfree which auto converts code into “pointfree style” - e.g.

                                                                                                            \a b c -> b * c
                                                                                                            
                                                                                                            const (*)
                                                                                                            

                                                                                                            And there’s also Pointful, for the other way around.

                                                                                                            1. 1

                                                                                                              I think Haskell strikes a very fine balance here. Both point-free and pointful styles are very ergonomic, so you tend to name precisely those things that would be better off named.

                                                                                                            1. 17

                                                                                                              Knowing german, I read this way differently than the title is trying to convey initially.

                                                                                                              1. 8

                                                                                                                For the non-german folks here, du is the german personal-pronoun for you so the title reads: Like you but more intuitive :)

                                                                                                                1. 2

                                                                                                                  It’s a bit more complex than that: German retained the T-V distinction, which means it has two forms of singular second person pronoun, one for people you’re close to and one for people you’re not close to. Sie is the pronoun for people you’re not close to, du is the one for people you are close to. It also has two forms of the second-person plural pronoun, ihr for people you’re close to and, again, Sie for people you’re not close to.

                                                                                                                  1. 2

                                                                                                                    Still it translates to the same and I don’t know of any way to preserve that intent in English.

                                                                                                                    I always thought the du/Sie distinction makes German very formal but it also seems very ingrained in the culture. The distinction was also in Swedish but it disappeared and is so rare in Denmark I can’t remember when I saw it last. Something I couldn’t imagine happening in Germany.

                                                                                                                    1. 3

                                                                                                                      Sweden is such a small country that a reform of this type, made in the heady days of the 60s, got traction very easily.

                                                                                                                      As a bank cashier in the late 80s I’d sometimes refer to customers using “ni” and occasionally get pushback from people of the “68 generation”.

                                                                                                                2. 5

                                                                                                                  Also “dust” means dork or idiot in Norwegian.