1. 1

    Scimming quickly through this it seems to me that it must be quite an ild article, because of the RAM of workstations that is in the range of 256MB to 1GB. However i didn’t see the date of publication.

    1. 3

      Yeah, was thinking the same. I can see the pkgsrc documentation was accessed in October 2004 so I guess the document must be from around that timeframe?

      Update: Ah, found this related presentation by Jan Schaumann and it’s from EuroBSDCon 2004.

      1. 3

        The real question is how well it all holds up two decades later. Is that system even still in use? I wonder.

        1. 2

          Whoa, a KDE 2.x screenshot! That brings back memories!

      1. 3

        Fast forward to a couple of months ago when I decided to learn how to touch type, which is a story for another day, and realized that touch typing aligned very well with vim

        I can really relate to this. I think touch typing is a joy enhancer for VI. I don’t know if I actually would use VI without touch typing.

        1. 13

          I find it somewhat surprising that someone can sit in front of a computer all day writing code and not know how to touch-type. Kudos to them for making a conscious decision to learn it.

          1. 5

            My father has been a programmer for 40 years and he still hunt and pecks with two fingers. I’ve been touch-typing for 20 years. The difference between us is that I also played computer games since I was little, and the impetus to learn to type quickly came from needing to both type out information and heal my party at the same time.

            1. 2

              I like to think I can type fast (85-90 wpm) and type a bit wonky, not by hunting and pecking but definitely not touch typing. What would you say is the best way to learn how to actually touch type properly?

              1. 4

                If you’re just using muscle memory and not looking at the keyboard, I think it’s still touch-typing. It’s harder to unlearn years of muscle memory, but what they had us do when I was in school was put a blind over the keyboard and do typing tests. There’s a bunch of simple games out there to help you practice typing, those help.

                1. 3

                  Thanks, yeah I guess I hadn’t thought of it that way. I occasionally look at the keyboard but I’d wager 95% of the time I don’t, so that’s good enough in my book. I basically just want to learn real touch typing for use with Vim, but I should probably actually start using Vim regularly first.

                  1. 2

                    I don’t do “real” touch-typing either, technically. I rarely use the right ctrl/shift/alt keys and still have to glance at my keyboard for the top number row and some symbols on occasion.

                2. 2

                  When I was 17, I broke my right wrist and so had to type everything for a bit. I learned to type quite quickly using only my left hand. I am now completely uncapable of using any split keyboards, because I use my left hand for anything to the left of the j key, though some thing in the middle I will type with either hand, depending on which hand typed the previous character. It makes people who learned to touch type properly very uncomfortable if they watch.

              2. 2

                On the other hand I have never understood why movement keys are HJKL. They’re one key off from the rest position of the right hand, which is JKL: and feels much more natural to me, so… I doubt that these keys were choosen by an experienced touch typist.

                Edit: Looks like HJKL is even older than vi: https://catonmat.net/why-vim-uses-hjkl-as-arrow-keys

              1. 2

                I used a similar solution to back-up a physical Windows installation and put it inside a VM. Since then I am using Linux all the time and when I need Windows (for conf calls and presentations) then I just spin up the Windows VM.

                I only remember that it took a bit of time to get Microsoft to activate it. But in the end, after I sent them a copy of the invoice and some photos to prove that I rightly purchased the computer with Windows, I had to let them on my windows machine remotely - they activated the machine.

                No need to reboot to Windows since then. Didn’t look back since!

                Maybe someone will find this useful.

                1. 1

                  For me the path to something similar was to start using a terminal multiplexer - I am using screen, because this is what is preinstalled on most servers, but there is tmux might be at least as good - with a scroll back buffer of 1000. Like this I have as many terminals open as I want, in one single window and I can copy and paste by using the screen short-cuts (Ctrl-A + VI navigation to navigate inside the window, Ctrl-A + [ to select a range that is copied and Ctrl-A ] to paste, most of the time after you have switched to another terminal window inside screen).

                  The reason that I started to use screen, was work on console on client’s servers limited to a single SSH console. But once I learned this I started to use it even on my Desktop - which resolved a huge issue that I had before - quite often my work requires to spin up a Windows VM to do conf calls and such, and from this Windows VM I can SSH to the Host (which is the Desktop machine) and have all the sessions open. Once I am done using Windows, I just detach the screen session in Windows, close the VM and go back working inside the Gnome Terminal.

                  I am a VI person, using VI wherever it is possible - (1) in VIM itself, (2) on the Bash with set -o vi to edit the command lines, that sometimes tend to get so long, that pressing ‘v’ will spin up the VIM to have more comfort, (3) in Emacs for Org-Mode via Evil-Mode, in GHCi (can be configured in $HOME/.haskeline) and all the Terminal programs with readline support where you can switch to vi mode with Ctrl-Alt-j.

                  Maybe someone will find this method useful.

                  The only problem that popped up, is that emacs in terminal doesn’t show the nice unicode symbols (status line and org-mode decorations) when I am connecting to the machine via SSH and attaching the screen. Tried different settings with LC_* and LANG variables, but so far was not able to find a solution. If someone has suggestions I would be very grateful.

                  1. 2

                    Personally I connect via (multiple) terminals to the remote server and edit the files in Emacs via tramp. This works well enough in my use case and I have configured ssh to reuse the connection, i. e. this is at the end of my ~/.ssh/config:

                    Host *
                      ControlMaster auto
                      ControlPath ~/.ssh/ssh_mux_%h_%p_%r

                    In general, with Emacs you’d try to set it up so you don’t need the shell, so if you for example want to compile something on the remote server you could use compile-mode for that, which will even give you output linked to the file and line. This is basically how emacs sucks you in as you can do more and more of your workflow within it and it reduces the need for a shell.

                    However, I’ve found that for large swathes of text the shell modes - pick any - do not work too well and slow down. Hence I’m still using the dual approach with emacs + shell to run interactive sessions, which works reasonably well.

                  1. 12

                    A while back I bought two of these USB Thinkpad keyboards, using the old (good) keyboard layout: https://www.newegg.com/lenovo-thinkpad-usb-wired/p/N82E16823218006

                    I have used the crap out of them. They are the absolute best.

                    Internally it’s just a USB controller attached to the same keyboard that shipped in older Thinkpads, so I’ve already fixed up at least one keyboard with parts from eBay.

                    Despite things like Vimium or i3 or other ways to reduce mouse usage, most folks still need a mouse from time to time. Reducing the travel time from your keyboard to your mouse seems really high value to me, and I’m lost why most of these custom or fancy keyboard people don’t focus on having a nearby mouse of some kind?? I’m not the OP of this thread, but I highly empathize: https://www.reddit.com/r/MechanicalKeyboards/comments/626sga/how_about_trackpoints/

                    These Thinkpad trackpoint keyboards are perfect. The mouse is right there.

                    1. 10

                      I love my shinobi tex, a mechanical homage to the thinkpad design: https://tex.com.tw/products/shinobi

                      1. 4

                        Just got yesterday mine. Such a pleasure to have again some key travel, and feeling the fingers match the keys. Really nice to alternate with the laptop keyboard (X1E Gen1) and is an incentive to work more at the desk with a big screen. For me the trackpoint on the shinobi work much more precise and easy. I was expecting a little more pressure resistance from the keys, but in the end I think it is quite comfortable. It’s really nice too that there is a deeper mold in the keycaps. Was expensive, but I’m definitely happy about this purchase.

                        1. 4

                          oh my gosh i’ve never seen this before, this is amazing!

                          1. 4

                            Woah! This is the first keyboard I’ve seen in years that tempts me…

                            1. 3

                              How are the key symbol printings holding up? I got mine a week ago and I’m already noticing L-Ctl, Esc, and frequent letters fading. It’s not a big deal since I don’t really look but I’m surprised.

                              1. 3

                                I’ve been using mine for ~9 months daily, and while it’s true that some letters started fading very quickly, they seem to have reached a “plateau”. Definitely the discolouring has slowed its pace or the keycaps would be blank by now.

                                1. 2

                                  Same here. Fading on frequent used keys. Been using it since last november.

                              2. 5

                                Thank you for your comment. I feel the same way about trackpoints, and your comment made me order a ThinkPad USB keyboard :)

                                I really like the newer chiclet design, so I’ve picked a more recent model. Luckily they seem to be designed with a similar concept; reuse of the existing laptop keyboard design (see https://dontai.com/wp/2018/09/06/thinkpad-wired-usb-keyboard-with-trackpoint-0b47190-disassembly-and-cleaning/ for disassembly). The number of key rows don’t really bother me, and for all I’ve tried I don’t feel comfortable on keyboards with mechanical switches. Too many hours on a ThinkPad, I think.

                                1. 4

                                  i am very happy lenovo is still making these keyboards, even if it’s the new layout

                                2. 4

                                  I have one of these and I love it! I’m a sucker for the trackpoint and I love the pre-chiclet key design. It’s super portable too - I can easily throw it in my backpack with my laptop if I’m going to be out of the (home) office all day.

                                  It’s a little sad that these version seem to be so unavailable these days :(

                                  1. 4

                                    I’d recommend ThinkPad TrackPoint Keyboard II because it is wireless - via Bluetooth or Wireless Nano USB Dongle.

                                    1. 5

                                      I own the first generation as wired version and the micro USB socket is absolute garbage. Two out of three keyboards lose USB connection when the cable is moved slightly. But, this problem can be fixed pretty easily by disassembling the keyboard, bending the socket back to normal shape and then adding a large solder blob to the socket case such that it can’t bend that easily anymore. I fixed both keyboards reliably with this procedure.

                                  1. 5

                                    Wow. Half-ish of the users who responded use evil mode or similar to have vi key bindings. That surprises me even though I frequently use emacs that way myself.

                                    I guess the old joke that “Emacs is a great system. It just needs a decent text editor” had more truth to it than I imagined.

                                    1. 1

                                      Indeed I believe I would not stick with Emacs if there would not be VI support. I use Emacs only for org-mode, but now for years. And I remember that at the beginning I just installed standard Emacs and was researching how to enable evil mode. Later I switched to Spacemacs. This works for my case, though for coding and console work I still reach mostly to VIM.

                                      1. 1

                                        It looks it correlates with the Doom + Spacemacs user numbers and as far as I remember they have ViM keybindings as defaults or at least suggested as defaults. I know they helped me as a long time ViM user with easing the emacs adoption curve.

                                      1. 1

                                        Very interesting and made me discover more about what happened before C.

                                        My understanding after reading this article:

                                        CPL [1]       -> BCPL [3] -----\
                                        ALGOL 60 [2]  -> SMALGOL [4] --|
                                                                       \-> B [5] -> NB [6] -> C [7]

                                        The text in references are quotes extracted from the article:

                                        [1]: Born because Cambridge Univeristy got a new computer, a stripped-down
                                        version of the Ferranti Atlas, dubbed “Titan”, and the trio of David Wheeler,
                                        David Barron and Hartley convinced that it is necessary to develop a new
                                        language. This new programming language was dubbed CPL for Cambridge
                                        Programming Language, later revised to Combined Programming Language and
                                        later, after Christopher Strachey was overseeing the project, came to mean
                                        “Christopher’s Programming Language” for those associated with it.
                                        The specifications of the language were largely finished, but there was no
                                        compiler available. The working group had made CPL so complicated that early
                                        attempts at writing a compiler resulted in machine code tha was incredibly
                                        inefficient. The CPL reference manual 8.

                                        [2]: Excerpts from Wikipedia about ALGOL 60 9: It followed ALGOL 58 [..]
                                        introduced code blocks [..] nested function definitions with lexical scope.
                                        [..] ALGOL 60 was used mostly by research computer sicentiests [..]. Its use
                                        in commercial applications was hindered by the absence of standard
                                        input/output facilities in its description and the lack of interest in the
                                        language by large computer vendors.

                                        [3]: BCPL: Martin Richards has joined the CPL project. He set to work
                                        developing a limited version of CPL that could be made available to users.
                                        This “BCPL”, the ‘B’ standing for “Basic” would need to have an effective
                                        compiler. [..] in 1996, Richards brought BCPL [..] to Massachusetts [..]. BCPL
                                        is a “bootstrap” language because its compiler is capable of self-compiling.
                                        [..] a small chunk of the BCPL compiler was written in assmebly or machine
                                        code, and the rest of the compiler would be written in a corresponding subset
                                        of BCPL. The section of the compiler written in BCPL would be fed into the
                                        section written in assembly code, and the resultant compiler program could be
                                        used to compile any program written in BCPL. [..] Bootstrapping compilers
                                        dramatically simplifies the process of porting a language from one computer
                                        [..] to another. [..] While Richards was working on the BCPL compiler at MIT,
                                        the institute was engaged in the Multics project with Bell Labs and GE. [..]
                                        He (Ken Thompson from Bell Labs) downloaded it (BCPL) to a Bell Labs mainframe
                                        and began working with it. [..] The PDP-7 had 8192 “words” of memory [..].
                                        Unix took up the first 4k, leaving 4k free for running programs. Thompson took
                                        his copy of BCPL [..] and further compressed it so that it would fit into the
                                        available 4k of memory on the PDP-7.

                                        [4]: SMALGOL: [..] Thompson took his copy of BCPL [..] and further compressed
                                        it so that it would fit into the available 4k of memory on the PDP-7. In the
                                        course of doing this, he borrowed from a language he had encountered while a
                                        student at the University of California, Berkeley. That language, “SMALGOL”,
                                        was a subset of ALGOL 60 designed to run on less powerful computers.

                                        [5]: B: The language that Thompson eventually ended up using on the PdP-7 was,
                                        as he described it to Ars, “BCPL semantics with a lot of SMALGOL syntax”,
                                        meaning that it looked like SMALGOL and worked like BCPL. Because this new
                                        language consisted only of aspects of BCPL that Thompson found most useful and
                                        that could be fit into the rather cramped PDP-7, he decided to shorten the
                                        name “BCPL” to just “B”.

                                        [6]: NB: [..] when Bell Labs purchased a PDP-11 for the department in 1971,
                                        Thompson decided it was time to rewrite Unix in a high-level programming
                                        language [..] At the same time, Dennis Ritchie had adopted B and was adapting
                                        it to run on more powerful computers. One of the first things that he added
                                        back into B was the ability to “type” variables. [..] Ritchie dubbed this
                                        modified language NB for “New B” [..] It was installed on the mainframes in
                                        the computing center at Murray Hill, which made it available to users
                                        throughout Bell Labs.

                                        [7]: C: [..] when Bell Labs purchased a PDP-11 for the department in 1971,
                                        Thompson decided it was time to rewrite Unix in a high-level programming
                                        language [..] he started with NB. His first tries ended in failure and, “being
                                        an egoist, I blamed it on the language”, Thompson recalled at VCF with a
                                        chuckle. With each failure, Ritchie added features back into NB in a manner
                                        that made sense to him and Ken, and once he added structures [..] Thompson was
                                        able to write Unix in this new language. Ritchie and Thompson saw the addition
                                        of structures, which could be found nowhere in B, SMALGAL, BCPL or CPL, as a
                                        change significant enough to warrant renaming the programming language, and B
                                        became C.

                                        1. 5

                                          This was an entirely seamless upgrade for me. nixos-rebuild switch --upgrade + 5 minutes waiting == new shiny. I haven’t had this easy of a time upgrading since I left BSD-land.

                                          1. 1

                                            I’m doing this at the moment, after having read your comment. But it’s now 1 hour and it still is not finished. It compiles quite a lot of stuff, that it didn’t in 20.03 - I could see Thunderbird and Libreoffice, just by looking from time to time to the console. I hope it will finish soon. Honestly I hope updates will be much quicker.

                                            Another difference that I have seen, is that it doesn’t accept to run plasma5 (KDE) and gnome3 at the same time - reported a conflict, so I removed plasma5. It’s strange, because in 20.03 they worked.

                                            EDIT: Update, finally finished to upgrade from 20.03 to 20.09. Took ca. 3 hours. I don’t know why but a lot of programs that normally were downloaded already compiled, this time had to recompile.

                                          1. 2

                                            I think this is interesting, and should perhaps be applied to programming languages as well.

                                            Hyper-inefficient programming languages like Ruby, Python, Haskell, etc. produce far more CO2 than, for example, C.

                                            1. 5

                                              Hyper-inefficient programming languages like Ruby, Python, Haskell

                                              I hope you realize that Haskell’s performance is much closer to C than Python. Haskell usually ranks around the likes of Java in language benchmarks. Either way you look at it, it doesn’t deserve being called “Hyper-inefficient”…

                                              1. 3

                                                I think it depends on how much energy is used developing and compiling code vs energy used during all times the program is run. I expect that equivalent C and Haskell programs take similar amounts of energy to run, and that the Haskell one takes a lot more energy to compile, but less time (and therefore less idle-time energy) to develop. This would make them similarly energy-expensive for most use-cases.

                                                Scripting languages may require less develop-time energy, but more run-time energy. If run only a few times, they’d use less energy than would be spent writing, compiling, debugging, and running a C program. Run many times, they would lose out to the finished C program.

                                                1. 3

                                                  That’s actually a very relevant point. My first reaction to your comment was, “but who cares, you build only once”, but that’s not true. I have a beefy laptop that I’ve bought specifically to support a comfortable Haskell development experience. The IDE tooling continuously compiles your code behind the scenes. Then my team also has a very beefy EC2 instance that serves as a CI environment, it builds all the branches all the time. Then we’re also employing various ways of deploying the application and that also means it gets built in various ways per release image. All of that probably adds up to an energy consumption amount that’s comparable to a significant number of users running the application.

                                                  1. 2

                                                    Then we should include maintenance cost as well. I believe that in a lifetime of a program the energy put into the initial development is only a part, most probably a smaller part, of the energy needed to maintain it: bug fixing, updates, etc. In this case, theoretically, Haskell should have an advantage, because the language, due to its type safety restrictions, will force you to make less mistakes, in design and in terms of bugs. I don’t have any numbers to support these claims, it’s just gut feeling, so don’t take it too serious.

                                                2. 4

                                                  There actually have been studies on that question, eg: https://greenlab.di.uminho.pt/wp-content/uploads/2017/10/sleFinal.pdf

                                                  1. 1

                                                    I love that paper. If you’re looking for a quick heuristic, energy efficiency strongly correlates with performance. Compare those numbers to these: https://benchmarksgame-team.pages.debian.net/benchmarksgame/which-programs-are-fastest.html

                                                1. 2

                                                  I have discovered this document for the first time and I found it a fantastic read (roughly 1 hour) and it triggered at least for me the following remarks and questions:

                                                  • It seems that in 1974 User Groups did not yet exist: I take it from the following quote “Also given for new files is a set of seven protection bits. Six of these specify independently read, write, and execute permission for the owner of the file and for all other users.” If that’s the case, it would be interesting to know at what time was the concept of user groups introduced and what were the use cases that pushed for this?

                                                  • It is said that the set-user-ID seems to solve the MOO accounting problem: Does anybody know what this accounting problem was about?

                                                  • Is it possible that quota was not yet introduced? I take this from the following quote: “The simplest reasonably fair algorithm seems to be to spread the charges equally among users who have links to a file. The current version of UNIX avoids the issue by not charging any fees at all.” Questions that this raises: When was quota introduced? Is the quota functionality standardized among modern UNIXes? If so what are the limits and inconsistencies of standardized quotas?

                                                  • The error stream stderr seems to not have been invented yet in 1974. I take this from the quote “Programs executed by the Shell, however, start off with two open files which have file descriptors 0 and 1.”

                                                  • Evolution through Hackability, Expressiveness and Source Code Availability: Authors emphasize the importance of hackability, expressiveness and availability of source code, for the evolution, or as they call it “maintainance”, of the system. Hackability quote: “First, since we are programmers, we naturally designed the system to make it easy to write, test, and run programs.” Expressiveness quotes: “Given the partiality antagonistic desires for reasonable efficiency and expressive power, the size constraint has encouraged not only economy but a certain elegance of design.” “Another important aspect of programming convenience is that there are no “control blocks” with a complicated structure partially maintained by and depended on by the file system or other system calls.” Quotes about availability of source code: “Since all source programs were always available and easily modified online, we were willing to revise and rewrite the system and its software when new ideas were invented, discovered, or suggested by others.”

                                                  • Stability/reliability: Quote: “The longest uninterrupted up time was about two weeks. Service calls average one every three weeks, but are heavily clustered. Total up time has been about 98 percent of our 24-hour, 365-day schedule.”

                                                  • Is there any document that describes in such a concise way what principal changes have been introduced in Plan9?

                                                  1. 3

                                                    It is said that the set-user-ID seems to solve the MOO accounting problem: Does anybody know what this accounting problem was about?

                                                    MOO was a simple number guessing game. What made it interesting in its original computer implementation was that it maintained a high-score table. When a user guessed right, the high score table had to be modified.

                                                    The problem was that the high score table was a file not owned by the user. For everyone to be able to update it, it had to be writable by everyone, which made cheating trivial.

                                                    Setuid allowed the file to be owned by the “MOO user” and writable only by that user, but the command to update it could be run by anyone.

                                                    Basically it was the same mechanism that passwd uses to update user passwords, and that various multi-user games have used on UNIX since time immemorial.

                                                    I remember reading about the MOO example specifically in some “hardening UNIX” book/article at some point. I can’t remember exactly where, sorry.

                                                  1. 14

                                                    Why did Haskell’s popularity wane so sharply?

                                                    What is the source that Haskell’s popularity are declining so sharply? Is there really some objective evidence for this, I mean numbers, statistics, etc.?

                                                    It’s anecdotal and just my personal impression by observing the Haskell reddit 1 for 10 years, but I have never seen so many Haksell resources, Conferences, Books and even postings for jobs as now. I have not at all the impression that the language is dying. It has accumulated cruft, has some inconsistencies, is struggling to get a new standard proposal out, but other than that I have the impression that it attracts quite some people that come up with new ideas.

                                                    1. 2

                                                      Haskell had glory days when SPJ/Marlow were traveling to various conferences talking about the new language features. Mileweski’s posts, LYAH, Parsec, STM, and Lenses are from that era. The high-brow crowd was of course discussing Lenses. Sure, these things drove adoption, and there’s a little ecosystem for the people who went on the Haskell bandwagon back then.

                                                      What innovation has it had over the last 5 years? The community couldn’t agree on how to implement any of the features of a respectable dependent-type system, so they invented a bunch of mutually incompatible flags, and destroyed the language. Thanks to the recent hacking, GHC is plastered with band-aids.

                                                      It’s true that you can’t explain these things with some points on a pretty graph, but that doesn’t make it anecdotal. Look at the commits going into ghc/ghc, and look at the activity on the bread-and-butter Haskell projects: lens, trifecta, cloud-haskell. Maintenance mode. Where are the bold new projects?

                                                      1. 23

                                                        These assertions about Haskell are all simply false. There are plenty of problems with Haskell, we don’t need to add ones that aren’t true.

                                                        The community couldn’t agree on how to implement any of the features of a respectable dependent-type system, so they invented a bunch of mutually incompatible flags, and destroyed the language. Thanks to the recent hacking, GHC is plastered with band-aids

                                                        The reason GHC didn’t just turn on all flags by default is that many of them are mutually incompatible, so your individual .hs file has to pick a compatible set of language features it wants to work with.

                                                        You keep saying this in multiple places, but it’s not true. Virtually no GHC extensions are incompatible with one another. You have to work hard to find pairs that don’t get along and they involve extremely rarely used extensions that serve no purpose anymore.

                                                        The community is also not divided on how to do dependent types. We don’t have two camps and two proposals to disagree about. The situation is that people are working together to figure out how to make them happen. GHC also doesn’t contain bad hacks for dependent types, avoiding this is exactly why building out dependent types is taking time.

                                                        That being said, dependent types work today with singletons. I use them extensively. It is a revolution in programming. It’s the biggest step forward in programming that I’ve seen in 20 years and I can’t imagine life without them anymore, even in their current state.

                                                        Look at the commits going into ghc/ghc, and look at the activity on the bread-and-butter Haskell projects: lens, trifecta, cloud-haskell. Maintenance mode. Where are the bold new projects?

                                                        Haskell is way more popular today than it was 5 years ago, and 10 years ago, and 20 years ago. GHC development is going strong, for example, we just got linear types, a huge step forward. There’s been significant money lately from places like cryptocurrency startups. For the first time I regularly see Haskell jobs advertised. What is true, is that the percentage of Haskell questions on stack overflow has fallen, but not the amount. The size of Stack Overflow exploded.

                                                        Even the community is much stronger than it was 5 years ago. We didn’t have Haskell Weekly news for example. Just this year a category theory course was taught at MIT in Haskell making both topics far more accessible.

                                                        Look at the commits going into ghc/ghc

                                                        Let’s look. Just in the past 4 years we got: linear types, a new low-latency GC, compact regions, deriving strategies & deriving via, much more flexible kinds, all sorts of amazing new plugins (type plugins, source plugins, etc.) that extend the language and provide reliable tooling that was impossible 5 years ago, much better partial type signatures, visible type applications (both at the term level and the type level), injective type families, type in type, strict by default mode. And much more!

                                                        This totally changed Haskell. I don’t write Haskell the way I did 5 years ago, virtually nothing I do would work back then.

                                                        It’s not just GHC. Tooling is amazing compared to what we had in the past. Just this year we got HLS so that Haskell works beautifully in all sorts of editors now from Emacs, to vscode, to vim, etc.

                                                        look at the activity on the bread-and-butter Haskell projects: lens, trifecta, cloud-haskell. Maintenance mode. Where are the bold new projects?

                                                        lens is pretty complete as it is and is just being slowly polished. Haskell packages like lens are based on a mathematical theory and that theory was played out. That’s the beauty of Haskell, we don’t need to keep adding to lens.

                                                        I would never use trifecta today, megaparsec is way better. It’s seen a huge amount of development in the past 5 years.

                                                        There are plenty of awesome Haskell packages. Servant for example for the web. Persistent for databases. miso for the frontend. 5 years ago I couldn’t dream of deploying a server and frontend that have a type-checked API. For bold new ideas look at all the work going into neural network libraries that provide type safety.

                                                        I’m no fanboy. Haskell has plenty of issues. But it doesn’t have the issues you mentioned.

                                                        1. 1

                                                          Right. Most of my Haskell experience is dated: from over five years ago, and the codebase is proprietary, so there are few specifics I can remember. I’m definitely not the best person to write on the subject. In any case, I’ve rewritten the Haskell section of the article, with more details. Thanks.

                                                          1. 6

                                                            From my definition of “dying language” it means losing popularity, or losing interest. For Haskell this is absolutely not clear. Also your section is about “why Haskell is bad” not “why it is dying”. People do not talk about Haskell as they used to in my opinion, but I still see a lot of activity in Haskell ecosystem. And it doesn’t really look like it’s dying.

                                                            I think it is easier to agree about Clojure dying looking at Google trends for example: https://trends.google.com/trends/explore?cat=5&date=all&geo=US&q=haskell,clojure

                                                            But Haskell looks more like a language that will never die but still probably never become mainstream.

                                                            1. 5

                                                              I’m definitely not the best person to write on the subject. In any case, I’ve rewritten the Haskell section of the article, with more details. Thanks.

                                                              Great! Although there are still many issues that are factually untrue.

                                                              I think this is just a sign that you’ve been away from the community for many years now, and don’t see movement on the things that were hot 5-10 years ago. Like “The high-brow crowd was obssessed with transactional memory, parser combinators, and lenses.” Well, that’s over. We figured out lenses and have great libraries, we figured out parser combinators, and have great libraries. The problems people are tackling now for those packages are engineering problems, not so much science problems. Like how do we have lenses and good type errors? And there, we’ve had awesome progress lately with custom error messages https://kodimensional.dev/type-errors that you would not have seen 5 years ago.

                                                              The science moved on to other problems.

                                                              The issue is that different extensions interact in subtle ways to produce bugs, and it’s very difficult to tell if a new language extension will play well with the others (it often doesn’t, until all the bugs are squashed, which can take a few years).

                                                              This still isn’t true at all. As for the release cadence of GHC, again, things have advanced amazingly. New test environments and investments have resulted in regular GHC releases. We see several per year now!

                                                              In Atom, the Haskell addon was terrible, and even today, in VSCode, the Haskell extension is among the most buggy language plugins.

                                                              That was true a year ago, it is not true today. HLS merged all efforts into a single cross-editor package that works beautifully. All the basic IDE functionality you would want is a solved problem now, the community is moving on to fun things like code transformations.

                                                              Then there’s Liquid Haskell that allows you to pepper your Haskell code with invariants that it will check using Z3. Unfortunately, it is very limited in what it can do: good luck checking your monadic combinator library with LH

                                                              Not true for about 3 years. For example: https://github.com/ucsd-progsys/liquidhaskell/blob/26fe1c3855706d7e87e4811a6c4d963d8d10928c/tests/pos/ReWrite7.hs

                                                              The worst case plays out as follows: the typechecker hangs or crashes, and you’re on the issue tracker searching for the issue; if you’re lucky, you’ll find a bug filed using 50~60% of the language extensions you used in your program, and you’re not sure if it’s the same issue; you file a new issue. In either case, your work has been halted.

                                                              In 15 years of using Haskell I have never run into anything like this. It is not the common experience. My code is extremely heavy and uses many features only available in the latest compiler, with 20-30 extensions enabled. Yet this just doesn’t happen.

                                                              There is almost zero documentation on language extensions. Hell, you can’t even find the list of available language extensions with some description on any wiki.

                                                              Every single version of GHC has come with a list of the extensions available, all of which have a description, most of which have code: https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/glasgow_exts.html You can link to the manual that neatly explains everything, rather than to the git repo.

                                                              Looking at the big picture: first, this is a poor way to do software development; as the number of language extensions increase, your testing burden increases exponentially.

                                                              This is only true if you can’t prove how extensions interact, or more fundamentally, that they don’t interact.

                                                              Second, the problem of having a good type system is already solved by a simple dependent type theory; you study the core, and every new feature is just a small delta that fits in nicely with the overall model.

                                                              That’s totally untrue. There is no such general-purpose language today. We have no idea how to build one.

                                                              As opposed to having to read detailed papers on each new language extension. And yes, there’s a good chance that very few people will be able to understand your code if you’re using some esoteric extensions.

                                                              Again, that’s just not true. You don’t need to know how the extensions are implemented. I have not read a paper on any of the extensions I use all the time.

                                                              In summary, language extensions are complicated hacks to compensate for the poverty of Haskell’s type system.

                                                              That’s just the wrong way to look at language extensions. Haskell adds features with extensions because the design is so good. Other languages extend the language forcing you into some variant of it because their core is too brittle and needs fundamental changes. Haskell’s core is so solid we don’t need to break it.

                                                              However, PL research has shifted away from Haskell for the most part

                                                              That’s again totally factually untrue. Just look at Google Scholar, the number of Haskell papers per year is up, not down. The size of the Haskell workshop at ICFP is the same as 5 years ago.

                                                              Moreover, there are no tools to help you debug the most notorious kind of bug seen in a complicated codebase: memory blowups caused by laziness.

                                                              Again, that’s not factually true.

                                                              We have had a heap profiler for two decades, in the past few years we got ThreadScope to watch processes in real time. We have systematic ways to find such leaks quickly, you just limit the GC to break when leaks happen. https://github.com/ndmitchell/spaceleak We also got stack traces in the past few years so we can locate where issues come from. In the past few years we got Strict and StrictData.

                                                              As for the code examples. I can pick 2 lines of any language out of context and you’ll have no idea what they do.

                                                              Who cares what every extension does for every example? That’s the whole point! I have literally never looked at a piece of Haskell code and wondered what an extension does. I don’t need to know. GHC tells me when I need to add an extension and it tells me when an extension is unused.

                                                              How many more language features are missing?

                                                              Extensions are not missing language features.

                                                            2. 1

                                                              GHC also doesn’t contain bad hacks for dependent types, avoiding this is exactly why building out dependent types is taking time.

                                                              Honestly, I’d much rather prefer a simple core model, like that of HoTT.

                                                              1. 3

                                                                Honestly, I’d much rather prefer a simple core model, like that of HoTT.

                                                                I’d love that too! As would everyone!

                                                                But the reality is, we don’t know how to do that. We don’t even know how to best represent computations in HoTT. It might be decades before we have a viable programming language. We do have dependent types that work well in Haskell today, that I can deploy to prod, and that prevent countless bugs while making code far easier to write.

                                                                1. 1

                                                                  I think HoTT with computations is “cubical type theory”? It’s very active currently.

                                                                  As for the dependent types as the backend for advanced type level features, I think it’s what Dotty/scala 3 is about. It’s definitely not the only way to do it, but it’s also not decades away. Idris 2 is also an interesting effort.

                                                            3. 4

                                                              Dependent types aren’t that useful for production software, and full blown dependent types are really contrary to the goals of Haskell in a lot of ways. Any language that’s >20 years old (basically 30) is gonna have some band-aids. I’m not convinced that Haskell is waning in any meaningful way except that people don’t hype it as much on here/hn. Less hype and more doing is a good thing, imho.

                                                              1. 3

                                                                Reminds me of the days when people said FP and complete immutability weren’t useful for production software. It is true that there is no decent general purpose language that implements dependent types, but that’s besides the point.

                                                                It’s true, hype is a poor measure.

                                                                1. 4

                                                                  Yeah, that’s an interesting comparison, but I think it’s a totally different situation. Immutability and dependent types both are things you do to make certain assumptions about your code. In that, immutability allows you to know that some underlying value won’t change. Dependent types allow you to make more general statements/proofs of some invariant. The big difference is that immutability is a simplification. You’re removing complexity by asserting some assumption throughout your code. Generally, dependent types are adding complexity. You have to provide proofs of some statement externally or you have to build the proof of your invariants intrinsically into your constructions. IMHO, that’s a huge difference for the power to weight ratio of these two tools. Immutability is really powerful and fairly light weight. Dependent types are not really that powerful and incredibly heavy. I’m not saying dependent types are worthless. Sometimes you really really want that formal verification (eg. compilers, cryptography, etc). The vast majority of code doesn’t need it, and you’re just adding complexity, something I think should be avoided in production software.

                                                                  1. 3

                                                                    Tl;dr I have a good amount of experience with dependently typed languages, and I write Haskell for a living. After all of my experience, I have come to the conclusion that dependent types are over hyped.

                                                                    1. 1

                                                                      I’ve started writing a post on dependent types. Here’s early draft: https://artagnon.com/articles/dtt

                                                                    2. 3

                                                                      What about Ada?

                                                              1. 6

                                                                This is by far my favorite version control system. Unlike git, I find branching in darcs intuitive and easy to understand.

                                                                1. 6

                                                                  Darcs doesn’t support branching in the same sense as Git, so I’m not sure that’s a fair comparison.

                                                                  The “every repo is a branch” and “Master and Working Repositories” workflows also in Git by cloning the upstream repo, then cloning local “branch” repos from it. When you’re done, push to the local “master”, and then eventually back to upstream. You’d miss out on a lot of Git functionality, but it should work.

                                                                  Personally, I think the Darcs way of doing it is really annoying. I like being able to keep around experimental branches, WIP features and bug fixes, etc. without cluttering my file system. And I like being able to push them all somewhere and clone them in different places with a single “git clone …”. I know under the hood Git’s keeping all that data around, but it’s hidden away in .git where I don’t have to think about it.

                                                                  1. 5

                                                                    I’d love to hear more about how darcs’ branching is different than git’s. Care to give us some more insight?

                                                                    1. 2

                                                                      Indeed. Unlike git, I never find myself needing to rm -rf . my darcs repo.

                                                                      1. 9

                                                                        I’ve never needed to do that to a git repo either

                                                                        1. 1

                                                                          Neither have I, hopefully I can at least encourage people to learn/use git reflog. Its rare you need it but comes in super handy when you’ve made a mistake.

                                                                      2. 1

                                                                        Do you by chance know if darcs how darcs supports big non-text files?

                                                                        1. 3

                                                                          It’s been a while, but as far as I remember, Darcs generally works in a way a bit different from git: it downloads all patch metadata, figures out which patches it needs to reconstruct the current work tree and then downloads the data. So binaries are at least not part of the bundle you usually work with.

                                                                          AFAIK, it works quite well with them. I know it’s been a promoted strong-point of pijul.

                                                                      1. 7

                                                                        Looking up in the changelog for the latest Darcs version 2.16.1 (emphasis mine):

                                                                        Preliminary UNSTABLE support for a new patch theory named “darcs-3”, largely based on the pioneering work of Ian Lynagh for ‘camp’.

                                                                        Please note that this format is not yet officially supported: some features (like conversion from older formats) are still missing, and we have not yet finalized the on-disk format. You should NOT use it for any serious work yet.

                                                                        The new theory finally solves all the well-known consistency problems that plagued the earlier ones, and thus fixes a number of issues (including issue1401 and issue2605) that have been outstanding for many years. It also reduces the worst case asymptotic runtime for commutation and merging from exponential to merely quadratic in the number of patches involved.

                                                                        One of the reasons we are confident this new theory and its implementation is sound, i.e. respect all required properties, is that we have improved our test case generator for sequences of patches. It now generates all possible conflict scenarios. Since the new theory no longer has worst case exponential runtime, we can and did test all required properties and invariants with a large number of generated test cases (up to 100000).

                                                                        Can somebody report how the problems manifests in real life and what are the workarounds? Does it mean that using darcs now, i.e. before “darcs-3” patch theory becomes mature is dangerous?

                                                                        1. 6

                                                                          Wait, this seems like huge news to me!?

                                                                          largely based on the pioneering work of Ian Lynagh for ‘camp’.

                                                                          Can’t find anything more recent than 2011… is there a paper?

                                                                          1. 3

                                                                            Darcs had/has a “poison patch” issue where one patch may trigger very long checkout times, up to a point where it would render your repo useless. Most people would never have that problem, but it did happen to me back in 2004, and I then switched to Mercurial after trying Monotone and Git. You may want to check out Pijul, as it’s the spiritual successor of Darcs.

                                                                            1. 2

                                                                              Uh, I’m trying to remember. If I remember right, darcs had an issue around branches that would go to far away from each other. As darcs tracks which patches are needed to apply a certain patch, this lead to a problematic behaviour. From my experience, that was unlikely, but of sufficient likeliness that it became an issue, especially on large repositories. There’s fixes for that, but that’s a space where you need deep knowledge of darcs, which is kind of against the ethos of the project in trying to go an extra mile for making their technology accessible. (For the last part: I was lurking darcs for a while and tried contributing 1-2 patches)

                                                                              1. 2

                                                                                There are two issues:

                                                                                1. When a conflict between Alice and Bob is solved in two different ways by Alice and by Bob, this generates a new conflict. If they keep doing that n times, applying the patches in Darcs used to be in time 2^n. Solving that is really cool, because AFAIK no one really understood until recently what Darcs was doing in that case.

                                                                                2. Conflicts are a little strange in Darcs, since there is no notion of “state”, so coming back to a conflicting situation (with e.g. darcs revert, or darcs rollback) is always going to be a little bit weird.

                                                                                Solving point 1 is big news.

                                                                              1. 2

                                                                                As someone who is in V’s Discord every day being constantly blown away at the progress being made, I am shocked at the level of dishonesty that this strangely anti-V hit piece achieves.

                                                                                In particular, the degree of cherry-picking (and often then still misrepresenting) a few facts in order to make V appear in the worst possible light is truly breathtaking.

                                                                                She cites vlang.io saying

                                                                                V can be bootstrapped in under a second by compiling its code translated to C with a simple

                                                                                cc v.c

                                                                                No libraries or dependencies needed.

                                                                                then argues against it, preposterously, by saying in part,

                                                                                Git is a dependency, which means perl is a dependency, which means a shell is a dependency, which means glibc is a dependency, which means that a lot of other things (including posix threads) are also dependencies. …

                                                                                Downloading a .c source file requires git? Does this person know what a “dependency” is? Should JavaScript developers include depending upon the laws of physics in package.json?

                                                                                Amusingly, the documentation still claims that memory management is both a work in progress and has perfect accuracy for cleaning up things at compile time.

                                                                                No, the documentation correctly says that memory management is a work in progress, and also that, once completed, will clean up after itself in much the way that Rust does.

                                                                                An Honest Depiction of Progress

                                                                                Here are the combined release notes from all of the V releases since December:

                                                                                Release 0.1.23:

                                                                                - [Direct x64 machine code generation](https://github.com/vlang/v/issues/2849). Hello world being built in 3 milliseconds.
                                                                                - Bare metal support via the `-freestanding` flag, allowing to build programs without linking to libc.
                                                                                - Prebuilt V packages for Linux, macOS, and Windows.
                                                                                - `string.index()` now returns `?int` instead of `int/-1`.
                                                                                - Lots of fixes in Generics.
                                                                                - vweb framework for developing web applications is back.
                                                                                - Vorum, the forum/blogging software written in V/vweb, can now be compiled and has been added to CI.
                                                                                - REPL, `v up` have been split up into separate applications to keep the core V compiler small.
                                                                                - V now enforces short enum syntax (`.green` instead of `Color.green`) when it's enough.
                                                                                - V UI for macOS.
                                                                                - Interfaces have been rewritten. `[]interface` support.
                                                                                - `os.cp()` for copying files and directores.
                                                                                - Additional compile-time flags: `$if clang, msvc, mingw, x32, x64, big_endian, little_endian {`.
                                                                                - All C functions now have to be declared, all missing C functions have been defined.
                                                                                - Global variables (only with the `--enable-globals` flag) for low level applications like kernels and drivers.
                                                                                - Nothing can be cast to bool (previously code like `if bool(1) {` worked.
                                                                                - `<<` and `>>` now work with all integer types.
                                                                                - V detects Cygwin and shows an error. (V supports Windows natively)
                                                                                - Improved type checking of some operators (`%, |, &` etc).
                                                                                - Windows 7 support.
                                                                                - `println(true)` now prints `true` instead of `1`.
                                                                                - `os.exec()` now uses `CreateProcess` on Windows.
                                                                                - fast.vlang.io website for monitoring the performance of V after every commit.
                                                                                - On Windows Visual Studio is now used automatically if GCC is not installed.
                                                                                - vfmt!
                                                                                - Lots of cleaning up in the compiler code.
                                                                                - Multi-level pointers in unsafe code (`****int`).
                                                                                - MSVC backtrace.
                                                                                - `$if os {` blocks are now skipped on a different OS.
                                                                                - C string literals (`c'hello'`).
                                                                                - AlpineLinux/musl fixes + added to CI.
                                                                                - Inline assembly.
                                                                                - Clipboard module (Windows, macOS, X).
                                                                                - `foo()?` syntax for error propagation.
                                                                                - Docs have been migrated from HTML to `doc/docs.md`.
                                                                                - `eventbus` module.
                                                                                - Haiku OS support.
                                                                                - `malloc/free` on bare metal.
                                                                                - `utf8` helper functions (`to_lower()`, `to_upper()`, etc).
                                                                                - Optimization of `for c in str {`.
                                                                                - `string/array.left/right/slice/substr` were removed (`[a..b]` slicing syntax should be used instead).

                                                                                Release 0.1.24:

                                                                                - A new parser/generator built on top of an AST that simplifies code greatly and allows to implement new
                                                                                  backends much faster.
                                                                                - Sum types (`type Expr = IfExpr | MatchExpr | IntegerLiteral`).
                                                                                - B-tree map (sped up the V compiler by ~10%).
                                                                                - `v fmt -w`.
                                                                                - The entire code base has been formatted with vfmt.
                                                                                - Generic structs.
                                                                                - SDL module.
                                                                                - Arrays of pointers.
                                                                                - os: `is_link()`, `is_dir()`, `exists()`.
                                                                                - Ranging through fixed size arrays.
                                                                                - Lots of fixes in ORM and vweb.
                                                                                - The first tutorial: [building a simple web application with vweb](https://github.com/vlang/v/blob/master/tutorials/building-a-simple-web-blog-with-vweb.md).
                                                                                - Match expressions now must be exhaustive.
                                                                                - freestanding: `malloc()`/`free()`.
                                                                                - `++` is now required instead of `+= 1` for consistency.
                                                                                - Interpolated strings now allow function calls: `println('val = $get_val()')`.
                                                                                - `string.replace_each([])` for an efficient replacement of multiple values.
                                                                                - More utf8 helper functions.
                                                                                - `-prealloc` option for block allocations.
                                                                                - `type` aliases.
                                                                                - Running `v` with an unknown command will result in an error.
                                                                                - `atof` implementation in pure V.
                                                                                - Enums can now have negative values.
                                                                                - New `filepath` module.
                                                                                - `math.factorial`.
                                                                                - `ftp` module.
                                                                                - New syntax for casting: `val as Type`.
                                                                                - Fewer libc functions used (soon V will have no dependency on libc).

                                                                                Release 0.1.27:

                                                                                - `vfmt` has been re-written from scratch using the new AST parser. It's much faster, cleaner, and can format
                                                                                files with compilation errors.
                                                                                - `strconv`, `sprintf`, and `printf` in native V, without any libc calls.
                                                                                - Interfaces are now a lot more stable and have all expected features.
                                                                                - Lots of x64 backend improvements: function calls, if expressions, for loops, local variables.
                                                                                - `map()` and `filter()` methods can now be chained.
                                                                                - New `[]int{cap:cap, len:len}` syntax for initializing array length and capacity.
                                                                                - New `is` keyword for checking the type of sum types and interfaces.
                                                                                - `as` can now be used to cast interfaces and sum types.
                                                                                - Profiling with `-profile`. Prints a nice table with detailed information about every single function call:
                                                                                number of calls, average time per call, total time per function.
                                                                                - `import(xxx)` syntax has been removed in favor of `import xxx` for simplicity and greppability.
                                                                                - Lots of fixes and improvements in the type checker.
                                                                                - `time.StopWatch`
                                                                                - `dl` module for dynamic loading.
                                                                                - Automatic `str()` method generation for every single type, including all arrays and fixed size arrays.
                                                                                - Short struct initialization syntax for imitating named function args: `foo(bar:0, baz:1)`.
                                                                                - New operator `!in`.
                                                                                - Performance improvements in critical parts of the builtin data structures (array, map).
                                                                                - High order functions improvements (functions can now be returned etc).
                                                                                - Anonymous functions that can be defined inside other functions.
                                                                                - Built-in JSON module is back.
                                                                                - Closures.
                                                                                - Lots and lots of new tests added, including output tests that test error messages.
                                                                                - Multiple errors are now printed, the compiler no longer stops after the first error.
                                                                                - The new JS backend using the AST parser (almost complete).
                                                                                - Variadic functions.
                                                                                - `net.websocket` module (early stage).
                                                                                - `vlib` is now memory leak free, lots of `autofree` improvements.
                                                                                - Simplified and cleaned up `cmd/v`, `v.builder`.
                                                                                - V UI was updated to work with the new backend.

                                                                                After she COMPLETELY ignores the MASSIVE progress mademore than 3000 commits worth from a brilliant and fiercely dedicated team – and judges the current state of V based exclusively on misunderstandings, nitpicks, and on its memory management status after acknowledging that it’s not done yet and that the language is in an alpha state, she snarkily ends with:

                                                                                Overall, V looks like it is making about as much progress as I had figured it would.

                                                                                This is almost as bad as the quote she ended with in her previous post on V:

                                                                                Don’t ever, ever try to lie to the Internet, because they will catch you. …

                                                                                Honesty, Please!

                                                                                If you want to know how well V is actually progressing, try it yourself, check out the Discord, look on GitHub, but whatever you do, do not focus on ignorant, dishonest, cherry-picked commentary from haters; that doesn’t serve anyone well, and is incredibly unfair to those who are pouring themselves into this important project.

                                                                                The Brilliance of V

                                                                                After my 11 years of programming, including 9.5 of programming in Go (which is the most similar language to V), I consider V to easily be the best-designed programming language that exists.

                                                                                Yes, it’s learned a lot from Go and C, and maybe Lisp people prefer Lisp, but V successfully combines the simplicity of Go, the programmer ergonomics of Python, the speed C, and almost as many safety guarantees as Rust (once V has finished implementing these latter aspects, of course!).

                                                                                What I thought would take the V team 5 years to implement has taken less than 1 year. Alex (V’s creator) thought it would take even less time, and now he’s being unfairly raked over the coals for setting extremely ambitious timelines while the same naysayers and bullies ignore everything that has been done.

                                                                                V Resources

                                                                                Website (including code examples): https://vlang.io/

                                                                                GitHub: https://github.com/vlang/v

                                                                                Wiki page explaining why C is used as in intermediate representation rather than LLVM (another brilliant move that allows V to build on the shoulders of giants and avoid reinventing the wheel in order to bootstrap a new language, but a move that is misunderstood and absurdly used to argue against V for doing things differently/better): https://github.com/vlang/v/wiki/On-the-benefits-of-using-C-as-a-language-backend

                                                                                1. 26

                                                                                  I understand that you have strong feelings for your language of choice. Nonetheless, language designers are not entitled to a community, nor are they entitled to shelter from criticism. Indeed, one of the most important parts of programming language design is rejecting new languages based on showstoppingly-unacceptable design choices.

                                                                                  V does not offer any compelling design choices. Its advertised features can be sorted into libraries, compiler/toolchain offerings, and roughly the level of safety advertised in the 80s when memory-safety was still controversial. Just like Go, V has not learned many lessons, and refuses to offer programmers a more interesting way to express themselves. Even if V were literally Go but better, this would be sufficient to damn it.

                                                                                  I understand that you might not like it when people point out that the release dates keep slipping; I think it’s curious that you are willing to link to V’s wiki and source code, but not to bumping release dates.

                                                                                  As a language designer, I think that it is important to not advertise what you don’t yet have written. Monte has had one release, a developer preview, and we are intending to complete another iteration of bootstrapping before even considering another release. We know that almost every feature that typical end users will want is not yet written, and so we are not loudly advertising our offering as usable for everyday general-purpose programming, regardless of how much everyday general-purpose programming I or anybody else actually achieves with it.

                                                                                  I consider V to easily be the best-designed programming language that exists.

                                                                                  What’s your favorite ML? I have lately played with OCaml. There are entire universes of language designs which I suspect that you have yet to explore.

                                                                                  1. -4

                                                                                    Just like Go, V has not learned many lessons, and refuses to offer programmers a more interesting way to express themselves.

                                                                                    FP diehards will never understand why Go has been so wildly successful – and V will be even more successful than Go.

                                                                                    V is like Go but fixes all ~10 things wrong with it, providing a lot more flexibility due to its generic functions, generic structs, generic channels (still in the works), sum types, and TypeScript-style interfaces (also still partially in the works).

                                                                                    Plus there’s the raw speed factor; V is translated to C before being compiled to machine code, cleverly piggybacking on decades of speed optimizations made by gcc/clang/tcc/etc.

                                                                                    The simplicity of Go or Python + almost as much safety as Rust + almost exactly as much speed as C + a flexible type system + familiar syntax == a winning combination, I insist!

                                                                                    1. 17

                                                                                      The simplicity of Go or Python + almost as much safety as Rust + almost exactly as much speed as C + a flexible type system + familiar syntax == a winning combination, I insist!

                                                                                      Except all of these are “some time in the future”, and widely incompatible with one another. There’s nothing to support any of these claims. What’s the design for “almost as much safety as rust” (without GC, of course)? The whole thing only just got an AST, and we’re supposed to believe it’s going to be revolutionary? There’s been a lot of grand promises with release dates being pushed back repeatedly, but nothing specific about how the promises will actually be achieved. Making a good language is hard, it takes years (if not decades), and you can’t just magically make something both simple, fast, safe, gc-free, etc. in a field where it’s known that some tradeoffs are inevitable.

                                                                                      1. -3

                                                                                        Except all of these are “some time in the future”, and widely incompatible with one another.

                                                                                        Nope, totally wrong. The simplicity is there, the speed is there, the flexible type system is there, and the familiar syntax is there. A safe subset of C is generated then compiled but not all the safety guarantees are implemented yet.

                                                                                        There’s been a lot of grand promises with release dates being pushed back repeatedly

                                                                                        V is the first software project in history to be finished later than originally intended ;-).

                                                                                        The whole thing only just got an AST

                                                                                        Completely false; V has always had an AST. The AST-related thing that’s new is representing the generated C code as an AST before outputting it.

                                                                                        … you can’t just magically make something both simple, fast, safe, gc-free, etc. in a field where it’s known that some tradeoffs are inevitable.

                                                                                        The big “a-ha” moment for me was this: I now realize that I had falsely assumed that just because prior languages took certain trade-offs that it was impossible to check all these boxes at once. But I was wrong.

                                                                                        The only inherent tension between any of the things I listed is between simplicity and flexibility. But as I said in the comment you’re replying to,

                                                                                        V is like Go but fixes all ~10 things wrong with it, providing a lot more flexibility due to its generic functions, generic structs, generic channels (still in the works), sum types, and TypeScript-style interfaces (also still partially in the works).

                                                                                        The limiting factor is not some innate impossibility of making a language that is simple, fast, and safe. The limiting factor is creativity. But V has learned much from Go, Rust, Python, and other languages, and has unique insights of its own (like its error handling!).

                                                                                        New things are, in fact, possible… and spelled out in detail on the website and in the docs, in this case. See for yourself: https://github.com/vlang/v/blob/master/doc/docs.md .

                                                                                        1. 9

                                                                                          the flexible type system is there

                                                                                          hydraz below convincingly demonstrated that if function calls have generic types, type is not checked at all(!) in current V. How can you say type system is “there”? I guess it is “there” in terms of code generation, but if you are not checking types, saying type system is there is at best deceptive.

                                                                                          1. -4

                                                                                            How can you say type system is “there”?

                                                                                            …because there are types you can define and instantiate and do all the usual things that programming languages let you do with types…

                                                                                            hydraz said,

                                                                                            type errors for parameters in functions with a slapped on them are still silently ignored…

                                                                                            Silently ignored? If you use a generic type in a way that’s invalid then the program won’t compile (yes, during the C -> machine code step).

                                                                                            1. 9

                                                                                              I think you need to read my comments - and indeed, the compiler code that I linked - again. V has roughly no type system at all. The function, foo, that I wrote, isn’t generic!

                                                                                              • It does have a <T>, but there’s nothing to infer that T from (This should be a type error. It isn’t)
                                                                                              • It takes a string, but I can give it an int, and this should be an error, but the compiler has code specifically for silently ignoring these errors.
                                                                                          2. 8

                                                                                            spelled out in detail

                                                                                            Let’s see memory management: there’s no explanation, just claims there are (or will be, it’s wip after all) no leaks, no gc, not refcounting, but also no manual memory management (it’s hardly leak free, after all, even in rust). What magic justifies that? Is there just no dynamic allocation? Otherwise I’d like to see the papers and subsequent Turing award for solving memory management once and for all.

                                                                                            As for the deadlines: the author of V has made ridiculous deadlines so many times, for no good reason (why promise something in a few weeks or months instead of just waiting for it to be polished?!). It’s not like open source projects are tied to pointy haired boss deadlines.

                                                                                        2. 13

                                                                                          Interestingly, I’m not an “FP diehard”; I come from an object-based tribe, and I work on object-based designs.

                                                                                          None of the listed improvements to V over Go are related to what makes Go bad; I have a thread from last year exploring the worst of Go’s flaws. In short, the problem isn’t even an upper limit on abilities, but a lower limit on how much code is required to do even the bare minimum, and a surprising lack of safety in common situations.

                                                                                          As the original post author and several others have repeatedly indicated throughout current and past discussion about V, the speed claims simply aren’t being substantiated in an open and reproducible configuration which the community can examine and test themselves. Simply changing the host language does not grant speed, unfortunately, because of interpretative overhead, and the only cure is putting effort into the compiler.

                                                                                          At the same time, it is toolchains and not languages that are fast, and so any novel speed improvements in V should be explicable to the rest of us. For example, in Monte, we use nanopass design and ANF style, originally explored in Scheme. We have a JIT in the Smalltalk/Java tradition, using an off-the-shelf toolkit, RPython.

                                                                                          As an aside, I would imagine that V would be able to more efficiently take advantage of GCC/LLVM/etc. by picking just one backend, and emitting code just for that backend. This would be due to C’s relatively poor guarantees about how memory will be used.

                                                                                          1. 5

                                                                                            V is translated to C before being compiled to machine code

                                                                                            That, right there, is enough for me to question any safety guarantee V offers (and I like C).

                                                                                            1. 4

                                                                                              Nim compiles to C and it’s garbage collected. I believe the reasons they do that are runtime reach and whole program optimization.

                                                                                              If you can statically guarantee safety it shouldn’t be a problem. (However, its not necessarily a trivial thing to suggest.)

                                                                                              1. 3

                                                                                                ATS compiles to C too, if I understand it correctly. And there have been Haskell compilers that compiled to C too and many other programming languages that provide some aspects of safety that the underlying C language, like the machine code, do not provide.

                                                                                                1. 4

                                                                                                  Why does using C as an intermediate language in the compilation process necessarily imply that a language’s safety guarantees are bad? Compilers that compile to some kind of bytecode - like rustc compiling to LLVM bitcode, or JVM langauges’ compilers compiling to JVM bytecode - are perfectly capable of being safe, even though they output code in an unsafe language (which may or may not be the final compilation output - it is (I think) in the JVM case, but LLVM bitcode is further transformed into machine-specific machine code). I don’t see why C should be any different in this respect.

                                                                                                  1. 3

                                                                                                    I don’t know the guarantees of LLVM or JVM, but at the language level, C has a ton of unspecified and undefined behaviors. Skipping the dangers around pointers and arrays, you still have the following undefined behavior as outlined in the C standard:

                                                                                                    • shifting an integer by a negative value
                                                                                                    • shifting an integer more than its size in bits
                                                                                                    • left shifting a negative value
                                                                                                    • signed integer representation (sign-magnitude, 1s complement, 2s complement [1])
                                                                                                    • (quoting from the C99 standard for this one): Whether certain operators can generate negative zeros and whether a negative zero becomes a normal zero when stored in an object
                                                                                                    • signed integer trap representations
                                                                                                    • signed integer wrap sematics
                                                                                                    • padding value
                                                                                                    • padding in general
                                                                                                    • reading a union member that wasn’t the last one written to

                                                                                                    Now, it seems that V is targeting GCC/clang, but even so, you’ll get differences in behavior across different architectures, specifically with shifting (some architectures will mask the shift count, some won’t). When I see “safety” as applied to a computer language, I would expect these issues will be spelled out as to what to expect.

                                                                                                    [1] In my research, there aren’t many systems in use today that are not 2s complement. They are:

                                                                                                    • Unisys 1100/2200
                                                                                                    • Unisys ClearPath A
                                                                                                    • IBM 700/7000 series

                                                                                                    I know one of the Unisys systems is still be produced today and has a C compiler (which one, I don’t recall, I think the 1100/2200).

                                                                                                    1. 2

                                                                                                      you do realize that source code gets compiled to machine code, which is not safe by definition.

                                                                                                      The generated C code doesn’t use any of these, and doesn’t have to.

                                                                                                      1. 3

                                                                                                        Then what’s your definition of “safe” then? There is way less that’s undefined in assembly than in C. Give me an architecture, and I can look up what it does for the above list. The reason C has so much undefined behavior is precisely because it runs on many architectures and the designers of C didn’t want to favor one over the other. Different languages can make different trade offs .

                                                                                                2. 5

                                                                                                  FP diehards will never understand why Go has been so wildly successful – and V will be even more successful than Go.

                                                                                                  Do you? Go succeeded because it was created and backed by veteran Bell Labs people and Google, to solve existing problems. I’m not talking about marketing only, but also the level of sophistication and simplicity those people were able to bring in.

                                                                                                  It also succeeded because it didn’t promise anything it didn’t deliver.

                                                                                                  1. -4

                                                                                                    Yes. I spotted Go as great technology in November of 2010. Go is simple and fairly powerful considering that simplicity.

                                                                                                    The original version of V was written in Go, V has learned many lessons from Go, both from its strengths that V builds on and the weaknesses it shores up with generic functions, generic structs, sum types, and more.

                                                                                              2. 14

                                                                                                I’d be interested in seeing what kind of these “lots of fixes in Generics are”, because as far as I can tell from reading the compiler source code, type errors for parameters in functions with a <T> slapped on them are still silently ignored…

                                                                                                  if !c.check_types(typ, arg.typ) {
                                                                                                    // str method, allow type with str method if fn arg is string
                                                                                                    if arg_typ_sym.kind == .string && typ_sym.has_method('str') {
                                                                                                      // note: str method can return anything. will just explode in the C compiler -- hydraz
                                                                                                    if typ_sym.kind == .void && arg_typ_sym.kind == .string {
                                                                                                    if f.is_generic {
                                                                                                      // ignore errors in functions with a <T> -- hydraz
                                                                                                    if typ_sym.kind == .array_fixed {

                                                                                                Try this code:

                                                                                                fn  foo<T>(y string) int {
                                                                                                  return 0
                                                                                                fn main() {
                                                                                                1. -3

                                                                                                  You had foo take a string then passed in an int :-)

                                                                                                  EDIT: This works, for example:

                                                                                                  fn foo<T>(y string) int {
                                                                                                    return 0
                                                                                                  fn main() {
                                                                                                  1. 23

                                                                                                    … Yes, that’s my point. I passed an int to a string parameter, and the V compiler didn’t give a type error: the C compiler did.

                                                                                                    % make
                                                                                                    cd ./vc && git clean -xf && git pull --quiet
                                                                                                    cd /var/tmp/tcc && git clean -xf && git pull --quiet
                                                                                                    cc  -g -std=gnu11 -w -o v ./vc/v.c  -lm -lpthread
                                                                                                    ./v self
                                                                                                    V self compiling ...
                                                                                                    make modules
                                                                                                    make[1]: Entering directory '/home/abby/Projects/v'
                                                                                                    #./v build module vlib/builtin > /dev/null
                                                                                                    #./v build module vlib/strings > /dev/null
                                                                                                    #./v build module vlib/strconv > /dev/null
                                                                                                    make[1]: Leaving directory '/home/abby/Projects/v'
                                                                                                    V has been successfully built
                                                                                                    V 0.1.27 b806fff
                                                                                                    % ./v test.v
                                                                                                    /home/abby/.cache/v/test.tmp.c: In function ‘main’:
                                                                                                    /home/abby/.cache/v/test.tmp.c:9476:2: error: implicit declaration of function ‘foo’ [-Werror=implicit-function-declaration]
                                                                                                     9476 |  foo(123);
                                                                                                          |  ^~~
                                                                                                    /home/abby/.cache/v/test.tmp.c: In function ‘vcalloc’:
                                                                                                    /home/abby/.cache/v/test.tmp.c:4597:1: warning: control reaches end of non-void function [-Wreturn-type]
                                                                                                     4597 | }
                                                                                                          | ^
                                                                                                    /home/abby/.cache/v/test.tmp.c: In function ‘byte_is_white’:
                                                                                                    /home/abby/.cache/v/test.tmp.c:7227:1: warning: control reaches end of non-void function [-Wreturn-type]
                                                                                                     7227 | }
                                                                                                          | ^
                                                                                                    (Use `v -cg` to print the entire error message)
                                                                                                    builder error: 
                                                                                                    C error. This should never happen.
                                                                                                2. 17

                                                                                                  Author of the post here, let me see if I can try to clear some things up.

                                                                                                  She cites vlang.io saying

                                                                                                  V can be bootstrapped in under a second by compiling its code translated to C with a simple

                                                                                                  cc v.c

                                                                                                  No libraries or dependencies needed.

                                                                                                  then argues against it, preposterously, by saying in part,

                                                                                                  Git is a dependency, which means perl is a dependency, which means a shell is a dependency, which means glibc is a dependency, which means that a lot of other things (including posix threads) are also dependencies. …

                                                                                                  Downloading a .c source file requires git? Does this person know what a “dependency” is? Should JavaScript developers include depending upon the laws of physics in package.json?

                                                                                                  Okay I was being a bit unfair, but if we look at the makefile we see that it has the following dependencies:

                                                                                                  • make (which depends on perl, glibc, autotools and all that nonsense)
                                                                                                  • git (which depends on perl (even at runtime), glibc, autotools and all that nonsense)
                                                                                                  • gcc (which depends on perl, glibc, autotools, automake, autoconf and more libraries than I care to list right now)

                                                                                                  So if you want to be completely honest, even if you cut out the make and git steps (which i care about as someone who builds packages for linux boxes using the unmodified build system as much as possible so I can maintain whatever shred of sanity I have left), it still depends on a C compiler to get bootstrapped. This is a dependency. Then you have to download the bootstrap file from somewhere, which requires dependencies in terms of root certificates and the compiler to bootstrap with (not to mention the server that hosts the bootstrap file both existing and serving the right file back). Given that V in its current form requires you to download files from the internet in order to build it, mathematically it cannot be dependency free (this actually precludes it from being packageable in NixOS, because NixOS doesn’t allow package builds to access the network, all of the tarball/assets need to be explicitly fetched outside the build with fetchgit, fetchurl and similar). Pedantically, requiring someone to have an internet connection is a dependency.

                                                                                                  Pedantically, the v binary lists the following dynamically linked dependencies when using lld(1):

                                                                                                  $ ldd ./v
                                                                                                          linux-vdso.so.1 (0x00007fff2d044000)
                                                                                                          libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f2fb3e4c000)
                                                                                                          libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f2fb3a5b000)
                                                                                                          /lib64/ld-linux-x86-64.so.2 (0x00007f2fb4345000)

                                                                                                  If the binary was truly dependency-free, the ldd output would look something like this:

                                                                                                  $ ldd $HOME/bin/dhall
                                                                                                          not a dynamic executable

                                                                                                  This leads me to assume that the v binary has dependencies that the runtime system will need to provide, otherwise the program will not be able to be loaded by the Linux kernel and executed. Binaries produced by v have similar limitations:

                                                                                                  $ ldd ./hello
                                                                                                          linux-vdso.so.1 (0x00007ffdfdff2000)
                                                                                                          libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fed25771000)
                                                                                                          /lib64/ld-linux-x86-64.so.2 (0x00007fed25d88000)

                                                                                                  Additionally, I am banned from the V discord and GitHub. The V programming language has censored a neuro-diverse trans woman from being able to contribute to the project in any capacity. I would love to be able to contribute to things at least to make the documentation and website not filled with misleading statements, but I cannot. This is why I ask other people to make issues for me in my posts.

                                                                                                  I realize things might look snarky when the article is viewed from a certain worldview/lens, but there is a total scalar lack of snark intended in that article when it was written. If you cannot realize that, then I am sorry that my intended tone didn’t have the effect I wanted and I will use this feedback to further refine my writing ability.

                                                                                                  Direct x64 machine code generation

                                                                                                  In my testing I was unable to get this working on my Ubuntu server. It still used gcc.

                                                                                                  1. 8

                                                                                                    You ended your piece (which utterly trashes V) by saying

                                                                                                    Overall, V looks like it is making about as much progress as I had figured it would.

                                                                                                    I criticized your snark, and you replied with

                                                                                                    I realize things might look snarky when the article is viewed from a certain worldview/lens, but there is a total scalar lack of snark intended in that article when it was written.

                                                                                                    Do you really expect anyone to believe that?

                                                                                                    Additionally, I am banned from the V discord and GitHub. The V programming language has censored a neuro-diverse trans woman from being able to contribute to the project in any capacity.

                                                                                                    Do you really think that’s why you were banned? Does Alex even know you’re trans? You don’t think you were banned for your vicious and misleading attacks on this project?

                                                                                                    1. 6

                                                                                                      You’re just ignoring the points she made here, and keep talking besides her and banging on about the article.

                                                                                                      1. 6

                                                                                                        @cadey didn’t say they were banned for who they are, they just stated they were X and were banned.

                                                                                                        You don’t think you were banned for your vicious and misleading attacks on this project?

                                                                                                        If my memory serves me right, @cadey was banned because of their disagreements with V, such as those voiced in this and the previous (https://christine.website/blog/v-vvork-in-progress-2020-01-03) blog post. I could be wrong. Also “vicious” is being overly dramatic and frankly not productive.

                                                                                                        1. 4

                                                                                                          @cadey didn’t say they were banned for who they are, they just stated they were X and were banned.

                                                                                                          Then why bring it up?

                                                                                                          If my memory serves me right, @cadey was banned because of their disagreements with V, … . I could be wrong.

                                                                                                          Is that actually true?

                                                                                                          Also “vicious” is being overly dramatic …

                                                                                                          It doesn’t sound like you’re paying very close attention… Two other people criticized her for bullying in this very thread before I even got here. If you read my comments here then I’d hope you would change your mind about how unfairly harsh she has been to Alex, to his project, and to the V team.

                                                                                                          Consider starting here: https://lobste.rs/s/nfjifq/v_update_june_2020#c_vuofat

                                                                                                          1. 22

                                                                                                            You almost never comment except in V threads.

                                                                                                            Further, many of those comments seem to be today, in last year’s thread.

                                                                                                            Please just let this be. Flag the submission if you must and move on.

                                                                                                            1. 6

                                                                                                              I have removed the post as of this commit. In a few minutes the article will be gone, but a tombstone of the former article will remain.

                                                                                                              1. 19

                                                                                                                I don’t think you did anything wrong by posting it, I don’t think you made any mistakes, and I enjoyed reading it. Not saying this to convince you to put the essay back; I just know that I personally feel awful when I get really harsh criticism, even when I don’t respect or care about the person giving it. So wanted to provide a bit of positivity to balance it out ツ

                                                                                                              2. 8

                                                                                                                Then why bring it up?

                                                                                                                I can’t answer that question. While I’m not sure what the benefit is of bringing it up, I don’t see the harm either; it was pretty clear from the comment they were not saying they were banned because of who they are.

                                                                                                                Is that actually true?

                                                                                                                Feel free to show otherwise. I can’t, because I’m not the person who banned @cadey; nor am I in contact with them.

                                                                                                                It doesn’t sound like you’re paying very close attention

                                                                                                                Please do not make such assumptions. It’s unproductive starting an argument as such, as well as factually incorrect.

                                                                                                                Two other people criticized her for bullying in this very thread before I even got here.

                                                                                                                Only one person said it’s starting to look like bullying (https://lobste.rs/s/nfjifq/v_update_june_2020#c_cdxvwk). Other comments mentioning “bullying” either state they are not sure, or don’t see it as bullying. I don’t see anybody else mentioning this is bullying. Am I perhaps overlooking something?

                                                                                                                If you read my comments here then I’d hope you would change your mind about how unfairly harsh she has been to Alex, to his project, and to the V team.

                                                                                                                I agree the tone in the blog post is not the most productive. While some parts of the post are a bit pedantic, overall I think it’s not unfairly harsh. V made many big claims both before and after its release. Here are just a few examples:

                                                                                                                • https://github.com/vlang/v/issues/35
                                                                                                                • Translating C++ to V, which now has the note “TODO: translating C to V will be available in V 0.3. C++ to V will be available later this year.”
                                                                                                                • Claiming V has pure functions when they can still have side-effects. “Pure” has a well defined meaning. If you mean “pure but with IO”, then call it something else; otherwise it’s just misleading. This was brought up in this issue, which was closed, but I can’t find any mention of this in the docs here.
                                                                                                                • Various claims about certain components not having dependencies, only to have dependencies; despite the website advertising “Compiles to native binaries without any dependencies”
                                                                                                                • Advertising “V is written in V and compiles itself in under a second.”, when according to https://fast.vlang.io/ compiling V with optimisations (if I’m reading the table correctly) takes over one second most of the time.

                                                                                                                There is a lot more from past discussions (e.g. those on Hacker News), so I suggest taking a look at those.

                                                                                                                With that all said, I really do hope V succeeds and wish the authors the best of luck. But the authors really need to make sure that what they advertise is actually implemented as advertised, or make it very clear what the current state is. Don’t go around saying “We support X” followed by “oh by the way we’ll release that end of this year”.

                                                                                                      1. 33

                                                                                                        When content recommendation becomes the most important highlight of a privacy-friendly browser’s new release.

                                                                                                        I love Firefox, but sjeesh

                                                                                                        1. 13

                                                                                                          It is infuriating that the developers of a web browser consider it acceptable to implement any “content recommendation” on their program.

                                                                                                          1. 7

                                                                                                            Why? As stated up-thread, browsing data is never uploaded. Content recommendation happens locally only. What is so wrong with this?

                                                                                                            1. 7

                                                                                                              Because it’s a browser. It should empower me to search the internet for the stuff I want to see, not they think I want to see. I gain no user experience whatsoever. It’s a slippery slope downhill from any recommendation system, no matter how privacy friendly they claim it to be.

                                                                                                              1. 6

                                                                                                                Imagine your new FM radio “recommended” which station to tune to when you turned it on. Would it really be any comfort if the manufacturer assured you that that this recommendation had nothing to do with your own preferences, because they don’t know and definitely don’t care? Bookmarks have been part of browsers since the beginning. This is something else.

                                                                                                                People only put up with this nonsense because it’s free-as-in-beer. That’s why I’d be happy to pay for a fork that treated me like a paying customer rather than a set of eyeballs to sell through some convoluted scheme papered over with a lot of patronizing rhetoric.

                                                                                                                1. 4

                                                                                                                  Maybe a car analogy is useful for you. It’s as if your car “recommended” which restaurant to go when you drive it on Saturday afternoon, and actually drove you there without asking, until you overrade it. A minimal amount of fuel would be lost at the beginning of the journey; this is no problem, you can override it at any time. Would you be OK with that?

                                                                                                                  I do not want my web browser to make any network request when I open it, unless I ask for it explicitly. As other posts in this thread explain, this is actually impossible with firefox. This is what infuriates me.

                                                                                                                  1. 0

                                                                                                                    Though they say the browsing data is never uploaded, it’s trivial to match the IP and the time of when the content was recommended. That info can then be correlated with the site serving the recommended content. Several ways exist in which to deanonymize browsing history.

                                                                                                                2. 10

                                                                                                                  One of the first things - besides installing uBlock and friends - I do with a new FF installation is the disabling of all these spurious ‘services’ - content suggestion, dangerous content warnings, the various telemetry bits apart from bug reports, those I do send seeing as I run nightly and as such can provide useable reports.

                                                                                                                  1. 4

                                                                                                                    I would be grateful if you could share the configurations that you are doing. I am asking so that I could note them down and set them too.

                                                                                                                    I would like if NixOS would provide firefox configuration options that could be configured centrally, or per user, to make sure that every upgrade applies them.

                                                                                                                    1. 3

                                                                                                                      home-manager allows you to declaraticely configure which Firefox add-ons to install (although you still need to enable them manually the first time you start Firefox for security reasons). And you can set Firefox options declaratively using their enterprise policies.

                                                                                                                      1. 2

                                                                                                                        I don’t use much magic to configure it, most is by hand. The only ‘automatic’ thing I do is install a policies.jsonfile (in distribution/policies.json in the FF install directory, in my case that is /opt/APPfirefox/bin/distribution/policies.json) which disables automatic updates since I handle those using a script. I do not want to have binaries writeable by users so these automatic update policies are out of the question. The update script pulls new nightlies from the server and installs them, installs the policies.json file in the correct location and sets ownership and permission so that regular users can execute, but not modify the distribution. I used to have a FF sync server when that was still a thing but eventually it got too hard to reinstate ‘old’ sync support. I do not have, nor do I want to have a ‘Firefox account’ since I do not use any such external services if I can in any way avoid them. I might look into building a ‘new’ FF sync server some time but other matters are more important for now. Until such time I will simply install the following extensions:

                                                                                                                        • uBlock origin (set to ‘expert user’ mode)
                                                                                                                        • uMatrix (disabled by default)
                                                                                                                        • Nuke Anything (to get rid of annoying overlays which uBlock can not filter out)
                                                                                                                        • Open With (to open e.g. media files through a local script)
                                                                                                                        • Containers with Transitions (to always open certain sites in site-specific container tabs)
                                                                                                                        • Foxyproxy Standard (disabled, sometimes used to redirect sites through a local Tor node)
                                                                                                                    2. 13

                                                                                                                      It seems to me it’s just a “here are the most popular articles”-list; don’t see anything wrong with that, or any fundamental privacy-concerns. Also from the expanded announcement on it:

                                                                                                                      Recommendations are drawn from aggregate data and neither Mozilla nor Pocket receives Firefox browsing history or data, or is able to view the saved items of an individual Pocket account. A Firefox user’s browsing data never leaves their own computer or device.

                                                                                                                      And from the FAQ:

                                                                                                                      [N]either Mozilla nor Pocket ever receives a copy of your browser history. When personalization does occur, recommendations rely on a process of story sorting and filtering that happens locally in your personal copy of Firefox.

                                                                                                                      1. 11

                                                                                                                        I see something wrong with that, that being giving the user an experience that they have not ask for nor had any control over. Also, what news site, and what collection of news given to the user is trustworthy in a general sense?

                                                                                                                        I feel about it as if I got public broadcasting in my new tab, not something I want nor I am interested in.

                                                                                                                        1. 9

                                                                                                                          that being giving the user an experience that they have not ask for

                                                                                                                          How can you be so sure? I’m a Firefox user, and I find those articles occasionally useful.

                                                                                                                          nor had any control over

                                                                                                                          You can switch it off easily in preferences or directly on the New Tab page (three dots in the upper right corner).

                                                                                                                          1. 4

                                                                                                                            “nor had any control over” is a terrible way to word it (it is your computer and you are definitely in control). My first reaction was “this person is entitled as heck”.

                                                                                                                            However, there is an implied social contract (because firefox existing makes it socially/politically almost impossible to get an alternative off the ground). I still disagree with lich, but their argument has legs.

                                                                                                                          2. 3

                                                                                                                            I see something wrong with that, that being giving the user an experience that they have not ask for nor had any control over. Also, what news site, and what collection of news given to the user is trustworthy in a general sense?

                                                                                                                            I don’t feel that’s a fair characterization. Any new feature can be described as giving the user an experience they did not asked for. And as other commenters note, it can be disabled. Which grants control.

                                                                                                                            As to a user experience, I have lobsters show up in my recommended list, probably because I visit it so often. It does make some sense that I would be recommended what I like to habitually visit.

                                                                                                                            I even removed the suggestion a few times and timed how long and how many visits made it reappear. For me, it learned the association in a day and ten visits to the front page because my habit is to close the tab after quickly reviewing the stories posted.

                                                                                                                          3. 6

                                                                                                                            Where does the aggregate data come from?

                                                                                                                          4. 5

                                                                                                                            I cannot use Firefox and feel safe without ghacks user.js. It is kind of absurd that there is no real community-lead option for browsers. You could put the blame on standards bodies for creating bloated standards, but now more than ever they are just a facade commanded by corporate interests. I don’t know much about it but Project Gemini (along with gopher) seem to be closer to achieving the goals of free software and the “original dream of the web” (whatever that means).

                                                                                                                            Edit: typo

                                                                                                                            1. 1

                                                                                                                              I totally get your point. Would something like Pale Moon feel better to use?

                                                                                                                          1. 4

                                                                                                                            Where can I find more information about why Plan 9 is amazing, especially how it compares/contrasts to Linux or Unixes?

                                                                                                                            1. 10

                                                                                                                              I found this paper to be a wonderful walkthrough. I highly recommend getting a copy of 9front running, and going through some the exercises in the paper.

                                                                                                                              It’s very long, but definitely a great way to get a feel for how some of the concepts in Plan 9 are applied.

                                                                                                                              Edit: Since I was reminded how much I like this paper, I decided to submit it as a story.

                                                                                                                              1. 5

                                                                                                                                Some goodies from Plan 9 were ported to *nixes, for example procfs, unfortunately not all of them (Plan 9 like process namespaces).

                                                                                                                                1. 7

                                                                                                                                  Unfortunately, the best piece of plan 9 is impossible to port: A unified, interposable way of doing everything, so you don’t have to think about huge numbers of special cases and strange interactions between features. 9p is, more or less, the only way to talk to the OS, and the various interfaces that are exposed over it can be transparently swapped out with namespaces, allowing you to replace, mock out, or redirect across the network any part of the system that you want.

                                                                                                                                  1. 4

                                                                                                                                    Well, Linux namespaces got 90% of the way there, although they certainly didn’t get their ergonomics.

                                                                                                                                  2. 3

                                                                                                                                    I just watched the video https://www.youtube.com/watch?v=3d1SHOCCDn0

                                                                                                                                    Found the way the presenter explained the core concept very understandable. It is 40 minutes long.

                                                                                                                                  1. 5

                                                                                                                                    As long as it isn’t as unstable as the Ubuntu x Thinkpad partnership.

                                                                                                                                    1. 2

                                                                                                                                      How is that these days? It’s been about 6 years since I last ran Ubuntu on a Thinkpad.

                                                                                                                                      1. 2

                                                                                                                                        For what my experience is worth, I have a X1 Extreme hi resolution. It was a nightmare to get it work. To use external displays didn’t work. When the BIOS was set to hybrid graphics, it didn’t work.

                                                                                                                                        In the end it worked, but I have sacrificed so much of my time that I would wish to use for something else.

                                                                                                                                        And when it finally worked, I had the worst input lag since the beginning of my Linux experience (95). Just typing in a terminal window was so bad, that I was constantly making typos! I guess it was Ubuntu’s switch to Gnome + using 3D where a good 2D would be enough.

                                                                                                                                        The battery time is terrible. I am happy if I can make through 2h on battery.

                                                                                                                                        Switched now to NixOS. Fighting the hardware issues too. But my hope is, that when I fought through this, I will be in piece for some time. Ubuntu was serving me for 15 years and I am grateful for this. It’s time for something better.

                                                                                                                                        1. 3

                                                                                                                                          That’s concerning. I’m considering an X1 Carbon if Apple doesn’t raise the 13” MacBook Pro memory to 32 GB this year. I’ve used a Mac laptop for myself for 12 years now and for work for 8 of the last 10 — two years on a Thinkpad with “Open Client for Debian Community” i.e. IBM’s Ubuntu spin — with much adoration. 16 GB is slowing me down. I’m too much of a multitasker these days and find myself more often using my 32 GB desktop gaming rig running Windows or my work laptop that is a 15-in MacBook Pro 32 GB.

                                                                                                                                          1. 3

                                                                                                                                            That being said, Fedora (stock) on an X1 Carbon works very well (I’m using it to type this, and at $dayjob).

                                                                                                                                            1. 1

                                                                                                                                              I didn’t have a single hardware issue in my thinkpad with fedora. It’s not an X1, but in my experience, Fedoraand ThinkPad are a good match.

                                                                                                                                            2. 2

                                                                                                                                              Don’t get me wrong, I am using ThinkPads for over 20 years now and I will continue. I will just avoid ThinkPads with NVidia inside. However when I decided to buy the X1E (1st gen) I did it partially because it was certified to be Ubuntu compatible! I already had doubts, and I had actually before one ThinkPad with NVidia, but on that one I could use the Intel GPU for external displays, so the NVidia was just a waste of weight and energy, but at least I could do work. On this one however I cannot use external screens without somehow making this NVidia, Hybrid, BumbleBee, Prime whatever stuff working. It steals so much of my time.

                                                                                                                                              1. 1

                                                                                                                                                X1 carbon is great. The extreme has Nvidia graphics which are a tire fire.

                                                                                                                                                1. 1

                                                                                                                                                  I thought Nvidia + Linux was <3? Did that change in the twenty teens?

                                                                                                                                        1. 17

                                                                                                                                          God save me from Lenovo pre-installed software…

                                                                                                                                          1. 11

                                                                                                                                            I think the real benefit here is Lenovo seems to now give a shit about making their hardware work well Linux, and maybe, just maybe, they’ll push that hardware support upstream so folks who don’t want to run the distro this ships with can still benefit from it.

                                                                                                                                            1. 3

                                                                                                                                              Exactly my thoughts. I’m using a T540p at work with Debian on it, but the hardware support is pretty shitty. Lots of power management problems, display driver had problems in the beginning and so on. It also took years for Debian on my x230 to support the built-in microphone. So maybe with this, that fabled “good Linux support” will finally become actually true for these models.

                                                                                                                                              1. 4

                                                                                                                                                RHEL7 on my T540p “just worked“, years ago. Although I hated the laptop so didn’t use it for very long. When people have problems with a particular Linux on a particular laptop, sometimes it’s the laptop, but sometimes perhaps the distribution. Or somewhere in the middle.

                                                                                                                                                1. 2

                                                                                                                                                  That’s weird, you’d expect this to be more or less the same across distros (as long as they use the same kernel version and X drivers)

                                                                                                                                                  1. 3

                                                                                                                                                    You’ve put your finger on it: they probably aren’t using the same kernel and X drivers.

                                                                                                                                                  2. 1

                                                                                                                                                    You hated a T540p ? Don’t answer.

                                                                                                                                                    1. 2

                                                                                                                                                      Not OP but I had to use one for a little while. I hated it due to keyboard and touchpad. Lenovo touchpads of the era were so terrible they should have just left them off and stuck to the trackpoint. A janky touchpad with half-assed palm rejection degraded the experience.

                                                                                                                                                      And I just can’t deal with the off-center typing that 10key forces on a laptop that size.

                                                                                                                                                      Otherwise it was great, but I couldn’t get past those two things.

                                                                                                                                                      1. 2


                                                                                                                                                  3. 2

                                                                                                                                                    I hope for this too! Having an X1 Extreme with Ubuntu 18.04 for 1 year being the worst Linux experience in 25 of using Linux. I have to say until this I avoided Hardware that didn’t have good Linux support. Will boycott Nvidia for the rest of my life.

                                                                                                                                                    Just would have hoped that Lenovo would have a notebook with same form factor and physical aspects (15”, centered keyboard, hi resolution) but without Nvidia.

                                                                                                                                                1. 1
                                                                                                                                                  • Learning to touch type, later realizing that sticking to US keyboard will give me peace of mind - started IT in Germany, then moved to France and traveled quite a bit. For accents/umlauts using AltGr International combinations
                                                                                                                                                  • Use VI commands everywhere I can (configuring shell and REPLs to VI mode)
                                                                                                                                                  • Unix command line with big history size, using available unix commands. Saved me a lot of time and made me more productive by combining tasks into commands and when in the end they are not optimized further and I still use them, I make a script. Still from time to time discovering new commands. Makes me happy.
                                                                                                                                                  • Emacs with org-mode and evil (VI emulation) to take work notes, clock time and make reports for invoices
                                                                                                                                                  • Haskell, for programs that do complicated enough things, where it makes sense to get the basis right and avoid mistakes. Made me as well much more humble about programming. As well I know that I will never end learning this language and it makes me happy. There is so much new research going on.
                                                                                                                                                  • git: works reliably and has solutions for weird work-flows
                                                                                                                                                  • screen: Discovered it late, a tool that has been around for many years, but I discovered it only when I was constrained to work on different servers. It’s often preinstalled. Now I use it even on the laptop, locally.
                                                                                                                                                  • Regression tests that can be automated.

                                                                                                                                                  Techniques that I learned only after I needed them and would have liked to know them earlier:

                                                                                                                                                  • Writing without seeing the text, i.e. same font and background color (black-on-black or white-on-white). Allows to “speak” out the mind in an intimate and honest way. Helps beautifully to calm down when in emotional stress.
                                                                                                                                                  • Meditation. Will not describe my impressions, because it has too many facets and the revelations are changing. But it was a precious discovery and gives me often a very nice road trip into myself and often helps to take decisions, unblock situations.
                                                                                                                                                  1. 4

                                                                                                                                                    Can’t GenodeOS work as the userland for seL4?

                                                                                                                                                    1. 5

                                                                                                                                                      Genode is nice and all, but it is Affero GPL licensed. This is likely seen as a huge liability.

                                                                                                                                                      1. 5

                                                                                                                                                        Specifically because they want hardware/software businesses to pay for using it. So, they should probably think of that combo as seL4 plus a commercial product. Most won’t use it as you predicted.

                                                                                                                                                        1. 4

                                                                                                                                                          Ooooh, now that is something I totally missed, thanks!

                                                                                                                                                        2. 2

                                                                                                                                                          From their documentation 1:

                                                                                                                                                          Genode can be deployed on a variety of different kernels including most members of the L4 family (NOVA, seL4, Fiasco.OC, OKL4 v2.1, L4ka::Pistachio, L4/Fiasco). Furthermore, it can be used on top of the Linux kernel to attain rapid development-test cycles during development. Additionally, the framework is accompanied with a custom microkernel that has been specifically developed for Genode and thereby further reduces the complexity of the trusted computing base compared to other kernels.

                                                                                                                                                          1. 2

                                                                                                                                                            But seL4 is single core only, so it’s not much use outside of embedded or single-purpose equipment :(

                                                                                                                                                            1. 1

                                                                                                                                                              Uh, oh :/ this is something I didn’t realize :(

                                                                                                                                                              1. 1

                                                                                                                                                                They have an unverified implementation, which is roughly as secure as a normal operating system.