1. 5

    This was an entirely seamless upgrade for me. nixos-rebuild switch --upgrade + 5 minutes waiting == new shiny. I haven’t had this easy of a time upgrading since I left BSD-land.

    1. 1

      I’m doing this at the moment, after having read your comment. But it’s now 1 hour and it still is not finished. It compiles quite a lot of stuff, that it didn’t in 20.03 - I could see Thunderbird and Libreoffice, just by looking from time to time to the console. I hope it will finish soon. Honestly I hope updates will be much quicker.

      Another difference that I have seen, is that it doesn’t accept to run plasma5 (KDE) and gnome3 at the same time - reported a conflict, so I removed plasma5. It’s strange, because in 20.03 they worked.

      EDIT: Update, finally finished to upgrade from 20.03 to 20.09. Took ca. 3 hours. I don’t know why but a lot of programs that normally were downloaded already compiled, this time had to recompile.

    1. 2

      I think this is interesting, and should perhaps be applied to programming languages as well.

      Hyper-inefficient programming languages like Ruby, Python, Haskell, etc. produce far more CO2 than, for example, C.

      1. 5

        Hyper-inefficient programming languages like Ruby, Python, Haskell

        I hope you realize that Haskell’s performance is much closer to C than Python. Haskell usually ranks around the likes of Java in language benchmarks. Either way you look at it, it doesn’t deserve being called “Hyper-inefficient”…

        1. 3

          I think it depends on how much energy is used developing and compiling code vs energy used during all times the program is run. I expect that equivalent C and Haskell programs take similar amounts of energy to run, and that the Haskell one takes a lot more energy to compile, but less time (and therefore less idle-time energy) to develop. This would make them similarly energy-expensive for most use-cases.

          Scripting languages may require less develop-time energy, but more run-time energy. If run only a few times, they’d use less energy than would be spent writing, compiling, debugging, and running a C program. Run many times, they would lose out to the finished C program.

          1. 3

            That’s actually a very relevant point. My first reaction to your comment was, “but who cares, you build only once”, but that’s not true. I have a beefy laptop that I’ve bought specifically to support a comfortable Haskell development experience. The IDE tooling continuously compiles your code behind the scenes. Then my team also has a very beefy EC2 instance that serves as a CI environment, it builds all the branches all the time. Then we’re also employing various ways of deploying the application and that also means it gets built in various ways per release image. All of that probably adds up to an energy consumption amount that’s comparable to a significant number of users running the application.

            1. 2

              Then we should include maintenance cost as well. I believe that in a lifetime of a program the energy put into the initial development is only a part, most probably a smaller part, of the energy needed to maintain it: bug fixing, updates, etc. In this case, theoretically, Haskell should have an advantage, because the language, due to its type safety restrictions, will force you to make less mistakes, in design and in terms of bugs. I don’t have any numbers to support these claims, it’s just gut feeling, so don’t take it too serious.

          2. 4

            There actually have been studies on that question, eg: https://greenlab.di.uminho.pt/wp-content/uploads/2017/10/sleFinal.pdf

            1. 1

              I love that paper. If you’re looking for a quick heuristic, energy efficiency strongly correlates with performance. Compare those numbers to these: https://benchmarksgame-team.pages.debian.net/benchmarksgame/which-programs-are-fastest.html

          1. 2

            I have discovered this document for the first time and I found it a fantastic read (roughly 1 hour) and it triggered at least for me the following remarks and questions:

            • It seems that in 1974 User Groups did not yet exist: I take it from the following quote “Also given for new files is a set of seven protection bits. Six of these specify independently read, write, and execute permission for the owner of the file and for all other users.” If that’s the case, it would be interesting to know at what time was the concept of user groups introduced and what were the use cases that pushed for this?

            • It is said that the set-user-ID seems to solve the MOO accounting problem: Does anybody know what this accounting problem was about?

            • Is it possible that quota was not yet introduced? I take this from the following quote: “The simplest reasonably fair algorithm seems to be to spread the charges equally among users who have links to a file. The current version of UNIX avoids the issue by not charging any fees at all.” Questions that this raises: When was quota introduced? Is the quota functionality standardized among modern UNIXes? If so what are the limits and inconsistencies of standardized quotas?

            • The error stream stderr seems to not have been invented yet in 1974. I take this from the quote “Programs executed by the Shell, however, start off with two open files which have file descriptors 0 and 1.”

            • Evolution through Hackability, Expressiveness and Source Code Availability: Authors emphasize the importance of hackability, expressiveness and availability of source code, for the evolution, or as they call it “maintainance”, of the system. Hackability quote: “First, since we are programmers, we naturally designed the system to make it easy to write, test, and run programs.” Expressiveness quotes: “Given the partiality antagonistic desires for reasonable efficiency and expressive power, the size constraint has encouraged not only economy but a certain elegance of design.” “Another important aspect of programming convenience is that there are no “control blocks” with a complicated structure partially maintained by and depended on by the file system or other system calls.” Quotes about availability of source code: “Since all source programs were always available and easily modified online, we were willing to revise and rewrite the system and its software when new ideas were invented, discovered, or suggested by others.”

            • Stability/reliability: Quote: “The longest uninterrupted up time was about two weeks. Service calls average one every three weeks, but are heavily clustered. Total up time has been about 98 percent of our 24-hour, 365-day schedule.”

            • Is there any document that describes in such a concise way what principal changes have been introduced in Plan9?

            1. 3

              It is said that the set-user-ID seems to solve the MOO accounting problem: Does anybody know what this accounting problem was about?

              MOO was a simple number guessing game. What made it interesting in its original computer implementation was that it maintained a high-score table. When a user guessed right, the high score table had to be modified.

              The problem was that the high score table was a file not owned by the user. For everyone to be able to update it, it had to be writable by everyone, which made cheating trivial.

              Setuid allowed the file to be owned by the “MOO user” and writable only by that user, but the command to update it could be run by anyone.

              Basically it was the same mechanism that passwd uses to update user passwords, and that various multi-user games have used on UNIX since time immemorial.

              I remember reading about the MOO example specifically in some “hardening UNIX” book/article at some point. I can’t remember exactly where, sorry.

            1. 14

              Why did Haskell’s popularity wane so sharply?

              What is the source that Haskell’s popularity are declining so sharply? Is there really some objective evidence for this, I mean numbers, statistics, etc.?

              It’s anecdotal and just my personal impression by observing the Haskell reddit 1 for 10 years, but I have never seen so many Haksell resources, Conferences, Books and even postings for jobs as now. I have not at all the impression that the language is dying. It has accumulated cruft, has some inconsistencies, is struggling to get a new standard proposal out, but other than that I have the impression that it attracts quite some people that come up with new ideas.

              1. 2

                Haskell had glory days when SPJ/Marlow were traveling to various conferences talking about the new language features. Mileweski’s posts, LYAH, Parsec, STM, and Lenses are from that era. The high-brow crowd was of course discussing Lenses. Sure, these things drove adoption, and there’s a little ecosystem for the people who went on the Haskell bandwagon back then.

                What innovation has it had over the last 5 years? The community couldn’t agree on how to implement any of the features of a respectable dependent-type system, so they invented a bunch of mutually incompatible flags, and destroyed the language. Thanks to the recent hacking, GHC is plastered with band-aids.

                It’s true that you can’t explain these things with some points on a pretty graph, but that doesn’t make it anecdotal. Look at the commits going into ghc/ghc, and look at the activity on the bread-and-butter Haskell projects: lens, trifecta, cloud-haskell. Maintenance mode. Where are the bold new projects?

                1. 23

                  These assertions about Haskell are all simply false. There are plenty of problems with Haskell, we don’t need to add ones that aren’t true.

                  The community couldn’t agree on how to implement any of the features of a respectable dependent-type system, so they invented a bunch of mutually incompatible flags, and destroyed the language. Thanks to the recent hacking, GHC is plastered with band-aids

                  The reason GHC didn’t just turn on all flags by default is that many of them are mutually incompatible, so your individual .hs file has to pick a compatible set of language features it wants to work with.

                  You keep saying this in multiple places, but it’s not true. Virtually no GHC extensions are incompatible with one another. You have to work hard to find pairs that don’t get along and they involve extremely rarely used extensions that serve no purpose anymore.

                  The community is also not divided on how to do dependent types. We don’t have two camps and two proposals to disagree about. The situation is that people are working together to figure out how to make them happen. GHC also doesn’t contain bad hacks for dependent types, avoiding this is exactly why building out dependent types is taking time.

                  That being said, dependent types work today with singletons. I use them extensively. It is a revolution in programming. It’s the biggest step forward in programming that I’ve seen in 20 years and I can’t imagine life without them anymore, even in their current state.

                  Look at the commits going into ghc/ghc, and look at the activity on the bread-and-butter Haskell projects: lens, trifecta, cloud-haskell. Maintenance mode. Where are the bold new projects?

                  Haskell is way more popular today than it was 5 years ago, and 10 years ago, and 20 years ago. GHC development is going strong, for example, we just got linear types, a huge step forward. There’s been significant money lately from places like cryptocurrency startups. For the first time I regularly see Haskell jobs advertised. What is true, is that the percentage of Haskell questions on stack overflow has fallen, but not the amount. The size of Stack Overflow exploded.

                  Even the community is much stronger than it was 5 years ago. We didn’t have Haskell Weekly news for example. Just this year a category theory course was taught at MIT in Haskell making both topics far more accessible.

                  Look at the commits going into ghc/ghc

                  Let’s look. Just in the past 4 years we got: linear types, a new low-latency GC, compact regions, deriving strategies & deriving via, much more flexible kinds, all sorts of amazing new plugins (type plugins, source plugins, etc.) that extend the language and provide reliable tooling that was impossible 5 years ago, much better partial type signatures, visible type applications (both at the term level and the type level), injective type families, type in type, strict by default mode. And much more!

                  This totally changed Haskell. I don’t write Haskell the way I did 5 years ago, virtually nothing I do would work back then.

                  It’s not just GHC. Tooling is amazing compared to what we had in the past. Just this year we got HLS so that Haskell works beautifully in all sorts of editors now from Emacs, to vscode, to vim, etc.

                  look at the activity on the bread-and-butter Haskell projects: lens, trifecta, cloud-haskell. Maintenance mode. Where are the bold new projects?

                  lens is pretty complete as it is and is just being slowly polished. Haskell packages like lens are based on a mathematical theory and that theory was played out. That’s the beauty of Haskell, we don’t need to keep adding to lens.

                  I would never use trifecta today, megaparsec is way better. It’s seen a huge amount of development in the past 5 years.

                  There are plenty of awesome Haskell packages. Servant for example for the web. Persistent for databases. miso for the frontend. 5 years ago I couldn’t dream of deploying a server and frontend that have a type-checked API. For bold new ideas look at all the work going into neural network libraries that provide type safety.

                  I’m no fanboy. Haskell has plenty of issues. But it doesn’t have the issues you mentioned.

                  1. 1

                    Right. Most of my Haskell experience is dated: from over five years ago, and the codebase is proprietary, so there are few specifics I can remember. I’m definitely not the best person to write on the subject. In any case, I’ve rewritten the Haskell section of the article, with more details. Thanks.

                    1. 6

                      From my definition of “dying language” it means losing popularity, or losing interest. For Haskell this is absolutely not clear. Also your section is about “why Haskell is bad” not “why it is dying”. People do not talk about Haskell as they used to in my opinion, but I still see a lot of activity in Haskell ecosystem. And it doesn’t really look like it’s dying.

                      I think it is easier to agree about Clojure dying looking at Google trends for example: https://trends.google.com/trends/explore?cat=5&date=all&geo=US&q=haskell,clojure

                      But Haskell looks more like a language that will never die but still probably never become mainstream.

                      1. 5

                        I’m definitely not the best person to write on the subject. In any case, I’ve rewritten the Haskell section of the article, with more details. Thanks.

                        Great! Although there are still many issues that are factually untrue.

                        I think this is just a sign that you’ve been away from the community for many years now, and don’t see movement on the things that were hot 5-10 years ago. Like “The high-brow crowd was obssessed with transactional memory, parser combinators, and lenses.” Well, that’s over. We figured out lenses and have great libraries, we figured out parser combinators, and have great libraries. The problems people are tackling now for those packages are engineering problems, not so much science problems. Like how do we have lenses and good type errors? And there, we’ve had awesome progress lately with custom error messages https://kodimensional.dev/type-errors that you would not have seen 5 years ago.

                        The science moved on to other problems.

                        The issue is that different extensions interact in subtle ways to produce bugs, and it’s very difficult to tell if a new language extension will play well with the others (it often doesn’t, until all the bugs are squashed, which can take a few years).

                        This still isn’t true at all. As for the release cadence of GHC, again, things have advanced amazingly. New test environments and investments have resulted in regular GHC releases. We see several per year now!

                        In Atom, the Haskell addon was terrible, and even today, in VSCode, the Haskell extension is among the most buggy language plugins.

                        That was true a year ago, it is not true today. HLS merged all efforts into a single cross-editor package that works beautifully. All the basic IDE functionality you would want is a solved problem now, the community is moving on to fun things like code transformations.

                        Then there’s Liquid Haskell that allows you to pepper your Haskell code with invariants that it will check using Z3. Unfortunately, it is very limited in what it can do: good luck checking your monadic combinator library with LH

                        Not true for about 3 years. For example: https://github.com/ucsd-progsys/liquidhaskell/blob/26fe1c3855706d7e87e4811a6c4d963d8d10928c/tests/pos/ReWrite7.hs

                        The worst case plays out as follows: the typechecker hangs or crashes, and you’re on the issue tracker searching for the issue; if you’re lucky, you’ll find a bug filed using 50~60% of the language extensions you used in your program, and you’re not sure if it’s the same issue; you file a new issue. In either case, your work has been halted.

                        In 15 years of using Haskell I have never run into anything like this. It is not the common experience. My code is extremely heavy and uses many features only available in the latest compiler, with 20-30 extensions enabled. Yet this just doesn’t happen.

                        There is almost zero documentation on language extensions. Hell, you can’t even find the list of available language extensions with some description on any wiki.

                        Every single version of GHC has come with a list of the extensions available, all of which have a description, most of which have code: https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/glasgow_exts.html You can link to the manual that neatly explains everything, rather than to the git repo.

                        Looking at the big picture: first, this is a poor way to do software development; as the number of language extensions increase, your testing burden increases exponentially.

                        This is only true if you can’t prove how extensions interact, or more fundamentally, that they don’t interact.

                        Second, the problem of having a good type system is already solved by a simple dependent type theory; you study the core, and every new feature is just a small delta that fits in nicely with the overall model.

                        That’s totally untrue. There is no such general-purpose language today. We have no idea how to build one.

                        As opposed to having to read detailed papers on each new language extension. And yes, there’s a good chance that very few people will be able to understand your code if you’re using some esoteric extensions.

                        Again, that’s just not true. You don’t need to know how the extensions are implemented. I have not read a paper on any of the extensions I use all the time.

                        In summary, language extensions are complicated hacks to compensate for the poverty of Haskell’s type system.

                        That’s just the wrong way to look at language extensions. Haskell adds features with extensions because the design is so good. Other languages extend the language forcing you into some variant of it because their core is too brittle and needs fundamental changes. Haskell’s core is so solid we don’t need to break it.

                        However, PL research has shifted away from Haskell for the most part

                        That’s again totally factually untrue. Just look at Google Scholar, the number of Haskell papers per year is up, not down. The size of the Haskell workshop at ICFP is the same as 5 years ago.

                        Moreover, there are no tools to help you debug the most notorious kind of bug seen in a complicated codebase: memory blowups caused by laziness.

                        Again, that’s not factually true.

                        We have had a heap profiler for two decades, in the past few years we got ThreadScope to watch processes in real time. We have systematic ways to find such leaks quickly, you just limit the GC to break when leaks happen. https://github.com/ndmitchell/spaceleak We also got stack traces in the past few years so we can locate where issues come from. In the past few years we got Strict and StrictData.

                        As for the code examples. I can pick 2 lines of any language out of context and you’ll have no idea what they do.

                        Who cares what every extension does for every example? That’s the whole point! I have literally never looked at a piece of Haskell code and wondered what an extension does. I don’t need to know. GHC tells me when I need to add an extension and it tells me when an extension is unused.

                        How many more language features are missing?

                        Extensions are not missing language features.

                      2. 1

                        GHC also doesn’t contain bad hacks for dependent types, avoiding this is exactly why building out dependent types is taking time.

                        Honestly, I’d much rather prefer a simple core model, like that of HoTT.

                        1. 3

                          Honestly, I’d much rather prefer a simple core model, like that of HoTT.

                          I’d love that too! As would everyone!

                          But the reality is, we don’t know how to do that. We don’t even know how to best represent computations in HoTT. It might be decades before we have a viable programming language. We do have dependent types that work well in Haskell today, that I can deploy to prod, and that prevent countless bugs while making code far easier to write.

                          1. 1

                            I think HoTT with computations is “cubical type theory”? It’s very active currently.

                            As for the dependent types as the backend for advanced type level features, I think it’s what Dotty/scala 3 is about. It’s definitely not the only way to do it, but it’s also not decades away. Idris 2 is also an interesting effort.

                      3. 4

                        Dependent types aren’t that useful for production software, and full blown dependent types are really contrary to the goals of Haskell in a lot of ways. Any language that’s >20 years old (basically 30) is gonna have some band-aids. I’m not convinced that Haskell is waning in any meaningful way except that people don’t hype it as much on here/hn. Less hype and more doing is a good thing, imho.

                        1. 3

                          Reminds me of the days when people said FP and complete immutability weren’t useful for production software. It is true that there is no decent general purpose language that implements dependent types, but that’s besides the point.

                          It’s true, hype is a poor measure.

                          1. 4

                            Yeah, that’s an interesting comparison, but I think it’s a totally different situation. Immutability and dependent types both are things you do to make certain assumptions about your code. In that, immutability allows you to know that some underlying value won’t change. Dependent types allow you to make more general statements/proofs of some invariant. The big difference is that immutability is a simplification. You’re removing complexity by asserting some assumption throughout your code. Generally, dependent types are adding complexity. You have to provide proofs of some statement externally or you have to build the proof of your invariants intrinsically into your constructions. IMHO, that’s a huge difference for the power to weight ratio of these two tools. Immutability is really powerful and fairly light weight. Dependent types are not really that powerful and incredibly heavy. I’m not saying dependent types are worthless. Sometimes you really really want that formal verification (eg. compilers, cryptography, etc). The vast majority of code doesn’t need it, and you’re just adding complexity, something I think should be avoided in production software.

                            1. 3

                              Tl;dr I have a good amount of experience with dependently typed languages, and I write Haskell for a living. After all of my experience, I have come to the conclusion that dependent types are over hyped.

                              1. 1

                                I’ve started writing a post on dependent types. Here’s early draft: https://artagnon.com/articles/dtt

                              2. 3

                                What about Ada?

                        1. 6

                          This is by far my favorite version control system. Unlike git, I find branching in darcs intuitive and easy to understand.

                          1. 6

                            Darcs doesn’t support branching in the same sense as Git, so I’m not sure that’s a fair comparison.

                            The “every repo is a branch” and “Master and Working Repositories” workflows also in Git by cloning the upstream repo, then cloning local “branch” repos from it. When you’re done, push to the local “master”, and then eventually back to upstream. You’d miss out on a lot of Git functionality, but it should work.

                            Personally, I think the Darcs way of doing it is really annoying. I like being able to keep around experimental branches, WIP features and bug fixes, etc. without cluttering my file system. And I like being able to push them all somewhere and clone them in different places with a single “git clone …”. I know under the hood Git’s keeping all that data around, but it’s hidden away in .git where I don’t have to think about it.

                            1. 5

                              I’d love to hear more about how darcs’ branching is different than git’s. Care to give us some more insight?

                              1. 2

                                Indeed. Unlike git, I never find myself needing to rm -rf . my darcs repo.

                                1. 9

                                  I’ve never needed to do that to a git repo either

                                  1. 1

                                    Neither have I, hopefully I can at least encourage people to learn/use git reflog. Its rare you need it but comes in super handy when you’ve made a mistake.

                                2. 1

                                  Do you by chance know if darcs how darcs supports big non-text files?

                                  1. 3

                                    It’s been a while, but as far as I remember, Darcs generally works in a way a bit different from git: it downloads all patch metadata, figures out which patches it needs to reconstruct the current work tree and then downloads the data. So binaries are at least not part of the bundle you usually work with.

                                    AFAIK, it works quite well with them. I know it’s been a promoted strong-point of pijul.

                                1. 7

                                  Looking up in the changelog for the latest Darcs version 2.16.1 (emphasis mine):

                                  Preliminary UNSTABLE support for a new patch theory named “darcs-3”, largely based on the pioneering work of Ian Lynagh for ‘camp’.

                                  Please note that this format is not yet officially supported: some features (like conversion from older formats) are still missing, and we have not yet finalized the on-disk format. You should NOT use it for any serious work yet.

                                  The new theory finally solves all the well-known consistency problems that plagued the earlier ones, and thus fixes a number of issues (including issue1401 and issue2605) that have been outstanding for many years. It also reduces the worst case asymptotic runtime for commutation and merging from exponential to merely quadratic in the number of patches involved.

                                  One of the reasons we are confident this new theory and its implementation is sound, i.e. respect all required properties, is that we have improved our test case generator for sequences of patches. It now generates all possible conflict scenarios. Since the new theory no longer has worst case exponential runtime, we can and did test all required properties and invariants with a large number of generated test cases (up to 100000).

                                  Can somebody report how the problems manifests in real life and what are the workarounds? Does it mean that using darcs now, i.e. before “darcs-3” patch theory becomes mature is dangerous?

                                  1. 6

                                    Wait, this seems like huge news to me!?

                                    largely based on the pioneering work of Ian Lynagh for ‘camp’.

                                    Can’t find anything more recent than 2011… is there a paper?

                                    1. 3

                                      Darcs had/has a “poison patch” issue where one patch may trigger very long checkout times, up to a point where it would render your repo useless. Most people would never have that problem, but it did happen to me back in 2004, and I then switched to Mercurial after trying Monotone and Git. You may want to check out Pijul, as it’s the spiritual successor of Darcs.

                                      1. 2

                                        Uh, I’m trying to remember. If I remember right, darcs had an issue around branches that would go to far away from each other. As darcs tracks which patches are needed to apply a certain patch, this lead to a problematic behaviour. From my experience, that was unlikely, but of sufficient likeliness that it became an issue, especially on large repositories. There’s fixes for that, but that’s a space where you need deep knowledge of darcs, which is kind of against the ethos of the project in trying to go an extra mile for making their technology accessible. (For the last part: I was lurking darcs for a while and tried contributing 1-2 patches)

                                        1. 2

                                          There are two issues:

                                          1. When a conflict between Alice and Bob is solved in two different ways by Alice and by Bob, this generates a new conflict. If they keep doing that n times, applying the patches in Darcs used to be in time 2^n. Solving that is really cool, because AFAIK no one really understood until recently what Darcs was doing in that case.

                                          2. Conflicts are a little strange in Darcs, since there is no notion of “state”, so coming back to a conflicting situation (with e.g. darcs revert, or darcs rollback) is always going to be a little bit weird.

                                          Solving point 1 is big news.

                                        1. 0

                                          As someone who is in V’s Discord every day being constantly blown away at the progress being made, I am shocked at the level of dishonesty that this strangely anti-V hit piece achieves.

                                          In particular, the degree of cherry-picking (and often then still misrepresenting) a few facts in order to make V appear in the worst possible light is truly breathtaking.

                                          She cites vlang.io saying

                                          V can be bootstrapped in under a second by compiling its code translated to C with a simple

                                          cc v.c

                                          No libraries or dependencies needed.

                                          then argues against it, preposterously, by saying in part,

                                          Git is a dependency, which means perl is a dependency, which means a shell is a dependency, which means glibc is a dependency, which means that a lot of other things (including posix threads) are also dependencies. …

                                          Downloading a .c source file requires git? Does this person know what a “dependency” is? Should JavaScript developers include depending upon the laws of physics in package.json?

                                          Amusingly, the documentation still claims that memory management is both a work in progress and has perfect accuracy for cleaning up things at compile time.

                                          No, the documentation correctly says that memory management is a work in progress, and also that, once completed, will clean up after itself in much the way that Rust does.

                                          An Honest Depiction of Progress

                                          Here are the combined release notes from all of the V releases since December:

                                          Release 0.1.23:

                                          - [Direct x64 machine code generation](https://github.com/vlang/v/issues/2849). Hello world being built in 3 milliseconds.
                                          - Bare metal support via the `-freestanding` flag, allowing to build programs without linking to libc.
                                          - Prebuilt V packages for Linux, macOS, and Windows.
                                          - `string.index()` now returns `?int` instead of `int/-1`.
                                          - Lots of fixes in Generics.
                                          - vweb framework for developing web applications is back.
                                          - Vorum, the forum/blogging software written in V/vweb, can now be compiled and has been added to CI.
                                          - REPL, `v up` have been split up into separate applications to keep the core V compiler small.
                                          - V now enforces short enum syntax (`.green` instead of `Color.green`) when it's enough.
                                          - V UI for macOS.
                                          - Interfaces have been rewritten. `[]interface` support.
                                          - `os.cp()` for copying files and directores.
                                          - Additional compile-time flags: `$if clang, msvc, mingw, x32, x64, big_endian, little_endian {`.
                                          - All C functions now have to be declared, all missing C functions have been defined.
                                          - Global variables (only with the `--enable-globals` flag) for low level applications like kernels and drivers.
                                          - Nothing can be cast to bool (previously code like `if bool(1) {` worked.
                                          - `<<` and `>>` now work with all integer types.
                                          - V detects Cygwin and shows an error. (V supports Windows natively)
                                          - Improved type checking of some operators (`%, |, &` etc).
                                          - Windows 7 support.
                                          - `println(true)` now prints `true` instead of `1`.
                                          - `os.exec()` now uses `CreateProcess` on Windows.
                                          - fast.vlang.io website for monitoring the performance of V after every commit.
                                          - On Windows Visual Studio is now used automatically if GCC is not installed.
                                          - vfmt!
                                          - Lots of cleaning up in the compiler code.
                                          - Multi-level pointers in unsafe code (`****int`).
                                          - MSVC backtrace.
                                          - `$if os {` blocks are now skipped on a different OS.
                                          - C string literals (`c'hello'`).
                                          - AlpineLinux/musl fixes + added to CI.
                                          - Inline assembly.
                                          - Clipboard module (Windows, macOS, X).
                                          - `foo()?` syntax for error propagation.
                                          - Docs have been migrated from HTML to `doc/docs.md`.
                                          - `eventbus` module.
                                          - Haiku OS support.
                                          - `malloc/free` on bare metal.
                                          - `utf8` helper functions (`to_lower()`, `to_upper()`, etc).
                                          - Optimization of `for c in str {`.
                                          - `string/array.left/right/slice/substr` were removed (`[a..b]` slicing syntax should be used instead).
                                          

                                          Release 0.1.24:

                                          - A new parser/generator built on top of an AST that simplifies code greatly and allows to implement new
                                            backends much faster.
                                          - Sum types (`type Expr = IfExpr | MatchExpr | IntegerLiteral`).
                                          - B-tree map (sped up the V compiler by ~10%).
                                          - `v fmt -w`.
                                          - The entire code base has been formatted with vfmt.
                                          - Generic structs.
                                          - SDL module.
                                          - Arrays of pointers.
                                          - os: `is_link()`, `is_dir()`, `exists()`.
                                          - Ranging through fixed size arrays.
                                          - Lots of fixes in ORM and vweb.
                                          - The first tutorial: [building a simple web application with vweb](https://github.com/vlang/v/blob/master/tutorials/building-a-simple-web-blog-with-vweb.md).
                                          - Match expressions now must be exhaustive.
                                          - freestanding: `malloc()`/`free()`.
                                          - `++` is now required instead of `+= 1` for consistency.
                                          - Interpolated strings now allow function calls: `println('val = $get_val()')`.
                                          - `string.replace_each([])` for an efficient replacement of multiple values.
                                          - More utf8 helper functions.
                                          - `-prealloc` option for block allocations.
                                          - `type` aliases.
                                          - Running `v` with an unknown command will result in an error.
                                          - `atof` implementation in pure V.
                                          - Enums can now have negative values.
                                          - New `filepath` module.
                                          - `math.factorial`.
                                          - `ftp` module.
                                          - New syntax for casting: `val as Type`.
                                          - Fewer libc functions used (soon V will have no dependency on libc).
                                          

                                          Release 0.1.27:

                                          - `vfmt` has been re-written from scratch using the new AST parser. It's much faster, cleaner, and can format
                                          files with compilation errors.
                                          - `strconv`, `sprintf`, and `printf` in native V, without any libc calls.
                                          - Interfaces are now a lot more stable and have all expected features.
                                          - Lots of x64 backend improvements: function calls, if expressions, for loops, local variables.
                                          - `map()` and `filter()` methods can now be chained.
                                          - New `[]int{cap:cap, len:len}` syntax for initializing array length and capacity.
                                          - New `is` keyword for checking the type of sum types and interfaces.
                                          - `as` can now be used to cast interfaces and sum types.
                                          - Profiling with `-profile`. Prints a nice table with detailed information about every single function call:
                                          number of calls, average time per call, total time per function.
                                          - `import(xxx)` syntax has been removed in favor of `import xxx` for simplicity and greppability.
                                          - Lots of fixes and improvements in the type checker.
                                          - `time.StopWatch`
                                          - `dl` module for dynamic loading.
                                          - Automatic `str()` method generation for every single type, including all arrays and fixed size arrays.
                                          - Short struct initialization syntax for imitating named function args: `foo(bar:0, baz:1)`.
                                          - New operator `!in`.
                                          - Performance improvements in critical parts of the builtin data structures (array, map).
                                          - High order functions improvements (functions can now be returned etc).
                                          - Anonymous functions that can be defined inside other functions.
                                          - Built-in JSON module is back.
                                          - Closures.
                                          - Lots and lots of new tests added, including output tests that test error messages.
                                          - Multiple errors are now printed, the compiler no longer stops after the first error.
                                          - The new JS backend using the AST parser (almost complete).
                                          - Variadic functions.
                                          - `net.websocket` module (early stage).
                                          - `vlib` is now memory leak free, lots of `autofree` improvements.
                                          - Simplified and cleaned up `cmd/v`, `v.builder`.
                                          - V UI was updated to work with the new backend.
                                          

                                          After she COMPLETELY ignores the MASSIVE progress mademore than 3000 commits worth from a brilliant and fiercely dedicated team – and judges the current state of V based exclusively on misunderstandings, nitpicks, and on its memory management status after acknowledging that it’s not done yet and that the language is in an alpha state, she snarkily ends with:

                                          Overall, V looks like it is making about as much progress as I had figured it would.

                                          This is almost as bad as the quote she ended with in her previous post on V:

                                          Don’t ever, ever try to lie to the Internet, because they will catch you. …


                                          Honesty, Please!

                                          If you want to know how well V is actually progressing, try it yourself, check out the Discord, look on GitHub, but whatever you do, do not focus on ignorant, dishonest, cherry-picked commentary from haters; that doesn’t serve anyone well, and is incredibly unfair to those who are pouring themselves into this important project.


                                          The Brilliance of V

                                          After my 11 years of programming, including 9.5 of programming in Go (which is the most similar language to V), I consider V to easily be the best-designed programming language that exists.

                                          Yes, it’s learned a lot from Go and C, and maybe Lisp people prefer Lisp, but V successfully combines the simplicity of Go, the programmer ergonomics of Python, the speed C, and almost as many safety guarantees as Rust (once V has finished implementing these latter aspects, of course!).

                                          What I thought would take the V team 5 years to implement has taken less than 1 year. Alex (V’s creator) thought it would take even less time, and now he’s being unfairly raked over the coals for setting extremely ambitious timelines while the same naysayers and bullies ignore everything that has been done.


                                          V Resources

                                          Website (including code examples): https://vlang.io/

                                          GitHub: https://github.com/vlang/v

                                          Wiki page explaining why C is used as in intermediate representation rather than LLVM (another brilliant move that allows V to build on the shoulders of giants and avoid reinventing the wheel in order to bootstrap a new language, but a move that is misunderstood and absurdly used to argue against V for doing things differently/better): https://github.com/vlang/v/wiki/On-the-benefits-of-using-C-as-a-language-backend

                                          1. 25

                                            I understand that you have strong feelings for your language of choice. Nonetheless, language designers are not entitled to a community, nor are they entitled to shelter from criticism. Indeed, one of the most important parts of programming language design is rejecting new languages based on showstoppingly-unacceptable design choices.

                                            V does not offer any compelling design choices. Its advertised features can be sorted into libraries, compiler/toolchain offerings, and roughly the level of safety advertised in the 80s when memory-safety was still controversial. Just like Go, V has not learned many lessons, and refuses to offer programmers a more interesting way to express themselves. Even if V were literally Go but better, this would be sufficient to damn it.

                                            I understand that you might not like it when people point out that the release dates keep slipping; I think it’s curious that you are willing to link to V’s wiki and source code, but not to bumping release dates.

                                            As a language designer, I think that it is important to not advertise what you don’t yet have written. Monte has had one release, a developer preview, and we are intending to complete another iteration of bootstrapping before even considering another release. We know that almost every feature that typical end users will want is not yet written, and so we are not loudly advertising our offering as usable for everyday general-purpose programming, regardless of how much everyday general-purpose programming I or anybody else actually achieves with it.

                                            I consider V to easily be the best-designed programming language that exists.

                                            What’s your favorite ML? I have lately played with OCaml. There are entire universes of language designs which I suspect that you have yet to explore.

                                            1. -4

                                              Just like Go, V has not learned many lessons, and refuses to offer programmers a more interesting way to express themselves.

                                              FP diehards will never understand why Go has been so wildly successful – and V will be even more successful than Go.

                                              V is like Go but fixes all ~10 things wrong with it, providing a lot more flexibility due to its generic functions, generic structs, generic channels (still in the works), sum types, and TypeScript-style interfaces (also still partially in the works).

                                              Plus there’s the raw speed factor; V is translated to C before being compiled to machine code, cleverly piggybacking on decades of speed optimizations made by gcc/clang/tcc/etc.

                                              The simplicity of Go or Python + almost as much safety as Rust + almost exactly as much speed as C + a flexible type system + familiar syntax == a winning combination, I insist!

                                              1. 16

                                                The simplicity of Go or Python + almost as much safety as Rust + almost exactly as much speed as C + a flexible type system + familiar syntax == a winning combination, I insist!

                                                Except all of these are “some time in the future”, and widely incompatible with one another. There’s nothing to support any of these claims. What’s the design for “almost as much safety as rust” (without GC, of course)? The whole thing only just got an AST, and we’re supposed to believe it’s going to be revolutionary? There’s been a lot of grand promises with release dates being pushed back repeatedly, but nothing specific about how the promises will actually be achieved. Making a good language is hard, it takes years (if not decades), and you can’t just magically make something both simple, fast, safe, gc-free, etc. in a field where it’s known that some tradeoffs are inevitable.

                                                1. -3

                                                  Except all of these are “some time in the future”, and widely incompatible with one another.

                                                  Nope, totally wrong. The simplicity is there, the speed is there, the flexible type system is there, and the familiar syntax is there. A safe subset of C is generated then compiled but not all the safety guarantees are implemented yet.

                                                  There’s been a lot of grand promises with release dates being pushed back repeatedly

                                                  V is the first software project in history to be finished later than originally intended ;-).

                                                  The whole thing only just got an AST

                                                  Completely false; V has always had an AST. The AST-related thing that’s new is representing the generated C code as an AST before outputting it.

                                                  … you can’t just magically make something both simple, fast, safe, gc-free, etc. in a field where it’s known that some tradeoffs are inevitable.

                                                  The big “a-ha” moment for me was this: I now realize that I had falsely assumed that just because prior languages took certain trade-offs that it was impossible to check all these boxes at once. But I was wrong.

                                                  The only inherent tension between any of the things I listed is between simplicity and flexibility. But as I said in the comment you’re replying to,

                                                  V is like Go but fixes all ~10 things wrong with it, providing a lot more flexibility due to its generic functions, generic structs, generic channels (still in the works), sum types, and TypeScript-style interfaces (also still partially in the works).

                                                  The limiting factor is not some innate impossibility of making a language that is simple, fast, and safe. The limiting factor is creativity. But V has learned much from Go, Rust, Python, and other languages, and has unique insights of its own (like its error handling!).

                                                  New things are, in fact, possible… and spelled out in detail on the website and in the docs, in this case. See for yourself: https://github.com/vlang/v/blob/master/doc/docs.md .

                                                  1. 9

                                                    the flexible type system is there

                                                    hydraz below convincingly demonstrated that if function calls have generic types, type is not checked at all(!) in current V. How can you say type system is “there”? I guess it is “there” in terms of code generation, but if you are not checking types, saying type system is there is at best deceptive.

                                                    1. -4

                                                      How can you say type system is “there”?

                                                      …because there are types you can define and instantiate and do all the usual things that programming languages let you do with types…

                                                      hydraz said,

                                                      type errors for parameters in functions with a slapped on them are still silently ignored…

                                                      Silently ignored? If you use a generic type in a way that’s invalid then the program won’t compile (yes, during the C -> machine code step).

                                                      1. 9

                                                        I think you need to read my comments - and indeed, the compiler code that I linked - again. V has roughly no type system at all. The function, foo, that I wrote, isn’t generic!

                                                        • It does have a <T>, but there’s nothing to infer that T from (This should be a type error. It isn’t)
                                                        • It takes a string, but I can give it an int, and this should be an error, but the compiler has code specifically for silently ignoring these errors.
                                                    2. 8

                                                      spelled out in detail

                                                      Let’s see memory management: there’s no explanation, just claims there are (or will be, it’s wip after all) no leaks, no gc, not refcounting, but also no manual memory management (it’s hardly leak free, after all, even in rust). What magic justifies that? Is there just no dynamic allocation? Otherwise I’d like to see the papers and subsequent Turing award for solving memory management once and for all.

                                                      As for the deadlines: the author of V has made ridiculous deadlines so many times, for no good reason (why promise something in a few weeks or months instead of just waiting for it to be polished?!). It’s not like open source projects are tied to pointy haired boss deadlines.

                                                  2. 13

                                                    Interestingly, I’m not an “FP diehard”; I come from an object-based tribe, and I work on object-based designs.

                                                    None of the listed improvements to V over Go are related to what makes Go bad; I have a thread from last year exploring the worst of Go’s flaws. In short, the problem isn’t even an upper limit on abilities, but a lower limit on how much code is required to do even the bare minimum, and a surprising lack of safety in common situations.

                                                    As the original post author and several others have repeatedly indicated throughout current and past discussion about V, the speed claims simply aren’t being substantiated in an open and reproducible configuration which the community can examine and test themselves. Simply changing the host language does not grant speed, unfortunately, because of interpretative overhead, and the only cure is putting effort into the compiler.

                                                    At the same time, it is toolchains and not languages that are fast, and so any novel speed improvements in V should be explicable to the rest of us. For example, in Monte, we use nanopass design and ANF style, originally explored in Scheme. We have a JIT in the Smalltalk/Java tradition, using an off-the-shelf toolkit, RPython.

                                                    As an aside, I would imagine that V would be able to more efficiently take advantage of GCC/LLVM/etc. by picking just one backend, and emitting code just for that backend. This would be due to C’s relatively poor guarantees about how memory will be used.

                                                    1. 5

                                                      V is translated to C before being compiled to machine code

                                                      That, right there, is enough for me to question any safety guarantee V offers (and I like C).

                                                      1. 4

                                                        Nim compiles to C and it’s garbage collected. I believe the reasons they do that are runtime reach and whole program optimization.

                                                        If you can statically guarantee safety it shouldn’t be a problem. (However, its not necessarily a trivial thing to suggest.)

                                                        1. 3

                                                          ATS compiles to C too, if I understand it correctly. And there have been Haskell compilers that compiled to C too and many other programming languages that provide some aspects of safety that the underlying C language, like the machine code, do not provide.

                                                          1. 4

                                                            Why does using C as an intermediate language in the compilation process necessarily imply that a language’s safety guarantees are bad? Compilers that compile to some kind of bytecode - like rustc compiling to LLVM bitcode, or JVM langauges’ compilers compiling to JVM bytecode - are perfectly capable of being safe, even though they output code in an unsafe language (which may or may not be the final compilation output - it is (I think) in the JVM case, but LLVM bitcode is further transformed into machine-specific machine code). I don’t see why C should be any different in this respect.

                                                            1. 3

                                                              I don’t know the guarantees of LLVM or JVM, but at the language level, C has a ton of unspecified and undefined behaviors. Skipping the dangers around pointers and arrays, you still have the following undefined behavior as outlined in the C standard:

                                                              • shifting an integer by a negative value
                                                              • shifting an integer more than its size in bits
                                                              • left shifting a negative value
                                                              • signed integer representation (sign-magnitude, 1s complement, 2s complement [1])
                                                              • (quoting from the C99 standard for this one): Whether certain operators can generate negative zeros and whether a negative zero becomes a normal zero when stored in an object
                                                              • signed integer trap representations
                                                              • signed integer wrap sematics
                                                              • padding value
                                                              • padding in general
                                                              • reading a union member that wasn’t the last one written to

                                                              Now, it seems that V is targeting GCC/clang, but even so, you’ll get differences in behavior across different architectures, specifically with shifting (some architectures will mask the shift count, some won’t). When I see “safety” as applied to a computer language, I would expect these issues will be spelled out as to what to expect.

                                                              [1] In my research, there aren’t many systems in use today that are not 2s complement. They are:

                                                              • Unisys 1100/2200
                                                              • Unisys ClearPath A
                                                              • IBM 700/7000 series

                                                              I know one of the Unisys systems is still be produced today and has a C compiler (which one, I don’t recall, I think the 1100/2200).

                                                              1. 2

                                                                you do realize that source code gets compiled to machine code, which is not safe by definition.

                                                                The generated C code doesn’t use any of these, and doesn’t have to.

                                                                1. 3

                                                                  Then what’s your definition of “safe” then? There is way less that’s undefined in assembly than in C. Give me an architecture, and I can look up what it does for the above list. The reason C has so much undefined behavior is precisely because it runs on many architectures and the designers of C didn’t want to favor one over the other. Different languages can make different trade offs .

                                                          2. 5

                                                            FP diehards will never understand why Go has been so wildly successful – and V will be even more successful than Go.

                                                            Do you? Go succeeded because it was created and backed by veteran Bell Labs people and Google, to solve existing problems. I’m not talking about marketing only, but also the level of sophistication and simplicity those people were able to bring in.

                                                            It also succeeded because it didn’t promise anything it didn’t deliver.

                                                            1. -4

                                                              Yes. I spotted Go as great technology in November of 2010. Go is simple and fairly powerful considering that simplicity.

                                                              The original version of V was written in Go, V has learned many lessons from Go, both from its strengths that V builds on and the weaknesses it shores up with generic functions, generic structs, sum types, and more.

                                                        2. 14

                                                          I’d be interested in seeing what kind of these “lots of fixes in Generics are”, because as far as I can tell from reading the compiler source code, type errors for parameters in functions with a <T> slapped on them are still silently ignored…

                                                            if !c.check_types(typ, arg.typ) {
                                                              // str method, allow type with str method if fn arg is string
                                                              if arg_typ_sym.kind == .string && typ_sym.has_method('str') {
                                                                // note: str method can return anything. will just explode in the C compiler -- hydraz
                                                                continue
                                                              }
                                                              if typ_sym.kind == .void && arg_typ_sym.kind == .string {
                                                                continue
                                                              }
                                                              if f.is_generic {
                                                                // ignore errors in functions with a <T> -- hydraz
                                                                continue
                                                              }
                                                              if typ_sym.kind == .array_fixed {
                                                              }
                                                          

                                                          Try this code:

                                                          fn  foo<T>(y string) int {
                                                            return 0
                                                          }
                                                          
                                                          fn main() {
                                                            foo(123)
                                                          }
                                                          
                                                          1. -4

                                                            You had foo take a string then passed in an int :-)

                                                            EDIT: This works, for example:

                                                            fn foo<T>(y string) int {
                                                              return 0
                                                            }
                                                            
                                                            fn main() {
                                                              println(foo<int>('hi'))
                                                            }
                                                            
                                                            1. 23

                                                              … Yes, that’s my point. I passed an int to a string parameter, and the V compiler didn’t give a type error: the C compiler did.

                                                              % make
                                                              cd ./vc && git clean -xf && git pull --quiet
                                                              cd /var/tmp/tcc && git clean -xf && git pull --quiet
                                                              cc  -g -std=gnu11 -w -o v ./vc/v.c  -lm -lpthread
                                                              ./v self
                                                              V self compiling ...
                                                              make modules
                                                              make[1]: Entering directory '/home/abby/Projects/v'
                                                              #./v build module vlib/builtin > /dev/null
                                                              #./v build module vlib/strings > /dev/null
                                                              #./v build module vlib/strconv > /dev/null
                                                              make[1]: Leaving directory '/home/abby/Projects/v'
                                                              V has been successfully built
                                                              V 0.1.27 b806fff
                                                              
                                                              % ./v test.v
                                                              ==================
                                                              /home/abby/.cache/v/test.tmp.c: In function ‘main’:
                                                              /home/abby/.cache/v/test.tmp.c:9476:2: error: implicit declaration of function ‘foo’ [-Werror=implicit-function-declaration]
                                                               9476 |  foo(123);
                                                                    |  ^~~
                                                              /home/abby/.cache/v/test.tmp.c: In function ‘vcalloc’:
                                                              /home/abby/.cache/v/test.tmp.c:4597:1: warning: control reaches end of non-void function [-Wreturn-type]
                                                               4597 | }
                                                                    | ^
                                                              /home/abby/.cache/v/test.tmp.c: In function ‘byte_is_white’:
                                                              /home/abby/.cache/v/test.tmp.c:7227:1: warning: control reaches end of non-void function [-Wreturn-type]
                                                               7227 | }
                                                                    | ^
                                                              ...
                                                              ==================
                                                              (Use `v -cg` to print the entire error message)
                                                              
                                                              builder error: 
                                                              ==================
                                                              C error. This should never happen.
                                                              
                                                          2. 17

                                                            Author of the post here, let me see if I can try to clear some things up.

                                                            She cites vlang.io saying

                                                            V can be bootstrapped in under a second by compiling its code translated to C with a simple

                                                            cc v.c

                                                            No libraries or dependencies needed.

                                                            then argues against it, preposterously, by saying in part,

                                                            Git is a dependency, which means perl is a dependency, which means a shell is a dependency, which means glibc is a dependency, which means that a lot of other things (including posix threads) are also dependencies. …

                                                            Downloading a .c source file requires git? Does this person know what a “dependency” is? Should JavaScript developers include depending upon the laws of physics in package.json?

                                                            Okay I was being a bit unfair, but if we look at the makefile we see that it has the following dependencies:

                                                            • make (which depends on perl, glibc, autotools and all that nonsense)
                                                            • git (which depends on perl (even at runtime), glibc, autotools and all that nonsense)
                                                            • gcc (which depends on perl, glibc, autotools, automake, autoconf and more libraries than I care to list right now)

                                                            So if you want to be completely honest, even if you cut out the make and git steps (which i care about as someone who builds packages for linux boxes using the unmodified build system as much as possible so I can maintain whatever shred of sanity I have left), it still depends on a C compiler to get bootstrapped. This is a dependency. Then you have to download the bootstrap file from somewhere, which requires dependencies in terms of root certificates and the compiler to bootstrap with (not to mention the server that hosts the bootstrap file both existing and serving the right file back). Given that V in its current form requires you to download files from the internet in order to build it, mathematically it cannot be dependency free (this actually precludes it from being packageable in NixOS, because NixOS doesn’t allow package builds to access the network, all of the tarball/assets need to be explicitly fetched outside the build with fetchgit, fetchurl and similar). Pedantically, requiring someone to have an internet connection is a dependency.

                                                            Pedantically, the v binary lists the following dynamically linked dependencies when using lld(1):

                                                            $ ldd ./v
                                                                    linux-vdso.so.1 (0x00007fff2d044000)
                                                                    libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f2fb3e4c000)
                                                                    libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f2fb3a5b000)
                                                                    /lib64/ld-linux-x86-64.so.2 (0x00007f2fb4345000)
                                                            

                                                            If the binary was truly dependency-free, the ldd output would look something like this:

                                                            $ ldd $HOME/bin/dhall
                                                                    not a dynamic executable
                                                            

                                                            This leads me to assume that the v binary has dependencies that the runtime system will need to provide, otherwise the program will not be able to be loaded by the Linux kernel and executed. Binaries produced by v have similar limitations:

                                                            $ ldd ./hello
                                                                    linux-vdso.so.1 (0x00007ffdfdff2000)
                                                                    libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fed25771000)
                                                                    /lib64/ld-linux-x86-64.so.2 (0x00007fed25d88000)
                                                            

                                                            Additionally, I am banned from the V discord and GitHub. The V programming language has censored a neuro-diverse trans woman from being able to contribute to the project in any capacity. I would love to be able to contribute to things at least to make the documentation and website not filled with misleading statements, but I cannot. This is why I ask other people to make issues for me in my posts.

                                                            I realize things might look snarky when the article is viewed from a certain worldview/lens, but there is a total scalar lack of snark intended in that article when it was written. If you cannot realize that, then I am sorry that my intended tone didn’t have the effect I wanted and I will use this feedback to further refine my writing ability.

                                                            Direct x64 machine code generation

                                                            In my testing I was unable to get this working on my Ubuntu server. It still used gcc.

                                                            1. 8

                                                              You ended your piece (which utterly trashes V) by saying

                                                              Overall, V looks like it is making about as much progress as I had figured it would.

                                                              I criticized your snark, and you replied with

                                                              I realize things might look snarky when the article is viewed from a certain worldview/lens, but there is a total scalar lack of snark intended in that article when it was written.

                                                              Do you really expect anyone to believe that?

                                                              Additionally, I am banned from the V discord and GitHub. The V programming language has censored a neuro-diverse trans woman from being able to contribute to the project in any capacity.

                                                              Do you really think that’s why you were banned? Does Alex even know you’re trans? You don’t think you were banned for your vicious and misleading attacks on this project?

                                                              1. 6

                                                                You’re just ignoring the points she made here, and keep talking besides her and banging on about the article.

                                                                1. 6

                                                                  @cadey didn’t say they were banned for who they are, they just stated they were X and were banned.

                                                                  You don’t think you were banned for your vicious and misleading attacks on this project?

                                                                  If my memory serves me right, @cadey was banned because of their disagreements with V, such as those voiced in this and the previous (https://christine.website/blog/v-vvork-in-progress-2020-01-03) blog post. I could be wrong. Also “vicious” is being overly dramatic and frankly not productive.

                                                                  1. 4

                                                                    @cadey didn’t say they were banned for who they are, they just stated they were X and were banned.

                                                                    Then why bring it up?

                                                                    If my memory serves me right, @cadey was banned because of their disagreements with V, … . I could be wrong.

                                                                    Is that actually true?

                                                                    Also “vicious” is being overly dramatic …

                                                                    It doesn’t sound like you’re paying very close attention… Two other people criticized her for bullying in this very thread before I even got here. If you read my comments here then I’d hope you would change your mind about how unfairly harsh she has been to Alex, to his project, and to the V team.

                                                                    Consider starting here: https://lobste.rs/s/nfjifq/v_update_june_2020#c_vuofat

                                                                    1. 22

                                                                      You almost never comment except in V threads.

                                                                      Further, many of those comments seem to be today, in last year’s thread.

                                                                      Please just let this be. Flag the submission if you must and move on.

                                                                      1. 6

                                                                        I have removed the post as of this commit. In a few minutes the article will be gone, but a tombstone of the former article will remain.

                                                                        1. 19

                                                                          I don’t think you did anything wrong by posting it, I don’t think you made any mistakes, and I enjoyed reading it. Not saying this to convince you to put the essay back; I just know that I personally feel awful when I get really harsh criticism, even when I don’t respect or care about the person giving it. So wanted to provide a bit of positivity to balance it out ツ

                                                                        2. 8

                                                                          Then why bring it up?

                                                                          I can’t answer that question. While I’m not sure what the benefit is of bringing it up, I don’t see the harm either; it was pretty clear from the comment they were not saying they were banned because of who they are.

                                                                          Is that actually true?

                                                                          Feel free to show otherwise. I can’t, because I’m not the person who banned @cadey; nor am I in contact with them.

                                                                          It doesn’t sound like you’re paying very close attention

                                                                          Please do not make such assumptions. It’s unproductive starting an argument as such, as well as factually incorrect.

                                                                          Two other people criticized her for bullying in this very thread before I even got here.

                                                                          Only one person said it’s starting to look like bullying (https://lobste.rs/s/nfjifq/v_update_june_2020#c_cdxvwk). Other comments mentioning “bullying” either state they are not sure, or don’t see it as bullying. I don’t see anybody else mentioning this is bullying. Am I perhaps overlooking something?

                                                                          If you read my comments here then I’d hope you would change your mind about how unfairly harsh she has been to Alex, to his project, and to the V team.

                                                                          I agree the tone in the blog post is not the most productive. While some parts of the post are a bit pedantic, overall I think it’s not unfairly harsh. V made many big claims both before and after its release. Here are just a few examples:

                                                                          • https://github.com/vlang/v/issues/35
                                                                          • Translating C++ to V, which now has the note “TODO: translating C to V will be available in V 0.3. C++ to V will be available later this year.”
                                                                          • Claiming V has pure functions when they can still have side-effects. “Pure” has a well defined meaning. If you mean “pure but with IO”, then call it something else; otherwise it’s just misleading. This was brought up in this issue, which was closed, but I can’t find any mention of this in the docs here.
                                                                          • Various claims about certain components not having dependencies, only to have dependencies; despite the website advertising “Compiles to native binaries without any dependencies”
                                                                          • Advertising “V is written in V and compiles itself in under a second.”, when according to https://fast.vlang.io/ compiling V with optimisations (if I’m reading the table correctly) takes over one second most of the time.

                                                                          There is a lot more from past discussions (e.g. those on Hacker News), so I suggest taking a look at those.

                                                                          With that all said, I really do hope V succeeds and wish the authors the best of luck. But the authors really need to make sure that what they advertise is actually implemented as advertised, or make it very clear what the current state is. Don’t go around saying “We support X” followed by “oh by the way we’ll release that end of this year”.

                                                                1. 33

                                                                  When content recommendation becomes the most important highlight of a privacy-friendly browser’s new release.

                                                                  I love Firefox, but sjeesh

                                                                  1. 13

                                                                    It is infuriating that the developers of a web browser consider it acceptable to implement any “content recommendation” on their program.

                                                                    1. 7

                                                                      Why? As stated up-thread, browsing data is never uploaded. Content recommendation happens locally only. What is so wrong with this?

                                                                      1. 7

                                                                        Because it’s a browser. It should empower me to search the internet for the stuff I want to see, not they think I want to see. I gain no user experience whatsoever. It’s a slippery slope downhill from any recommendation system, no matter how privacy friendly they claim it to be.

                                                                        1. 6

                                                                          Imagine your new FM radio “recommended” which station to tune to when you turned it on. Would it really be any comfort if the manufacturer assured you that that this recommendation had nothing to do with your own preferences, because they don’t know and definitely don’t care? Bookmarks have been part of browsers since the beginning. This is something else.

                                                                          People only put up with this nonsense because it’s free-as-in-beer. That’s why I’d be happy to pay for a fork that treated me like a paying customer rather than a set of eyeballs to sell through some convoluted scheme papered over with a lot of patronizing rhetoric.

                                                                          1. 4

                                                                            Maybe a car analogy is useful for you. It’s as if your car “recommended” which restaurant to go when you drive it on Saturday afternoon, and actually drove you there without asking, until you overrade it. A minimal amount of fuel would be lost at the beginning of the journey; this is no problem, you can override it at any time. Would you be OK with that?

                                                                            I do not want my web browser to make any network request when I open it, unless I ask for it explicitly. As other posts in this thread explain, this is actually impossible with firefox. This is what infuriates me.

                                                                            1. 0

                                                                              Though they say the browsing data is never uploaded, it’s trivial to match the IP and the time of when the content was recommended. That info can then be correlated with the site serving the recommended content. Several ways exist in which to deanonymize browsing history.

                                                                          2. 10

                                                                            One of the first things - besides installing uBlock and friends - I do with a new FF installation is the disabling of all these spurious ‘services’ - content suggestion, dangerous content warnings, the various telemetry bits apart from bug reports, those I do send seeing as I run nightly and as such can provide useable reports.

                                                                            1. 4

                                                                              I would be grateful if you could share the configurations that you are doing. I am asking so that I could note them down and set them too.

                                                                              I would like if NixOS would provide firefox configuration options that could be configured centrally, or per user, to make sure that every upgrade applies them.

                                                                              1. 3

                                                                                home-manager allows you to declaraticely configure which Firefox add-ons to install (although you still need to enable them manually the first time you start Firefox for security reasons). And you can set Firefox options declaratively using their enterprise policies.

                                                                                1. 2

                                                                                  I don’t use much magic to configure it, most is by hand. The only ‘automatic’ thing I do is install a policies.jsonfile (in distribution/policies.json in the FF install directory, in my case that is /opt/APPfirefox/bin/distribution/policies.json) which disables automatic updates since I handle those using a script. I do not want to have binaries writeable by users so these automatic update policies are out of the question. The update script pulls new nightlies from the server and installs them, installs the policies.json file in the correct location and sets ownership and permission so that regular users can execute, but not modify the distribution. I used to have a FF sync server when that was still a thing but eventually it got too hard to reinstate ‘old’ sync support. I do not have, nor do I want to have a ‘Firefox account’ since I do not use any such external services if I can in any way avoid them. I might look into building a ‘new’ FF sync server some time but other matters are more important for now. Until such time I will simply install the following extensions:

                                                                                  • uBlock origin (set to ‘expert user’ mode)
                                                                                  • uMatrix (disabled by default)
                                                                                  • Nuke Anything (to get rid of annoying overlays which uBlock can not filter out)
                                                                                  • Open With (to open e.g. media files through a local script)
                                                                                  • Containers with Transitions (to always open certain sites in site-specific container tabs)
                                                                                  • Foxyproxy Standard (disabled, sometimes used to redirect sites through a local Tor node)
                                                                              2. 13

                                                                                It seems to me it’s just a “here are the most popular articles”-list; don’t see anything wrong with that, or any fundamental privacy-concerns. Also from the expanded announcement on it:

                                                                                Recommendations are drawn from aggregate data and neither Mozilla nor Pocket receives Firefox browsing history or data, or is able to view the saved items of an individual Pocket account. A Firefox user’s browsing data never leaves their own computer or device.

                                                                                And from the FAQ:

                                                                                [N]either Mozilla nor Pocket ever receives a copy of your browser history. When personalization does occur, recommendations rely on a process of story sorting and filtering that happens locally in your personal copy of Firefox.

                                                                                1. 11

                                                                                  I see something wrong with that, that being giving the user an experience that they have not ask for nor had any control over. Also, what news site, and what collection of news given to the user is trustworthy in a general sense?

                                                                                  I feel about it as if I got public broadcasting in my new tab, not something I want nor I am interested in.

                                                                                  1. 9

                                                                                    that being giving the user an experience that they have not ask for

                                                                                    How can you be so sure? I’m a Firefox user, and I find those articles occasionally useful.

                                                                                    nor had any control over

                                                                                    You can switch it off easily in preferences or directly on the New Tab page (three dots in the upper right corner).

                                                                                    1. 4

                                                                                      “nor had any control over” is a terrible way to word it (it is your computer and you are definitely in control). My first reaction was “this person is entitled as heck”.

                                                                                      However, there is an implied social contract (because firefox existing makes it socially/politically almost impossible to get an alternative off the ground). I still disagree with lich, but their argument has legs.

                                                                                    2. 3

                                                                                      I see something wrong with that, that being giving the user an experience that they have not ask for nor had any control over. Also, what news site, and what collection of news given to the user is trustworthy in a general sense?

                                                                                      I don’t feel that’s a fair characterization. Any new feature can be described as giving the user an experience they did not asked for. And as other commenters note, it can be disabled. Which grants control.

                                                                                      As to a user experience, I have lobsters show up in my recommended list, probably because I visit it so often. It does make some sense that I would be recommended what I like to habitually visit.

                                                                                      I even removed the suggestion a few times and timed how long and how many visits made it reappear. For me, it learned the association in a day and ten visits to the front page because my habit is to close the tab after quickly reviewing the stories posted.

                                                                                    3. 6

                                                                                      Where does the aggregate data come from?

                                                                                    4. 5

                                                                                      I cannot use Firefox and feel safe without ghacks user.js. It is kind of absurd that there is no real community-lead option for browsers. You could put the blame on standards bodies for creating bloated standards, but now more than ever they are just a facade commanded by corporate interests. I don’t know much about it but Project Gemini (along with gopher) seem to be closer to achieving the goals of free software and the “original dream of the web” (whatever that means).

                                                                                      Edit: typo

                                                                                      1. 1

                                                                                        I totally get your point. Would something like Pale Moon feel better to use?

                                                                                    1. 4

                                                                                      Where can I find more information about why Plan 9 is amazing, especially how it compares/contrasts to Linux or Unixes?

                                                                                      1. 9

                                                                                        I found this paper to be a wonderful walkthrough. I highly recommend getting a copy of 9front running, and going through some the exercises in the paper.

                                                                                        It’s very long, but definitely a great way to get a feel for how some of the concepts in Plan 9 are applied.

                                                                                        Edit: Since I was reminded how much I like this paper, I decided to submit it as a story.

                                                                                        1. 5

                                                                                          Some goodies from Plan 9 were ported to *nixes, for example procfs, unfortunately not all of them (Plan 9 like process namespaces).

                                                                                          1. 7

                                                                                            Unfortunately, the best piece of plan 9 is impossible to port: A unified, interposable way of doing everything, so you don’t have to think about huge numbers of special cases and strange interactions between features. 9p is, more or less, the only way to talk to the OS, and the various interfaces that are exposed over it can be transparently swapped out with namespaces, allowing you to replace, mock out, or redirect across the network any part of the system that you want.

                                                                                            1. 4

                                                                                              Well, Linux namespaces got 90% of the way there, although they certainly didn’t get their ergonomics.

                                                                                            2. 3

                                                                                              I just watched the video https://www.youtube.com/watch?v=3d1SHOCCDn0

                                                                                              Found the way the presenter explained the core concept very understandable. It is 40 minutes long.

                                                                                            1. 5

                                                                                              As long as it isn’t as unstable as the Ubuntu x Thinkpad partnership.

                                                                                              1. 2

                                                                                                How is that these days? It’s been about 6 years since I last ran Ubuntu on a Thinkpad.

                                                                                                1. 2

                                                                                                  For what my experience is worth, I have a X1 Extreme hi resolution. It was a nightmare to get it work. To use external displays didn’t work. When the BIOS was set to hybrid graphics, it didn’t work.

                                                                                                  In the end it worked, but I have sacrificed so much of my time that I would wish to use for something else.

                                                                                                  And when it finally worked, I had the worst input lag since the beginning of my Linux experience (95). Just typing in a terminal window was so bad, that I was constantly making typos! I guess it was Ubuntu’s switch to Gnome + using 3D where a good 2D would be enough.

                                                                                                  The battery time is terrible. I am happy if I can make through 2h on battery.

                                                                                                  Switched now to NixOS. Fighting the hardware issues too. But my hope is, that when I fought through this, I will be in piece for some time. Ubuntu was serving me for 15 years and I am grateful for this. It’s time for something better.

                                                                                                  1. 3

                                                                                                    That’s concerning. I’m considering an X1 Carbon if Apple doesn’t raise the 13” MacBook Pro memory to 32 GB this year. I’ve used a Mac laptop for myself for 12 years now and for work for 8 of the last 10 — two years on a Thinkpad with “Open Client for Debian Community” i.e. IBM’s Ubuntu spin — with much adoration. 16 GB is slowing me down. I’m too much of a multitasker these days and find myself more often using my 32 GB desktop gaming rig running Windows or my work laptop that is a 15-in MacBook Pro 32 GB.

                                                                                                    1. 3

                                                                                                      That being said, Fedora (stock) on an X1 Carbon works very well (I’m using it to type this, and at $dayjob).

                                                                                                      1. 1

                                                                                                        I didn’t have a single hardware issue in my thinkpad with fedora. It’s not an X1, but in my experience, Fedoraand ThinkPad are a good match.

                                                                                                      2. 2

                                                                                                        Don’t get me wrong, I am using ThinkPads for over 20 years now and I will continue. I will just avoid ThinkPads with NVidia inside. However when I decided to buy the X1E (1st gen) I did it partially because it was certified to be Ubuntu compatible! I already had doubts, and I had actually before one ThinkPad with NVidia, but on that one I could use the Intel GPU for external displays, so the NVidia was just a waste of weight and energy, but at least I could do work. On this one however I cannot use external screens without somehow making this NVidia, Hybrid, BumbleBee, Prime whatever stuff working. It steals so much of my time.

                                                                                                        1. 1

                                                                                                          X1 carbon is great. The extreme has Nvidia graphics which are a tire fire.

                                                                                                          1. 1

                                                                                                            I thought Nvidia + Linux was <3? Did that change in the twenty teens?

                                                                                                  1. 17

                                                                                                    God save me from Lenovo pre-installed software…

                                                                                                    1. 11

                                                                                                      I think the real benefit here is Lenovo seems to now give a shit about making their hardware work well Linux, and maybe, just maybe, they’ll push that hardware support upstream so folks who don’t want to run the distro this ships with can still benefit from it.

                                                                                                      1. 3

                                                                                                        Exactly my thoughts. I’m using a T540p at work with Debian on it, but the hardware support is pretty shitty. Lots of power management problems, display driver had problems in the beginning and so on. It also took years for Debian on my x230 to support the built-in microphone. So maybe with this, that fabled “good Linux support” will finally become actually true for these models.

                                                                                                        1. 4

                                                                                                          RHEL7 on my T540p “just worked“, years ago. Although I hated the laptop so didn’t use it for very long. When people have problems with a particular Linux on a particular laptop, sometimes it’s the laptop, but sometimes perhaps the distribution. Or somewhere in the middle.

                                                                                                          1. 2

                                                                                                            That’s weird, you’d expect this to be more or less the same across distros (as long as they use the same kernel version and X drivers)

                                                                                                            1. 3

                                                                                                              You’ve put your finger on it: they probably aren’t using the same kernel and X drivers.

                                                                                                            2. 1

                                                                                                              You hated a T540p ? Don’t answer.

                                                                                                              1. 2

                                                                                                                Not OP but I had to use one for a little while. I hated it due to keyboard and touchpad. Lenovo touchpads of the era were so terrible they should have just left them off and stuck to the trackpoint. A janky touchpad with half-assed palm rejection degraded the experience.

                                                                                                                And I just can’t deal with the off-center typing that 10key forces on a laptop that size.

                                                                                                                Otherwise it was great, but I couldn’t get past those two things.

                                                                                                                1. 2

                                                                                                                  Ok!

                                                                                                            3. 2

                                                                                                              I hope for this too! Having an X1 Extreme with Ubuntu 18.04 for 1 year being the worst Linux experience in 25 of using Linux. I have to say until this I avoided Hardware that didn’t have good Linux support. Will boycott Nvidia for the rest of my life.

                                                                                                              Just would have hoped that Lenovo would have a notebook with same form factor and physical aspects (15”, centered keyboard, hi resolution) but without Nvidia.

                                                                                                          1. 1
                                                                                                            • Learning to touch type, later realizing that sticking to US keyboard will give me peace of mind - started IT in Germany, then moved to France and traveled quite a bit. For accents/umlauts using AltGr International combinations
                                                                                                            • Use VI commands everywhere I can (configuring shell and REPLs to VI mode)
                                                                                                            • Unix command line with big history size, using available unix commands. Saved me a lot of time and made me more productive by combining tasks into commands and when in the end they are not optimized further and I still use them, I make a script. Still from time to time discovering new commands. Makes me happy.
                                                                                                            • Emacs with org-mode and evil (VI emulation) to take work notes, clock time and make reports for invoices
                                                                                                            • Haskell, for programs that do complicated enough things, where it makes sense to get the basis right and avoid mistakes. Made me as well much more humble about programming. As well I know that I will never end learning this language and it makes me happy. There is so much new research going on.
                                                                                                            • git: works reliably and has solutions for weird work-flows
                                                                                                            • screen: Discovered it late, a tool that has been around for many years, but I discovered it only when I was constrained to work on different servers. It’s often preinstalled. Now I use it even on the laptop, locally.
                                                                                                            • Regression tests that can be automated.

                                                                                                            Techniques that I learned only after I needed them and would have liked to know them earlier:

                                                                                                            • Writing without seeing the text, i.e. same font and background color (black-on-black or white-on-white). Allows to “speak” out the mind in an intimate and honest way. Helps beautifully to calm down when in emotional stress.
                                                                                                            • Meditation. Will not describe my impressions, because it has too many facets and the revelations are changing. But it was a precious discovery and gives me often a very nice road trip into myself and often helps to take decisions, unblock situations.
                                                                                                            1. 4

                                                                                                              Can’t GenodeOS work as the userland for seL4?

                                                                                                              1. 5

                                                                                                                Genode is nice and all, but it is Affero GPL licensed. This is likely seen as a huge liability.

                                                                                                                1. 5

                                                                                                                  Specifically because they want hardware/software businesses to pay for using it. So, they should probably think of that combo as seL4 plus a commercial product. Most won’t use it as you predicted.

                                                                                                                  1. 4

                                                                                                                    Ooooh, now that is something I totally missed, thanks!

                                                                                                                  2. 2

                                                                                                                    From their documentation 1:

                                                                                                                    Genode can be deployed on a variety of different kernels including most members of the L4 family (NOVA, seL4, Fiasco.OC, OKL4 v2.1, L4ka::Pistachio, L4/Fiasco). Furthermore, it can be used on top of the Linux kernel to attain rapid development-test cycles during development. Additionally, the framework is accompanied with a custom microkernel that has been specifically developed for Genode and thereby further reduces the complexity of the trusted computing base compared to other kernels.

                                                                                                                    1. 2

                                                                                                                      But seL4 is single core only, so it’s not much use outside of embedded or single-purpose equipment :(

                                                                                                                      1. 1

                                                                                                                        Uh, oh :/ this is something I didn’t realize :(

                                                                                                                        1. 1

                                                                                                                          They have an unverified implementation, which is roughly as secure as a normal operating system.

                                                                                                                    1. 1

                                                                                                                      Good read. I would like to know if and how BTRFS does guarantee that data that is on the disk is the one that was written. As well it would be interesting to know how much this guarantees are impacting the performance. I would be grateful if somebody who has answers to this could shed some light.

                                                                                                                      1. 2

                                                                                                                        (The same answear you can find on the blog in the comment section).

                                                                                                                        Let’s start by seeing, “I’m not an expert in BTRFS”.

                                                                                                                        That said, first of all, the BTRFS is using b-tree instead of the merkle tree. They decided not to keep the checksum of the node in the node above. Instead, they are checksumming the level of the block and block number where this block is supposed to live. This allows them to detect misplaced writes/reads on the media.

                                                                                                                        Everything that points to a tree block also stores the transaction id (the generation field) it expects that block to have, and this allows to detect phantom writes.

                                                                                                                        The difference with ZFS is that it checks the checksum of the block instead of the transaction id.

                                                                                                                        Source: https://btrfs.wiki.kernel.org/index.php/Btrfs_design

                                                                                                                        1. 1

                                                                                                                          Thank you for the reply!

                                                                                                                      1. 5

                                                                                                                        Can anyone explain why Clear Linux is consistently winning on benchmarks over Fedora/Ubuntu?

                                                                                                                        https://clearlinux.org/ says “Highly tuned for Intel platforms where all optimization is turned on by default.” - is that what this boils down to (-O3 -mtune= all the way, Gentoo style), or are the developers doing other clever things in the kernel/desktop env/etc?

                                                                                                                        1. 8

                                                                                                                          Have you been hearing about any other benchmarks, than those being done by phoronix? They seem to be the only ones talking about the distro…

                                                                                                                          1. 3

                                                                                                                            You could try it yourself, it’s just an ISO you can download. My experience matches up pretty closely with the phoronix benchmarks, but package support is severely lacking so I don’t use Clear anymore.

                                                                                                                          2. 4

                                                                                                                            It’s a combination of compiler flags like the ones you mentioned and setting the CPU governor to “performance”.

                                                                                                                            It also sprinkles in several other minor optimizations, but those two get you 95% of the way there and can be done on any source-based distro.

                                                                                                                            1. 2

                                                                                                                              Aren’t they testing it using AMD CPU?

                                                                                                                              1. 2

                                                                                                                                Yes, but perhaps it’s only important that they’re compiling for modern CPUs?

                                                                                                                                Looks like they’re probably not compiling with ICC or the performance would probably be worse on AMD than Ubuntu using GCC or clang.

                                                                                                                                1. 1

                                                                                                                                  AMD also makes CPUs for Intel platforms. In fact, that’s probably what they are most known for.

                                                                                                                                  1. 1

                                                                                                                                    Are you talking about x64 (AKA x86-64)?

                                                                                                                                    1. 1

                                                                                                                                      Yes, which in an ironic twist I call amd64, to separate it from Intel’s IA-64.

                                                                                                                                      1. 1

                                                                                                                                        So that’s not really an “Intel platform”… unless you were using the term to refer to the x86 line.

                                                                                                                                        1. 1

                                                                                                                                          Which is clearly how it was used in the context we’re discussing.

                                                                                                                                          1. 1

                                                                                                                                            Clear to you, yes.

                                                                                                                                2. 2

                                                                                                                                  Copying part of a comment [1] from the article:

                                                                                                                                  well it is worthwhile to have a look at their github repo - there is more ongoing eg. plenty of patches adding avx support to certain packages.

                                                                                                                                  [1] - https://www.phoronix.com/forums/forum/phoronix/latest-phoronix-articles/1157948-even-with-a-199-laptop-clear-linux-can-offer-superior-performance-to-fedora-or-ubuntu

                                                                                                                                  1. 1

                                                                                                                                    I don’t know about their Intel optimizations, but if it’s only that, it may be interesting to see how Clear Linux compares with a mainstream distribution on which the kernel has been compiled with all flags set.

                                                                                                                                  1. 1

                                                                                                                                    The title is a bit of a stretch, because even though the author doesn’t look at a system’s source code, he writes code for a model checker.

                                                                                                                                    Still it was a good talk for me because it showed usage of a model checker.

                                                                                                                                    1. 8

                                                                                                                                      Wow, that is a very unusual introduction to Haskell — going straight into imperative programs (everything’s a do!) and concurrency. And then it just…stops!

                                                                                                                                      1. 6

                                                                                                                                        It’s a phrasebook. It gives a way to do something in a language you don’t really know.

                                                                                                                                        It isn’t idiomatic, it’s just getting you to have something to show for it as quickly as possible.

                                                                                                                                        1. 6

                                                                                                                                          It’s a work in progress:

                                                                                                                                          We have launched the Phrasebook with 14 demonstrations of topics ranging from if-then-else expressions to transactional concurrency, and there is a lot more to come.

                                                                                                                                          1. 2

                                                                                                                                            In… a good way? Bad way?

                                                                                                                                            1. 5

                                                                                                                                              I don’t know! Well, it’s not good that it just stops. But I wonder what a Haskell book would be like that started with the imperative and concurrent stuff like “normal” languages have, and ended with the higher-order functions and so on, instead of the other way around, as a Haskell book normally does.

                                                                                                                                              Like, you would start off thinking it was like Go or something, just with weird syntax. You’d get sucked in that way, but then things would start getting more abstract and powerful, and by the end you’d be using pure functions and GADTs and free monads before you knew what hit you.

                                                                                                                                              1. 3

                                                                                                                                                Like, you would start off thinking it was like Go or something, just with weird syntax. You’d get sucked in that way, but then things would start getting more abstract and powerful, and by the end you’d be using pure functions and GADTs and free monads before you knew what hit you.

                                                                                                                                                I suspect you might give up, thinking, “what’s the point of this weirdness” before you got to any real motivation or reason to keep learning.

                                                                                                                                              2. 4

                                                                                                                                                I like it. And I am waiting for it to provide more examples. I went through several books, still reading and still trying to learn. But did already write programs that I am using for my work and that are helpful for me. Still mostly reaching out for shell scripting, because the shell scripts naturally grow by combining commands and I wish I would use some Haskell Shell in which I would do my daily stuff and that would easily allow me at some point to put together the Haskell programs.

                                                                                                                                                I like how they are showing ghcid early (how long did it take me to find settle on ghcid, how many editor/IDE tools did I try), and I like that ghci is introduced. It’s pragmatic.

                                                                                                                                                I hope it will go on with many examples.

                                                                                                                                                1. 0

                                                                                                                                                  In… a good way? Bad way?

                                                                                                                                                  Definitely a bad way.

                                                                                                                                                  All the weirdness and higher order stuff is there to give you all kinds of guarantees which can be extremely useful.

                                                                                                                                                  In fact: If you are not using the higher-order stuff, you might just as well use another language which requires you to jump though less hoops, because you are missing the whole point of what Haskell is about.

                                                                                                                                                  You should start with the higher-order stuff and then bolt this phrasebook on as an afterthought, not the other way around. If you start with this phrasebook, you will essentially be writing a bad code base.

                                                                                                                                                  Please keep in mind that I have actually reviewed assignments from a “Functional Programming” course, which used Haskell as it’s primary subject of study.

                                                                                                                                                  1. 9

                                                                                                                                                    You are gate-keeping, and this behaviour is definitely worse for the community.

                                                                                                                                                    I’m one of those developers who had no computer science education, and started programming essentially by banging rocks together, trying to pay the bills with WordPress and jQuery.

                                                                                                                                                    I learned Haskell the trial-and-error way, and the imperative way. My first foray into Haskell was from the book Seven Languages in Seven Weeks, which necessarily doesn’t go very deep into the languages it exhibits. I got some of the basics there, but otherwise trial-and-error, Google, IRC, etc. My first web apps in Haskell were pretty terrible, but I needed to just get something working for me to be more invested in the technology. Everyone sucks at something before they’re good at it anyway. There’s still an enormous amount of Haskell for me to learn. I see that as compelling, not a hurdle.

                                                                                                                                                    This has not “destroyed my reputation”, as you asserted. If anything it’s only improved it, especially among people who are interested in Haskell but are discouraged by people like you.

                                                                                                                                                    Now I run three businesses on Haskell, and employ other Haskellers who have worked at Haskell companies you have heard of.

                                                                                                                                                    1. 6

                                                                                                                                                      you will essentially be writing a bad code base.

                                                                                                                                                      But you WILL be writing a code base.

                                                                                                                                                      1. 1

                                                                                                                                                        But you WILL be writing a code base.

                                                                                                                                                        You will be writing a codebase that will force the next competent Haskell developer, to throw out all your work and start over. Also: It will destroy your reputation.

                                                                                                                                                        Honestly, it’s better to not write anything at all if this is your starting point. Just use something else like python, C/C++, Java or C#. This is simply not how Haskell should be written and I will probably also perform worse than the alternatives.

                                                                                                                                                        Why? Because if you use Haskell the right way, the compiler can throw in all kinds of optimizations, like lazy evaluation and memoization for free. If you are writing Haskell in the way that is proposed in the Phrasebook, you essentially loose all those perks without gaining anything. In fact your code will be much, much, (about a factor 10 actually) slower than it would be if you’d just started out by using a different language.

                                                                                                                                                        For an elaborate example, you can look at The evolution of a Haskell programmer. Note that the Juniors and the first Senior developer’s solutions are in fact perfectly valid and viable.

                                                                                                                                                        However, the second senior (which uses foldl) makes a critical mistake which costs him the “lazy evaluation perk”, which means that his best-case and worst-case performance are both O(n), whereas the senior that uses foldr will have O(1) as best case and O(n) as worst case performance.

                                                                                                                                                        And it goes downhill from there. However the Haskell code I see in the Phrasebook is similar to what the “Beginning graduate Haskell programmer” would do.

                                                                                                                                                        The “right” way to do it, is the “Tenured professor”-way all at the bottom. It doesn’t matter that product uses foldl’ internally in this case, which also sacrifices lazy evaluation. It’s about a way of doing things and in general, where you rely upon the implementation of libraries getting better. This phrasebook also throws a lot of those perks out by manually taking control over nearly the entire control flow (which is something you should do as little as possible when you are writing Haskell).

                                                                                                                                                        That is the kind of “bad codebase you would be writing” we are talking about here. If you find yourself in the situation where you need this phrasebook to get started, you are simply out of your league. The situation is really not unlike the software engineering team that programmed the flight computers of the 737 MAX 8. You should step away and say: “No, I am not up to this task right now. I need at least 120 hours (but 240 hours is a more reasonable estimate) of study before I can do this”.

                                                                                                                                                        But if you did invest the hours upfront and are using this Phrasebook as an afterthought… sure; Sure! Go ahead! You should now know where the pitfalls in these examples are.

                                                                                                                                                        1. 7

                                                                                                                                                          One of the authors of this Phrasebook is also an author of Haskell Programming from First Principles, which starts from the lambda calculus. I think she’s deliberately exploring as different an approach as possible. There’s isn’t a single Right way to teach, the readers’ varied backgrounds and motivations lead them to really different results.

                                                                                                                                                          1. 1

                                                                                                                                                            One of the authors of this Phrasebook is also an author of Haskell Programming from First Principles, which starts from the lambda calculus. I think she’s deliberately exploring as different an approach as possible. There’s isn’t a single Right way to teach, the readers’ varied backgrounds and motivations lead them to really different results.

                                                                                                                                                            The approach the author is taking now, is an approach which defeats the main purpose of Haskell: It’s type-system and a relatively smart compiler that exploits this through lazy evaluation. Because of this, I simply do not agree with this statement.

                                                                                                                                                            A pilot needs to learn at least some basic meteorology and aerodynamics, the same applies here, because if you don’t take the time to properly understand the type system and lazy evaluation, you are basically an unlicensed pilot that knows how to get an airplane off the ground, keep in in the air and land it again, but without any contact with air traffic control.

                                                                                                                                                            I would not want to fly with such a pilot, neither do I want to use an aircraft he/she has flown in. In reality we have systems in place to stop this from happening and the pilot will be told to stay on the ground and “pilot” something (like a car for example) he/she knows how to pilot. In the software world, we do not have a system, other than our own sound judgement, in place to prevent this from happening.

                                                                                                                                                            So please: Learn Haskell’s fundamentals first and then add this phrasebook to the mix afterwards or choose an entirely different technology. Everyone who is currently next to you or whom comes after you, will thank you for it.

                                                                                                                                                            1. 3

                                                                                                                                                              Hopefully Haskell can be many things to many people. I think it makes for a pretty good imperative language.

                                                                                                                                                          2. 6

                                                                                                                                                            I’m currently training a team of engineers to write Scala. We’re experiencing the “no code” problem right now. I prefer people write bad (but functional) code than no code.

                                                                                                                                                            1. 1

                                                                                                                                                              I’m currently training a team of engineers to write Scala. We’re experiencing the “no code” problem right now. I prefer people write bad (but functional) code than no code.

                                                                                                                                                              I would agree with you if this was about any other programming language, but Haskell really is a different beast in this regard.

                                                                                                                                                              I pose you this question: Would you rather spend some time training your engineers or would you rather have them dive in without them knowing what they are doing?

                                                                                                                                                              Since you are training a team, you’ve probably chosen the first approach, which is exactly what I am proposing you should do with Haskell as well. You do not hand a pilot the keys to an airplane without making sure they’ve had some proper training. The same applies here (see below). Most other programming languages are like cars or trucks, but Haskell really is more of an aircraft.

                                                                                                                                                              1. 8

                                                                                                                                                                I think this type of elitist gate keeping dissuades people trying to learn Haskell and reflects poorly on the community. Furthermore the creators of the Haskell Phrasebook clearly know a lot about Haskell and have built a business around teaching it to people. Do you think it’s possible for them to have a compelling reason to create a resource like this?

                                                                                                                                                                @argumatronic: I’ve seen people do similar with Haskell, starting with very imperative-style Haskell, but in the meantime: I can understand you, thank you for making effort to learn a new language, welcome.

                                                                                                                                                                1. 0

                                                                                                                                                                  I think this type of elitist gate keeping dissuades people trying to learn Haskell and reflects poorly on the community.

                                                                                                                                                                  Actually I digress. There is nothing elitist about it. It’s about using a hammer to turn a screw in.

                                                                                                                                                                  Furthermore the creators of the Haskell Phrasebook clearly know a lot about Haskell and have built a business around teaching it to people.

                                                                                                                                                                  The fact that someone builds a business around something, doesn’t mean they are doing things the right way. Teaching people things the wrong way, has a tendency to stick around. Oh and btw, I also earned money teaching Haskell (and cryptography and security) to people during my studies at an accredited university with the oversight of a professor leading in the development of the language…. So I am no lightweight either…. And what I see here makes me cringe and would have awarded any student a non-passing grade with approval.

                                                                                                                                                                  Do you think it’s possible for them to have a compelling reason to create a resource like this?

                                                                                                                                                                  Yes I do. In fact, they state the same reason as I suspected on the Twitter feed you mention:

                                                                                                                                                                  IME, people start to write more Haskelly Haskell as they get comfortable with it, but we have the tools to write imperative-style Haskell as a bridge, no shame in using them.

                                                                                                                                                                  And:

                                                                                                                                                                  Eventually, by doing that a lot, I became quite fluent in Japanese. And I’ve seen people do similar with Haskell, starting with very imperative-style Haskell, but in the meantime: I can understand you, thank you for making effort to learn a new language, welcome.

                                                                                                                                                                  And like I said, there is nothing wrong with using the phrasebook, but you have to use it after you at least have a firm grasp op the basic concepts. Doing it the other way around will give the community and the language itself a bad name. If nothing else, the Haskell ecosystem will turn into a hack fest similar to python or nodejs with the decrease in quality and performance of everything that comes with it.

                                                                                                                                                                  That’s what I am worried about and it’s also why I disagree: You want people that write Haskell, to write it in a completely different way than you’d write an imperative language.

                                                                                                                                                  1. 20

                                                                                                                                                    Lately I find myself writing fewer and fewer programs, because I can do a lot in the shell, and it’s usually a couple of lines at worst. Saves me time, saves me work… UNIX as an IDE is incredibly satisfying to use. Someone shared with me a lovely Unix koan:

                                                                                                                                                    Master Foo once said to a visiting programmer: “There is more Unix-nature in one line of shell script than there is in ten thousand lines of C.”

                                                                                                                                                    The programmer, who was very proud of his mastery of C, said: “How can this be? C is the language in which the very kernel of Unix is implemented!”

                                                                                                                                                    Master Foo replied: “That is so. Nevertheless, there is more Unix-nature in one line of shell script than there is in ten thousand lines of C.”

                                                                                                                                                    The programmer grew distressed. “But through the C language we experience the enlightenment of the Patriarch Ritchie! We become as one with the operating system and the machine, reaping matchless performance!”

                                                                                                                                                    Master Foo replied: “All that you say is true. But there is still more Unix-nature in one line of shell script than there is in ten thousand lines of C.”

                                                                                                                                                    The programmer scoffed at Master Foo and rose to depart. But Master Foo nodded to his student Nubi, who wrote a line of shell script on a nearby whiteboard, and said: “Master programmer, consider this pipeline. Implemented in pure C, would it not span ten thousand lines?”

                                                                                                                                                    The programmer muttered through his beard, contemplating what Nubi had written. Finally he agreed that it was so.

                                                                                                                                                    “And how many hours would you require to implement and debug that C program?” asked Nubi.

                                                                                                                                                    “Many,” admitted the visiting programmer. “But only a fool would spend the time to do that when so many more worthy tasks await him.”

                                                                                                                                                    “And who better understands the Unix-nature?” Master Foo asked. “Is it he who writes the ten thousand lines, or he who, perceiving the emptiness of the task, gains merit by not coding?”

                                                                                                                                                    Upon hearing this, the programmer was enlightened.

                                                                                                                                                    1. 7

                                                                                                                                                      I’ve been going in the other direction. I now use libraries like lens and lens-aeson inside of ghci more and more often. These provide a consistent API for whatever data I’m working with.

                                                                                                                                                      I can work with YAML, TOML, JSON, CSV, XML, etc. with the exact same functions. No need to learn separate tools like jq, yq, xmlstarlet, etc.

                                                                                                                                                      1. 1

                                                                                                                                                        This is my wish too. From time to time I reach to ghci, but I have not been able to create my toolbox that would allow me to replace shell for the daily operations (navigating directories, copying, etc., good history). But learning Haskell is a journey on itself. Maybe I should start refactoring some of the shell scripts with Haskell to get more used to it?

                                                                                                                                                        What libraries can you suggest for the mundane unix tasks, like navigating directories, operating on files, using other processes? I remember that I tried some in the past years, but none convinced me.

                                                                                                                                                        1. 2

                                                                                                                                                          I usually just use the directory, process, etc. packages. They’re a bit non-ergonomical for a shell but I’m familiar enough with them.

                                                                                                                                                          I’ve used the Turtle package, it’s probably a better tool for the things I do and I should probably use it more often.

                                                                                                                                                      2. 4

                                                                                                                                                        What I’ve observed is that sometimes these shell pipelines are only executed once, and then writing them saves a lot of time. But sometimes someone wants to run them again. Or even a few times. Occasionally, someone puts them in a crontab somewhere to be run repeatedly until further notice. And then maybe, just maybe, they become part of a production system and they are supposed to run in various environments.

                                                                                                                                                        At some point along this chain of events, it would have been easier from a maintenance perspective to just write the program to begin with. But I’m still terrible at judging ahead of time whether my shell pipelines are truly one-off tasks, or if they’re going to grow into a part of the production system.

                                                                                                                                                        “But why don’t you just write the shell pipeline first, and then port it to a proper program at some point in the future?” I hear you ask. That’s a perfectly valid solution. It’s also a really boring task to translate a shell pipeline into a program – all the fun of the implementation design work has already been done, and what’s left is a grudging mechanical task of translation. So I tend to put it off…

                                                                                                                                                        1. 9

                                                                                                                                                          At some point along this chain of events, it would have been easier from a maintenance perspective to just write the program to begin with. But I’m still terrible at judging ahead of time whether my shell pipelines are truly one-off tasks, or if they’re going to grow into a part of the production system.

                                                                                                                                                          If you truly believe shell pipelines do not have a place in your production system, go talk to one of your systems administrators/ops/SREs or whatever you call them - your eyes will be opened ;-)

                                                                                                                                                          Alternative approach for you to consider: why is a shell pipeline not a program?

                                                                                                                                                          1. 7

                                                                                                                                                            At some point along this chain of events, it would have been easier from a maintenance perspective to just write the program to begin with.

                                                                                                                                                            But, would it?

                                                                                                                                                            What stops a well-written pipeline from being run again, by a cron or any other program?

                                                                                                                                                            1. 2

                                                                                                                                                              True, I missed some important context in my comment: I am the only person in the company who is anywhere near capable of writing something that approaches a well-written pipeline. I guess, now that I write it out, though, there’s an argument to it being the more efficient long-term strategy to train everyone else in shell scripting, though.

                                                                                                                                                        1. 8

                                                                                                                                                          The comment field there doesn’t permit editting and correcting typos…..

                                                                                                                                                          So let me try again here…

                                                                                                                                                          In a galaxy far far away….

                                                                                                                                                          Larry Wall wondered why he needed to learn 3 pretty bad languages, sh, awk, sed…., and devised perl as the Grand Unifying Language.

                                                                                                                                                          Perl sadly borrowed too much from it’s inspirations, and wasn’t much more readable.

                                                                                                                                                          Then Matz came along and resolved to borrow the best from perl and scheme and ….. and make something more powerful than them all, yet more readable.

                                                                                                                                                          It’s called Ruby.

                                                                                                                                                          And yes, you can do everything in Ruby, in one line if you must, that you can do in bash, awk, sed, jq, perl…. but in a more powerful and maintainable form.

                                                                                                                                                          All this has been available for decades, why are we (still) bashing (pun intended) our heads against the Lowest Common Denominator?

                                                                                                                                                          1. 8

                                                                                                                                                            serious question: what does “doing some awk in Ruby” look like? This might be a pretty big motivator for me to finally figure out Ruby for scripting (I’m more of a Python guy myself but awk works nicely for small scripts on line-oriented stuff when I want a one-liner)

                                                                                                                                                            1. 8

                                                                                                                                                              Compare:

                                                                                                                                                              # Official way of naming Go-related things:
                                                                                                                                                              $ grep -i ^go /usr/share/dict/* | cut -d: -f2 | sort -R | head -n1
                                                                                                                                                              goldfish
                                                                                                                                                              

                                                                                                                                                              Versus Ruby:

                                                                                                                                                              puts(Dir['/usr/share/dict/*-english'].map do |f|
                                                                                                                                                                File.open(f)
                                                                                                                                                                  .readlines
                                                                                                                                                                  .select { |l| l[0..1].downcase == 'go' }
                                                                                                                                                              end.flatten.sample.chomp)
                                                                                                                                                              

                                                                                                                                                              Simple example, but I think it demonstrates that doing various basic and common tasks are quite a bit more complex to do in Ruby than in the shell.

                                                                                                                                                              That doesn’t mean I’m always in favour of shell scripts – I got that example from an article I wrote saying you shouldn’t use shell scripts – but there are definitely reasons shell scripting persists, even though we have things like Perl and Ruby.

                                                                                                                                                              In that article I wrote “I regret writing most shell scripts [..] and my 2018 new year’s pledge will be to not write any more”. I’ve mostly failed at that new years’ pledge, and have happily continued shelling about. I have started rewritting shell script prototypes to other languages at the first sign of getting hairy though, and that seems like a middle ground that is working well for me (I should update/ammend that article).

                                                                                                                                                              1. 5

                                                                                                                                                                To be fair, it looks like most of the additional complexity in the Ruby code comes from reading files: the first command in the pipeline, grep -i ^re glob, is what becomes

                                                                                                                                                                Dir[glob].map do |f|
                                                                                                                                                                  File.open(f)
                                                                                                                                                                    .readlines
                                                                                                                                                                    .select { |l| l[0..1].downcase == re }
                                                                                                                                                                end.flatten
                                                                                                                                                                

                                                                                                                                                                The rest of the script contributes very little to the Ruby code.

                                                                                                                                                                I suspect this is a recurring theme when trying to replace shell pipelines with programs. Only Perl avoids some of this additional complexity for reading files, I think.

                                                                                                                                                                1. 5
                                                                                                                                                                  puts Dir['/usr/share/dict/*-english'].
                                                                                                                                                                    flat_map { |f| File.readlines(f).grep(/^go/i) }.
                                                                                                                                                                    sample
                                                                                                                                                                  
                                                                                                                                                                  1. 6

                                                                                                                                                                    At least with Ruby I don’t have to constantly cross-reference the man page and my cargo-culted knowledge of Unix’s multitude text manipulation DSLs, all unlike. It’s pretty obvious what it’s doing.

                                                                                                                                                                    1. 1

                                                                                                                                                                      Actually you used very little shell there in your first example.

                                                                                                                                                                      You also used grep, cut, sort and head.

                                                                                                                                                                      Why do you assume the backtick operator and the | operator for io doesn’t exist in ruby? In fact why do people assume shell and jq do not exist if you use ruby?

                                                                                                                                                                      Personally I tend to reduce the number of tools involved to reduce the cognitive load of needing to understand each tool to understand the one liner.

                                                                                                                                                                      I balance that against considerations like going IO.read(”|sort -u fileName”) can be a huge performance boost

                                                                                                                                                                      Anyhoo… some examples of ruby onliners

                                                                                                                                                                      http://reference.jumpingmonkey.org/programming_languages/ruby/ruby-one-liners.html

                                                                                                                                                                    2. 7

                                                                                                                                                                      Because code in sed or awk that worked a decade ago (or, hell, two years) still works. Ruby code seems to bit rot faster than any other language I’ve use for nontrivial work.

                                                                                                                                                                      Also, with awk, I could put it down for a year, then use it again, and everything I’d need to be productive fits in a small man page. (The same seems to be true of sed, though I don’t use it much.) The Ruby ecosystem moves a lot faster, and if you haven’t been following it closely, catching up will add extra work. (Whether it’s actually going anywhere is neither here nor there.)

                                                                                                                                                                      Yes, awk is a more limited language, but that’s a trade-off – there are upsides, and I know which I’d prefer.

                                                                                                                                                                      1. 1

                                                                                                                                                                        Not true.

                                                                                                                                                                        The awk scripts I wrote decades ago with in Solaris awk which is not quite the same thing as gnu awk.

                                                                                                                                                                        Well thought out growth in a language is good.

                                                                                                                                                                        I find the maintenance burden in ruby rolling forward with language versions is very low.

                                                                                                                                                                        Doubly so since rubocop will often autocorrect stuff.

                                                                                                                                                                      2. 6

                                                                                                                                                                        I don’t know Ruby. But for me these are the reasons why I am writing more and more bash programs:

                                                                                                                                                                        • Bash is my command line. So I am doing a lot of small steps, file modifications, comparing, searching analysing. At some point I can see that some of the steps can be composed and I pull them out of the history, try them out on the console and at some point put them into a script. If Ruby would have a REPL in which I can do all the operations that I am doing on the command line with less typing and more comfort, I would maybe give it a try.

                                                                                                                                                                        • Bash is on every Linux box. Ruby is not.

                                                                                                                                                                        1. 4

                                                                                                                                                                          Ruby does have a REPL. It’s called IRB and it comes with every Ruby installation. I use it exactly as you describe, for composing small programs iteratively.

                                                                                                                                                                          1. 1

                                                                                                                                                                            Are you using the Ruby REPL as your daily all-time console or just when you have in mind to create a program? I am asking honestly because I do not know anything about Ruby or their REPL and I am quite interested how good this REPL is as a replacement for the daily life?

                                                                                                                                                                            My point is that shell scripts are a by-product of using the shell for doing manual tasks. And I get better and better at my shell usage, and even after 20 years of shell usage I am still discovering new features or ways to do something in a more efficient way. While the shell language is really ugly, but being very succinct plus the composition of unix commands, the history, the prompt customization, the possibility to have vi mode for editing (and I probably forgot a lot of features), all this makes using shell such an efficient tool.

                                                                                                                                                                            1. 2

                                                                                                                                                                              Well, no, not as my daily shell. I dislike shell scripting enough that I switch to Ruby pretty quickly if I’m having to spend any amount of time or effort on a task, but it’s not meant to be a replacement for bash/zsh/fish.

                                                                                                                                                                          2. 3

                                                                                                                                                                            Bash is on every Linux box. Ruby is not.

                                                                                                                                                                            Let’s not limit ourselves here. For those not using Bash and/or Linux, how about this:

                                                                                                                                                                            • Bourne-compatible $SHELL is on every Unix box. Ruby is not.
                                                                                                                                                                            1. 2

                                                                                                                                                                              Bash is on every Linux box. Ruby is not.

                                                                                                                                                                              So is ed.

                                                                                                                                                                              However sudo apt install ruby solves that problem.

                                                                                                                                                                              And yes, ruby does have a REPL.

                                                                                                                                                                              1. 2

                                                                                                                                                                                apt: command not found.

                                                                                                                                                                                sudo: permission denied

                                                                                                                                                                                $

                                                                                                                                                                                1. 2

                                                                                                                                                                                  Have fun with ed then, it’s the Standard!

                                                                                                                                                                                  https://www.gnu.org/fun/jokes/ed-msg.html

                                                                                                                                                                                  1. 1

                                                                                                                                                                                    I have written scripts in ed before to do some sufficiently tricky text manipulation. It’s a good tool.

                                                                                                                                                                            2. 5

                                                                                                                                                                              Mostly, because picking up enough jq, awk and sed to be useful is faster than learning the ins and outs of Ruby?

                                                                                                                                                                              I suppose you could make a similar argument about learning Ruby one-liners, but by the time I’m writing a very long bash script, I’m probably writing a larger program anyway, either in Go or Python. Ruby as a language doesn’t have much appeal to me, at least at the moment.

                                                                                                                                                                              Awk, at least, fits very nicely into a small space right next to regex. jq is a bit fiddilier to pick up, but very nice for basic stuff. Sed, I still don’t have down very well, but also is nicely regex adjacent.

                                                                                                                                                                              1. 3

                                                                                                                                                                                I regularly write sed one liners to do refactorings on my Ruby code. Usually the sed call is fed by the result of grep or find. I could write a Ruby one liner to do the same, but it would be a much longer line and escaping would be much more difficult. Ruby is simply not a replacement for the convenience of sed.

                                                                                                                                                                                And maintainability is a red herring here: the whole point of something like sed is that you use it for one-off commands.

                                                                                                                                                                                1. 2

                                                                                                                                                                                  I’m not that experienced with jq, but when it comes to awk (and sed), one of their benefits is that you can easily write a program in the shell, since they act as glue between pipe operations.

                                                                                                                                                                                  For example, to filter out all lines that have less than 4 characters, all you have to write is

                                                                                                                                                                                  ... | awk 'length >= 5' | ...
                                                                                                                                                                                  

                                                                                                                                                                                  no imports or types required. It was made for stuff like this, which makes it easy to use. I’ve only read a book about Ruby a few years ago, but to process stdin/out this was should require a bit more overhead, shouldn’t it?

                                                                                                                                                                                  1. 1

                                                                                                                                                                                    One part of your history lesson is missing: Paul McCarthy and Steve Russell saw what was going to happen and pre-emptively invented Lisp. And yes, you can do everything in Lisp, in one line if you must, that you can do in bash, awk, sed, jq, perl… but in a more powerful and maintainable form.

                                                                                                                                                                                    ;)

                                                                                                                                                                                    1. 2

                                                                                                                                                                                      s/Paul/John/

                                                                                                                                                                                      This gotta be one of my most common brainarts…

                                                                                                                                                                                      1. 2

                                                                                                                                                                                        It was Yoko’s fault.

                                                                                                                                                                                      2. 1

                                                                                                                                                                                        Ruby equivalents of the basic awk and sed examples from the article, as examples of Ruby one-liner structure:

                                                                                                                                                                                        • AWK: awk '{print $1}' logs.txt
                                                                                                                                                                                          • Ruby: cat logs.txt | ruby -ne 'puts $_.split[0]'
                                                                                                                                                                                          • Ruby: cat logs.txt | ruby -ane 'puts $F[0]'
                                                                                                                                                                                        • sed: sed 's/^[^ ]*//' logs.txt |sed 's/"[^"]*"$//'
                                                                                                                                                                                          • Ruby: cat logs.txt | ruby -ne 'puts $_.gsub(/^[^ ]*/, "").gsub(/"[^"]*"$/, "")'
                                                                                                                                                                                      1. 3

                                                                                                                                                                                        ctrl-A and ctrl-X Increments and decrements the next number in the line

                                                                                                                                                                                        A good idea is to remap these to - and +. It’s normal mode, what else should these keys do? :)

                                                                                                                                                                                        1. 1

                                                                                                                                                                                          I like this idea, but I think this key combination is reserved for zooming, on my machine even in the terminal application and in the web-browser.

                                                                                                                                                                                          1. 1

                                                                                                                                                                                            Go to the beginning of the next or previous line, of course.