1. 4

    Edited: I judged too harshly. Retracted my previous criticism.

    1. 6

      It does look like an ad for teleport, but I would not reduce it to “just”.

      1. 4

        how so? i found it a very good general discussion of the ux issues around exposing a terminal in a web browser.

        1. 3

          On the contrary, I was just about to comment that this post is a really good example showing courage talking about the internal decisions made in a company on a product.

        1. 5

          I can’t emphasize enough just how critical this project is for adoption of Rust in companies with large C++ codebases: this either requires a CTO-level buy-in into Rust and a serious commitment, or projects like this to prove that Rust can be a good citizen of an existing C++ codebase. C FFIs are okay for isolated projects/libraries, but the usual enterprise code is rarely organized like this and interfacing C++ and Rust via C is not a good fit.

          If you want Rust to succeed in replacing C++, this is absolutely a project to contribute to.

          1. 4

            a scruffy professor telling about touring machines or something”

            — that was good!

            1. 1

              Another edition? I love rust, but this is starting to feel silly…

              1. 15

                What? This is only the second edition. Did you think there was only going to be one edition?

                1. 14

                  This one will also be coming out 3 years from the last one. Hardly seems too fast; this is the same pace for new versions as C++.

                  1. 6

                    C++ is a famously overcluttered language. I’d prefer not using it as a complexity benchmark for rust. However I also don’t object to the three year edition cycle.

                    1. 1

                      3 years is likely not even enough time for some large peojects to migrate to the previous edition. With a 3-year cycle there will be pressure for any large rust user to effectively constantly be updating. 10 years would be a better minimum.

                      1. 8

                        Edition migration is a job for one afternoon, not 10 years. It’s trivial. If your code compiled without warnings, it’s just a matter of prefixing a few things with r# or crate::. cargo fix does it automatically anyway.

                        Don’t confuse edition-incompatible changes (which are few and small) with adopting new functionality added over these years. You don’t need to rewrite your code to use fancy new async/await just to enable a new compilation mode.

                        Also it’s not an all-or-nothing switch for entire projects and their dependencies, like Python 2->3 was. Rust allows mixing editions, so you update one crate at a time.

                    2. 1

                      No, but I hoped for a more reasonable speed to introduce breaking changes. It’s only 3 years since the last one!

                      1. 3

                        From https://doc.rust-lang.org/edition-guide/editions/index.html:

                        When a new edition becomes available in the compiler, crates must explicitly opt in to it to take full advantage. This opt in enables editions to contain incompatible changes, like adding a new keyword that might conflict with identifiers in code, or turning warnings into errors. A Rust compiler will support all editions that existed prior to the compiler’s release, and can link crates of any supported editions together.

                        In other words, there are no breaking changes, there are changes one can opt in to enjoy the new shiny things, and the editions are promised to be supported and compatible “indefinitely”.

                        1. 1

                          enales editions to contain incompatible changes

                          It’s right there in the quote :) Yes, a large project could stay on the previous edition, but there will be pressure to move, and more pressure as each new edition comes out and they are perceived as being “further and further behind”. This is the same thing that happens with other opt-in language version ecosystems (haksell, C, C++, probably others)

                          1. 7

                            The pressure to stay up to date with language idioms and latest dependencies is real, but it’s a separate thing from enabling an edition flag in Rust.

                            Editions are different from C++ versions! In C++ new features aren’t added to old versions, e.g. C++98 is frozen in time and hasn’t changed since 1998. This is not the case in Rust! Every new feature in Rust is added to all past Rust editions, as long as it’s possible (i.e. it doesn’t require a keyword added later). For example, the new 2021 const generics feature that will be released this month is available in the Rust 2015 edition.

                            Rust editions are closer to "use strict"; in JS. There’s one Rust with one feature set, one standard library, and there are different levels of warnings vs errors and minor syntax variations to keep old programs compile with no changes.

                            Switching on Rust 2018 edition just boils down to tiny things like “I’m not using async or try keywords for my functions/variables”. You just rename variables if you did, and off you go!

                  1. 3

                    This is a valiant effort, but I am very pessimistic as long as Linux is written in C. I am also skeptical building Linux with LLVM will help much: it will help, but not much.

                    1. 3

                      Vulnerabilities will always exist regardless of choice of language.

                      1. 0

                        As soon as you port from C or C++ to Rust, much less vulnerabilities will exist. Much less than playing with LLVM build.

                        1. 1

                          The devil in the details: “as soon as” here means more than a decade even for a very intensive rewrite of Linux in anything else.

                    1. 6

                      I did not participate in FOSDEM actively, but one thing I learned from it that Matrix and Element are really good these days. I remember looking at riot.im two years ago and being underwhelmed; now I am impressed, it feel like a wonderful evolution of opensource messaging, teeming with life, seamlessly bridging Freenode and gitter.im; it’s like I am back to the wonderful days of Jabber.

                      1. 1

                        I might be curmudgeon here, but I don’t think this project will live for long. The power of emacs comes from the ability to modify everything using a simple language - lisp. Adding javascript just increases complexity (ok, performance might be better) and adds an inconsistent language to the mix.

                        1. 4

                          I’ll be very interested to see how far it goes. If there is one language that could be used in a project like this and survive, Javascript would be it.

                          1. 3

                            Legitimately curious what would prevent a lisp to underlying typescript / javascript interpreter to interop with existing elisp.

                            1. 3

                              I tried writing a simple elisp-to-js transpiler some time ago, it’s doable and mostly straightforward (macros are tricky, but doable). I lost interest after I’d gotten the basics working (functions, variables, simple data types, macros). Maybe I should go back to that little project, it was fun.

                              1. 1

                                Sounds fun and like it could have practical/research benefit to this project!

                              2. 1

                                Same, it’s something I’ve been wondering for a long time. I genuinely don’t understand why, as an example, why can’t I use python’s regex functions (which I already know) to manipulate strings. Apart from performance concerns, of course, but often there aren’t any.

                                For me, the power of emacs usually comes from

                                • advice system
                                • scoping rules: sucks that lexical scope isn’t the default, but it’s nice when you can easily override any global variable over the course of a single function call
                                • integration with the runtime, e.g. REPL, evaluating arbitrary form in the file, being able to jump to any function source, etc.

                                I don’t see why this all can’t be achieved in other languages (but I’m happy to be convinced!).

                            1. 6

                              This person’s stream-of-consciousness style drives me crazy 🤯. I’m not going to spend 20 minutes reading a post with no idea what the point is eventually going to be. I reserve that for writers like Thomas Pynchon, whom I will allow to digress for 20 pages about Turkic languages, or colonialism in South Africa, because he’s Thomas Fscking Pynchon. This guy, I’m not so sure.

                              He had a decent, if obvious, point about compile-time checks saving time over the program’s lifecycle, but then I gave up after a few screenfuls about SSH attacks and HTTP parsing. I assume that’s going to tie into his grand theme later on, but I’d rather go find something that doesn’t bury the lede in a coal mine.

                              Plus, his cartoon asides aren’t as funny as Why The Lucky Stiff’s were.

                              1. 6

                                Style and comics are a matter of taste, I will give you that. What remains, to me, is that Amos lists three statements:

                                1. Programming in Rust requires you to think differently
                                2. It is harder to write any code at all in Rust
                                3. It is easier to write “correct” code in Rust

                                Most of the article is centered on proving the last point, by going through a lengthy example with NodeJS and Go only to come back to Rust in the end to demonstrate said point. I for one appreciate long form, in-depth content like this.

                                1. 1

                                  Programming in Rust requires you to think differently

                                  I hate this, because it presumes a lot about how the reader thinks in the first place. And if the author of such a statement is going to make sweeping assumptions about how I think, then they’re likely to make sweeping assumptions elsewhere too.

                                  1. 0

                                    If they’re aiming at correctness, a language like Haskell is better; Rust is, ultimately, a compromise with correctness and speed on current machines, and it dumps correctness too often to be safe in any real sense. Rust’s focus on “types” which merely specify the size in bits (making them fail to be types, as they do nothing to catch invalid operations) proves that it’s a systems programming language, as opposed to one with any emphasis on correctness beyond the superficial.

                                    1. 12

                                      Rust’s focus on “types” which merely specify the size in bits (making them fail to be types, as they do nothing to catch invalid operations)

                                      Um, where did you hear this? That’s not right at all.

                                      1. 6

                                        Did you read the article?

                                        1. 4

                                          It took me a bit to understand where you might be standing (and unflag the comment).

                                          Still, I vehemently disagree: Rust type system might be weaker than Haskell’s (yes, no HKT and any way to do monads, no tons of advanced type system extensions), but it’s so much more than just specifying bit sizes. It’s the first time we have ADTs in a semi-mainstream systems programming; traits are direct translation of Haskell typeclasses. Borrow checker is a unique development that only now finds its way into GHC in much more experimental form.

                                          Haskell, on the other hand, has quite many non-typesafe footguns like non-exhaustive pattern matching, partial functions, laziness (which is not reflected in the type system at all, poisons it with bottom everywhere, explodes into space leaks, sneaks in lazy I/O, makes reasoning about runtime a nightmare) and still not expressive enough to avoid occasional Template Haskell and verify monad laws mechanically.

                                      2. 4

                                        every one of his articles reads like this. I’m digging it for like a page or two and then he goes into some extremely specific corner case that just reads like he found a bug and he has a massive axe to grind about it. I want to like his writing more than I do, he seems to be pointed in the right direction.

                                      1. 19

                                        Plan 9 is the peak of operating system design, so let’s assume that’s the basis. Anyone who is thinking about OS design and who hasn’t used Plan 9 has insufficient credentials for the job. Let’s not fork the code, though, the LPL is bad. Also, a micro- or hybrid-kernel is the right call in $CURRENTYEAR.

                                        Per-process namespaces, union filesystems, and the kernel interface should come along for the ride. 9P is good for network transparency, and that would be wise to keep, but it also bears acknowleding that shared computing has lost and making some design changes to improve single-system performance. These features make Plan 9 do containers 10x better than any of the others (be it Linux, or even Solaris or BSD), at 1/10th the complexity, and with much more utility.

                                        Factotum is good, but let’s expand the concept. Require FDE, prompt for a username and password on early boot, use it to decrypt the disks, and also use it to handle login and opening up a keyring. It would also be nice to expand Factotum with separate agents which grok various protocols, perform authentication for them, and hand the connection off to a client - so that they never have to handle the user’s secrets at all. ndb is also my preferred way of configuring networks - and the file format is a vast well of untapped potential for other applications - but it should be modernized to deal with ad-hoc networks more easily. Planned networks should come back into style, though.

                                        The rc shell is excellent, but can be streamlined a bit. The C API is not great - something a little bit closer to POSIX (with the opportunity to throw a bunch of shit out, refactor like mad, fill in some gaps, etc) would be better. The acid debugger has amazing ideas which have had vanishingly little penetration into the rest of the debugger market, and that ought to be corrected. gdb, strace, and dtrace could all be the same tool, and its core could be just hundreds of lines of code. The marraige of dtrace and acid alone would probably make an OS the most compelling choice in the market.

                                        ZFS is really good and something like it should be involved, and perhaps generalized and expanded upon.

                                        Introduce a package manager - my favorite is apk but if you can come up with a much simpler approach to what nix and guix are trying to do, they it might be a good idea.

                                        The graphical-first, mouse-driven paradigm has been shown to be a poor design in my experience. Ditch it. Make the command line and keyboard operation first class again, and then let’s try something other than rio/acme/etc for the UI. Neat experiements but not good in practice.

                                        1. 3

                                          @ac tries to “come up with a much simpler approach to what nix and guix are trying to do” with janet-based hermes & hpkgs.

                                          1. 3

                                            If your new OS needs a package manager, your new OS is broken by design. Package managers are made to handle accidental complexity which the original design failed to handle.

                                            1. 13

                                              What? You’ll have to expand on that a bit.

                                              1. 2

                                                I think that the comment above is coming from the macOS point of view: “just unpack a new app into /Applications/”. It might make sense to package end-user applications in another way than system-wide libraries and utilities.

                                                Dependencies and reusing system libraries are definitely not accidental complexity in my view.

                                                1. 2

                                                  macOS is able to do this only because it bundles gigabytes of libraries with the OS. It gets away without a package manager by, effectively, putting most of the dependencies for all applications in a single package that’s updated centrally. Even then, it doesn’t work so well for suites of applications that want to share libraries or services, which end up providing their own complex updater programs that manage library dependencies for that specific program.

                                                2. 1

                                                  On the contrary, I think you need to expand on why a package manager is needed, then identify what problems it solves, and then eliminate those problems. Running and installing programs is an integral task of an operating system.

                                                3. 4

                                                  Package managers allows users to install curated packages, while also (in an ideal world) never mess up the system. These are nice properties.

                                                4. 2

                                                  How about FIDO keys instead of U/Ps?

                                                  1. 2

                                                    I like that you can store a U/P in your brain (for the master password, at least), and I don’t like that FIDO can be stolen. But a mix would be cool.

                                                  2. 1

                                                    I’m not sure about your idea of dropping graphics and the mouse. Visual selection plus keyboard shortcuts seem like an annoying way to run/plumb. Or maybe you just want Vim?

                                                    1. 3

                                                      I don’t want to drop graphics and the mouse – I want to drop Plan 9’s single minded obsession with graphics and the mouse being the only way to use the system.

                                                  1. 1

                                                    Vim lost me with its blue text on black default rendering it impossible to read. Nice default.

                                                    Also going to the line I was last in a file is annoying as hell, along with its inane idea that inserting a line after a comment should also be a comment.

                                                    It’s clearly tailored made to annoy me so much that I end up having to give up and just build a more sane BSD no stupid “feature” vi that just is vi.

                                                    1. 1

                                                      Sounds like you have an awful distribution, upstream vim out of the box isn’t that bad at all (though terminal colors can be iffy regardless). But I actually do think this is a good counterpoint to the claim that it can always be used anywhere… I frequently ssh into computers and find godawful vim configurations that just frustrate me.

                                                      That auto-comment thing is one I commonly seen turned on that just drives me nuts.

                                                      I like my vim, other people’s vims tend to be broken trash. Frequently arrow keys randomly fail too, like come on, why would they set it up like that?

                                                      1. 1

                                                        Is this vim’s fault, though? set bg=dark would fix this instantly.

                                                        There is a disconnect between traditional terminal emulation world (vt100, terminal, etc) and the GUI world. There are no standard ways for a CLI application to figure out colors used by its GUI rendering. This causes quite a lot of user experience problems, indeed.

                                                      1. 3

                                                        Oh, this is very nice. Missing the newer micro-frsmeworks built around the new/standardized async - like fastapi or sanic.

                                                        I feel my frustration around a flask project at work is somewhat vindicated :/

                                                        1. 3

                                                          +1. I did enjoy proper async in Sanic instead of the mess of handler methods called in who-knows-what order in Tornado. The author’s interpretation of “some boilerplate” in Tornado was rather charitable.

                                                        1. 66

                                                          It would be terrible if this repo was replicated across the Internet.

                                                          Remember kids, don’t copy that floppy!

                                                          1. 12
                                                            1. 10

                                                              Codeberg is hosted in Germany, I wouldn’t count on this repo staying up.

                                                              1. 2

                                                                I wouldn’t be so sure. In germany youtube-dl would probably be seen as a tool for making a personal copy, which is allowed in germany.

                                                                1. 2

                                                                  Historically, why is Germany so anal about copyright compared to other countries?

                                                                  1. 10

                                                                    Easy money. Cease & Desist letter in Germany come with a fine attached if you follow them. You need to pay the the lawyers fees (~800+ EUR). This is pretty unique. So there’s “cease & desist” mills who trawl people, e.g. off bittorrent and all other networks. This means that some of these cases will end up at a court. As all things digital have no place of service or occurrence, the filing side can pick any court to go to. Which is usually Hamburg or Cologne, which tend to be the most eager to stretch the law to the rights holder side. But that’s actually not the process intended, what they want is people to be frightened and paying the lawyers fee on the first letter. They will even lower it if you even look at them like you might defend yourself.

                                                                    1. 2

                                                                      They, like the US, have a highly information based economy. If you can put a value amount on copyright, you can put a penalty amount. You can decide if a lawyer is worth it. You can make laws about it.

                                                                      1. 2

                                                                        Germany somehow seems even worse than the US though, despite producing less media so I don’t understand that discrepancy

                                                                        1. 2

                                                                          This is unsurprising. Most international media companies here are basically “importing” and rarely “exporting”, which means (distribution) licensing, so all their local orgs have a high focus on rights management and lobbying for better terms. And if you have staff lawyers around all the time… you might as well use them?

                                                                      2. 1

                                                                        I think that whole sentence can be substituted as “Historically, why is Germany so anal about everything compared to other countries?”…

                                                                        (Just kidding, sorry Germans!)

                                                                        1. 1

                                                                          Der Freud wegen.

                                                                    2. 7

                                                                      The original repo is on the WayBack machine 1400 times with the most recent being 5 days ago. If you’re forking just to a copy lives around and not to continue development, I would just snag it there to be sure of the source.

                                                                      Edited to make it clear I was linking to the original repo.

                                                                      1. 3

                                                                        https://github.com/plredmond/yt-download

                                                                        please do not click the fork button

                                                                        1. 1

                                                                          Do not save copies outside Github as well. Wait until Github takes down all the forks in one go (it’s not hard to do technically).

                                                                      1. 4

                                                                        Isn’t the complicated code in GNU libc related to locales, see: https://github.com/gliderlabs/docker-alpine/issues/144, https://www.openwall.com/lists/musl/2014/08/01/1.

                                                                        Maybe someone with more knowledge can comment.

                                                                        1. 11

                                                                          The funny thing is that isalnum cannot take Unicode codepoints, according to its specification. What’s the point of using locales for unsigned char? 8-bit encodings? But I’d be surprised if it handled cyrillic in e.g. CP1251.

                                                                          Update: Ok, I am surprised. After generating a uk_UA.cp1251 locale, isalnum actually handles cyrillic:

                                                                          $ cat > isalnum.c
                                                                          #include <ctype.h>
                                                                          #include <stdio.h>
                                                                          #include <locale.h>
                                                                          
                                                                          int main() {
                                                                              setlocale(LC_ALL, "uk_UA.cp1251");
                                                                              printf("isalnum = %s\n", isalnum('\xE0') ? "true" : "false");
                                                                          }
                                                                          
                                                                          $ sudo localedef -f CP1251 -i uk_UA uk_UA.cp1251
                                                                          $ gcc isalnum.c && ./a.out
                                                                          isalnum = true
                                                                          

                                                                          On the practical side, I have never used a non-utf8 locale on Linux.

                                                                          1. 17

                                                                            I guess you haven’t been around in the 90ies then, which incidentally also is likely when the locale related complexity was introduced in glibc.

                                                                            Correctness often requires complexity.

                                                                            When you are privileged to only needing to deal with english input to software running on servers inter US, then, yes, all the complexity of glibc might feel over the top and unnecessary. But keep in mind others might not share that privilege and while glibc provides a complicated implementation, at least it provides a working implementation that fixes your problem.

                                                                            musl does not.

                                                                            1. 10

                                                                              Correctness often requires complexity.

                                                                              In this case, it doesn’t – it just requires a single:

                                                                              _ctype_table = loaded_locale.ctypes;
                                                                              

                                                                              when loading up the locale, so that a lookup like this would work:

                                                                              #define	isalpha(c)\
                                                                                      (_ctype_table[(unsigned char)(c)]&(_ISupper|_ISlower))
                                                                              

                                                                              Often, correctness is used as an excuse for not understanding the problem, and mashing keys until things work.

                                                                              Note, this one-line version is almost the same as the GNU implementation, once you unwrap the macros and remove the complexity. The complexity isn’t buying performance, clarity, or functionality.

                                                                              1. 7

                                                                                Interestingly, this is exactly how OpenBSD’s libc does it, which I looked up after reading this article out of curiosity.

                                                                              2. 4

                                                                                Locale support in standard C stuff is always surprising to me. I just never expect anything like that at this level. My brain always assumes that for unicode anything you need a library like ICU, or a higher level language.

                                                                                1. 4

                                                                                  Glibc predates ICU (of a Sun ancestry) by 12 years.

                                                                                  1. 2

                                                                                    It probably does. A background project I’ve been working on is a Pascal 1973 Compiler (because the 1973 version is simple) and I played around with using the wide character functions in C99. When I use the wide character version (under Linux) I can see it loads /usr/lib/gconv/gconv-modules.cache and /usr/lib/locale/locale-archive. The problem I see with using this for a compiler though (which I need to write about) is ensuring a UTF-8 locale.

                                                                                  2. 2

                                                                                    you haven’t been around in the 90ies then

                                                                                    – true

                                                                                    only needing to deal with english input to software running on servers inter US

                                                                                    – I’ve been using and enjoying uk_UA.utf8 on my personal machines since 2010, that is my point

                                                                                    The rest of your assumptions does not apply to me in the slightest (“only English input”, “servers in the US”). I agree that I just missed the time when this functionality made sense.

                                                                                    Still, I think that the standard library of C feels like a wrong place to put internationalization into. It’s definitely a weird GNUism to try to write as much end-user software in C as possible (but again, it was not an unreasonable choice at the time).

                                                                                    1. 3

                                                                                      Locale support is part of the POSIX standard and the glibc support for locales was all there was back in the 90ies; ICU didn’t exist yet.

                                                                                      You can’t fault glibc for wanting to be conformant to POSIX, especially when there was no other implementation at the time

                                                                                      1. 4

                                                                                        Locale support is part of the POSIX standard

                                                                                        Locales are actually part of C89, though the standard says any locales other than "C" are implementation-defined. The C89 locale APIs are incredibly problematic because they are unaware of threads and a call to setlocale in one thread will alter the output of printf in another. POSIX2008 adopted Apple’s xlocale extension (and C++11 defined a very thin wrapper around POSIX2008 locales, though the libc++ implementation uses a locale_t per facet, which is definitely not the best implementation). These define two things, a per-thread locale and _l-suffixed variants of most standard-library functions that take an explicit locale (e.g. printf_l) so that you can track the locale along with some other data structure (e.g. a document) and not the thread.

                                                                                        1. 2

                                                                                          Oh. Very interesting. Thank you. I know about the very problematic per-process setlocale, but I thought this mess was part of posix, not C itself.

                                                                                          Still. This of course doesn’t change my argument that we shouldn’t be complaining about glibc being standards compliant, at least not when we’re comparing the complexities of two implementations when one is standard compliant and one isn’t.

                                                                                          Not doing work is always simpler and considerable speedier (though in this case it turns out that the simpler implementation is also much slower), but if not doing work also means skipping standards compliance, then doing work shouldn’t be counted as being a bad thing.

                                                                                          1. 3

                                                                                            I agree. Much as I dislike glibc for various reasons (the __block fiasco, the memcpy debacle, and the refusal to support static linking, for example), it tries incredibly hard to be standards compliant and to maintain backwards ABI compatibility. A lot of other projects could learn from it. I’ve come across a number of cases in musl where it half implements things (for example, __cxa_finalize is a no-op, so if you dlclose a library with musl then C++ destructors aren’t run, nor are any __attribute__((destructor)) C functions, so your program can end up in an interesting undefined state), yet the developers are very critical of other approaches.

                                                                                  3. 5

                                                                                    What’s the point of using locales for unsigned char? 8-bit encodings?

                                                                                    Yes, locales were intended for use with extended forms of ASCII.

                                                                                    They were a reasonable solution at the time and it’s not surprising that there are some issues 30 years later. The committee added locales so that the ANSI C standard could be adopted without any changes as an ISO standard. This cost them another year of work but eliminated many of the objections to reusing the ANSI standard.

                                                                                    If you’re curious about this topic then I would recommend P.J. Plauger’s The Standard C Library.
                                                                                    He discusses locales and the rest of a 1988 library implementation in detail.

                                                                                    1. 2

                                                                                      Yes, locales were intended for use with extended forms of ASCII.

                                                                                      Locales don’t just include character sets, they include things like sorting. French and German, for example, sort characters with accents differently. They also include number separators (e.g. ',' for thousands, '.' for decimals in English, ' ' for thousands, ',' for the decimal in French). Even if everyone is using UTF-8, 90% of the locale code is still necessary (though I believe that putting it in the core C/C++ standard libraries was a mistake).

                                                                                      1. 2

                                                                                        Date formatting is also different:

                                                                                        $ LC_ALL=en_US.UTF-8 date
                                                                                        Tue Sep 29 12:37:13 CEST 2020
                                                                                        
                                                                                        $ LC_ALL=de_DE.UTF-8 date
                                                                                        Di 29. Sep 12:37:15 CEST 2020
                                                                                        
                                                                                  4. 1

                                                                                    My post elsewhere explains a bit about how the glibc implementation works.

                                                                                  1. 39

                                                                                    It’s an interesting update, but I’d be happier simply forgetting it existed along with everyone else.

                                                                                    1. 10

                                                                                      My fear about this is that it can actually capture some programming domain and we’ll forget about this just to wake up with another PHP on our hands.

                                                                                      This is not a baseless fear: both V’s author and community are relentlessly and aggressively self-promotional, they push a narrative of a perfect language non-stop (just look at the comments below), even though its incoherence and empty promises have been easily demonstrated many times.

                                                                                      A bit of negative hype might be justified.

                                                                                      1. 6

                                                                                        Potentially valid, but it seems like the more effective way would be to make better tools and promote them better. I still am not yet aware of a good plug-and-play system that can do some of the things that PHP can quite as simply.

                                                                                        Case study: I made the Rust crate ggez because the main community game engine at the time was piston. piston was so damn bad that I wrote an entire game framework to avoid it, initially basically as an attempt to kill piston, and I’m pretty pleased at how it’s worked out. Could be better, but I don’t think I’m being too self-congratulating myself in saying that ggez is, if not the #1, then the #2 2D pure Rust game lib out there. I’d put a decent bet on it having more actual games and game experiments written using it than piston does. There are at least a couple Rust gamedev channels out there where the default answer to newbies saying “help I’m having trouble with piston” is people other than me saying “piston isn’t very good, why not try ggez?”… and these conversations invariably end with said newbie saying “wow, ggez is way nicer, thanks!”

                                                                                        Now don’t get me wrong, I used to really hate piston. I felt it was a turd fragment stuck to the toilet bowl of the universe and I acted accordingly. But honestly, that never got me anywhere good and I’m personally happier taking a step back from worrying about it. So, negative hype has its place, but is only one part of a tool set. If you do it well and objectively, as this post does, then it serves as a useful reference point for follow-up conversations; if someone asks “well why DO you hate piston” and isn’t satisfied with a basic answer I aim them at my attempt at objective assessment. But it is only one part of the conversation, and it’s not the first part to aim people at.

                                                                                        I’m hardly an expert on this, but so far these are my guidelines for gently smothering crap systems with better ones:

                                                                                        • Be the example you want. Make the alternative system better in every way. It’s useless to have superior code if your docs are crap, it’s useless to have superior docs if nobody can find them, it’s useless to have a well-organized project that nobody knows about.
                                                                                        • Market gently. Make announcements of major events in appropriate public places like reddit or mailing lists, but don’t try to overwhelm the hype of others. Just say “hey, we’re still here, doing cool stuff, here’s examples”. You don’t need to say “we’re better than X for these reasons”, just let your work speak for itself. Your stuff is inevitably cooler than your competition’s, and the people who matter will notice that.
                                                                                        • Never, ever, ever badmouth the competition. Never say “X is crap” or “Y is stupid”, even if it’s true. This just makes you look biased. People remember who you are, if they’re interested in this topic, and developing a reputation for having a personal hate-on for a system is not useful.
                                                                                        • Never compete with good systems. There are plenty of other 2D game libs out there for Rust, with various upsides and downsides. I’d happily recommend ggez over quicksilver for the things its better at, but quicksilver does useful things that ggez doesn’t as well. So I’m now buddies with the creators of most of these systems, we hang out and talk gamedev, and it’s usually useful for everyone.
                                                                                        • If you want to market one system to a user over another one, always do it invitingly. Never say “you should use X, Y is crap”, you say “why did you choose X over Y?” Often the answer is “X was the first thing I picked up” or “I have a use case that Y doesn’t solve”. If someone still chooses X over better alternatives, you shrug and walk away because they don’t know what they’re doing and their project is probably doomed to failure anyway.
                                                                                        • Target your effort strategically. I just checked crates.io, and piston has 160k downloads while ggez has 40k. Irksome, but why is this? Now, it looks like the main user of the piston crate is piston_window, which is used mainly by the plotters crate, which is a (novel and rather nice looking) plotting lib. plotters‘s main reverse dependency is criterion, which is an awesome benchmarking crate – so that makes total sense. criterion is awesome and well-known, it uses plotters, and so piston gets tons of downloads. But plotters isn’t a game engine, it presumably doesn’t really care about piston, it just wants a handy window it can draw on without too much fuss. ggez is not a good option for that, but it’s a perfectly reasonable thing to want to do. So, if I wanted to cut into piston‘s mindshare, the best way isn’t to update my objective assessment rant and point out all its piston‘s flaws, but rather to submit a PR to plotters saying “here’s a new window backend using something else, it has advantages A B and C over the current model”. If I can’t do that… then the problem isn’t that piston exists, the problem is that it’s the best solution for a valid use case.

                                                                                        Anyone with suggestions and tips on doing this better, please let me know.

                                                                                        You’re playing the long game here. But so far I’ve seen this sort of approach work pretty well, albeit on one data point in my fairly small and niche domain. You don’t have to murder the bad system, you just have to arrange it so that the better system is set up for inevitable success and then let things take their course. Which is hard; the world is full of WorseIsBetter systems that exist mainly because of external factors that set up the bad system for inevitable success. But young, open source projects that fill a specific need that isn’t being otherwise filled is probably the easiest context to do this in. As an analogy, Go didn’t succeed (only) because it was backed by Google, and node.js didn’t succeed (only) because it was written in Javascript. Both succeeded because, in 2010, the main alternatives for writing server backends were Python/Ruby, Java/C#, or C/C++. Go and Node filled a niche for high-performance, low-friction systems of a particular kind, that was not being served well by existing solutions.

                                                                                        1. -6

                                                                                          A bit of negative hype might be justified.

                                                                                          Misrepresenting the project and its impressive progress is not an appropriate way to do that! See my main comment: https://lobste.rs/s/nfjifq/v_update_june_2020#c_vuofat

                                                                                      1. 13

                                                                                        I really appreciate what Netsurf have accomplished: It’s nearly sufficient for my day-to-day browsing. But I do miss my userscripts, extensions, and most of all vi-style keybindings.


                                                                                        sent from netsurf 3.10, btw :)

                                                                                        1. 1

                                                                                          I really wish I could give it try but last time I used it you couldn’t even switch tabs with the keyboard. Have they added keyboard support in recent releases?

                                                                                          1. 15

                                                                                            Hi, thanks for trying our little browser. I did rework the GTK frontend for this release but there may still be missing some keyboard navigation shortcuts.

                                                                                            Always happy to receive feature requests in the https://bugs.netsurf-browser.org tracker

                                                                                            Please do remember there are only a handful of us developing the browser for seven toolkits across eleven operating systems (https://ci.netsurf-browser.org/jenkins/view/Categorized/job/netsurf/) so if we do not get to it quickly it is not we are not interested, just stretched a bit thin.

                                                                                            1. 1

                                                                                              Thanks; I’ll give the new version a look. It’s a shame it’s written in C because I might be interested in contributing otherwise. Anyway I wish you the best of luck.

                                                                                              1. 9

                                                                                                A shame? I think that’s part of the appeal.

                                                                                                1. 7

                                                                                                  Why is it a shame?

                                                                                                  For C++ and Rust, there’s established browser projects. This one is in C.

                                                                                                  1. 2

                                                                                                    I did apt install netsurf on Ubuntu 20.04 and tried to browse a bunch of my old-school sites (nothing fancy by modern standards: https, basic HTTP auth, some redirects here and there: gitea, static pages, etc) and netsurf segfaults on more pages than it is able to open.

                                                                                                    The authors did great job, but C is not really an option here.

                                                                                                    1. 2

                                                                                                      3.9 hasn’t segfaulted on me, not even once.

                                                                                                      Arch packages.

                                                                                                      I wouldn’t be surprised if Ubuntu’s packages were just stolen outright from Debian, and forced to run with incompatible linkages. I do not trust derivative distributions.

                                                                                                    2. -1

                                                                                                      It’s very difficult to trust a project written in an unsafe language that exists primary to view untrusted content, even when it has millions of dollars behind it. When the only contribution comes from unpaid enthusiasts it’s even more troubling. I can understand a huge company like Google or Microsoft erring on the side of conservative technology but if you’re a ragtag band going up against Goliath you’ve got to make better choices to have a chance at keeping up.

                                                                                                      Also I just don’t have that much free time, and I prefer to spend it coding in enjoyable languages. I wouldn’t code without a repl unless I was getting paid quite a lot.

                                                                                                      Edit: not to show any disrespect; I’m just giving my reasons for declining to contribute personally.

                                                                                                      1. 3

                                                                                                        The problem is there’s little else that’s widely available (particularly on the marginalized platforms that NetSurf courts) that’s as portable (for better or for worse, we live in Unix and C’s shadow) and performant. Not even Ada (as sibling comment says) can be trusted to be available.

                                                                                                        1. 4

                                                                                                          Also include the fact that they target RiscOS, Amiga, Haiku, etc and you’re even more limited, without doing some yak shaving to get language support.

                                                                                                          That all being said, the project is relatively small. An ambitious programmer with some free time could port it all to Rust, or D, or whatever other safe language for their platform of choice. And, since it’s already C, they could do it incrementally and have a working browser the whole time. Not saying that anyone should, mind you, but it would be pretty neat. :)

                                                                                                        2. 4

                                                                                                          It’s very difficult to trust a project written in an unsafe language that exists primary to view untrusted content

                                                                                                          People do more insane things, like using systems affected by the confused deputy problem, when seL4 exists.

                                                                                                          I frankly can’t blame netsurf for using a language that’s old, lightweight, well-understood and available for the platforms they target over, say, some immature experimental language that’s barely 5 years old and has very little in terms of successful projects made with it to show.

                                                                                                          Also I just don’t have that much free time, and I prefer to spend it coding in enjoyable languages. I wouldn’t code without a repl unless I was getting paid quite a lot.

                                                                                                          You’re right in doing whatever you want with your time, yet nobody has asked you to contribute, either.

                                                                                                          1. 2

                                                                                                            There’s a lighteight, widely supported, safe, and proven language. It’s called Ada. It’s also a part of GCC.

                                                                                                            1. 4

                                                                                                              Sure. If you want to write a browser in Ada, be my guest. But you’re surely not living in a bubble where you don’t know which language the parent was all about. It isn’t Ada, and there’s already a browser utilizing it, albeit only partially.

                                                                                                              Personally, I find the real problem to be the browser acting as TCB. The web standards have got so complex it is impossible in practice to write a safe browser.

                                                                                                              On a good design, exploiting a browser should yield no benefit. I would focus on that. Capabilities (such as implemented by seL4, the whitepaper of which is linked above) are a good building block to achieve that.

                                                                                              1. 1
                                                                                                • bin - a place to put scripts and dump .iso files and some installations like IntelliJ. Maybe it should be a symlink to .local/bin.
                                                                                                • code - contains my projects at the top level, also:
                                                                                                  • code/go - my $GOPATH
                                                                                                  • code/src - cloned git repositories, mostly from github, roughly organized by topic (lang, os, sys, util, etc)
                                                                                                • downloads as a messy dumping ground for files from the internet
                                                                                                • Sync for the Syncthing directory
                                                                                                • the usual stuff: desktop, docs, music, pics, video (yes, I edit ~/.config/user-dirs.dirs to get the shorter lower-case names)
                                                                                                • oh, I’ve just noticed snap in my home directory.
                                                                                                • a bunch of hidden directories, some of them more interesting than others: .local, .config, .cabal, .cargo, .hoogle, .rustup, .sbt, .stack, .VirtualBox and others.
                                                                                                1. 5

                                                                                                  Sorry if this is the wrong forum for this question, but what would it take for Plan 9 to become a viable option on servers or desktops today? It seems like many people are enthusiastic about the concepts it is built on and its potential, and have been for years, yet it seems as elusive as the GNU Hurd. Technically it may still be under development, but I’ve never encountered a machine, virtually or in person, that is running Plan 9.

                                                                                                  Why isn’t Plan 9 more popular? Is it an organizational problem, where there is no clear leader (person, corporation, or non-profit) pushing Plan 9 forward? Are there too many competing “forks” diluting what development effort exists? Is it a lack of good documentation/tutorials helping people get started developing Plan 9? Is there some licensing issue? Is it lack of hardware support, making it impossible to run on modern hardware? (Then why not run it in a virtual machine / emulator, as Redox OS does while it’s being developed?) Is it a lack of software written for it, or lack of a killer app that makes people want to run it instead of BSD or any other niche OS? Is it a sheer lack of publicity, so that fewer people are aware of its existence than I think? Is Plan 9 actually obsolete, so that people who really look at its design give up and go do something else with their time?

                                                                                                  1. 12

                                                                                                    I think there are a few factors:

                                                                                                    • The developers have very strong opinions. On things like obsessive adherence to the Unix philosophy (check the source for the plan 9 coreutils), syntax highlighting, mouse use, and so on. The nature of the system seems to make a lot of those opinions much harder to disagree with than other systems. Read the mailing lists or cat-v to get an idea of what I mean. I don’t think that this is a bad thing, but it is polarising.
                                                                                                    • The mouse is central. This stems from above, but I think it’s an issue in itself. A lot of the people who are likely to be interested are also likely to be invested in programs such as Vim or Emacs, and telling these people that they have no choice but to use a mouse isn’t going to go down well. Also, the prevalence of laptops these days means that people are less likely to always be able to use a mouse in the efficient way that is required. Furthermore, the mouse should be three buttoned, and modern mouses rarely are, the scroll wheel not working as a suitable alternative.
                                                                                                    • It works best together. Plan 9 is designed as a distributed operating system, and comes into its own when used on more than just a single personal machine. The fact no one uses it makes it hard for this to be achieved - a chicken and egg situation.
                                                                                                    • It’s ugly. Personally I quite like the aesthetic, but it does look like it’s from the 90s, and that’s going to turn a lot of people off. The interface is spartan, and many of its programs don’t come with easy ways to change the colour schemes to what the user might prefer.
                                                                                                    • There isn’t a good browser. I hate that I need these as much as the next person, but unfortunately it’s the case.

                                                                                                    These are the main things that have stopped me from using the system, and I’ve wanted to make it my main OS on a couple of occasions. Some people have switched, but others have moved to modern systems, bringing the killer apps with them.

                                                                                                    That’s the situation as I’ve experienced it anyway.

                                                                                                    [edit - just a heads up, apologies if this sounds a bit rushed, I wrote it once and then accidentally C-w’d my tab at the last moment (damn browsers!) and my thought process was a bit scattered the second time around.]

                                                                                                    1. 1

                                                                                                      Sorry to pick out this one thing, but why won’t the scroll wheel work as a 3rd button? Can’t you just push it without scrolling?

                                                                                                      1. 2

                                                                                                        The scroll wheel does work as a 3rd button. I wouldn’t call it unsuitable, it’s just that it is less ergonomic than a real middle button.

                                                                                                        1. 1

                                                                                                          You can, but that’s not really what it’s between designed for. Maybe you do and it works for you, but I find it frustrating because it feels like a wheel, not a button. Even when I disable scrolling with it, it feels like a wheel which is broken so that’s even worse. I used to have a three button mouse and it just felt better (in that regard, it also had a ball so was worse in that regard).

                                                                                                      2. 9

                                                                                                        @twee answered in detail why Plan 9 as a whole doesn’t get much adoption. But a lot of pieces of Plan 9 have been inspirational to other more popular OSes, and as a research OS, I think that counts as success for Plan 9. Obvious examples are UTF-8, which is everywhere, and the /proc filesystem on Linux that gives “everything is just a file” access to all sorts of kernel internals. A less obvious one is the 9p file server protocol, which made a recent appearance in Windows Subsystem for Linux of all places!

                                                                                                        1. 4

                                                                                                          Drivers are a huge obstacle for every alternative OS. Even on Linux, the situation is rough.

                                                                                                          The ubiquity of virtualization software (and somewhat consistent virtual hardware drivers) has been a boon to alt-OS usage and popularity.

                                                                                                          1. 4

                                                                                                            Plan 9 was and still is a beautiful experiment, an example of a research OS developed coherently, with a clear vision, while having a surprisingly decent userspace. However, it’s dangerous to confuse its aesthetic beauty and conceptual simplicity with actual utility for end users. The reason for existence for operating systems is to provide a hardware abstraction layer and to run end-user applications. It turns out that it’s possible to run the world on something as bloated and messy as Linux, and even, gasp, Windows. As long as the OS does not fall apart (like Windows 98) and has drivers for its target hardware, it’s good enough. Plan 9 clearly steered too much into the “pure aesthetics” territory without being an order of magnitude improvement for end users.

                                                                                                            Already in mid-90s Plan 9 started to get out of touch with the mainstream OSes and it had not found a niche where it was a winner. The other comments mentioned already the archaic UI and the mouse-centric workflow that requires the middle button–it is opinionated and rather hostile to most workflows of both power and casual users. As for the organizational problems: a successful OS needs backing of at least one big corp (OpenBSD is the most successful OS that I still consider being actually community-driven).

                                                                                                            To be honest, I prefer Plan 9 to stay in history in its clear form, as a myth of a perfect operating system, instead of watching it becoming something like what Linux is becoming today under the pressure of needs of big enterprises.

                                                                                                          1. 1

                                                                                                            The oldie, but goodie is “TCP/IP Illustrated” by Richard Stevens. It finally got networking clicking for me, and later editions are quite good too.

                                                                                                            1. 3

                                                                                                              Running an absurd amount, finishing some work on my desktop refresh, and cleaning my desk. Maybe some building/3D printing.

                                                                                                              1. 1

                                                                                                                Is this absurd amount equal to a marathon, by any chance?

                                                                                                                1. 1

                                                                                                                  Was actually planning on more and on a trail no less, but thanks to the chaos that was this week, wasn’t able to work up to it.

                                                                                                              1. 13

                                                                                                                If your needs can be served by data that can be captured in server-side logs, then I recommend https://goaccess.io/

                                                                                                                1. 1

                                                                                                                  I tried to make goaccess work for me, but I am still not satisfied.

                                                                                                                  Maybe I just use it wrong: I have a cronjob that generates html reports for different logs in /var/log/nginx. It produces a lot of information that I cannot drill down into. The time intervals for different websites are uneven: some nginx logs span a long time, some just one day.

                                                                                                                  Another consideration is that it’s written in C (unlike its name might suggest), and I am a little concerned about munging strings from the internet in a C program.

                                                                                                                  1. 1

                                                                                                                    They have an official docker container if you run to run it in isolation. You can mount your logs as read-only into the container.

                                                                                                                    I’ve only run the console program, and really just to see live traffic. For long term stats, I use awstats.