Threads for Sirikon

  1. 4

    Earthly is multi-stage dockerfiles with a marketing budget then?

    1. 37

      tldr; Tor but paid. Just two hops, the company (Invisiv) and a partner (Fastly).

      Invisiv sees your IP. Fastly sees where you’re connecting to (and the content if the destination is unencrypted). Only thing keeping this information uncorrelated is both companies agreeing to not share data.

      Sounds weak. Better than a regular VPN, but not much.

      EDIT: Closed source (or at leasr sources not disclosed on the webpage). App distributed thru Google Play. Release an update and the app could act as a regular VPN without users noticing anything.

      1. 2

        Correct, though we think the closer equiavlent to this is Apple’s iCloud Private Relay, in pretty much every way (same architecture, protocol, and infrastructure), except that it’s for Android and it works for most/all apps rather than just specific apps. (Also as I mention in another comment, we will be open sourcing it very soon.)

        1. 4

          Apple uses three exit node providers though, not just one.

          1. 4

            Yes and no – they do use three globally, but in each location they don’t use all three:

            https://arxiv.org/pdf/2207.02112.pdf

            1. 3

              But they don’t use just one either, they generally use two, a fact listed in section 4.2, paragraph “Geo Distribution” of the linked paper and easily verified by checking the assignment of addresses from https://mask-api.icloud.com/egress-ip-ranges.csv.

              Where I live, I constantly switch between Akamai and Cloudfare.

              1. 4

                Where I am, I’ve seen it stick with Akamai 90% of the time, sometimes go to Cloudflare. But yes, that’s a matter of resilience that a big company like Apple needs. We may add a second soon.

                1. 1

                  Unrelated: why don’t you support iOS?

                  1. 2

                    Tricky question on multiple levels, but the main reason is that we figured, since Apple already has Private Relay on iOS, that there wouldn’t be that much interest from users (even though ours has some differences). We could port it if there were enough interest.

      1. 6

        I think a framework like this will help improve the adoption (at the end, you need some product to drive the adoption of a language like RoR for Ruby or Flutter for Dart) but I am sad that the only “official” way so far to deploy your code is using their hosting platform.

        1. 6

          Yes, that’s what I thought too. The new open source feels like it is becoming the freemium model. “freemium source”, you heard it here first folks.

          But, all sarcasm aside, I do also feel glad that the money goes to the people who actually build the things and write the code.

          1. 4

            I believe one can just point the deno binary at a thing and run it. There’s no magic, you could make a Dockerfile with like 3 lines and host it on fly.io or whatever

            1. 2

              There’s no vendor lock-in nowhere. Just run the Deno runtime wherever, however you want.

              1. 2

                I htink they meant Deno Deploy, from the example project.

            1. 1

              Configuring my new Macbook @ work, required for working on iOS/React Native. I already hate every aspect of it.

              1. 6

                But wait, isn’t there that one nonguix project that allows you to install a normal kernel and Steam?

                Yeah, but talk about that in the main #guix channel and you risk getting banned. GG. You just have to know that it exists and you can’t learn that it exists without knowing someone that tells you that it exists under the table.

                Has this actually happened? Getting banned for talking about nonguix?

                1. 9

                  Not sure about getting banned, per se, but it’s explicitly discouraged. The second paragraph of nonguix’s readme:

                  Please do NOT promote this repository on any official Guix communication channels, such as their mailing lists or IRC channel, even in response to support requests! This is to show respect for the Guix project’s strict policy against recommending nonfree software, and to avoid any unnecessary hostility.

                  1. 12

                    even in response to support requests

                    Holy shit, that’s extremely disrespectful to users.

                    1. 2

                      I would recommend actually reading the help-guix archives to see how often support issues are created and how many issues users have are ignored or told they are out of place.

                    2. 12

                      I admit I fucked up and misunderstood the rules. My complaint now reads:

                      Yeah, but talk about that in the main #guix channel and you get told to not talk about it. You just have to know that it exists and you can’t learn that it exists without knowing someone that tells you that it exists under the table, like some kind of underground software drug dealer giving you a hit of wifi card firmware. This means that knowledge of the nonguix project (which may contain tools that make it possible to use Guix at all) is hidden from users that may need it because it allows users to install proprietary software. This limits user freedom from being able to use their computer how they want by making it a potentially untrustable underground software den instead of something that can be properly handled upstream without having to place trust in too many places.

                    3. 9

                      That’s made up, like most of that article, it’s full of misconceptions. Can’t tell whether or not this has been written in good faith.

                      But hey, outrage is good to attract attention. Proof to the point: I’m commenting this out of outrage.

                      1. 6

                        But hey, outrage is good to attract attention.

                        Hehe, yeah, the FSF and SFC use outrage constantly! I get emails all the time telling me that Microsoft and Apple are teaming up to murder babies or whatever. It’s pretty much all they have left at this point, and I say this as someone who donated and generally supported their mission for many, many years (which is why I still get the emails).

                        1. 3

                          Hyperbole and untruths are like pissing in your shoes to get warm; backfire once the initial heat is gone.

                      2. 6

                        When I wrote that bit I made the assumption that violating the rules of the channel could get you banned. I admit that it looks wrong in hindsight, so I am pushing a commit to amend it.

                        1. 2

                          Not to my knowledge. No. I’ve seen it tut-tutted but I’ve yet to see someone get banned.

                          1. 1

                            That’s 100% messed up if true.

                          1. 30

                            Unpopular opinion: I use bash all the way up til I need more than associative arrays, then I use Rust. It works surprisingly well for scripting tasks.

                            1. 15

                              Other than the rust part, I suggest this is the popular opinion ;-)

                              1. 7

                                Julialang.org is especially good. But I need to google how to use it all the time.

                                And JS in contrast is my “native” language. I remember it even at night at 3 a.m. So scripting in JS is much easier for me let’s say than in bash.

                                But something is just easier in bash, like cat | wc -l. Zx helps me to combine the power of the two.

                                I also believe it’s true for lots of people. This is why zx saw such an increase in popularity.

                                1. 4

                                  You’re absolutely right. That’s why it’s an unpopular opinion. 😁

                                  I’m fully aware that I’m far more willing to write far more bash (and zsh) than most people.

                                  1. 1

                                    Julia is tough for me as well, so much feels very magical and I don’t really know how to operate it.. but when it works it’s very cool

                                  2. 3

                                    Ditto, but I reach for Ruby because other folks in my team/servers have the runtime installed.

                                    1. 3

                                      For scripting inside a project I tend to include an executable task file with this Python starter: https://gist.github.com/sirikon/d4327b6cc3de5cc244dbe5529d8f53ae

                                      1. 1

                                        What is the advantage of this as opposed to just write whatever code you need directly and run the script?

                                        1. 1

                                          With the Python starter I get the basic, repeating stuff done:

                                          • A simple help command is generated (triggered by running just ./task), after a week or so without touching a project I just forget the commands and want a quick help.
                                          • Working directory switched to the task file’s directory, handy when running the script from another place like a subdirectory.
                                          • A cmd function as a pass-thru to subprocess.run, but forcing check=True.

                                          Here’s the task file from one of my projects, as an example: https://github.com/sirikon/bilbostack-app/blob/master/task

                                      2. 2

                                        /bin/sh to perl for me. I’d rather try (un)icon and scsh than javascript.

                                        1. 2

                                          I usually use Python since it’s available, though I entirely understand your sentiment due to the built-in std::process and clap crate.

                                          1. 5

                                            the moment I need more than bash I just look which lang is available and has the right libraries, can be rust, can be python..

                                            1. 1

                                              Have you tried xonsh?

                                          1. 3

                                            For anyone wondering, the interesting part starts at 13 minutes.

                                            1. 13

                                              Deno is an impressive project. But importing URLs rather than some abstraction (package names, “@vendorname/packagename”, reverse DNS notation, anything) is a no-go for me. I 100% do not want a strict association between the logical identifier of a package and a concrete piece of internet infrastructure with a DNS resolver and an HTTP server. No thank you. I hate that this seems to be a trend now, with both Deno and Go doing it.

                                              1. 8

                                                package names, “@vendorname/packagename” and reverse DNS notation, both in systems like Maven or NPM, are just abstractions for DNS resolvers and HTTP servers, but with extra roundtrips and complexity. The idea is to get rid of all those abstractions and provide a simple convention: Whatever the URLs is, it should never change it’s contents, so the toolchain can cache it.

                                                Any http server with static content can act as an origin for getting your dependencies. It could be deno.land/x, raw.githubusercontent.com, esm.sh, your own nginx instance, any other thing, or all of those options combined.

                                                1. 21

                                                  Package identifiers are useful abstractions. With the abstraction in place, the package can be provided by a system package manager, or the author can change their web host and no source code (only the name -> URL mapping) needs to be changed. As an author I don’t want to promise that a piece of software will always, forevermore, be hosted on some obscure git host, or to promise that I will always keep a particular web server alive at a particular domain with a particular directory structure, I want the freedom to move to a different code hosting solution in the future, but if every user has the URL in every one of their source files I can’t do that. As a result, nobody wants to take the risk to use anything other than GitHub as their code host.

                                                  With a system which uses reverse DNS notation, I can start using a library com.randomcorp.SomePackage, then later, when the vendor stops providing the package (under that name or at all) for some reason, the code will keep working as long as I have the packages with identifier com.randomcorp.SomePackage stored somewhere. With a system which uses URLs, my software will fail to build as soon as randomcorp goes out of business, changes anything about their infrastructure which affects paths, stops supporting the library, or anything else which happens to affect the physical infrastructure my code has a dependency on.

                                                  The abstraction does add “complexity” (all abstractions do), but it’s an extremely useful abstraction which we should absolutely not abandon. Source code shouldn’t unnecessarily contain build-time dependencies on random pieces of physical Internet infrastructure.

                                                  That’s my view of things anyways.

                                                  1. 8

                                                    As an author I don’t want to promise that a piece of software will always, forevermore, be hosted on some obscure git host, or to promise that I will always keep a particular web server alive at a particular domain with a particular directory structure. I want the freedom to move to a different code hosting solution in the future.

                                                    Same applies to Maven and npm, repositories are coded into the project (or the default repository being defined by the package manager itself). If a host dies and you need to use a new one, you’ll need to change something.

                                                    What happens if npm or jcenter.bintray.com stops responding? Everyone will have to change their projects to point at the new repository to get their packages.

                                                    but if every user has the URL in every one of their source files I can’t do that. As a result, nobody wants to take the risk to use anything other than GitHub as their code host.

                                                    In Deno you can use an import map (And I encourage everyone to do so): https://deno.land/manual/linking_to_external_code/import_maps so all the hosts are in a single place, just one file to look at when a host dies, just like npm’s .npmrc.

                                                    There are lockfiles, too: https://deno.land/manual@v1.18.0/linking_to_external_code/integrity_checking#caching-and-lock-files.

                                                    And another note: It’s somewhat typical for companies to have an internal repository that works as a proxy for npm/maven/etc and caches all the packages in case some random host dies, that way the company release pipeline isn’t affected. Depending on the package manager and ecosystem, you’ll need very specific software for implementing this (Verdaccio for npm, for example). But with Deno, literally any off-the-shelf HTTP caching proxy will work, something way more common for systems people.

                                                    Source code shouldn’t unnecessarily contain build-time dependencies on random pieces of physical Internet infrastructure.

                                                    That’s right, but there are only two ways to make builds from source code without needing random pieces of physical internet infrastructure, and these apply for all package management solutions:

                                                    • You have no dependencies at all
                                                    • All your dependencies are included in the repository

                                                    The rest of solutions are just variations of dependency caching.

                                                  2. 5

                                                    Although this design is simpler, it has a security vulnerability which seems unsolvable.

                                                    The scenario:

                                                    1. A domain expires that was hosting a popular package
                                                    2. A malicious actor buys the domain and hosts a malicious version of the package on it
                                                    3. People who have never downloaded the package before, and therefore can’t possibly have a hash/checksum of it, read blog posts/tutorials/StackOverflow answers telling them to install the popular package; they do, and get compromised.

                                                    It’s possible to prevent this with an extra layer (e.g. an index which stores hashes/checksums), but I can’t see how the “URL only” approach could even theoretically prevent this.

                                                    1. 2

                                                      I think the weak link there is people blindly copy-pasting code from StackOverflow. That opens the door to a myriad of security issues, not only for Deno’s approach.

                                                      There are plenty of packages in npm with very similar names as legit, popular packages, but maybe just a letter, an underscore, or a number, differs. Enough for many people installing the wrong package, a malicious one, and getting compromised.

                                                      Same applies to domain names. Maybe someone buys den0.land and just writes it as DEN0.LAND in a forum online because anyway domains are case-insentive and the zero can hide better.

                                                      Someone could copy some random Maven host from StackOverflow and get the backdoored version of all their existing packages in their next gradle build.

                                                      Sure, in that sense, Deno is more vulnerable because the decentralisation of package-hosting domains. It’s easier for everyone to know that the “good” domain is github.com or deno.land. If any host could be a good host, any host could mislead and become a bad one, too.

                                                      For npm, we depend entirely on the fact that the domain won’t get taken over by a malicious actor without no one noticing. I think people will end up organically doing the same and getting their dependencies mostly from well-known domains as github.com or deno.land, but I think it’s important to have the option to not follow this rule strictly and have some freedom.

                                                      EDIT:

                                                      Apart from the “depending on very well-known and trusted centralised services” strategy, something more could be done to address the issue. Maybe there’s something about that in Deno 2 roadmap when it gets published. But fighting against StackOverflow blind copy-pastes is hard.

                                                      1. 1

                                                        What about people checking out an old codebase on a brand new computer where that package had never been installed before?

                                                        I dunno, this just feels wrong in so many ways, and there are lots of subtle issues with it. Why not stick to something that’s been proven to work, for many language ecosystems?

                                                        1. 1

                                                          What about people checking out an old codebase on a brand new computer where that package had never been installed before?

                                                          That’s easily solved with a lockfile, just like npm does: https://deno.land/manual/linking_to_external_code/integrity_checking

                                                          Why not stick to something that’s been proven to work, for many language ecosystems?

                                                          Well, the centralized model has many issues. Currently every piece of software that runs on node, to be distributed, has to be hosted by a private company property of Microsoft. That’s a single point of failure and a whole open source ecosystem relying on a private company and a private implementation of the registry.

                                                          Also, do you remember all the issues with youtube_dl on GitHub? Imagine something similar in npm.

                                                          Related to the topic: https://www.youtube.com/watch?v=MO8hZlgK5zc

                                                          1. 3

                                                            Good points those. I hadn’t considered that!

                                                            The single point of failure is not necessarily inherent to the centralized model though. In CHICKEN, we support multiple locations for downloading eggs. In Emacs, the package repository also allows for multiple sources. And of course APT, yum and Nix allow for multiple package sources as well. If a source goes rogue, all you have to do is remove it from your sources list and switch to a trustworthy source which mirrored the packages you’re using.

                                                      2. 2

                                                        Seems like you might want to adopt a “good practice” of only using URL imports from somehow-trusted sources, e.g. npmjs.com or unpkg or whatever.

                                                        Could have a lint rule for this with sensible, community-selected defaults as well.

                                                        1. 2

                                                          If in the code where the URL is also includes a hash of the content, then assuming the hash isn’t broken, it avoids this problem.

                                                          i.e.:

                                                          import mypackage.org/code/mypackage-v1.1#HASH-GOES-HERE
                                                          

                                                          You get the URL and the hash, problem solved. You either get the code as proved by the hash or you don’t get the code.

                                                          The downside is, you then lose auto-upgrades to some new latest version, but that is usually a bad idea anyway.

                                                          Regardless, I’m in favour of 1 level of indirection, so in X years when Github goes out of business(because we all moved on to new source control tech), people can still run code without having to hack up DNS and websites and everything else just to deliver some code to the runtime.

                                                          1. 1

                                                            This is a cool idea, although I’ve never heard of a package management design that does this!

                                                            1. 1

                                                              Agreed, I don’t know of anyone that does this either. In HTTPS land, we have SRI that is in the same ballpark, though I imagine the # or sites that use SRI can be counted on 1 hand :(

                                                          2. 1

                                                            It’s sort of unlikely to happen often in Go because most users use github.com as their URL. It could affect people using their own custom domain though. In that case, it would only affect new users. Existing users would continue to get the hashed locked versions they saved before from the Go module server, but new users might be affected. The Go modules server does I think have some security checks built into it, so I think if someone noticed the squatting, they could protect new users by blacklisting the URL in the Go module server. (https://sum.golang.org says to email security@golang.org if it comes up.)

                                                            So, it could happen, but people lose their usernames/passwords on NPM and PyPI too, so it doesn’t strike me as a qualitatively more dangerous situation.

                                                          3. 2

                                                            Whatever the URLs is, it should never change it’s contents, so the toolchain can cache it.

                                                            Does Deno do anything to help with that or enforce it?

                                                            In HTML you can apparently use subresource integrity:

                                                            https://developer.mozilla.org/en-US/docs/Web/Security/Subresource_Integrity

                                                            <script src="https://example.com/example-framework.js"
                                                                    integrity="sha384-oqVuAfXRKap7fdgcCY5uykM6+R9GqQ8K/uxy9rx7HNQlGYl1kPzQho1wx4JwY8wC"
                                                                    crossorigin="anonymous"></script>
                                                            

                                                            It would make sense for Deno to have something like that, if it doesn’t already.

                                                            1. 2

                                                              Deno does have something like that. Here’s the docs for it: https://deno.land/manual@v1.18.0/linking_to_external_code/integrity_checking.

                                                              1. 2

                                                                OK nice, I think that mitigates most of the downsides … including the problem where someone takes over the domain. If they publish the same file it’s OK :)

                                                          4. 2

                                                            While I agree with your concern in theroy, in the Go community this problem is greatly mitigated by two factors:

                                                            • Most personal projects tend to just use their GitHub/GitLab/… URLs.
                                                            • The Go module proxy’s cache is immutable, meaning that published versions cannot be changed retrospectively.

                                                            These two factors combined achieve the same level of security as a centralized registry. It is possible that Deno’s community will evolve in the same direction.

                                                            1. 1

                                                              change your hosts file

                                                            1. 31

                                                              The article doesn’t refute any of the arguments in the blog it’s responding to. Yet it has a point - Gemini seems to have taken off as a subculture.

                                                              People rally around this protocol as a backlash against the complexity of the web, even though a non-complex website can be made with HTTP and HTML, and it can be read on low-powered computers with oldschool browsers like lynx or slightly more modern links, dillo or netsurf. Gemini is not about any of the technical features (or even the anti-features, if you will) but more about the emotion and ethos surrounding it.

                                                              1. 11

                                                                It’s simpler than that: Gemini is about having fun!

                                                                1. 7

                                                                  Gemini is not about any of the technical features (or even the anti-features, if you will) but more about the emotion and ethos surrounding it.

                                                                  Gemini is an NFT.

                                                                  1. 1

                                                                    I can’t really see the similarity myself.

                                                                    1. 5

                                                                      It’s a technical solution to a social problem with a committed bunch of supporters who are Extremely Online.

                                                                      Thankfully there is no money involved however.

                                                                  2. 7

                                                                    It does address the charge of exclusionism. I would even respond to that charge in stronger terms: if you require every new project to be 100% inclusive from day 1, you’re a useful idiot for rich monopolists who have large departments devoted to marketing how inclusive and accessible they are[1].

                                                                    Reading pages on Gemini requires installing a program. We all used to do this, back in the day! It blows my mind that this is considered exclusionist.

                                                                    [1] Except they’re not really if you actually focus on the details. Why the fuck is the author of that other blog post not complaining about how exclusionist Twitter is? They can’t render 280 fucking characters without Javascript.

                                                                    1. 18

                                                                      Reading pages on Gemini requires installing a program. We all used to do this, back in the day! It blows my mind that this is considered exclusionist.

                                                                      People install apps on their phones and laptops all the time (lol you should see my “Messenger” app group.) Gemini isn’t exclusionist for asking for an app install, it’s exclusionary for being mostly text based, for emphasizing keyboard navigation, and for eschewing accessibility for minimalism. If I were ever interested in sharing math on Gemini it would be pretty much impossible; there’s no way to represent the markup. In theory you can share other formats, like HTML or images, but in practice the community strongly wants to stick to text/gemini.

                                                                      There’s also a decent amount of purity politics. See https://github.com/makeworld-the-better-one/amfora/issues/199 for example. It’s a set of cultural values that wants to exclude and circumscribe by default. There’s nothing wrong with this, but it makes the community by definition exclusionary.

                                                                      1. 10

                                                                        There’s also a decent amount of purity politics. See https://github.com/makeworld-the-better-one/amfora/issues/199 for example.

                                                                        I’ve been intrigued by the idea behind Gemini for awhile now (and Gopher before that) but reading through that conversation just made me absolutely certain that I never want anything to do with Gemini. To be fair, I now also doubt that they would want me involved in their community either :-)

                                                                        1. 14

                                                                          I mean, it’s mainly just Drew Devault who’s the issue in that exchange. If it makes you feel any better, he’s already been banned from Lobsters :)

                                                                          1. 1

                                                                            I haven’t been here for a few months so this is news to me. Whoah.

                                                                          2. 9

                                                                            That github issue made me so angry that I added a gemini title grabber to my IRC bot that would also fetch the favicon.txt file, just to spite that asshole.

                                                                          3. 4

                                                                            Yeah, that’s totally valid. “Accessible” is a more precise keyword here than “inclusive” that isn’t talked about in either OP or the post it’s responding to. It’s true that plain text isn’t accessible.

                                                                            I’ve been agonizing about this because my project is terminal-based. I’d characterize my current position as very/genuinely reluctantly excluding some people. I’d love to brainstorm ways to get around it with an accessibility expert.

                                                                            1. 8

                                                                              I’d love to brainstorm ways to get around it with an accessibility expert.

                                                                              Yeah it’s something I’m a bit sensitive to because I have some really bad RSI issues. Personally, I’ve always learned much better with text than drawings (since I was a child), and when I found text interfaces on computers, I found them much easier to navigate around than graphical interfaces. Unfortunately I had a sports injury when I was young in my wrist, and years of coding have now made my RSI pretty bad. There are days when I get by with using only my left hand on an ambidextrous trackball. Those days using the terminal is a gigantic chore and I feel super bummed when I read the fashionable online TUI maximalism in tech spaces. And I’m relatively lucky and privileged, I wasn’t even born with an actual disability. I can only imagine what it’s like for folks with other accessibility issues.

                                                                              I recall in the ’90s (though I may be conflating trends, so this might be more of a haphazard connection than a true connection) a desire to have the Web contain text and rich media to accommodate the information acquisition style most beneficial to the reader/viewer. By ideologically sticking to text the way Gemini does, I see Gemini making a strong statement that Geminauts should learn a certain way and that other types of learners (say graphical or aural learners) are not really considered. The Web as an open, welcoming technology then feels very different than the closed, specific community of Gemini.

                                                                              That said, as a personal project, you can’t “fix the world”. Focusing on a niche is fine IMO. We all have finite time in our lives and we do what we can with our time.

                                                                              1. 8

                                                                                other types of learners (say graphical or aural learners

                                                                                Just as a side note here: the idea that people have a “learning style” and one person will learn best with audio vs another best with visuals, has been widely refuted.

                                                                                1. 3

                                                                                  Want to +1 your comment. I’m aware but I didn’t add that into my post and I don’t want folks to think that my statement is a statement on the pedagogy at large on education.

                                                                                2. 2

                                                                                  Thanks! If you ever get around to trying out my thing, I’d love to hear any advice you have, whether on accessibility or anything else.

                                                                                3. 1

                                                                                  There’s a big difference between a terminal-based application for people to explore and ‘the web is for normies and if you don’t join our clique “you’re a useful idiot for rich monopolists”’ – which is a rather exclusionary thing to say.

                                                                                  1. 1

                                                                                    I don’t actually use Gemini much! I don’t hang out on the mailing list, I don’t know anybody there. If I’m in a clique, it’s a clique of 1.

                                                                                    If you require every new project to be 100% inclusive from day 1, you’re a useful idiot for rich monopolists who have large departments devoted to marketing how inclusive and accessible they are

                                                                                    I stand by this statement in all particulars. But I’m not sure who I’m excluding from what by saying it.

                                                                                    1. 5

                                                                                      If you require every new project to be 100% inclusive from day 1, you’re a useful idiot for rich monopolists who have large departments devoted to marketing how inclusive and accessible they are

                                                                                      That statement implicitly assumes that projects will later be extended to be more inclusive.

                                                                                      The problem is, Gemini seems pretty hostile to any extensions made after day 1, and this seems to be a specific goal of the project. This means if they ever want to include accessibility, it has to be planned in from day 1.

                                                                                      1. 2

                                                                                        I’m not sure what accessibility it needs to include? It’s trivial to create an audio-only, reading only, braille-only, large print, high contrast, translated to any language, version of any gemini capsule. Lacking support for mathematical notation or music isn’t about accessibility, it’s about content.

                                                                                        1. 5

                                                                                          Questions I’d have:

                                                                                          • How would a screen reader know what language a capsule is? What if you use multiple languages in your capsule?
                                                                                          • How does a screen reader know what to parse and what not to (e.g. images)?
                                                                                          • Those Figlet ASCII art images are absolute garbage for a screen reader

                                                                                          It’s trivial to create an audio-only, reading only, braille-only, large print, high contrast, translated to any language, version of any gemini capsule

                                                                                          So it’s been said since the beginning of the protocol but all I’ve seen is TUI clients, an Emacs client, and a handful of GUI clients which are still not thinking about accessibility at all.

                                                                                          1. 5
                                                                                            • The MIME type of the resource requested is included in the response, and it’s there where one can include a language tag (it’s even specified for text/gemini). At best, you can set a language per file, but if the document includes multiple languages, you are out of luck. But to be fair, does anyone actually tag foreign language words or phrases in HTML? I know I do (via tags) but I think I might be the only one.
                                                                                            • Images (like gif and jpegs) aren’t displayed inline. Yes, it’s an issue knowing a link is to an image (and what type of image) until it’s requested.
                                                                                            • The spec for text/gemini allow for “alt text” after the pre-formatted block marker, but there is no standard for how it should work, nor what it should contain. There’s been an insane amount of talk about the issue, but rarely (if ever) does someone even bother with a “proof-of-concept” to see how it might look or work (my biggest gripe with the Gemini community—mostly talk, no experimentation because actual working code is hard work and who wants to do that?)

                                                                                            Disclaimer: I wrote the first available Gemini server which helped with identifying the worst bits of the protocol (I did not bother much with text/gemini). I pretty much left the community because of the community, but I still run my Gemini site (gemini://gemini.conman.org/ in case anyone is interested).

                                                                                            1. 1
                                                                                              • A screen reader would know the language the same way anyone ever does. Language Detection is a pretty reasonably solved problem, certainly for long-ish passages.
                                                                                              • What images?
                                                                                              • That’s a screen reader problem (actually an ascii art problem), not a protocol accessibility problem. The web is no better at dealing with bad actors.

                                                                                              Feel free to write an Alexa client. It’d be pretty easy. (Like, legitimately so. It actually sounds kind of fun. I might try.)

                                                                                              1. 1

                                                                                                Those Figlet ASCII art images are absolute garbage for a screen reader

                                                                                                The standard allows for an optional “alt text” to be attached to preformatted sections.

                                                                                                Edit

                                                                                                How would a screen reader know what language a capsule is? What if you use multiple languages in your capsule?

                                                                                                The server I run (gemserv) has an optional directive indicating the language. I don’t know if it communicates this to the client though.

                                                                                    2. 2

                                                                                      I’m not familiar with the insider politics of Gemini. Now I’m regretting wading into this. I’d have kept quiet if either blog post said, “the people on the mailing list are rude,” something I’m not qualified or interested to debate.

                                                                                      1. 2

                                                                                        it’s exclusionary for being mostly text based

                                                                                        Like the Amish.

                                                                                        If people want to live without electricity, who am I to tell them otherwise?

                                                                                        If I were ever interested in sharing math on Gemini it would be pretty much impossible;

                                                                                        Yes yes, all your calculus I’m sure is quite good, but there are other things too, and you might say nothing quite captures the beauty of a rose like a picture of a rose, and that music is best heard not talked about, and to say nothing of the medium of games and interactivity, where even being able to see all the code can spoil the ending!

                                                                                        We already have something perfectly mediocre at representing all of those things, but we don’t have anything really great at just doing text and links besides Gemini.

                                                                                        There’s nothing wrong with this, but it makes the community by definition exclusionary.

                                                                                        I disagree wholeheartedly: A meetup for blind people isn’t exclusionary if a sighted person can join. You are so welcome! You’re free to consume or produce whatever content you want, but so are they, and your inability to share your math with me speaks more of your abilities than mine for simply lacking the eyes with which to “read” it.

                                                                                        1. 14

                                                                                          Like the Amish.

                                                                                          The Amish don’t proselytize. In fact I suspect they’d be relieved if the outside world stopped being fascinated by them. This does not describe most Gemini evangelists.

                                                                                          We already have something perfectly mediocre at representing all of those things, but we don’t have anything really great at just doing text and links besides Gemini.

                                                                                          Well, views differ. I see the lack of semantic content for emphasis as crippling, for text. And I really cannot see any reason for this to be the case apart of the rigid “each line should be parseable as a unit” argument, and “it’s up to the client to decide how to present”.

                                                                                          Quoting myself from here:

                                                                                          italics and boldface are good, I am still mad

                                                                                          You can’t reproduce most prose works without having fugly underscores or asterisks all over. This is defacing our cultural heritage. Asking the user/client to download on the side and using an external reader is a cop-out.

                                                                                          It’s increasingly apparent to me that gemtext is well suited for one thing: writing about and discussing gemtext and Gemini. A bargain-basement Sapir-Worf theory, in other words.

                                                                                          Gemtext is designed by someone who thought that the pinnacle of human communication is a 1990s Usenet post. Gutenberg and Aldus Manutius wept.

                                                                                          1. 2

                                                                                            The Amish don’t proselytize.

                                                                                            I think that depends on what you mean by proselytize; Many Amish vote, and they have Amish lobbyists, and some are even quite mission-oriented in how they talk of their faith (e.g. NOA). Some have certain rules that you have to follow if you want to participate in their community (like for example, visiting an Amish church), and both (to me) seem quite tolerant of the existence of other faiths.

                                                                                            In any event, I don’t follow exactly is it about what Gemini “evangelism” that has anything to do with your inability to express yourself to blind people. To me, it’s like you’re telling people to remove the braille from elevators. Why? If you don’t want to use Gemini? Who is forcing you?

                                                                                            I see the lack of semantic content for emphasis as crippling. … I really cannot see any reason for this.

                                                                                            That’s too bad. Again, why do you care that something exists that isn’t for you? Do you bring your cat to dog-walking clubs as well? A child to the pub? Do you think everyone should like the things you like?

                                                                                            1. 3

                                                                                              I dunno man, I’m just here, working on my gemsite, participating on Antenna, hanging out in the #gemini IRC and generally having a good time. I’m sorry I’m not comporting myself befitting a member of the Church of Gemini.

                                                                                      2. 8

                                                                                        Twitter used to have a nice mobile site. It worked beautifully with text-mode browsers. I used to use it with edbrowse. They killed it in December 2019, a couple weeks after I quit twitter.

                                                                                        And then there’s twitter’s crusade against third-party clients. Third-party clients happen to be very popular with the blind.

                                                                                    1. 1

                                                                                      I’ve been writing Crystal in the recent past, right before 1.0. Macros and API docs are lovely most of the time, but the main issue for me was:

                                                                                      • Compilation was too slow. Including slightly big libraries or frameworks in a project could mean 5-10 extra secs of compilation every time. Was forced to use none or little dependencies to keep the compilation times on sane levels. I know this version introduces an interpreter, but by their numbers (times reduced at 75% in best scenarios), it’s not enough.
                                                                                      • Integration with VS Code for checking-while-writing was slow and inaccurate, too.
                                                                                      1. 9

                                                                                        tldr: There’s a list at the end of the article saying which clients will not support new CA. If those matter to you, you’re in trouble.

                                                                                        Keep everything up to date and renew certificates.

                                                                                        1. 2

                                                                                          I’ve seen myself looking at nowhere and correlating politics and social structures with arquitectural problems in software and how fixing one could translate to the other.

                                                                                          Or seeing how other mundane stuff translates to a technical concept just because it’s called similarly.

                                                                                          Can’t come up with an example, but 100% this thing happens to me a lot.

                                                                                          1. 42

                                                                                            Of course it requires apt! Because not only we all run Linux, be we all run a specific distribution of Linux with a specific package manager.

                                                                                            I feel like we’re just skipping over the elephant in the makefile here. Why the hell is a Makefile running apt install in the first place?

                                                                                            1. 3

                                                                                              Why not? It’s ensuring the dependencies are in place.

                                                                                              1. 30

                                                                                                It’s rather unconventional for the build step to be doing systems-wide changes. Having an additional recipe, e.g. make deps, which handled installing the dependencies and could be optionally ran would be reasonable, but historically and conventionally make by itself handles the build step for the application, and just the build step.

                                                                                                1. 8

                                                                                                  In this particular case, a lot of the “build dependencies” aren’t even that: they’re cross-build dependencies, e.g. if you want to create an ARM binary on your amd64 machine. You probably don’t want to do that when you’re compiling it for just yourself though.

                                                                                                  I’m not sure if any of those dependencies are needed actually, it installs the SQLite3 headers, but the go-sqlite3 package already includes these and shouldn’t be needed. The only thing you should need is a C compiler (which I assume is included in the build-essential package?)

                                                                                                  None of this may be immediately obvious if you’re not very familiar with Go and/or compiling things in general; it’s adding a lot of needless friction.

                                                                                                  That Makefile does a number of other weird things: it runs the clean target on the build for example, which deletes far more than you’d might expect such as DB, config, log files, and a number of other things. That, much more than the apt-get usage, seems a big 😬 to me. It can really destroy people’s data.

                                                                                                  1. 2

                                                                                                    Having an additional recipe, e.g. make deps, which handled installing the dependencies and could be optionally ran would be reasonable

                                                                                                    That’s what I meant, it’s a reasonable way of running apt within make. Didn’t mean as a default procedure when running make.

                                                                                                    EDIT: I know that historically and conventionally make doesn’t do that, but, you know, it’s two lines, it’s about getting dependencies required for building… I don’t think it’s that much of a flex.

                                                                                                    1. 3

                                                                                                      Oh yeah, definitely. If it’s there just not the default, that’s great and I’d totally +1 it. It’s handy!

                                                                                                      Just please no system-wide changes by running just make :(

                                                                                                  2. 3

                                                                                                    It only does that on one particular flavor of Linux. Even if we ignore BSDs, not everyone is Debian-derived.

                                                                                                    1. 1

                                                                                                      Is it typically the job of a Makefile to evaluate dependencies (maybe) and install them (maybe not typically)?

                                                                                                  1. 11

                                                                                                    I can understand people assuming apt exists on the system because:

                                                                                                    • Most of the times they’ll be correct
                                                                                                    • People that doesn’t have apt will probably know how to find the equivalent packages in their equally bad linux package manager of choice.

                                                                                                    Can understand, too, people using a SQLite implementation for Go that doesn’t depend on CGo, because CGo has it’s own issues.

                                                                                                    Everything is hot garbage, doesn’t matter if you’re in Ubuntu or not. Don’t expect to have a gazillion of scripts that install all the dependencies in every package manager imaginable, none of those is good enough to deserve that much of attention, it won’t happen. At least apt is popular.

                                                                                                    That’s a reason Docker is so popular: It’s easier to work over an image with an specific package manager that will not change between machines. Doesn’t matter the distribution or the equally bad linux package manager of choice as long as you are on Linux and have Docker. And Dockerfiles end up being a great resource to know all the required quirks that allow the code to work.

                                                                                                    And finally:

                                                                                                    First, please stop putting your binaries in /app, please, pretty-please? We have /usr/local/bin/ for that.

                                                                                                    Never. Linux standard paths are a tragedy and will actively avoid them as much as possible. It’s about choices and avoiding monoculture, right?

                                                                                                    1. 10

                                                                                                      Linux standard paths are a tragedy and will actively avoid them as much as possible. It’s about choices and avoiding monoculture, right?

                                                                                                      No, it’s about writing software that nicely integrates with the rest of the chosen deployment method/system, not sticking random files all over the user’s machine.

                                                                                                      1. 22

                                                                                                        In this example /app is being used inside the docker image. It is most definitely not sticking random files all over the users machine.

                                                                                                        1. 4

                                                                                                          This hits a slightly wider point with the article - half the things the author is complaining about aren’t actually to do with Docker, despite the title.

                                                                                                          The ones that are part of the Docker build… don’t necessarily matter? Because their effects are limited to the Docker image, which you can simply delete or rebuild without affecting any other part of your system.

                                                                                                          I understand the author’s frustration - I’ve been through trying to compile things on systems the software wasn’t tested against, it’s a pain and it would be nice if everything was portable and cross-compatible. But I think it’s always been a reality of software maintenance that this is a difficult problem.

                                                                                                          In fact, Docker could be seen as an attempt to solve the exact problem the author is complaining about, in a way which works for a reasonable number of people. It won’t work for everyone and there are reasons not to like it or use it, but I’d prefer to give people the benefit of the doubt instead of ranting.

                                                                                                          Speaking of ranting, this comment’s getting long - but despite not really liking the tone of the article, credit to the author for raising issues and doing the productive work as well. That’s always appreciated.

                                                                                                          1. 3

                                                                                                            OP here. aww thank you! Yes as noted in the disclaimer at the top, I was very frustrated! hopefully I ported it all, now trying to clean up some code so I can make patches.

                                                                                                            According to commenters on the Yellow Site, it’s not wise to “just send a patch” or “rant”, they way it’s better to open an issue first. Which honestly I still don’t understand. Care someone explain that to me?

                                                                                                            As a open-source maintainer, I like only two types of issues. 1) here’s a bug, 2) here’s a feature request and how to implement. But if someone made an issue saying “Your code is not running on latest version of QNX”, I would rather see them “Here’s a patch that makes the code run on QNX”.

                                                                                                            Regardless, I tried an experiment and opened a “discussion issue” in one of the tools, hoping for the best.

                                                                                                            1. 3

                                                                                                              According to commenters on the Yellow Site, it’s not wise to “just send a patch” or “rant”, they way it’s better to open an issue first. Which honestly I still don’t understand. Care someone explain that to me?

                                                                                                              Receiving patches without prior discussion on the scope/goals is potentially something frustrating since basic communication can easily avoid unnecessary extra work for both maintainers but also contributors. Maybe a feature is already being worked on? Maybe they’ve had prior conversations on the topic that you couldn’t have seen? Maybe they simply don’t have the time to review things at the moment? Or maybe they won’t be able to maintain a certain contribution?

                                                                                                              Also for end-users, patches without a linked issue can be a source of frustration. Usually the MR contains discussion on the code/implementation details and issues conversation around the goals of the implementation.

                                                                                                              Of course that always depends, if you’re only contributing minor improvements/changes - a discussion is often not needed.

                                                                                                              Or in other words, as posted on ycombinator news:

                                                                                                              Sending patches directly is a non-collaborative approach to open source. Issues are made for discussion, and PRs for resolutions; as a matter of fact, some projects state this explicitly, in order to waste maintainers’ time with unproductive PRs.

                                                                                                          2. 3

                                                                                                            Exactly

                                                                                                            1. 2

                                                                                                              This is only a mediocre example, because with go there should only be one binary (or a handful of them) - but yes, if you put your software in a container, I am very happy if everything is in /app and if I want to have a look I don’t have to dissect /etc/, /usr/local and maybe /var. If the container is the sole point of packaging one app, I see no downside to ignoring the FHS and putting it all together. There’s a reason that most people do that for custom-built software (as opposed to “this container just does “apt-get install postgresql-server”, then I would expect the stuff to be there where the distro puts it)

                                                                                                        1. 2

                                                                                                          Isn’t this experience nearly the same with every SaaS that allows some degree of customization? In all those, customization is an afterthought, and truly painful to work with.

                                                                                                          1. 5

                                                                                                            How ironic that Xamarin, originally designed for .NET applications on Linux, is now not for Linux anymore. I wonder what De Icaza would think of that.

                                                                                                            1. 6

                                                                                                              Considering how the Linux community treated Mono, who can blame them?

                                                                                                              1. 1

                                                                                                                I’m not sure who would have backed up Mono on Linux. Permanent second class, living in the shadow of an uncooperative giant. Icaza’s long term vision was always unclear to me, but history shows what it was: assimilation, annihilation ;)

                                                                                                                1. 4

                                                                                                                  I have a different view of it, informed by rms’ fatwa against it that became the closest the Linux community got to an angry mob (i.e; screaming about Mono-based applications on distro CDs, trying to cancel a Debian developer for packaging it, etc.).

                                                                                                                  Microsoft in the 2000s seemed fairly cool towards it (i.e. adding Unix to the OS enum), but they were too Windows focused to promote such a thing. I understand why Mono had to wither away for MS’ own push for .NET on Linux (Windows users didn’t think it was viable, Linux users had prejudice), but I’m sad that a good project spent years in the weeds because of it.

                                                                                                                  1. 3

                                                                                                                    I understand why Mono had to wither away for MS’ own push for .NET on Linux (Windows users didn’t think it was viable, Linux users had prejudice), but I’m sad that a good project spent years in the weeds because of it.

                                                                                                                    I don’t think this is quite what’s happened. One of the goals of .NET 5 was to merge the Mono and .NET Core codebases. Various bits of Mono infrastructure are used in the Xamarin components.

                                                                                                                    This makes me somewhat sad because Mono was very portable but the .NET Core-derived bits are Linux/Windows/macOS and often lose *BSD/whatever support that Mono has had for ages.

                                                                                                              2. 1

                                                                                                                Too much money to care about that

                                                                                                              1. 16

                                                                                                                If you’re building a new app today, the kind of stuff you will need from day one for your service (excluding data persistency) is some load balancing and blue/green deployments. Shoving a full Kubernetes cluster just for that is really overkill.

                                                                                                                And you’re relying on the Cloud(tm) to have a working Kubernetes cluster with a few clicks. if you’re using AWS anyway, just get an Elastic Beanstalk or any equivalent that gets a docker container up and running, load balanced and easily updated using some script.

                                                                                                                All this, remaining cloud-agnostic, too. It’s just a docker container, literally every big cloud provider has some sort of “deploy a container as a service” and as a last resort you can have a VPS with Docker installed in no time.

                                                                                                                1. 5

                                                                                                                  We migrated our ~10ish services from Beanstalk to ECS in two weeks. Beanstalk deploys started to fail once we got to 1000 containers for mysterious, AWS-internals related reasons that required asking AWS support to un-wedge things on their end. Migration was pretty painless since everything was already containerized.

                                                                                                                  If your service is stateless, runs in a container, and speaks HTTP, it’s pretty easy to move between the different orchestrators. Your deploy step needs to write out a slightly different deploy JSON/YAML/… and call a slightly different orchestrator CLI, maybe you need to do some one-time load balancer reconfiguration in Terraform. Far easier than getting apps used to stateful deploys on bare boxes with tarballs into containers in the first place.

                                                                                                                  1. 3

                                                                                                                    I’ll add my own anecdote: I migrated a stack of 5 services, with secrets and other related things, from EKS to ECS in ~3 days. The biggest single obstacle I ran into is that ECS’s support for secrets isn’t as featureful as k8s’s; specifically, ECS doesn’t have a built-in way to expose secrets in the container filesystem. But I found an AWS sample showing how to achieve this using a sidecar container; here’s my fork of that code.

                                                                                                                  2. 1

                                                                                                                    AWS is really the odd one out, though; I had once had one Kubernetes description instantiated on four different clouds (Google GKE, DigitalOcean, MS Azure, IBM Bluemix) with only the cluster names and keys changed.

                                                                                                                  1. 1

                                                                                                                    Using a full programming language for configuring k8s works for complex scenarios, or scenarios which involve a “systems team” providing an API for “development team” so they can configure their services in the organization infrastructure, while staying abstracted away.

                                                                                                                    But I wouldn’t use Go for that, I think it’s kinda overkill and don’t think it’s a good language for the task. You’ll appreciate many data manipulation mechanisms and Go is lacking those.

                                                                                                                    1. 13

                                                                                                                      Excellent write-up! I’ve given a talk on a number of occasions about why Nomad is better than Kubernetes, as I too can’t think of any situations (other than the operator one you mention), where I think kubernetes is a better fit.

                                                                                                                      1. 2

                                                                                                                        Hey, yes I’ve definitely seen your talk :D Thanks for the feedback!

                                                                                                                        1. 2

                                                                                                                          Watched your talk and have some points of disagreement:

                                                                                                                          • YAML is criticized extensively in the talk (with a reason, it’s painful) as being an inherent part of Kubernetes, when reality is that it’s optional, as you can use JSON too. And, most importanty, as you can use JSON in k8s definitions, anything that outputs JSON can work as a configuration language. You’re not tied to YAML in k8s, the results you can get with stuff like Jsonnet is way superior to plain YAML here.
                                                                                                                          • I don’t think that comparing k8s to Nomad is entirely fair, as they are tools designed with different purposes. Kubernetes is oriented to fixing all the quirks of having networked cooperative systems. Nomad is way more generic and it only solves the workload orchestration part of the equation. As you well explained in the talk, you have to provide your own missing pieces to make it work for your specific use case. In a similar (and intended) example, there are many minimalistic init systems for Linux that give you total freedom and recombination… but Systemd has it’s specific use cases in which it makes sense and just works. UNIX philosophy isn’t a silver bullet, some times having multiple functionalities tied together in a single package makes sense for solving specific, common problems efficiently.
                                                                                                                          • About the complexity of running a Kubernetes cluster: True, k8s as it is, is a PITA to administrate and it’s WAY better to externalize it to any cloud provider, but there are projects like the one mentioned in the article, k3s.io, that simplifies a lot the management.

                                                                                                                          One thing we can agree 100% is that neither Kubernetes or Nomad should be the default tool for solving any problem, and that we should prefer solutions that are simpler, easier to reason about.

                                                                                                                          1. 1

                                                                                                                            I think you accidentally the link to the talk.

                                                                                                                            1. 1

                                                                                                                              Fixed, thanks :)

                                                                                                                          1. 5

                                                                                                                            Could be a huge marketing move for Microsoft to look cool, and wouldn’t hurt much their profits as it’s all about Azure now.

                                                                                                                              1. 3

                                                                                                                                Updated: Apr 11, 2019

                                                                                                                                1. 3

                                                                                                                                  2020 doesn’t help the case. MS carried on selling Xbox, Windows, and Dynamics, and renting LinkedIn to recruiters. Intelligent cloud went up but Microsoft is far from “all about Azure now”.