Threads for duncan_bayne

  1. 1

    I”m currently looking at wire-free mowers, anyone have any other recommendations? We have a fairly small garden, but it’s pretty complex with various narrow areas and not many straight angles.

    So far my main contender is the Ecovacs GOAT, despite the need for navigation antennas.

    1. 3

      Perhaps an actual goat?

      1. 4

        I think a wireless/unbound goat would ignore the boundaries I set, and likely also come into conflict with our dog…

        1. 2

          I assume they want to keep the garden. :-)

          Edit: I was imagining a garden of non-mown plants in beds with mown grass between them, but, if it’s all just grass, I guess the goat could work.

          1. 2

            Goats are browsers, they prefer nibbling on shrubs and bushes, and apparently also tend to pull up grass by its roots. Great for clearing brush, but for mowing lawns sheep are better… though I had a friend who grew up on a sheep farm and they are apparently perilously stupid animals. Maybe rabbits? Though that has its own problems. “Sorry for the state of my yard, my lawnmower got eaten by a hawk.”

            1.  
              1. 1

                Maybe rabbits? Though that has its own problems. “Sorry for the state of my yard, my lawnmower got eaten by a hawk.”

                One can put a large cage over the rabbits and move it around every few hours/days so they mow different areas while being protected. I think I read that in a book. You smart foxes might still get them, though. :-)

                1. 1

                  I dunno, that sounds dangerously approaching effort…

                2. 1

                  are apparently perilously stupid animals

                  https://americanliterature.com/author/banjo-paterson/short-story/the-merino-sheep

                  No idea why this is on an American literature site as Paterson was Australian; but it’s a great yarn (heh).

            1. 5

              Perl. I used to use it for trivial glue and CGI; I’m right now learning more of the language in order to write a non-trivial program with it.

              Why? Because I didn’t want to use Go or C, and I want to eventually support Plan 9.

              1. 7

                I’m a former “hacker” turned corporate coder. Perhaps it is a natural transition from youthful idealism to aging conservatism, but I don’t see young hackers very much, not in this city at least. Everyone has an internship now, no one goes to the hackerspace after school. It would be easy to say that it is the money. Yes, to a large extent that is true. I’ve been remodeling my house for the last few years, and that is awful expensive… But if I didn’t have the possibility to make a lot of money coding, I’m not sure I’d still be a hacker. Maybe I’d be a teacher instead, or a plumber, or I’d sell my house and go do permaculture somewhere. I don’t know. Maybe I’d even be happier without the option to have the money… But would I still be reading about Agda and pushing %100 free software? Not likely.

                The main reason for my transition, is that I am socially driven, and the sad fact is; projects with corporate sponsorship get more stars on github, they get more shares, more people talk about them, they get more pull requests, they get more atmosphere, more social air to breath. If you want to build free software, you’re very unlikely to be successful in promoting your project without corporate backing. Yes, there are a few non-corporate projects that are still being created (and it’s not like the old ones have gone away). Nix, to some degree rust, mastodon? maybe you could claim Vue… But the fact is, that ever since corporate open source took the world by storm, technical aspects (the soil) and even the surface aspects dwarf the power of a corporate logo. There were a lot of interesting cross platform distribution and isolation tools being built before and around the time of flatpak and snap. But flatpak and snap won out, even when neither were out of the alpha stage of software development. The Redhat and Canonical logos could build a community before the code had even been written. Even those projects that were successful without corporate branding were quickly bought up by the likes of Redhat. They hired most of the people who had been working on Gnome in their free time…

                The hackers got hired, and then they stopped hacking, and the ground became infertile for anyone without a logo or a series A.

                My hope is that now that everyone is being laid off, we can go back to hacking… And now that we see the risk of an OpenAI LLM monopoly, we’ll learn that we should support those projects that DO NOT have corporate backing.

                1. 4

                  Maybe you’re looking in the wrong places?

                  Gemini and the Fediverse are taking off; folks are still hacking on Plan 9(!), and the PinePhone is almost (or entirely, depending on your requirements) fine as a daily driver these days.

                  Yes they’re dwarfed by corporate open source. But that’s part of their charm.

                  1. 2

                    There’s been a huge explosion in new kernels in the last 10 years. I’ve written some internal memos about this trend because I strongly suspect that one of them will have a huge impact, most will die completely, and a few will eventually find some interesting niches. None of them are being written with Hyper-V / Azure support because it’s much easier to run Xen or KVM locally, test, and then deploy to AWS. By the time that we know which one will be the next disruptive technology, it will be well integrated with our competitors’ stacks.

                1. 4

                  This gists name would have benefitted from a comma.

                  1. 1

                    Yeah my first thought was “that’s an interesting direction to take”.

                  1. 3

                    I have definitely had a very good experience with KDE over the past year. It definitely does not look as nice as GNOME, but I can do the stuff I need to do with it, and the plasma desktop environment is mostly things that make sense (screenshot tool in KDE is quite nice and functional, if pretty ugly. GNOME’s one looks way nicer and improves on every release, of course)

                    We had a very unfortunate thing happen with desktop environments, with both major DE going through very painful transitions at the same time. Gnome is doing a lot of good work improving things, but so has KDE.

                    1. 10

                      We had a very unfortunate thing happen with desktop environments,

                      It wasn’t just the DEs, either. “Desktop Linux” just generally seemed to get worse for a long time after 2010-ish. And by “worse” I’m not talking about nerdy philosophical holy wars (systemd, etc), but I just mean getting things to work quasi-correctly. Some of it was “growing pains”, like when I had to do arcane rituals to simultaneously appease PulseAudio apps and ALSA apps, but some of it was just too much stuff all breaking in a short timespan. We had NetworkManager always doing something funny, Xorg changes + Wayland stuff, libinput with weird acceleration, polkit, udev, PAM, AppArmor, etc, etc, all changing or being rewritten or replacing each other.

                      Desktop Linux was nuts for a while. And, honestly, with the whole Snap vs Flatpack vs whatever ecosystems, I feel like we’re still doomed. Mostly because all of the app-containerizations absolutely suck. Badly. They all take up way more space, eat up way more resources when running, take longer to launch, etc. I know and understand that maintaining a massive repository for all supported software for an OS is kind of crazy and seems unsustainable, but these technologies are just not the answer. When will software devs learn that “computers are fast” is actually a lie used to justify lazy programmers? </pedestal>

                      1. 4

                        It wasn’t just the DEs, either. “Desktop Linux” just generally seemed to get worse for a long time after 2010-ish. And by “worse” I’m not talking about nerdy philosophical holy wars (systemd, etc), but I just mean getting things to work quasi-correctly.

                        Some two years ago, when I was still going through my inner “fuck it, I’m getting a Mac – holy fuck I’m not spending that kind of money on a computer!!!! – but I should really… – LOOK AT THE PRICE TAG MAN!!” debates, a friend of mine pointed out that, you know, we all look at the past through rose-tinted glasses, lots of things broke all the time way back, too.

                        A few days afterwards we were digging through my shovelware CDs and, just for shits and giggles, I produced a Slackware 10 CD, which we proceeded to install on an old x86 PC I keep for nostalgia reasons. Slackware 10 shipped with KDE 3.2.3, which was still pretty buggy and not quite the “golden” 3.5 standards yet.

                        Man, it’s not all rose-tinted glasses, that thing was pretty solid. Two years ago I could still break Plasma desktop just by staring at it mencingly – like, you could fiddle with the widgets on the panel for a bit and have it crash or resize them incorrectly, drag a network-mounted folder to the panel to iconify it and then get it to freeze at login by unplugging the network cable, or get System Settings and/or Kwin to grind to a halt or outright crash just by installing a handful of window decoration themes.

                        Then again, the tech stack underneath all that has grown tremendously since then. Plasma 5 has the same goals on paper but it takes a lot more work to achieve them than it took back in 2004 or whatever.

                        1. 1

                          I love this anecdote, and I’ve had similar experiences.

                          I’m a software dev these days, myself, and I’ve always been a free software fan/advocate, so I don’t want to shit on anyone’s hard work–especially when they are mostly doing it for free and releasing it to the world for free. But, I do wonder where things went wrong in the Desktop Linux world.

                          Is it that the “modern” underlying technologies (wayland, libinput, systemd, auth/security systems, etc) are harder to work with than the older stuff?

                          Is it that modern hardware is harder to work with (different sleep levels, proprietary driver APIs, etc)?

                          Is it just that there’s so much MORE of both of the above to support, and therefore the maintenance burden increases monotonically over time?

                          Or is it just the age-old software problem of trying to include the kitchen sink while never breaking backwards compatibility so that everyone is happy (which usually ends up with nobody happy)?

                          Again, I appreciate the work the KDE devs do, and I’m really glad that KDE and Plasma exist and that many people use their stuff and are happy with it… But…, I will state my uninformed speculation as a fellow software dev: I suspect that the vast majority of bugs in Plasma today are a direct result of trying to make the desktop too modular and too configurable. The truth is that the desktop pieces generally need to know about each other, so that the desktop can avoid being configured into a bad state, or so that widgets can adapt themselves when something else changes, e.g., containing panel resizes, screen size changes, etc. Obviously Plasma does have mechanisms in place for these things, and I don’t know what those mechanisms are (other than that it probably uses DBUS to publish event messages), so this is just speculation, but I imagine that the system for coordinating changes and alerting all of the different desktop parts is simultaneously more complex and more limited than it would be if the whole desktop were more tightly integrated. I strongly suspect that Plasma architected itself with a kind of traditional, Alan Kay-ish, “object oriented” philosophy: everything is an independent actor that communicate via asynchronous messages, and can be added and removed dynamically at runtime. I’m sure that the idea was to maximize flexibility and extensibility, but I also think that the cost to that approach is more complexity and that it’s basically impossible to figure out what will actually happen in response to a change. Not to mention that most of this stuff is (or was the last time I checked, at least) written in C++, which is not the easiest language to do dynamic stuff in.

                          1. 1

                            I suspect that the vast majority of bugs in Plasma today are a direct result of trying to make the desktop too modular and too configurable.

                            I hear this a lot but, looking back, I really don’t think it’s the case. KDE 3.x-era was surprisingly close to modern Plasma and KDE Applications releases in terms of features and configurability – not on the same level but also not barebones at all, and was developed by fewer people over a much shorter period of time. A lot of it got rewritten from the ground up – there was a lot of architecture astronautics in the 4.x series, so a couple of Plasma components actually lost some featyres on the way. And this was all happening back when the whole KDE series was a big unhappy bunch of naked C++ – it happened way before QtQuick & co..

                            IMHO it’s just a symptom of too few eyes looking over code that uses technology developed primarily for other purposes. Back in the early ‘00s there was money to be made in the desktop space, so all the cool kids were writing window managers and whatnot, and there was substantial (by FOSS standards of the age) commercial backing for the development of commercially-viable solutions, paying customers and all. This is no longer the case. Most developers in the current generations are interested in other things, and even the big players in the desktop space are mostly looking elsewhere. Much of the modern Linux tech stack has been developed for things other than desktops, too, so there’s a lot of effort to be duplicated at the desktop end (eh, Wayland?), and modern hardware is itself a lot more complex, so it just takes a lot more effort to do the same things well.

                            Some of the loss in quality is just inherent to looking the wrong way for inspiration – people in FOSS love to sneer at closed platforms, but they seek to emulate them without much discrimination, including the bad parts (app stores, ineffective UX).

                            But I think most of it is just the result of too few smart people having to do too much work. FOSS platforms were deliberately written without any care for backwards compatibility, so we can’t even reap the benefits of 20+ years of maintenance and application development the way Windows (and, to some extent, macOS) can.

                            1. 1

                              I hear this a lot but, looking back, I really don’t think it’s the case. KDE 3.x-era was surprisingly close to modern Plasma and KDE Applications releases in terms of features and configurability

                              It was very configurable, yes. But, I was speaking less from the lens of the user of the product, and more from the software architecture (as I came to understand it from blog posts, etc). I don’t know what the KDE 3.x code was like, but my impression for KDE/Plasma 4+ was that the code architecture was totally reorganized for maximum modularity.

                              Here’s a small example of what I mean from some KDE 4 documentation page: https://techbase.kde.org/Development/Architecture/KDE4/KParts. This idea of writing the terminal, text editor, etc as modular components that could be embedded into other stuff is an example of that kind of thinking, IMO. It sounds awesome, but there’s always something that ends up either constraining the component’s functionality in order to stay embeddable, or causing the component to not work quite right when embedded into something the author didn’t expect to be embedded in.

                              Back in the early ‘00s there was money to be made in the desktop space, so all the cool kids were writing window managers and whatnot, and there was substantial (by FOSS standards of the age) commercial backing for the development of commercially-viable solutions, paying customers and all. This is no longer the case.

                              Is that correct? My understanding was that a good chunk of the GNOME leadership were employed by Red Hat. Is that no longer the case? I don’t know the history of KDE and its stewardship, but if Novell or SUSE were contributing financially to it and now no longer are, I could see how that would hurt the person-power of the project.

                              Some of the loss in quality is just inherent to looking the wrong way for inspiration – people in FOSS love to sneer at closed platforms, but they seek to emulate them without much discrimination, including the bad parts (app stores, ineffective UX).

                              I definitely agree with this. That’s actually one reason why I tune out the GNOME Shell haters. It’s not that I don’t have some of my own criticisms about the UI/UX of it, but I really appreciate that they tried something different. Aside: And as someone who has worked on Macs for 10 years, it blows my mind when people say that GNOME Shell is at all mac-like; the workflow and UX has almost nothing in common with macOS except for the app-oriented super-tab switcher.

                              1. 1

                                Here’s a small example of what I mean from some KDE 4 documentation page: https://techbase.kde.org/Development/Architecture/KDE4/KParts. This idea of writing the terminal, text editor, etc as modular components that could be embedded into other stuff is an example of that kind of thinking, IMO.

                                Uhh… it’s been a while so I don’t remember the details very well but KDE 3 was definitely very modular as well. In fact KParts dates from the 3.x series, not 4.x: https://techbase.kde.org/Development/Architecture/KDE3/KParts . KDE 4.x introduced a whole bunch of new things that, uh, didn’t work out well for a while, like Nepomuk, and changed the desktop shell model pretty radically (IIRC that’s when (what would eventually become) Plasma Shell came up). Some frameworks and applications probably went through some rewrites, some were abandoned, and things like DCOP were buried, but the overall approach to designing reusable frameworks definitely stayed.

                                Is that correct? My understanding was that a good chunk of the GNOME leadership were employed by Red Hat. Is that no longer the case? I don’t know the history of KDE and its stewardship, but if Novell or SUSE were contributing financially to it and now no longer are, I could see how that would hurt the person-power of the project.

                                I think Red Hat still employs some Gnome developers. But Canonical no longer has a desktop team IIRC, Ximian is pretty much gone, Nokia isn’t pouring money into desktop/mobile Linux technologies etc.. Pretty much all the big Linux players are mostly working on server-side technologies or embedded deployments.

                                I definitely agree with this. That’s actually one reason why I tune out the GNOME Shell haters.

                                I don’t really mind Gnome Shell, Linux always had all sorts of whacky “desktop shell” thingies. However, I really started to hate my Linux boxes starting with GTK3.

                                I dropped most of the GTK3 applications I was using and got a trip down the memory lane compiling Emacs with the Lucid toolkit. But it wasn’t really avoidable on account of Firefox. That meant I had to deal with its asinine file finding dialog, the touch-sized widgets on a non-touch screen, and that awful font rendering on a daily basis. Not having to deal with that more than justifies the money I spent on my Mac, hell I’d pay twice that money just to never see those barely-readable Adwaita-themed windows again *sigh*.

                                1. 1

                                  Uhh… it’s been a while so I don’t remember the details very well but KDE 3 was definitely very modular as well.

                                  Fair enough. I definitely used KDE 3 a bit back in the day, but I don’t remember knowing anything about the development side of it. I could very well be mistaken about KDE 4 being a significant push toward modularity.

                          2. 1

                            Oof, I echo all of this so much.

                            I was a desktop linux user from about 2003 until 2010 or so, going through a variety of distros (Slackware, Gentoo, Arch) and sticking with Ubuntu since 2006ish.

                            At one point I got tired of how worse things were getting, specially for laptop users and switched to a Mac. I’ve used macs for my main working computers since then and mostly only used Linux in servers/rPis, etc.

                            About 3 years back, before the new Arm based macs came out I was a bit fed up with my work computer at the time, an Intel based Mac, being so sluggish, so I decided to try out desktop Linux again (with whatever the current Ubuntu was at the time) on my Windows Desktop PC which is mostly just a gaming PC.

                            I could replicate my usual workflow, specially because I never depended too much on Mac specific apps, but then even on a desktop machine with two screens, the overall experience was just…not great.

                            The number one thing that irked me was dealing with my two screens which have different resolutions and different DPIs. The desktop UI for it just literally didn’t work and I had to deal with xrandr commands that ran on desktop start and apparently this is “normal” and everyone accepted this as it being ok. And even then I could never get it exactly right and sometimes it would just mess up to a point that the whole display server would need a restart.

                            Other than that, the way many of these modern web based desktop apps just have all sorts of issues with different DPIs and font rendering.

                            I thought, how were all of these things still such a massive issue? Specially the whole screen thing with laptops being the norm over the last 15 years and people often using external screens that probably have a different DPI from their laptop anyway?

                            Last year I decided to acquire a personal laptop again (for many years I only had work laptops) and I thought I’d have a go at a Framework laptop, and this time I thought I’d start with Kubuntu and KDE, as I’d also briefly tried modern KDE on an Asahi Linux installation and loved it.

                            KDE seems to handle the whole multiple display/DPI thing a lot better but still not perfectly. The web based desktop app and font rendering issues were somehow still there but not as bad (I did read about some Electron bugs that got fixed in the meanwhile).

                            And then I dove into the whole Snap/Flatpack thing which I was kind of unfamiliar with as not having used desktop linux for so many years. And what a mess! At multiple points I had multiple instances of Firefox running and it took me a while to understand why. Some apps would open the system Firefox, others would go for the containerized one.

                            I get why these containerized app ecosystems exist, but in my limited experience with it the interoperability between these apps seems terrible and it makes for a terrible user experience. It feels like a major step back for all the improvements and ease of use Desktop Linux made over the years.

                            I did also briefly try the latest Ubuntu with Gnome and the whole dual screen DPI situation was just as bad as before, I’m guessing related to the whole fractional scaling thing. Running stuff at 100% was fine but too small, 200% fine but too bug, 150%? A blurry mess. KDE deals fine with all those in between scales.

                          3. 2

                            My other spicy opinion on breakage is that Ubuntu not doing rolling releases holds back everything, because bug fixes take too long to get in front of users.

                            “I don’t want updates to break things” OK well now every bug fix takes at least 6 months to get released. And is bundled with 10 other ones.

                            I understand the trade offs being made but imagine a world in which bug fixes show up “immediately”

                            1. 1

                              I understand the trade offs being made but imagine a world in which bug fixes show up “immediately”

                              … that was the world back in the day.

                              “Oh, $SHIT is broken. I see the patch landed last night. I’ll just grab the source and rebuild.”

                              1. 1

                                And it still is, depending on the distro (obviously, you can manually compile/manage packages with any distro, but some distros make that an officially supported approach).

                              2. 1

                                I agree that Ubuntu’s release philosophy isn’t great, but in its defense, bug fixes are not blocked from being pushed out as regular updates in between major releases.

                                What I do think is the big problem with Ubuntu’s releases is that there used to be no real distinction between “system” stuff and “user applications”. It’s one thing to say “Ubuntu version X.Y has bash version A.B, so write your scripts targeting that version.” It’s another to say “Ubuntu version X.Y has Firefox version ABC.” Why the hell wouldn’t “apps” always just be rolling release style? I do understand that the line between a “system thing” and a “user-space thing” is blurry and somewhat arbitrary, but that doesn’t mean that giving up is the right call.

                                To be fair, I guess that Ubuntu’s push toward “snap” packages for these things does kind of solve the issue, since I think snaps can update independently.

                              3. 1

                                It wasn’t just the DEs, either. “Desktop Linux” just generally seemed to get worse for a long time after 2010-ish.

                                That’s part of why I landed on StumpWM and stayed. It’s small, simple, works well for my use cases, and hasn’t experienced the sort of churn and CADT rewrites that have plagued others.

                                Moving to FreeBSD as my daily driver also helped there, because it allowed me to nope out of a lot of the general desktop Linux churn.

                                1. 1

                                  Why do people rewrite everything all the time?

                                2. 2

                                  I switched to Plasma from GNOME because I was tired of my customizations getting obliterated all the time. I also like the fact I can mess with the key combinations in many more apps, since my muscle memory uses Command, not Control. Combined with a couple add-ons and global menu, I’ve never looked back.

                                  1. 2

                                    We had a very unfortunate thing happen with desktop environments, with both major DE going through very painful transitions at the same time.

                                    The reasons were entirely clear, but people’s memories are short, and there is a tonne of politics.

                                    Microsoft threatened to sue. Red Hat and Ubuntu *2 bigger GNOME backers) refused to cooperate (including with each other) and built new desktops.

                                    SUSE and Linspire (2 of the biggest KDE backers) cooperated.

                                    I detailed it all here.

                                    https://liam-on-linux.dreamwidth.org/85359.html

                                    Someone stuck it on HN and senior GNOME folk denied everything. I don’t believe them. Of course they deny it, but it was no accident.

                                    This is all a matter of historical record.

                                  1. 3

                                    Fastmail. I prefer it to Google (my previous provider) because:

                                    1. It’s not Google.
                                    2. It’s not “free”; I pay a monthly fee; that gets me services and support. I know I’m not the product.
                                    3. It allows me to have a family account that I can administer, letting my children have their own addresses on our family domain, but also, allowing me to monitor their usage.
                                    4. Excellent support. I’ve seen their CTO on GitHub issue threads in years gone by.
                                    5. High reliability. I’ve only had a couple of issues in the years I’ve been with them.
                                    6. Their work on JMAP.
                                    1. 3

                                      I’ve seen their CTO on GitHub issue threads in years gone by.

                                      This is one of the reasons I like DuckDuckGo for search. Many years ago, they changed something in their search field that broke the keyboard navigation shortcuts on OS X (Google doing this was one of the reasons I started using DDG), so I filed a bug. A few hours later, I got an email from Gabriel (the founder) with a link to a test site to ask me if it fixed the issue. It did and they rolled out the fix that day.

                                      1. 1

                                        This is GitHub issue in question:

                                        https://github.com/skarra/ASynK/issues/72#issuecomment-112303307

                                        Compare and contrast with other “tech” companies where it takes forever to even talk to someone who understands the problem. And this isn’t (purely) a matter of scale: I’ve had that sort of problem with quite small companies as well (especially in the education tech market, as I’ve reported a number of quite serious bugs in that space).

                                    1. 3

                                      Given the blog is titled “Decentralised Thoughts” I thought it was funny that they ended the piece with:

                                      “Please answer/discuss/comment/ask on Twitter.”

                                      1. 2

                                        Wow this is a great idea. I am revamping my professional Website right now - it’s currently just a single page resume - and I’ll be shamelessly stealing this approach.

                                        1. 6

                                          See how much I can sell it for.

                                          I’m no fun.

                                          1. 2

                                            Money can’t buy love; but it can buy a lot of fun.

                                          1. 6

                                            I hope progress is being made on federation for gitea (and others).

                                            1. 5

                                              Git has supported “federation” from its inception. Activity-pub’ing some git frontend webshit isn’t going to work any better than adding activitypub to anything else has.

                                              1. 4

                                                The federation isn’t simply the source code repo, but stuff like issue trackers, documentation etc.

                                                1. 5

                                                  Put issues and documentation into the source-code repository. Check in all relevant files. (GitHub makes this difficult; Issues are designed to be hard to move to another service.)

                                                  1. 3

                                                    One way to do this, given tooling support, may be an explicit folder for ‘broken tests’; then issue reports can take the form of “pull request/patch for a broken test.”

                                              2. 3

                                                I don’t think gitea will focus on federation but a fork (forgejo) will.

                                                  1. 2

                                                    Thanks for the correction

                                                1. 2

                                                  I’m a little nervous about federation as a strategy here, for the same reason that I don’t think the fediverse should be the end-goal of social media: federation leads to feudalism

                                                  I think there might need to be some creative thinking on what to do instead. taking a corporate platform and trying to mirror the identical features as if they make sense when decentralized is a bit silly. the thing where you submit changes by forking the repo was never really for the benefit of the community, it was to give github a network effect.

                                                  people submitted patches through mailing lists for decades before github came along. it worked fine. people made local copies of repos to work in, not every contributor needed to have a published repo. I do understand why, in a world where email is for old people, that model probably needs to be rethought, but I offer it as an existence proof that there are meaningfully different strategies that work.

                                                  (I note that the author of this article would probably still be upset about the central approver on a mailing list thing as “gatekeeping”, but I think it’s worth discussing regardless)

                                                  1. 1

                                                    federation leads to feudalism

                                                    Can you elaborate on that? AFAIK actual feudalism only persisted because people weren’t free to switch patrons / lords / what-have-you.

                                                    Relatedly: this is why I worry when political power is centralised in Federal Government rather than remaining with the States (here in Australia). Granted there are efficiency arguments, but one of the major attractions of a federation is the freedom to switch states if one goes off the rails.

                                                    1. 2

                                                      I’d love to elaborate on that, and since I do kind of have a habit of saying it a lot I need to flesh out the metaphor a bit more. the following thoughts are in the nature of a first draft, not well-thought-out, so please take them for what they’re worth.

                                                      I see two negative aspects of federated social networks which I consider to be feudalism. the first is the power dynamic that instance operators have with respect to their members. the second, and the one that really suggests “feudalism” as a word to me, is that instances wind up in conflict with each other, staking out territory and attacking each others’ reputations and generally trying to create impermeable boundaries between parts of the fediverse, where you can’t see your friends if you’re on the wrong side of the instance-block wall.

                                                      I agree in principle that ease of switching would give members more power, if you could actually switch without consequences, but there are consequences, both technical (you lose all your post history) and social (you lose access to some subset of your former mutuals, and you can’t easily guess who, unless you somehow have the entire map of the political terrain in your head).

                                                      the other thing to keep in mind about switching is that in many cases, it doesn’t actually solve the problem. if your instance has been widely defederated due to political dynamics such as coordinated efforts to characterize marginalized groups you belong to as inherently abusive or whatever, that’s going to happen on any instance you and your friends choose to move to. switching is high-cost and at best a short-term solution to that sort of thing. so the incentive is generally to pick a place and stay there.

                                                1. 7

                                                  Couldn’t agree more. When GitHub was bought by Microsoft, I migrated all my personal repos to GitLab, and since then I’ve never really had any reason to switch back. Stuff like CoPilot just seems unethical to me, and GitLab being open-source is awesome. I do have a small self-hosted Gogs server; the only reason I chose Gogs over GitLab is just how resource-hungry GitLab can be for small collaborative groups. It scales well, I’m sure, but if you’ve just got 4 or 5 users it makes sense to use a lightweight server. Hell, for a few years I even just ran a git “server” with no front-end at all. Just pushing directly to repositories. If I wanted to create a new one I’d ssh in and make a –bare repo, and then push to it.

                                                  Addressing what others have said: I don’t get the Pull Requests thing either. It seems like a great way to manage contributions to public, open-source projects. Especially if you get many small contributions from a large number of people, with a small core group of developers, the amount of time it saves everyone really does add up. But, if you don’t want to support GitHub but want to contribute to projects on there, emailing git-formatted patches is a thing. I mean, that’s how the Linux team still does it, right? I don’t have much experience using git this way, but I have done some work manually emailing diff/patch stuff and if git integrates well, it’s probably not too painful.

                                                  I think trying to convince projects to move to a different platform is a great idea in principle, but when a stranger walks in and immediately starts complaining about the hosting service, the devs usually have bigger fish to fry. I think it’s better to encourage new projects to use alternative platforms. A huge number of people seem to have only heard of GitHub and maybe GitLab; but as I mentioned, there are Gogs and Gitea for self-hosting, SourceHut, LaunchPad, and BitBucket; plus there are lots of free tools like Gerrit that help manage git and code tracking.

                                                  Curious to know what platforms people use by default, especially when it comes to personal vs. professional projects.

                                                  1. 1

                                                    Stuff like CoPilot just seems unethical to me, and GitLab being open-source is awesome.

                                                    Weellllllll … partially. Part of it is open source and free to self-host. Whereas Sourcehut (where I migrated to from GitLab, after having left GitHub) is fully open source.

                                                  1. 3

                                                    Good news is this seems - despite being beloved of many of the culprits behind EME - to be entirely open. Firefox may have it in v120 or later.

                                                    1. 2

                                                      That’s a great-looking keyboard. Wonder how they feel.

                                                      1. 3

                                                        “PCjr” and “great-looking keyboard” in the same thought is so funny to me.

                                                        1. 1

                                                          Looks like they had two options for the JX? The one in the ad doesn’t look like a traditional PCjr keyboard to me.

                                                          But yeah - the original keyboard was an odd one.

                                                          1. 1

                                                            From the ad:

                                                            Choose the keyboard that suits you

                                                            The IBM JX offers you the choice of two precision-touch keyboards. They both use the proven IBM Selectric typewriter layout to help make typing quicker and more comfortable; both have an infra-red remote option.

                                                        2. 2

                                                          I recall them being “okay” - not in the same league as those (presumably Model M) fitted to the PS/2 model 30s.

                                                        1. 2

                                                          Oh wow! I went to a high school in New Zealand that partnered with IBM on a novel programme. It combined several topics - English, history, social studies? - into one subject called “Integrated Studies” and then made extensive use of computers to do the work.

                                                          Edited to add: at the time I started, 1992, the JX was already obsolete. We had two main “labs” - one stocked with IBM PS/2 30s (XT class), and another with clone 286s. A few years later we gained a lab of 486s running Windows 3.1. We also had a few individual machines knocking around - a genuine IBM AT that used to serve a token ring network connecting the JXs, and some admin machines for staff. The JXs themselves were spread around individual maths classrooms and almost never used.

                                                          1. 1

                                                            Emacs’ tendency - by virtue of being single-threaded, I believe - to lock up on long-running operations. In particular, unexpectedly long-running operations like opening a large PDF.

                                                            1. 27

                                                              I just do not like this persons style of writing.

                                                              1. 13

                                                                I don’t think I’d want to be in the same room as this person.

                                                                1. 13

                                                                  I happen to agree with a few of his observations about GNOME: there’s an air of arrogance to some of the developers involved, and they really don’t seem to have correctly identified their customers (per the product adoption model, I’d bet the vast majority of GNOME users are innovators, not even early adopters).

                                                                  But, from another article of his:

                                                                  The mathematical fact is that USA’s actions caused Russia’s [invasion of Ukraine]

                                                                  … so I have to say I wouldn’t be keen to share a room with him either.

                                                                  1. 3

                                                                    He could be right about the arrogance of GNOME devs but he’s also the worst possible messenger, given his own arrogant and harassing behavior in the bug reports.

                                                                    1. 1

                                                                      Also, he blames the USA for Putin’s invasion of Ukraine. He’s either unhinged or a Putin supporter.

                                                                      1. 4

                                                                        He’s either unhinged or a Putin supporter

                                                                        It’s the former, check his reddit profile.

                                                                  2. 5

                                                                    Linus agrees!. This guy has been trolling FOSS communities for years.

                                                                    1. 2

                                                                      Someone named Felipe Contreras being notoriously argumentative is some primo nominative determinism.

                                                                      1. 2

                                                                        Going waaaaay off topic, now, but have you seen this?

                                                                        Nominative determinism in hospital medicine, by Drs. Limb, Limb, Limb, and Limb.

                                                                        1. 2

                                                                          Except Spanish “contreras” seems to be a false friend to English “contrary”

                                                                          https://en.wikipedia.org/wiki/Contreras

                                                                  1. 10

                                                                    I hope it’s obvious to everyone at this point that any time a corporation is comparatively better than others at respecting the people who rely on its stuff, this is purely a temporary state of affairs.

                                                                    Free software projects are never as polished or featureful as corporate ones, but it’s quite rare for them to suddenly grow surveillance tooling. To me, this is worth it. Everyone can make their own decisions of course…

                                                                    1. 2

                                                                      Free software projects are never as polished or featureful as corporate ones

                                                                      Honestly, I’ve found the opposite to be true. I think it is fair to say they’re rarely as easy for novices to learn, however.

                                                                      1. 4

                                                                        “Easy to learn” is a big “polish” item.

                                                                        1. 2

                                                                          Well that explains why HP calculators are hard to learn; they got the polish backwards.

                                                                    1. 3

                                                                      They’ve released the object files, not the source, under the Apache 2 license. This means that yes, they’re free-as-in-beer to use, but:

                                                                      1. You won’t be able to fix bugs yourself, or pay people to fix them for you (bounties, etc.).

                                                                      2. There is no guarantee of continuity. They could drop Apache 2 licensing for the object files tomorrow, and you’d be stuck on an unsupported binary with no security fixes, etc.

                                                                      1. 7

                                                                        I’m surprised he didn’t reference this issue:

                                                                        https://www.jwz.org/doc/cadt.html

                                                                        In February 2003, a bunch of the outstanding bugs I’d reported against various GNOME programs over the previous couple of years were all closed as follows:

                                                                        Lest anyone think this was a one-off, look at what happened in 2021:

                                                                        “Since this week there are finally no open tickets in GNOME Bugzilla left. All were either migrated to GNOME GitLab or mass-closed over the last weeks by Bart or me. When mass-closing, an explanatory comment was added (example) to allow contributors to understand why we closed their ticket.”

                                                                        1. 23

                                                                          I switched two projects of mine to sourcehut pages (from github pages) not too long ago:

                                                                          The second required I update something about the content security policy so I could use an iframe in some cursed way. It was really pleasant to just suggest the change, then add it myself. Contributing to sourcehut can be difficult the first time if you haven’t used an email/patch based flow, but otherwise, it works well for me so far!

                                                                          1. 3

                                                                            I’ve had a patch or two accepted too with very little issue. It’s really nice when your tools/platform are open to contributions and allow you to read the source to suggest fixes.

                                                                            1. 2

                                                                              That’s a large part of why I abandoned .NET development altogether for Ruby on Rails back in ~ 2012.

                                                                            2. 2

                                                                              I love how these sites look!