1. 20

    Apparently firefox has always been using unstable features of rust? https://news.ycombinator.com/item?id=27492520

    1. 8

      Yes, and some Rust team members are highly critical of this, precisely because it is a hallmark project and produces the impression laid out in the blog post.

      1. 4

        Except this time, the cause was something completely different, as noted by matklad elsewhere in this thread. It’s pretty uncool that the blog post author didn’t post the error message up front and gave room for speculation.

        1. 3

          I don’t normally like to criticize this way, but this is a bit of a pattern for the author. It might be coming to your attention for the first time here because of the specific context, but I’ve been frustrated more than once by their posts claiming backwards-compatibility issues in things I’m more involved in/attentive to, because they so often don’t include enough detail for someone to dig in and figure out what the cause was.

          1. 1

            You are correct and sorry for falling into that trap!

      1. 78

        It would help if Firefox would actually make a better product that’s not a crappy Chrome clone. The “you need to do something different because [abstract ethical reason X]” doesn’t work with veganism, it doesn’t work with chocolate sourced from dubious sources, it doesn’t work with sweatshop-based clothing, doesn’t work with Free Software, and it sure as hell isn’t going to work here. Okay, some people are going to do it, but not at scale.

        Sometimes I think that Mozilla has been infiltrated by Google people to sabotage it. I have no evidence for this, but observed events don’t contradict it either.

        1. 24

          It would help if Firefox would actually make a better product that’s not a crappy Chrome clone. The “you need to do something different because [abstract ethical reason X]” doesn’t work with veganism, it doesn’t work with chocolate sourced from dubious sources, it doesn’t work with sweatshop-based clothing, doesn’t work with Free Software, and it sure as hell isn’t going to work here. Okay, some people are going to do it, but not at scale.

          I agree, but the deck is stacked against Mozilla. They are a relatively small nonprofit largely funded by Google. Structurally, there is no way they can make a product that competes. The problem is simply that there is no institutional counterweight to big tech right now, and the only real solutions are political: antitrust, regulation, maybe creating a publicly-funded institution with a charter to steward the internet in the way Mozilla was supposed to. There’s no solution to the problem merely through better organizational decisions or product design.

          1. 49

            I don’t really agree; there’s a lot of stuff they could be doing better, like not pushing out updates that change the colour scheme in such a way that it becomes nigh-impossible to see which tab is active. I don’t really care about “how it looks”, but this is just objectively bad. Maybe if you have some 16k super-HD IPS screen with perfect colour reproduction at full brightness in good office conditions it’s fine, but I just have a shitty ThinkPad screen and the sun in my home half the time (you know, like a normal person). It’s darn near invisible for me, and I have near-perfect eyesight (which not everyone has). I spent some time downgrading Firefox to 88 yesterday just for this – which it also doesn’t easily allow, not if you want to keep your profile anyway – because I couldn’t be arsed to muck about with userChrome.css hacks. Why can’t I just change themes? Or why isn’t there just a setting to change the colour?

            There’s loads of other things; one small thing I like to do is not have a “x” on tabs to close it. I keep clicking it by accident because I have the motor skills of a 6 year old and it’s rather annoying to keep accidentally closing tabs. It used to be a setting, then it was about:config, then it was a userChrome.css hack, now it’s a userChrome.css hack that you need to explicitly enable in about:config for it to take effect, and in the future I probably need to sacrifice a goat to our Mozilla overlords if I want to change it.

            I also keep accidentally bookmarking stuff. I press ^D to close terminal windows and sometimes Firefox is focused and oops, new bookmark for you! Want to configure keybinds for Firefox? Firefox say no; you’re not allowed, mere mortal end user; our keybinds are perfect and work for everyone, there must be something wrong with you if you don’t like it! It’s pretty darn hard to hack around this too – more time than I was willing to spend on it anyway – so I just accepted this annoyance as part of my life 🤷

            “But metrics show only 1% of people use this!” Yeah, maybe; but 1% here and 5% there and 2% somewhere else and before you know it you’ve annoyed half (of not more) of your userbase with a bunch of stuff like that. It’s the difference between software that’s tolerable and software that’s a joy to use. Firefox is tolerable, but not a joy. I’m also fairly sure metrics are biased as especially many power users disable it, so while useful, blindly trusting it is probably not a good idea (I keep it enabled for this reason, to give some “power user” feedback too).

            Hell, I’m not even a “power user” really; I have maybe 10 tabs open at the most, usually much less (3 right now) and most settings are just the defaults because I don’t really want to spend time mucking about with stuff. I just happen to be a programmer with an interest in UX who cares about a healthy web and knows none of this is hard, just a choice they made.

            These are all really simple things; not rocket science. As I mentioned a few days ago, Firefox seems have fallen victim to a mistaken and fallacious mindset in their design.

            Currently Firefox sits in a weird limbo that satisfies no one: “power users” (which are not necessarily programmers and the like, loads of people with other jobs interested in computers and/or use computers many hours every day) are annoyed with Firefox because they keep taking away capabilities, and “simple” users are annoyed because quite frankly, Chrome gives a better experience in many ways (this, I do agree, is not an easy problem to solve, but it does work “good enough” for most). And hey, even “simple” users occasionally want to do “difficult” things like change something that doesn’t work well for them.

            So sure, while there are some difficult challenges Firefox faces in competing against Google, a lot of it is just simple every-day stuff where they just choose to make what I consider to be a very mediocre product with no real distinguishing features at best. Firefox has an opportunity to differentiate themselves from Chrome by saying “yeah, maybe it’s a bit slower – it’s hard and we’re working on that – but in the meanwhile here’s all this cool stuff you can do with Firefox that you can’t with Chrome!” I don’t think Firefox will ever truly “catch up” to Chrome, and that’s fine, but I do think they can capture and retain a healthy 15%-20% (if not more) with a vision that consists of more than “Chrome is popular, therefore, we need to copy Chrome” and “use us because we’re not Chrome!”

            1. 21

              Speaking of key bindings, Ctrl + Q is still “quit without any confirmation”. Someone filed a bug requesting this was changeable (not even default changed), that bug is now 20 years old.

              It strikes me that this would be a great first issue for a new contributor, except the reason it’s been unfixed for so long is presumably that they don’t want it fixed.

              1. 9

                A shortcut to quit isn’t a problem, losing user data when you quit is a problem. Safari has this behaviour too, and I quite often hit command-Q and accidentally quit Safari instead of the thing I thought I was quitting (since someone on the OS X 10.8 team decided that the big visual clues differentiating the active window and others was too ugly and removed it). It doesn’t bother me, because when I restart Safari I get back the same windows, in the same positions, with the same tabs, scrolled to the same position, with the same unsaved form data.

                I haven’t used Firefox for a while, so I don’t know what happens with Firefox, but if it isn’t in the same position then that’s probably the big thing to fix, since it also impacts experience across any other kind of browser restart (OS reboots, crashes, security updates). If accidentally quitting the browser loses you 5-10 seconds of time, it’s not a problem. If it loses you a load of data then it’s really annoying.

                1. 4

                  Firefox does this when closing tabs (restoring closed tabs usually restores form content etc.) but not when closing the window.

                  The weird thing is that it does actually have a setting to confirm when quitting, it’s just that it only triggers when you have multiple tabs or windows open and not when there’s just one tab 🤷

                  1. 1

                    The weird thing is that it does actually have a setting to confirm when quitting, it’s just that it only triggers when you have multiple tabs or windows open and not when there’s just one tab

                    Does changing browser.tabs.closeWindowWithLastTab in about:config fix that?

                    1. 1

                      I have it set to false already, I tested it to make sure and it doesn’t make a difference (^W won’t close the tab, as expected, but ^Q with one tab will still just quit).

                  2. 2

                    I quite often hit command-Q and accidentally quit Safari

                    One of the first things I do when setting up a new macOS user for myself is adding alt-command-Q in Preferences → Keyboard → Shortcuts → App Shortcuts for “Quit Safari” in Safari. Saves my sanity every day.

                    1. 1

                      Does this somehow remove the default ⌘Q binding?

                      1. 1

                        Yes, it changes the binding on the OS level, so the shortcut hint in the menu bar is updated to show the change

                        1. 1

                          It overrides it - Safari’s menu shows ⌥⌘Q against “Quit Safari”.

                        2. 1

                          You can do this in windows for firefox (or any browser) too with an autohotkey script. You can set it up to catch and handle a keypress combination before it reaches any other application. This will be global of course and will disable and ctrl-q hotkey in all your applications, but if you want to get into detail and write a more complex script you can actually check which application has focus and only block the combination for the browser.

                        3. 2

                          This sounds like something Chrome gets right - if I hit CMD + Q I get a prompt saying “Hold CMD+Q to Quit” which has prevented me from accidentally quitting lots of times. I assumed this was MacOS behaviour, but I just tested Safari and it quit immediately.

                        4. 6

                          Disabling this shortcut with browser.quitShortcut.disabled works for me, but I agree that bug should be fixed.

                          1. 1

                            Speaking of key bindings, Ctrl + Q is still “quit without any confirmation”.

                            That was fixed a long time ago, at least on Linux. When I press it, a modal says “You are about to close 5 windows with 24 tabs. Tabs in non-private windows will be restored when you restart.” ESC cancels.

                            1. 1

                              That’s strange. I’m using latest Firefox, from Firefox, on Linux, and I don’t ever get a prompt. Another reply suggested a config tweak to try.

                              1. 1

                                I had that problem for a while but it went away. I have browser.quitShortcut.disabled as false in about:config. I’m not sure if it’s a default setting or not.

                                1. 1

                                  quitShortcut

                                  It seems that this defaults to false. The fact you have it false, but don’t experience the problem, is counter-intuitive to me. Anyway the other poster’s suggestion was to flip this, so I’ll try that. Thanks!

                                  1. 1

                                    That does seem backwards. Something else must be overriding it. I’m using Ubuntu 20.04, if that matters. I just found an online answer that mentions the setting.

                          2. 7

                            On one level, I disagree – I have zero problems with Firefox. My only complaint is that sometimes website that are built to be Chrome-only don’t work sometimes, which isn’t really Firefox’s problem, but the ecosystem’s problem (see my comment above about antitrust, etc). But I will grant you that Firefox’s UX could be better, that there are ways the browser could be improved in general. However, I disagree here:

                            retain a healthy 15%-20% (if not more)

                            I don’t think this is possible given the amount of resources Firefox has. No matter how much they improve Firefox, there are two things that are beyond their control:

                            1. Most users use Google products (gmail, calendar, etc), and without an antitrust case, these features will be seamlessly integrated into Chrome, and not Firefox.
                            2. Increasingly, websites are simple not targeting Firefox for support, so normal users who want to say, access online banking, are SOL on Firefox. (This happens to me, I still have to use Chrome for some websites)

                            Even the best product managers and engineers could not reverse Firefox’s design. We need a political solution, unless we want the web to become Google Web (tm).

                            1. 3

                              Why can’t I just change themes?

                              You can. The switcher is at the bottom of the Customize Toolbar… view.

                              1. 2

                                Hm, last time I tried this it didn’t do much of anything other than change the colour of the toolbar to something else or a background picture; but maybe it’s improved now. I’ll have a look next time I try mucking about with 89 again; thanks!

                                1. 3

                                  You might try the Firefox Colors extension, too. It’s a pretty simple custom theme builder.

                                  1. 2

                                    https://color.firefox.com/ to save the trouble of searching.

                              2. 4

                                I agree with Firefox’s approach of choosing mainstream users over power-users - that’s the only way they’ll ever have 10% or more of users. Firefox is doing things with theming that I wish other systems would do - they have full “fresco” themes (images?) in their chrome! It looks awesome! I dream about entire DEs and app suites built from the ground up with the same theme of frescoes (but with an different specific fresco for each specific app, perhaps tailored to that app). Super cool!

                                I don’t like the lack of contrast on the current tab, but “give users the choice to fix this very specific issue or not” tends to be extremely shortsighted - the way to fix it is to fix it. Making it optional means yet another maintenance point on an already underfunded system, and doesn’t necessarily even fix the problem for most users!

                                More importantly, making ultra-specific optionss like that is usually pushing decisions onto the user as a method of avoiding internal politicking/arguments, and not because pushing to the user is the optimal solution for that specific design aspect.

                                1. 2

                                  As for the close button, I am like you. You can set browser.tabs.tabClipWidth to 1000. Dunno if it is scheduled to be removed.

                                  As for most of the other grips, adding options and features to cater for the needs of a small portion of users has a maintenance cost. Maybe adding the option is only one line, but then a new feature needs to work with the option enabled and disabled. Removing options is just a way to keep the code lean.

                                  My favorite example in the distribution world is Debian. Debian supports tries to be the universal OS. We are drowning with having to support everything. For examples, supporting many init systems is more work. People will get to you if there is a bug in the init system you don’t use. You spend time on this. At the end, people not liking systemd are still unhappy and switch to Devuan which supports less init systems. I respect Mozilla to keep a tight ship and maintaining only the features they can support.

                                  1. 7

                                    Nobody would say anything if their strategy worked. The core issue is that their strategy obviously doesn’t work.

                                    adding options and features to cater for the needs of a small portion of users

                                    It ’s not even about that.

                                    It’s removing things that worked and users liked by pretending that their preferences are invalid. (And every user belongs to some minority that likes a feature others may be unaware of.)

                                    See the recent debacle of gradually blowing up UI sizes, while removing options to keep them as they were previously.

                                    Somehow the saved cost to support some feature doesn’t seem to free up enough resources to build other things that entice users to stay.

                                    All they do with their condescending arrogance on what their perfectly spherical idea of a standard Firefox user needs … is making people’s lives miserable.

                                    They fired most of the people that worked on things I was excited about, and it seems all that’s left are some PR managers and completely out-of-touch UX “experts”.

                                    1. 4

                                      As for most of the other grips, adding options and features to cater for the needs of a small portion of users has a maintenance cost. Maybe adding the option is only one line, but then a new feature needs to work with the option enabled and disabled. Removing options is just a way to keep the code lean.

                                      It seems to me that having useful features is more important than having “lean code”, especially if this “lean code” is frustrating your users and making them leave.

                                      I know it’s easy to shout stuff from the sidelines, and I’m also aware that there may be complexities I may not be aware of and that I’m mostly ignorant of the exact reasoning behind many decisions (most of us here are really, although I’ve seen a few Mozilla people around), but what I do know is that 1) Firefox as a product has been moving in a certain direction for years, 2) that Firefox has been losing users for years, 3) that I know few people who truly find Firefox an amazing browser that a joy to use, and that in light of that 4) keep doing the same thing you’ve been doing for years is probably not a good idea, and 5) that doing the same thing but doing it harder is probably an even worse idea.

                                      I also don’t think that much of this stuff is all that much effort. I am not intimately familiar with the Firefox codebase, but how can a bunch of settings add an insurmountable maintenance burden? These are not “deep” things that reach in to the Gecko engine, just comparatively basic UI stuff. There are tons of projects with a much more complex UI and many more settings.

                                      Hell, I’d argue that even removing the RSS was also a mistake – they should have improved it instead, especially after Google Reader’s demise there was a huge missed opportunity there – although it’s a maintenance burden trade-off I can understand it better, it also demonstrates a lack of vision to just say “oh, it’s old crufty code, not used by many (not a surprise, it sucked), so let’s just remove it, people can just install an add-on if they really want it”. This is also a contradiction with Firefox’s mantra of “most people use the defaults, and if it’s not used a lot we can just remove it”. Well, if that’s true then you can ship a browser with hardly any features at all, and since most people will use the defaults they will use a browser without any features.

                                      Browsers like Brave and Vivaldi manage to do much of this; Vivaldi has an entire full-blown email client. I’d wager that a significant portion of the people leaving Firefox are actually switching to those browsers, not Chrome as such (but they don’t show up well in stats as they identify as “Chrome”). Mozilla nets $430 million/year; it’s not a true “giant” like Google or Apple, but it’s not small either. Vivaldi has just 55 employees (2021, 35 in 2017); granted, they do less than Mozilla, but it doesn’t require a huge team to do all of this.

                                      And every company has limited resources; it’s not like the Chrome team is a bottomless pit of resources either. A number of people in this thread express the “big Google vs. small non-profit Mozilla”-sentiment here, but it doesn’t seem that clear-cut. I can’t readily find a size for the Chrome team on the ‘net, but I checked out the Chromium source code and let some scripts loose on that: there are ~460 Google people with non-trivial commits in 2020, although quite a bit seems to be for ChromeOS and not the browser part strictly speaking, so my guestimate is more 300 people. A large team? Absolutely. But Mozilla’s $430/million a year can match this with ~$1.5m/year per developer. My last company had ~70 devs on much less revenue (~€10m/year). Basically they have the money to spare to match the Chrome dev team person-for-person. Mozilla does more than just Firefox, but they can still afford to let a lot of devs loose on Gecko/Firefox (I didn’t count the number devs for it, as I got some other stuff I want to do this evening as well).

                                      It’s all a matter of strategy; history is littered with large or even huge companies that went belly up just because they made products that didn’t fit people’s demands. I fear Firefox will be in the same category. Not today or tomorrow, but in five years? I’m not so sure Firefox will still be around to be honest. I hope I’m wrong.

                                      As for your Debian comparison; an init system is a fundamental part of the system; it would be analogous to Firefox supporting different rendering or JS engines. It’s not even close to the same as “an UI to configure key mappings” or “a bunch of settings for stuff you can actually already kind-of do but with hacks that you need to explicitly search for and most users don’t know it exists”, or even a “built-in RSS reader that’s really good and a great replacement for Google Reader”.

                                      1. 2

                                        I agree with most of what you said. Notably the removal of RSS support. I don’t work for Mozilla and I am not a contributor, so I really can’t answer any of your questions.

                                        Another example of maintaining a feature would be Alsa support. It has been removed, this upsets some users, but for me, this is understandable as they don’t want to handle bug reports around this or the code to get in the way of some other features or refactors. Of course, I use Pulseaudio, so I am quite biased.

                                        1. 4

                                          I think ALSA is a bad example; just use Pulseaudio. It’s long since been the standard, everyone uses it, and this really is an example of “147 people who insist on having an überminimal Linux on Reddit being angry”. It’s the kind of technical detail with no real user-visible changes that almost no one cares about. Lots of effort with basically zero or extremely minimal tangible benefits.

                                          And ALSA is a not even a good or easy API to start with. I’m pretty sure that the “ALSA purists” never actually tried to write any ALSA code otherwise they wouldn’t be ALSA purists but ALSA haters, as I’m confident there is not a single person that has programmed with ALSA that is not an ALSA hater to some degree.

                                          Pulseaudio was pretty buggy for a while, and its developer’s attitude surrounding some of this didn’t really help, because clearly if tons of people are having issues then all those people are just “doing it wrong” and is certainly not a reason to fix anything, right? There was a time that I had a keybind to pkill pulseaudio && pulseaudio --start because the damn thing just stopped working so often. The Grand Pulseaudio Rollout was messy, buggy, broke a lot of stuff, and absolutely could have been handled better. But all of that was over a decade ago, and it does actually provide value. Most bugs have been fixed years ago, Poettering hasn’t been significantly involved since 2012, yet … people still hold an irrational hatred towards it 🤷

                                          1. 1

                                            ALSA sucks, but PulseAudio is so much worse. It still doesn’t even actually work outside the bare basics. Firefox forced me to put PA on and since then, my mic randomly spews noise and sound between programs running as different user ids is just awful. (I temporarily had that working better though some config changes, then a PA update - hoping to fix the mic bug - broke this… and didn’t fix the mic bug…)

                                            I don’t understand why any program would use the PA api instead of the alsa ones. All my alsa programs (including several I’ve made my own btw, I love it whenever some internet commentator insists I don’t exist) work equally as well as pulse programs on the PA system… but also work fine on systems where audio actually works well (aka alsa systems). Using the pulse api seems to be nothing but negatives.

                                    2. 1

                                      Not sure if this will help you but I absolutely cannot STAND the default Firefox theme so I use this: https://github.com/ideaweb/firefox-safari-style

                                      I stick with Firefox over Safari purely because it’s devtools are 100x better.

                                    3. 10

                                      There’s also the fact that web browsers are simply too big to reimplement at this point. The best Mozilla can do (barely) is try to keep up with the Google-controlled Web Platform specs, and try to collude with Apple to keep the worst of the worst from being formally standardized (though Chrome will implement them anyway). Their ability to do even that was severely impacted by their layoffs last year. At some point, Apple is going to fold and rebase Safari on Chromium, because maintaining their own browser engine is too unprofitable.

                                      At this point, we need to admit that the web belongs to Google, and use it only to render unto Google what is Google’s. Our own traffic should be on other protocols.

                                      1. 8

                                        For a scrappy nonprofit they don’t seem to have any issues paying their executives millions of dollars.

                                        1. 1

                                          I mean, I don’t disagree, but we’re still talking several orders of magnitude less compensation than Google’s execs.

                                          1. 5

                                            A shit sandwich is a shit sandwich, no matter how low the shit content is.

                                            (And no, no one is holding a gun to Mozilla’s head forcing them to hire in high-CoL/low-productivity places.)

                                        2. 1

                                          Product design can’t fix any of these problems because nobody is paying for the product. The more successful it is, the more it costs Mozilla. The only way to pay the rent with free-product-volume is adtech, which means spam and spying.

                                          1. 4

                                            Exactly why I think the problem requires a political solution.

                                        3. 8

                                          I don’t agree this is a vague ethical reason. Problem with those are concerns like deforestation (and destruction of habitats for smaller animals) to ship almond milk across the globe, and sewing as an alternative to poverty and prostitution, etc.

                                          The browser privacy question is very quantifiable and concrete, the source is in the code, making it a concrete ethical-or-such choice.

                                          ISTR there even being a study or two where people were asked about willingness to being spied upon, people who had no idea their phones were doing what was asked about, and being disconcerted after the fact. That’s also a concrete way to raise awareness.

                                          At the end of the day none of this may matter if people sign away their rights willingly in favor of a “better” search-result filter bubble.

                                          1. 11

                                            I don’t think they’re vague (not the word I used) but rather abstract; maybe that’s no the best word either but what I mean with it is that it’s a “far from my bed show” as we would say in Dutch. Doing $something_better on these topics has zero or very few immediate tangible benefits, but rather more abstract long-term benefits. And in addition it’s also really hard to feel that you’re really making a difference as a single individual. I agree with you that these are important topics, it’s just that this type of argument is simply not all that effective at really making a meaningful impact. Perhaps it should be, but it’s not, and exactly because it’s important we need to be pragmatic about the best strategy.

                                            And if you’re given the choice between “cheaper (or better) option X” vs. “more expensive (or inferior) option Y with abstract benefits but no immediate ones”, then I can’t really blame everyone for choosing X either. Life is short, lots of stuff that’s important, and can’t expect everyone to always go out of their way to “do the right thing”, if you can even figure out what the “right thing” is (which is not always easy or black/white).

                                            1. 1

                                              My brain somehow auto-conflated the two, sorry!

                                              I think we agree that the reasoning in these is inoptimal either way.

                                              Personally I wish these articles weren’t so academic, and maybe not in somewhat niche media, but instead mainstream publications would run “Studies show people do not like to be spied upon yet they are - see the shocking results” clickbaity stuff.

                                              At least it wouldn’t hurt for a change.

                                              1. 1

                                                It probably wasn’t super-clear what exactly was intended with that in the first place so easy enough of a mistake to make 😅

                                                As for articles, I’ve seen a bunch of them in mainstream Dutch newspapers in the last two years or so; so there is some amount of attention being given to this. But as I expended on in my other lengthier comment, I think the first step really ought to be making a better product. Not only is this by far the easiest to do and within our (the community’s) power to do, I strongly suspect it may actually be enough, or at least go a long way.

                                                It’s like investing in public transport is better than shaming people for having a car, or affordable meat alternatives is a better alternative than shaming people for eating meat, etc.

                                          2. 7

                                            I agree to an extent. Firefox would do well to focus on the user experience front.

                                            I switched to Firefox way back in the day, not because of vague concerns about the Microsoft hegemony, or even concerns about web standards and how well each browser implemented them. I switched because they introduced the absolutely groundbreaking feature that is tabbed browsing, which gave a strictly better user experience.

                                            I later switched to Chrome when it became obvious that it was beating Firefox in terms of performance, which is also a factor in user experience.

                                            What about these days? Firefox has mostly caught up to Chrome on the performance point. But you know what’s been the best user experience improvement I’ve seen lately? Chrome’s tab groups feature. It’s a really simple idea, but it’s significantly improved the way I manage my browser, given that I tend to have a huge number of tabs open.

                                            These are the kinds of improvements that I’d like to see Firefox creating, in order to lure people back. You can’t guilt me into trying a new browser, you have to tempt me.

                                            1. 10

                                              But you know what’s been the best user experience improvement I’ve seen lately? Chrome’s tab groups feature. It’s a really simple idea, but it’s significantly improved the way I manage my browser, given that I tend to have a huge number of tabs open.

                                              Opera had this over ten years ago (“tab stacking”, added in Opera 11 in 2010). Pretty useful indeed, even with just a limited number of tabs. It even worked better than Chrome groups IMO. Firefox almost-kind-of has this with container tabs, which are a nice feature actually (even though I don’t use it myself), and with a few UX enhancements on that you’ve got tab groups/stacking.

                                              Opera also introduced tabbed browsing by the way (in 2000 with Opera 4, about two years before Mozilla added it in Phoenix, which later became Firefox). Opera was consistently way ahead of the curve on a lot of things. A big reason it never took off was because for a long time you had to pay for it (until 2005), and after that it suffered from “oh, I don’t want to pay for it”-reputation for years. It also suffered from sites not working; this often (not always) wasn’t even Opera’s fault as frequently this was just a stupid pointless “check” on the website’s part, but those were popular in those days to tell people to not use IE6 and many of them were poor and would either outright block Opera or display a scary message. And being a closed-source proprietary product also meant it never got the love from the FS/OSS crowd and the inertia that gives (not necessarily a huge inertia, but still).

                                              So Firefox took the world by storm in the IE6 days because it was free and clearly much better than IE6, and when Opera finally made it free years later it was too late to catch up. I suppose the lesson here is that “a good product” isn’t everything or a guarantee for success, otherwise we’d all be using Opera (Presto) now, but it certainly makes it a hell of a lot easier to achieve success.

                                              Opera had a lot of great stuff. I miss Opera 😢 Vivaldi is close (and built by former Opera devs) but for some reason it’s always pretty slow on my system.

                                              1. 1

                                                This is fair and I did remember Opera being ahead of the curve on some things. I don’t remember why I didn’t use it, but it being paid is probably why.

                                                1. 1

                                                  I agree, I loved the Presto-era Opera and I still use the Blink version as my main browser (and Opera Mobile on Android). It’s still much better than Chrome UX-wise.

                                                2. 4

                                                  I haven’t used tab groups, but it looks pretty similar to Firefox Containers which was introduced ~4 years ahead of that blog post. I’ll grant that the Chrome version is built-in and looks much more polished and general purpose than the container extension, so the example is still valid.

                                                  I just wanted to bring this up because I see many accusations of Firefox copying Chrome, but I never see the reverse being called out. I think that’s partly because Chrome has the resources to take Mozilla’s ideas and beat them to market on it.

                                                  Disclaimer: I’m a Mozilla employee

                                                3. 4

                                                  One challenge for people making this kind of argument is that predictions of online-privacy doom and danger often don’t match people’s lived experiences. I’ve been using Google’s sites and products for over 20 years and have yet to observe any real harm coming to me as a result of Google tracking me. I think my experience is typical: it is an occasional minor annoyance to see repetitive ads for something I just bought, and… that’s about the extent of it.

                                                  A lot of privacy advocacy seems to assume that readers/listeners believe it’s an inherently harmful thing for a company to have information about them in a database somewhere. I believe privacy advocates generally believe that, but if they want people to listen to arguments that use that assumption as a starting point, they need to do a much better job offering non-circular arguments about why it’s bad.

                                                  1. 4

                                                    I think it has been a mistake to focus on loss of privacy as the primary data collection harm. To me the bigger issue is that it gives data collectors power over the creators of the data and society as a whole, and drives destabilizing trends like political polarization and economic inequality. In some ways this is a harder sell because people are brainwashed to care only about issues that affect them personally and to respond with individualized acts.

                                                    1. 4

                                                      There is no brainwashing needed for people to act like people.

                                                      1. 1

                                                        do you disagree with something in my comment?

                                                        1. 3

                                                          In some ways this is a harder sell because people are brainwashed to care only about issues that affect them personally and to respond with individualized acts.

                                                          I’m not @halfmanhalfdonut but I don’t think that brainwashing is needed to get humans to behave like this. This is just how humans behave.

                                                          1. 2

                                                            Yep, this is what I was saying.

                                                            1. 1

                                                              things like individualism, solidarity, and collaboration exist on a spectrum, and everybody exhibits each to some degree. so saying humans just are individualistic is tautological, meaningless. everyone has some individualism in them regardless of their upbringing, and that doesn’t contradict anything in my original comment. that’s why I asked if there was some disagreement.

                                                              to really spell it out, modern mass media and culture condition people to be more individualistic than they otherwise would be. that makes it harder to make an appeal to solidarity and collaboration.

                                                              @GrayGnome

                                                              1. 1

                                                                I think you’re only seeing the negative side (to you) of modern mass media and culture. Our media and culture also promote unity, tolerance, respect, acceptance, etc. You’re ignoring that so that you can complain about Google influencing media, but the reality is that the way you are comes from those same systems of conditioning.

                                                                The fact that you even know anything about income inequality and political polarization are entirely FROM the media. People on the whole are not as politically divided as media has you believe.

                                                                1. 1

                                                                  sure, I only mentioned this particular negative aspect because it was relevant to the point I was making in my original comment

                                                                2. 1

                                                                  to really spell it out, modern mass media and culture condition people to be more individualistic than they otherwise would be. that makes it harder to make an appeal to solidarity and collaboration.

                                                                  I think we’re going to have to agree to disagree. I can make a complicated rebuttal here, but it’s off-topic for the site, so cheers!

                                                                  1. 1

                                                                    cheers

                                                    2. 3

                                                      I agree with everything you’ve written in this thread, especially when it comes to the abstractness of pro-Firefox arguments as of late. Judging from the votes it seems I am not alone. It is sad to see Mozilla lose the favor of what used to be its biggest proponents, the “power” users. I truly believe they are digging their own grave – faster and faster it seems, too. It’s unbelievable how little they seem to be able to just back down and admit they were wrong about an idea, if only for a single time.

                                                      1. 2

                                                        Firefox does have many features that Chrome doesn’t have: container tabs, tree style tabs, better privacy and ad-blocking capabilities, some useful dev tools that I don’t think Chrome has (multi-line JS and CSS editors, fonts), isolated profiles, better control over the home screen, reader mode, userChrome.css, etc.

                                                      1. 29

                                                        Disclaimer: This article covers various things that are NOT right up my alley, so I’ll comment only on some. I’m not going to use my Mozilla Security hat, because I mostly work on other things.

                                                        I know some of the claims are outdated. E.g., the JIT was rewritten and the analysis by Chris Rohlf doesn’t apply anymore. It is true that win32 lockdown and site isolation aren’t fully ready yet, unless you use Firefox Nightly.

                                                        1. 1

                                                          It seems from some of the issues linked from Fission meta bug that Firefox is implementing OOPIF. Is that the case now?

                                                          e.g. https://bugzilla.mozilla.org/show_bug.cgi?id=1698044

                                                          1. 3

                                                            Yes, enabling Fission means that different-site iframes are out-of-process.

                                                            Type “Fission” in the search field in preferences in Nightly to find the checkbox to enable.

                                                            1. 2

                                                              That is fantastic news! Thank you!

                                                        1. 14

                                                          The sooner we all move to UTF-8 the better. There’s simply no reason to use anything else (except maybe UTF-32 for internal representations), and I dare you to give me an argument for any of the other encodings.

                                                          The only argument that is given in favour of UTF-16 over UTF-8 that is not immediately invalidated is that texts heavily written in some asian scripts would be smaller with UTF-16 than UTF-8, but this is actually not a good argument, because you would usually use some form of markup language (like HTML) that is mostly made up of ASCII, which gives UTF-8 the overall edge for a given document in said language.

                                                          The aspect that really crushes UTF-16 et. al. is the necessity for BOMs (byte-order-marks), and many many implementations omit them or handle them wrong. Don’t even get me started on surrogates (which ruined parts of the unicode spec because it had to reserve areas for them!).

                                                          Seriously, wherever you can, please use UTF-8 everywhere. And god bless Rob Pike and Ken Thompson for their stroke of genius while designing UTF-8.

                                                          1. 16

                                                            Even the argument for UTF-32 as an internal representation is very suspect in my opinion. It allows you to treat all code points as the same size, but I’m not sure of any use case where that’s actually an advantage.

                                                            You still can’t treat every code point as its own atomic unit of text. You can’t delete individual code points. You can’t reverse a string based on its code points. One glyph can be made out of multiple code points; if you reverse the string “Hello 👋🏿” (“Hello <waving hand with dark skin tone>”), you end up with the string “🏿👋 olleH” (”<dark brown><waving hand with default yellow skin> olleH”). So UTF-32 kind of just seems like an extremely space-inefficient variable width text encoding.

                                                            I 100% support UTF-8 everywhere - even as an in-memory string representation.

                                                            1. 4

                                                              I agree with you and this is why I made this the default in my grapheme cluster detection library (but you can also check for grapheme boundaries between two CPs “manually”).

                                                              In the end, if you really want to support grapheme clusters, you will have to deal with variable-length characters anyway. Many still consider codepoints and drawn characters to be equal.

                                                            2. 4

                                                              UTF-32 is pretty bad for internal representations, too.

                                                              The CJK argument (when the argument is made, “Asian” really means CJK) for UTF-16 is not a very convincing one even without markup. You want to pick one encoding globally instead of choosing contextually. When measuring bytes used for the same amount of human-perceivable information content, UTF-8 isn’t particularly unfair to CJK. See the table at the end of https://hsivonen.fi/string-length/

                                                              1. 4

                                                                UTF-8 isn’t particularly unfair to CJK. See the table at the end of https://hsivonen.fi/string-length/

                                                                If I am reading that table correctly, for any of the Chinese variants, the UTF-16 encoding is around two thirds the size of the UTF-8 encoding (a third the number of code units). The article is arguing something different: that a quota in terms of UTF-8 characters isn’t unfair to CJK languages because they encode more information per unicode code point, which makes up for requiring more bytes per code point than other encodings.

                                                                I don’t entirely buy @FRIGN’s argument for in-memory strings (though I’m willing to accept it for interchange), because when I process rich text, I don’t process it as HTML or similar, I process it as a string with metadata that is not stored as inline control characters and only serialise to to HTML (or RTF, or whatever) at the edge of the program. When I’m doing any processing on the text that doesn’t care about the metadata, then being able to fit larger strings in my L1 cache is a perf win (especially if I need to keep some tables for doing the extended grapheme cluster range calculation in the L1). There’s also a big win for using ASCII as an in-memory representation for things that can be losslessly stored as ASCII because knowing that up-front guarantees that one-byte = one unicode code unit = one grapheme cluster, which makes a lot of processing simpler.

                                                                For network bandwidth and persistent storage, UTF-8 is fine for two reasons:

                                                                • Text is tiny in comparison to most other kinds of media. A picture is worth a thousand words. A video is worth a few million. Unless you’re storing huge amounts of text (e.g. all of Wikipedia)
                                                                • None of the UTF-* variants is a compression algorithm. If you want to store a very large amount of text, use a compression algorithm. Even a fairly simple dictionary will give a huge win (English has around 20,000 words, with around 2,000 in common use. Without doing anything clever, you should be able to store most English sentences with 16 bits per word, which makes even UTF-8 for English look incredibly bloated).
                                                                1. 1

                                                                  I think you missed an important part:

                                                                  When measuring bytes used for the same amount of human-perceivable information content, UTF-8 isn’t particularly unfair to CJK

                                                                  UTF-8 is unfair to CJK glyph for glyph, while UTF-16 makes “western” glyphs the same size as CJK glyphs. However, CJK glyphs usually contain more information than “western” glyphs, so it’s “fair” to encode ASCII using one byte per glyph at the cost of using more bytes per glyph for CJK.

                                                                  Analyzing the exact information content becomes difficult, but as a rough approximation, we can say that the average word length in English documents is around 5 ASCII characters, while 2-glyph Chinese words are extremely common. Therefore, it’s “fair” for Chinese glyphs to be encoded using around 2.5x as many bytes on average as English glyphs, because each Chinese glyph contains 2.5x the amount of information.

                                                                  Again, this gets complicated and I don’t have expertise necessary to do a real analysis, but this should give an idea of why hsivonen claims UTF-8 isn’t particularly unfair to CJK.

                                                                  1. 2

                                                                    I didn’t miss that, it’s exactly my point. If you are processing a lot of Chinese text, UTF-16 as an in-memory representation, will have better memory and cache performance by a fairly significant margin. If you’re producing per-user (per-message, or whatever) memory quotas then using a UTF-8 encoding won’t particularly penalise users who write in CJK languages relative to English. That’s an odd thing to focus on, because it matters for protocols with maximum-length messages, but doesn’t impact performance at all for most cases.

                                                              2. 3

                                                                I agree, and https://www.oilshell.org/ is UTF-8 only, except where it calls libc, for say glob() or regexec().

                                                                There it inherits libc locales, which are messy and incoherent. They are unfortunately part of C and POSIX so I don’t think they’re ever going away.

                                                                The whole idea of a global variable in a PROGRAM makes no sense, let alone a library in a program. The encoding is a property of the DATA, not of the program that’s processing it!

                                                                In a non-networked world, you could imagine that say all the manuals on an entire Unix system are in a single encoding. But we’ve past that point by 30 years, so obviously you can have one file that’s UTF-8, and one file that’s UTF-16, and a shell has to look at them both.

                                                                Unlike HTTP, a Unix file system has no place for metadata. The only coherent solution is to use UTF-8, because you can perform almost all useful operations on it by treating it as a blob of bytes – in particular substring searching, like grep does, or like a shell parser does for keywords and operators (for, |, etc.).

                                                                grep and sort are also slowed down by an order of magnitude due to the locale, which annoys me. Compare LC_ALL=C sort to sort on most Linux systems.

                                                                1. 1

                                                                  The Good Thing about ye olde ascii was if you have a vast steaming pile of files created by a rambling ever changing herd of cats, you could read it, tweak it, write it back…. and the only thing that changed was the bits you tweaked.

                                                                  Try that believing that the herd of cats have set all their editors to utf-8… Ha!

                                                                  And you are bound to get slapped with an invalid code point exception…

                                                                  Ok, so you then do some tedious magic to squash all invalid code points to a magic value, do your tweak…. and then you have unexpected deltas all over the place.

                                                                  Sigh.

                                                                  1. 1

                                                                    With Vim at least, you can set ‘binary’ and it’ll leave arbitrary weirdness alone.

                                                                    1. 1

                                                                      Conversely, that setting, no doubt, allows you to create non valid utf-8 weirdness.

                                                                      1. 1

                                                                        Yep! Which is what you want when editing arbitrary buffers of bytes.

                                                                        1. 1

                                                                          Sadly, the vast steaming pile needs to be linked into a cohesive product, so that answer didn’t work in the long run.

                                                                          The solution is uchardet and iconv, uchardet to guess what encoding the cat had his editor set to… iconv to convert it to utf8.

                                                                          Fix them all up.

                                                                          Then set build tools to die noisily on invalid code point…

                                                                          When cat gets unhappy build tools aren’t working… tell them about uchardet and iconv and remind them about the required encoding.

                                                                          Tedious, but works.

                                                                1. 7

                                                                  One thing that goes unmentioned by the people who are complaining is the effect of RMS’s and the FSF’s licensing strategy. They prioritized avoiding proprietary GCC front ends with the effect of LLVM being the nicer option for also Free out-of-tree front ends. Now we’re at the point where it’s no longer enough to have GCC for a platform but in practice you need to have LLVM as well. The people who are upset about this don’t seem to blame the strategy that made GCC not nice for out-of-tree front ends.

                                                                  1. 6

                                                                    Spoiler: implementation used <ctype.h>, and was cursed by the C locale.

                                                                    1. 17

                                                                      It’s more subtle than that. ctype.h include an isascii, which returns true for things < 128. The FreeBSD libc version actually exposes this as a macro: (((c) & ~0x7F) == 0) (true if all of the bits above the low 7 are true). The regex library was using its own isascii equivalent implemented as isprint() || iscontrol(). These are locale-dependent and isprint will return true for a lot of non-ascii characters (most of unicode, in a unicode locale). This is not C locales’ fault, it is incorrect API usage.

                                                                      The fact that the C locales APIs are an abomination is a tangentially related fact.

                                                                      1. 2

                                                                        true if all of the bits above the low 7 are true

                                                                        Nit: it’s true if any bits above the low 7 are true.

                                                                        1. 2

                                                                          (true if all of the bits above the low 7 are true)

                                                                          false if any of the bits above the low 7 are true

                                                                        2. 3

                                                                          To be fair, this is a kind of bug you can run into without needing to hit C locales.

                                                                          For example, a few times I’ve had to re-teach people regexes in Python, because the behavior today isn’t what it was once upon a time.

                                                                          To take the most common example I personally see (because of the stuff I work on/with), Django used to only support regexes as the way to specify its URL routing. You write a regex that matches the URL you expect, tell Django to map it to a particular view, and any captured groups in the regex become arguments (keyword arguments for named captures, positional otherwise) used to call the view. Now there’s a simpler alternative syntax that covers a lot of common cases, but regexes are still supported when you need the kind of fine-grained/complex matching rules they provide.

                                                                          Anyway, suppose you want to build a blog, and you want to have the year, month, and day in the URL. Like /weblog/2021/02/26/post-title. Easy enough to do with regex. Except… most of the examples and tutorials floating around from days of yore are from a Python 2 world, where you could match things like the four-digit year with \d{4}. That only worked because Python 2 was an ASCII world, and \d was equivalent to [0-9]. In Python 3, the world is Unicode, and \d matches anything that Unicode considers to be a digit, which is a larger set than just the nine numerals of ASCII. So every once in a while someone pops up with “why is my regex URL pattern matching this weird stuff” and gets to learn that in a Python 3/Unicode world, if all you really want is [0-9], then [0-9] is what you have to write.

                                                                          This seems to have been an instance of the same problem, where the decision was deferred to some other system that might have its own ideas about things like “printable”, instead of just writing it correctly from the start to only match what it intended to match.

                                                                          1. 2

                                                                            In Python 3, the world is Unicode, and \d matches anything that Unicode considers to be a digit, which is a larger set than just the nine numerals of ASCII

                                                                            TIL!

                                                                            1. 1

                                                                              Have the Python core devs ever articulated what kind of use cases motivate “\d matches anything that Unicode considers to be a digit”?

                                                                              1. 3

                                                                                UTS#18 gives a set of recommendations for how regex metacharacters should behave, and its “Standard”-level recommendation (“applications should use this definition wherever possible”) is that \d match anything with Unicode general category Nd. This is what Python 3 does by default.

                                                                                So I would presume there’s no need to articulate “use cases” for simply following the recommendation of the Unicode standards.

                                                                                If you dislike this and want or absolutely need to use \d as a synonym for [0-9], you can explicitly switch the behavior to UTS#18’s “Posix Compatible” fallback by passing the re.ASCII flag to your Python regex (just as in Python 2 you could opt in to Unicode-recommended behavior by passing re.UNICODE). You also can avoid it altogether by not using str instances; the regex behavior on bytes instances is the ASCII behavior.

                                                                          1. 1

                                                                            Hopefully, there won’t be a Text Encoding menu in 2022.

                                                                            Most of documents server over HTTP are probably live enough to already have metadata properly declaring their encoding.

                                                                            But browsers are sometimes used to view local files – and such files might be very old or come from unusual platforms – offline documentation, some harvested website, notes, reports etc. In such case, metadata is often either lost or was never present because years or decades ago, authors expected certain platform default encoding or were not aware of encoding at all.

                                                                            1. 1

                                                                              When viewing such files locally, have you had the need to manually override Firefox’s guess after Firefox 78?

                                                                              1. 2

                                                                                Unfortunately there’s some old pages on the Japanese web that don’t seem to express their encoding properly. Maybe it worked on a Japanese OS. Not to mention some western ones with similar confusion between non-UTF-8 and UTF-8…

                                                                                It’s one of those features I wish I didn’t have to use, and could be tucked nice and out of the way, but it’s handy whenever you do run into that.

                                                                                1. 1

                                                                                  Has Firefox failed to guess the encoding of such pages for you after Firefox 78? (In cases where the encoding remains undeclared as opposed to a server update introducing a server-level UTF-8 declaration despite the content being old.)

                                                                                  Edit: Context for why 78: https://lobste.rs/s/dbwqu6/chardetng_more_compact_character

                                                                                  1. 2

                                                                                    I haven’t been browsing those kinds of pages in a while, so I’m not sure. I’ll let you know if I do though.

                                                                            1. 29

                                                                              It’s people like that, who say “Firefox is Spyware” who are abusive. These “phoning home” features in Firefox are legitimate features implemented in good faith in privacy-conscious ways by company which does not abuse or sell the data.

                                                                              The extremist view that the browser should not detect captive portals, should not catch phishing attempts, and should not have automatic updates sets up Firefox to fail. You can’t expect Firefox to have significant market share, and at the same time demand it to have defaults that make 99% of people think it’s a broken piece of shit that can’t even play Netflix.

                                                                              And remember you’re trashing Mozilla for having a special contract for extra privacy guarantees and data isolation in Google Analytics, while using a browser by the maker of the damn Analytics. And also a biggest ad network that actually makes billions gathering and selling access to personal data. And you’re probably even logged in to that browser with your real name and verified phone number, and phoning that home to every site by that browser vendor.

                                                                              1. 3

                                                                                And you’re probably even logged in to that browser with your real name and verified phone number, ..

                                                                                I think we’re long past that, with Chrome guessing and trying to log you in every time you use a Google service. On the other hand I’m not 100% sure to trust them to not deduce enough just from chrome usage without logging in, just based on IP and other fingerprinting stuff, which would on the one hand be a little paranoid but on the other hand maybe simply true.

                                                                                Just an aside, not trying to take away from your main point.

                                                                                1. 4

                                                                                  These “phoning home” features in Firefox are legitimate features implemented in good faith in privacy-conscious ways by company which does not abuse or sell the data.

                                                                                  I don’t believe you and them. And suggest other people do the same.

                                                                                  1. 13

                                                                                    On what basis? For example, what makes you believe that Mozilla does something nefarious with captive portal DNS requests?

                                                                                    1. 3

                                                                                      On the basis that they add binary blobs to Firefox for example. On the basis that this server (or whatever) is not opensourced and not audited by the community. That’s enough for me. I don’t trust them and I don’t trust you since you are trying to shield them.

                                                                                      1. 18

                                                                                        It makes sense to require reproducible builds and audited code, but there’s a difference between “I have no ability to verify this” vs “I actively distrust you and think you’re lying”.

                                                                                        1. 7

                                                                                          What blobs are you referring to?

                                                                                          Note that as distributed by Mozilla, Firefox doesn’t contain a DRM implementation. Firefox does automate its download, but you can tell Firefox not to do that.

                                                                                          (Disclosure: I work for Mozilla, but I am writing this comment on my own time and initiative.)

                                                                                          1. 1

                                                                                            Pocket for example.

                                                                                            1. 7

                                                                                              Where do you find a Pocket-related binary blob in Firefox? (See also https://searchfox.org/mozilla-central/source/browser/components/pocket )

                                                                                              1. 5

                                                                                                There is no pocket binary blob.

                                                                                    1. 38

                                                                                      Are people really still whining about this?!?

                                                                                      Python 2 is open source free software and you’re a software developer. Grab the code, build it yourself, and keep running Python 2 as long as you want. Nobody is stopping you.

                                                                                      This is even more silly because Python2 was ALREADY DEPRECATED in 2011 when the author started his project.

                                                                                      1. 5

                                                                                        /Are people really still using this argument?!?/ Just because software, packages and distributions don’t cost money doesn’t mean that people don’t use them and have expectations from them. In fact, that is exactly why they were provided in the first place. This “you should have known better” attitude is totally counterproductive because it implies that if you want any kind of stability or support with some QoS you should not use free/open-source software. I don’t think any of us want to suggest that. It would certainly not do most open source software justice.

                                                                                        1. 9

                                                                                          This “you should have known better” attitude is totally counterproductive because it implies that if you want any kind of stability or support with some QoS you should not use free/open-source software.

                                                                                          My comment doesn’t imply that, though. In fact, as I pointed out, the author can still download Python2 and use it if he wants to. Free to use does not imply free support, and I think it’s a good thing for people to keep in mind.

                                                                                          Furthermore, I don’t think a “you should have known better” attitude is out of line towards somebody who ignored 10 years of deprecation warnings. What did he think was going to happen? He had 10 years of warning - he really should have known better…

                                                                                          1. 1

                                                                                            if you argue with the 10 years of warning you’re missing the point.

                                                                                            The point is not that there was no time to change it. The point is that it shouldn’t need change at all.

                                                                                          2. 13

                                                                                            Just because software, packages and distributions don’t cost money doesn’t mean that people don’t use them and have expectations from them

                                                                                            Haven’t there been a few articles recently about people being burt out from maintaining open source projects? This seems like the exact kind of entitled attitude that I think many of the authors were complaining about. I’m sure there would be plenty of people to maintain it for you if you paid them, but these people are donaiting their time. Expecting some developer to maintain software depreciated in 2011 for you is absurd.

                                                                                            1. 1

                                                                                              Yeah, I’ve read a few of those articles, too. And don’t get me wrong I’m not trying to say that things should be this way. A lot of open source work deserves to be paid work!

                                                                                              But I also don’t think there is anything entitled about this point of view. It’s simply pragmatic: people make open source software, want others to use it, and that is why they support and maintain it. Then the users become dependent. Trouble ensues when visions diverge or no more time can be allocated for maintenance.

                                                                                              1. 9

                                                                                                At the same time, it’s not like a proprietary software vendor that you staked your entire business on. The source code to Python 2 isn’t going anywhere. Just because the PSF and your Linux distribution decided to stop maintaining and packaging an ancient version doesn’t mean you can’t continue to rely on some company (or yourself!) to maintain it for you. For instance, Red Hat will keep updating Python 2 for RHEL until June 2024.

                                                                                                And as crazy as it might seem to have to support software yourself, consider that the FreeBSD people kept a 2007 version of GCC in their build process until literally this week. That’s 13 years where they kept it working themselves. It’s not like it’s hard to build and package obsolete userspace software; nothing is going to change in the way Linux works that would prevent you from running Python 2 on it in five years (unlike most system software which might make more assumptions about the system it’s running on).

                                                                                                Some amount of gratuitous change is worth getting worked up about. For example, it’s a well-known issue in fast-moving ecosystems like JavaScript that you might not be able to get your old project to build with new dependency versions if you step away for a year. That’s a problem.

                                                                                                I, for one, am extremely glad that it’s now okay for library authors to stop maintaining Python 2 compatibility. The alternative would have been maintaining backwards compatibility using something like a strict mode (JavaScript, Perl) or heavily encouraging only using a modern subset of the language (C++). The clean break that Python made may have alienated some people with legacy software to keep running, but it moved the entire ecosystem forwards.

                                                                                                1. 1

                                                                                                  The source code to Python 2 isn’t going anywhere. Just because the PSF and your Linux distribution decided to stop maintaining and packaging an ancient version doesn’t mean you can’t continue to rely on some company (or yourself!) to maintain it for you.

                                                                                                  1. Some distros are eager to make python launch python3. This action is vanity-based hostile to having Python 2 and 3 side-by-side (with 2 coming from a non-distro source).
                                                                                                  2. By not keeping Python 2 open to maintainance by willing parties in the obvious place (at the PSF) and by being naming-hostile to people doing it elsewhere in a way that not only maintains but adds features, the PSF is making pooling effort for continued maintenance of Python 2 harder than it has to be.
                                                                                                  1. 2

                                                                                                    It’s arguably more irresponsible to continue to implicitly pushing Python 2.x as the “default” python by continuing to be refer to it by the python name out of deference to “not breaking things” when it is explicitly unmaintained.

                                                                                            2. 7

                                                                                              it implies that if you want any kind of stability or support with some QoS you should not use free/open-source software

                                                                                              If you want support with guarantees attached you shouldn’t expect to get that for free. If you are fine with community/developer-provided support with no guarantees attached, then free software is fine.

                                                                                              I think being deprecated for a decade before support being ended is pretty amazing for free community-provided support, to be honest.

                                                                                          1. 18

                                                                                            For folks wanting more context on how the “minimum supported Rust version” (MSRV) issue is treated in the ecosystem, this issue has a number of opinions (including my own) and some discussion: https://github.com/rust-lang/api-guidelines/issues/123

                                                                                            As far as I can tell, there is no strong consensus on what to do. In practice, I’ve observed generally the following states:

                                                                                            1. Some folks adopt an explicit MSRV policy but do not consider it a breaking change to increase it.
                                                                                            2. Some folks adopt an explicit MSRV policy and consider it a breaking change to increase it.
                                                                                            3. There is no MSRV policy, and the only guarantee you have is that it compiles on latest stable (or latest stable minus two releases).

                                                                                            In general, I’ve found that (1) and (2) are usually associated with more widely used crates and generally indicates an overall more conservative approach to increasing the MSRV. (3) is generally the default though, as far as I can tell.

                                                                                            There’s good reason for this. Maintaining support for older versions of Rust is a lot of thankless work, particularly if your library is still evolving or if your own crate has other dependencies with different MSRV policies. All it takes is one crate in your dependency graph to require a newer version of Rust. (Unless you’re willing to pin a dependency in a library, which is generally bad juju.) Rust’s release cycle reinforces this. It moves quickly and provides new things for folks to use all the time. Those new things are added specifically because folks have a use for them, so their use can propagate quickly in the ecosystem if a widely used crate starts using it. The general thinking here is that updating your Rust compiler should be easy. And generally speaking, it is.

                                                                                            “Maturity” is perhaps the right word, but only in the sense that, over time, widely used crates will slow their pace of evolution and, consequently, slow their MSRV increases. This isn’t necessarily equivalent to saying that “maturity” equals “slow evolution,” because it is generally possible for crates to make use of newer versions of Rust without increasing their MSRV via version sniffing and conditional compilation. (Not possible in every case, but the vast majority.) But doing this can lead to significant complexity and a greatly increased test matrix. It’s a lot of extra work, and maybe doing that extra work is what this author means by “maturity.” Chances are though, that’s a lot of unpaid extra work, and it’s not clear to me that that is reasonable expectation to have.

                                                                                            1. 4

                                                                                              Perhaps part of the solution could be to make LTS versions of rustc and cargo? That way distro maintainers could preferentially use those, and package maintainers preferentially target those. Make the common toolchain setup procedure apt install cargo instead of curl https://sh.rustup.rs > sh and there’s at least a prayer of people preferring that. Debian 10 currently ships with rustc 1.34 for example, which IMO is a pretty good place to put a breakpoint.

                                                                                              But for this to happen there needs to be agreement on what the LTS versions are. If Debian 10 ships rustc 1.34, Ubuntu 20.04 ships 1.37 and Fedora ships 1.12, then as a crate maintainer I’m not going to bother trying to target a useful minimal version, because it’s a lot of random work that will never be perfect. If everyone ships rustc 1.34, then it’s much easier to say to myself “well I’d like this shiny new feature in rustc 1.40 but I don’t really need it for now, it can just go in the next time I’m making a breaking release anyway”. This actually works in my favor, ‘cause then when a user tries to install my software on some ancient random system I can just say “sorry, you have to use rustc 1.34+ like everyone else, it’s not like that’s a big ask”. Then distro maintainers can backport rustc 1.34 to Debian 9 or 8 if they really need to, and only need to do it once as well for most people’s software to work.

                                                                                              This happens already, hence why Debian 10 has gcc-7 and gcc-8 packages. It’s fine. The special cases just need to be uncommon enough that it’s not a huge hassle.

                                                                                              1. 5

                                                                                                Yes, people generally want some kind of LTS story. There was an RFC that was generally positively received about 1.5 years ago: https://github.com/rust-lang/rfcs/pull/2483

                                                                                                It was closed due to lack of bandwidth to implement it, but it seems like something that will be revisited in the future. There’s just a ton of other stuff going on right now that is soaking up team bandwidth, mostly in the form of implementing already merged RFCs.

                                                                                                1. 4

                                                                                                  It would be really sad to let Debian hold back Rust version adoption in the ecosystem the way Debian gets to hold back C++ version adoption via frozen GCC.

                                                                                                  It seems to me it would be a major strategic blunder for Rust to do an LTS instead of the current situation.

                                                                                                  1. 2

                                                                                                    Is Debian a factor anymore? I mean it was always pretty backwards, but does anybody use it, care for it anymore? How independent is Ubuntu from them?

                                                                                                    I only use fedora/centos/rhel or windows for work. I have only seen Ubuntu in use by others in large real-world deployments, but Debian? Never.

                                                                                                    1. 3

                                                                                                      Is Debian a factor anymore? I mean it was always pretty backwards, but does anybody use it, care for it anymore? How independent is Ubuntu from them?

                                                                                                      People do care about Debian and use Debian. That’s fine. What’s not fine is acting entitled to having code from outside the Debian stable archive build with the compilers shipped by Debian stable.

                                                                                                      As Ubuntu LTS releases get older, they have similar ecosystem problems as Debian stable generally, but in the case of Rust in particular, Ubuntu updates Rust on the non-ESR Firefox cycle, so Rust is exempt from being frozen in Ubuntu. (Squandering this exemption by doing a Rust LTS would be a huge blunder for Rust in my opinion.)

                                                                                                      In my anecdotal experience entitlement to have out-of-archive code build with in-archive compilers is less of a problem with RHEL. People seem to have a better understanding that if you use RHEL, you are paying Red Hat to deal with being frozen in time instead of being frozen in time being a community endeavor beyond the distro itself. Edited to add: Furthermore, in the case of Rust specifically, Red Hat provides a rolling toolchain for RHEL. It doesn’t roll every six weeks. IIRC, it updates about every third Rust upstream release.

                                                                                                      1. 3

                                                                                                        The company I work at (ISP & ISTP) use Debian as the operating system on almost all virtual machines running core software which requires n nines uptime.

                                                                                                        1. 3

                                                                                                          I’ve found Debian Stable to be perfectly fine for desktop and server use. It just works, and upgrades are generally pretty smooth. Clearly, you have different experiences, but that doesn’t make Debian “backwards”.

                                                                                                          1. 1

                                                                                                            One department at my university is mostly-Debian for about 15+ years.

                                                                                                            1. 0

                                                                                                              I have seen Debian at a university department too, but not at places where actual money is made, or work is getting done. I had to use pkgsrc there to get fresh packages as a user to be able to get my stuff done.

                                                                                                              University departments can afford to be backwards, because they are wasting other people’s time and money with that.

                                                                                                              1. 3

                                                                                                                Every place that I have worked primarily uses Debian or a Debian derivative. (Google used Ubuntu on workstations; at [Shiny consumer products, inc] the server that I was deploying on was Debian, despite the fact that they have their own server OS and they even supported it at the time; and the rest have been smaller firms or I’m under NDA and can’t discuss them). Except for Google, it was always Debian stable. So no, not just universities.

                                                                                                                1. 1

                                                                                                                  BSD Unix was developed at a university.

                                                                                                                  Linus attended a university when starting to develop the Linux kernel.

                                                                                                                  The entire ethos and worldview of Free Software is inspired by RMS’ time at university.

                                                                                                                  The programming darling du jour, Haskell, is an offshoot of an academic project.

                                                                                                                  I’m really sad so much time and energy and other people’s money have been wasted on these useless things…

                                                                                                                  1. 2

                                                                                                                    Nice strawman!

                                                                                                                    And the infrastructure supporting these was just as backwards for its time as running Debian wasting the the time of students and tutors with outdated tools provided by the host institution…

                                                                                                                    1. 1

                                                                                                                      In the comment I replied to first , you write:

                                                                                                                      […] a university department too, but not at places where actual money is made, or work is getting done

                                                                                                                      University departments can afford to be backwards, because they are wasting other people’s time and money with that.

                                                                                                                      (my emphasis)

                                                                                                                      I find it hard to read these quotes in any other way than you believe that universities are a waste of time and money…

                                                                                                                      edit clarified source of quotes

                                                                                                                      1. 4

                                                                                                                        I can also mis-quote:

                                                                                                                        I find it hard to read […]

                                                                                                                        But I actually rather read and parse your sentences in their completeness.

                                                                                                                        My claims were:

                                                                                                                        a) I have only seen Debian used at places where efficiency is not a requirement
                                                                                                                        b) Universities are such places

                                                                                                                        I didn’t claim they don’t produce any useful things:

                                                                                                                        […] University departments can afford to be backwards, because they are wasting other people’s time and money with that.

                                                                                                                        Which should be parsed as: University Departments are wasting other people’s time and money with not using proper tools and infrastructure, for example using outdated (free) software. They are being inefficient. They waste student and tutor time, thus taxpayer money when not using better available free tools, but it doesn’t matter to them, as It does not show up n their balance sheet, Tutors and Students are already expected to do lot of “off-work hours” tasks to get their rewards: grades or money.

                                                                                                                        And yes, they are being inefficient:

                                                                                                                        • I had to find floppy disks in 2009 to be able to get my mandatory measurement data from a dos 5.0 machine at a lab. It was hard to buy them, and to get a place where I can read them… This one can be justified as expensive specialized measurement equipment was used and only legacy tools supported it.
                                                                                                                        • I had to do my assignments with software available only at the lab, running some Debian (then current) version shipping only outdated packages. OpenOffice kept crashing, and outdated tools were a constant annoyance. As a student my time was wasted. (Until I installed pkgsrc, and rolled my up to date tools)
                                                                                                                        • At a different university I have seen students working in Dosbox writing 16 bit protected mode in assembly in edit.com, compiling with some ancient MS assembler, in 2015, because the department thought the basics of assembly programming didn’t change since they introduced the curriculum, so they won’t update the tools and curriculum. They waste everyone’s money, the student’s won’t use it in real life anyway, because they are not properly supervised, as they would be if they were living from the market.
                                                                                                                        1. 3

                                                                                                                          Thanks for clarifying.

                                                                                                                          I realize it might be hard to realize for you now, but I can assure you that “the real world, governed by the market”, can be just as wasteful and inefficient as a university.

                                                                                                                          1. 2

                                                                                                                            Unfortunately that is also true, I have seen “bullshit jobs” (a nice book btw.) business from inside (been partly a box- ticker for a time), but the enormous waste I saw at universities make me feel that the useful stuff coming out from them is exception, the result of herculean efforts of a few working against all odds, complete institutions working on strangling people/projects leading to meaningful results.

                                                                                                                            Wasting someone’s own money is a thing, I don’t care that much about that, wasting taxpayer money is not a good move, but to some extent I can tolerate it… Wasting talents and other people’s time is what really infuriates me.

                                                                                                                  2. 1

                                                                                                                    I had to use pkgsrc there to get fresh packages

                                                                                                                    Did you have root privileges as a student?

                                                                                                                    1. 2

                                                                                                                      pkgsrc supports unprivilieged mode!

                                                                                                                      https://www.netbsd.org/docs/pkgsrc/platforms.html#bootstrapping-pkgsrc

                                                                                                                      It worked like a charm.

                                                                                                                      But I did actually have root privileges, as the guy responsible for the lab was overburdened and sometimes some of us he trusted helped other students. Still I didn’t use that to alter the installed system, as that would be out of my mandate.

                                                                                                            2. 1

                                                                                                              Debian 10 currently ships with rustc 1.34 for example, which IMO is a pretty good place to put a breakpoint.

                                                                                                              1.34 doesn’t have futures nor async/await that seriously impact the code design. Do I really have to wait for Debian 11 in 2021 to use them?

                                                                                                              1. 2

                                                                                                                No, if you need them then install a newer rustc and use them. But there’s plenty of code that also doesn’t need futures or async/await.

                                                                                                            3. 3

                                                                                                              Wow, I wasn’t aware that this issue has an acronym and even a place for discussion. Thanks for the pointer!

                                                                                                              widely used crates will slow their pace of evolution and, consequently, slow their MSRV increases.

                                                                                                              Exactly what I’m hoping for, and precisely the reason I’m not jumping off the ship :)

                                                                                                              maybe doing that extra work is what this author means by “maturity.”

                                                                                                              In part, yes, that’s what I meant. The other possibility is to hold off adopting new APIs (as you did with alloc in regex; thanks!). I understand both options are a PITA for library maintainers, and might not even make sense, economy-wise, for unpaid maintainers. Perhaps I should’ve used “self-restraint” instead of “maturity”, but that probably has some unwanted connotations as well.

                                                                                                              1. 2

                                                                                                                Here’s a cargo subcommand (cargo msrv-table) I hacked together (warning, just a hacky PoC) that displays the MSRV by crate version for any particular crate.

                                                                                                            1. 17

                                                                                                              I think this blog post is correct, but also judging Go on things Go has never focused on being excellent at.

                                                                                                              Go was intended to fix the issues that Google had with large C++ projects. It compiles fast. There are no cyclic dependencies. Concurrency is easy to use. Deployment is easy and straight-forward. The language is small (there are fewer keywords than in C).

                                                                                                              Especially deployment is often disregarded when people are evaluating programming languages. Python, C and C++ has their strengths, but deployment can often be problematic. Compiling a static ELF in Rust is possible, and some may say this is a detail and not important, but it’s nevertheless not as straightforward as in Go.

                                                                                                              Rust and Go has had completely different goals from day one, and that’s fine.

                                                                                                              1. 6

                                                                                                                Compiling a static ELF in Rust is possible, and some may say this is a detail and not important, but it’s nevertheless not as straightforward as in Go.

                                                                                                                What’s insufficiently static about Rust’s default behavior?

                                                                                                                That the glibc dependency is dynamically linked and you need to use the musl target to statically link the C library? That C-interfacing -sys crates leave the linkage with C system libs dynamic?

                                                                                                                By default all Rust code is statically linked.

                                                                                                                1. 2

                                                                                                                  This looks dynamically linked to me:

                                                                                                                  % rustc --version
                                                                                                                  rustc 1.43.0-nightly (5e7af4669 2020-02-16)
                                                                                                                  % rustc hello.rs
                                                                                                                  % file hello
                                                                                                                  hello: ELF 64-bit LSB pie executable, x86-64, version 1 (SYSV), dynamically linked, interpreter /lib64/ld-linux-x86-64.so.2, BuildID[sha1]=423cb18ef664b10711694265f2eb5f0215ccedde, for GNU/Linux 3.2.0, with debug_info, not stripped
                                                                                                                  
                                                                                                                  1. 1

                                                                                                                    It’s dynamically linked with glibc, but surely all the Rust code is statically linked into the executable by default.

                                                                                                              1. 5

                                                                                                                It’s disappointing that freezing the compiler is seen as maturity as opposed to being able to upgrade the compiler without breakage being seen as maturity.

                                                                                                                That said, a big part of why this topic is painful is Rust’s former resistance to making MSRV part of the Cargo dependency resolution, which means that increasing MSRV breaks the build for folks who, for whatever reason, run cargo update on an old toolchain. Making an MSRV bump a semver bump is bad as seen with the recent base64 semver bump that didn’t involve API breakage. Either all crates that transitively depend on base64 also undergo a semver bump (disruptive when the dependency doesn’t involve types exposed in the outward APIs of the dependent crates) or they don’t, which would make base64’s semver bump pointless.

                                                                                                                Fortunately, the Cargo MSRV RFC got accepted, so better times are ahead if only the RFC got implemented.

                                                                                                                1. 2

                                                                                                                  If cargo and rustup had been rolled into one tool much of this could have been avoided.

                                                                                                                  Dependency uses nightly? You get nightly.

                                                                                                                  New dependency increased the MSRV? You get the new one when you update.

                                                                                                                  1. 1

                                                                                                                    It’s disappointing that freezing the compiler is seen as maturity as opposed to being able to upgrade the compiler without breakage being seen as maturity.

                                                                                                                    I’d say that upgrading without breakage is just a stepping stone to maturity. Also, as I pointed out in the discussion on the Mastodon, the issue is less about the compiler and more about the crates ecosystem.

                                                                                                                    1. 1

                                                                                                                      Hm, I don’t think there were any resistance to MSRV in Cargo.toml post 1.0? IIRC, it’s rather that no one was willing to do design/implementation work.

                                                                                                                    1. 2

                                                                                                                      I will admit, the unicode one got me. I was expecting to be able to reverse the glyphs, but I suppose that probably isn’t that simple.

                                                                                                                      1. 6

                                                                                                                        Translate it to Arabic or Hebrew ;-)

                                                                                                                        1. 3

                                                                                                                          I don’t know any Arabic, but to give a Hebrew example, reversing the letters of חיים should give you מייח . The mem glyph changes.

                                                                                                                          1. 4

                                                                                                                            It was a joke answer to a trick question. I meant that if the string was in English (a left-to-right language) you could translate it into a right-to-left language to “reverse” it.

                                                                                                                            1. 5

                                                                                                                              Non-joke answer:

                                                                                                                              There’s a crate for extended grapheme cluster-aware string reversal: https://crates.io/crates/unicode-reverse

                                                                                                                              I have a use case for string reversal using this crate (though the use case doesn’t really need to be extended grapheme cluester-aware, since ISO-8859-8 doesn’t support vocalized Hebrew): Synthesizing visual Hebrew from a corpus of logical Hebrew text for testing visual Hebrew detection.

                                                                                                                              1. 2

                                                                                                                                Yeah that completely wooshed me

                                                                                                                          2. 2

                                                                                                                            Consider: how does one reverse “ijk”? Next: how does one reverse “Dijkstra”?

                                                                                                                            1. 1

                                                                                                                              Fun example, but falls flat if the person answering the question doesn’t know a damn about the Dutch “ij”. I think it’s a regular diphthong, as far as diphthongs can be regular :P

                                                                                                                              Now you can take this as proving or disproving your point, but if you give me a word/name in a language I don’t speak I don’t have any qualms about reversing it and pretending to not know (or really don’t know) if there’s a diphthong in it.

                                                                                                                          1. 3

                                                                                                                            This mentions the two usual points of difficulty: the borrow checker and the distinction between String and &str. It would be interesting to see a meta analysis of posts of this genre to see if these bother people with a C++ background less than these bother people coming from garbage-collected languages.

                                                                                                                            After all, if you are already writing OK C++, you need to write in ways that would satisfy the (post-NLL) borrow checker. Also, C++ also has the distinction between std::string and std::string_view.

                                                                                                                            1. 1

                                                                                                                              This isn’t really specific to Bastion, but there seems to be a new trend where everyone dual licenses their software. When I first got into free software, I always seem to remember reading this was bad and I shouldn’t do it. Has the wisdom changed and I just didn’t realize it? Or is everyone just no longer following the “rule” consequences be damned?

                                                                                                                              1. 3

                                                                                                                                I think it’s mostly a trend in rust projects because rust is licensed that way: https://internals.rust-lang.org/t/rationale-of-apache-dual-licensing/8952/5

                                                                                                                                1. 3

                                                                                                                                  Dual licensing with what the language implementation project really wants plus something that accomplishes GPLv2 compatibility and the library ecosystem using the same licensing as the language implementation is not a new trend. It goes back all the way to Perl.

                                                                                                                                  1. 2

                                                                                                                                    There’s been a cultural shift.

                                                                                                                                    In part, it’s driven by the spate of companies selling OSS hosting - eg AWS taking the foss elasticsearch and selling it as a hosted service.

                                                                                                                                    Quite a few people started to feel like they were spending their nights and weekends working in order to make money for someone who wasn’t contributing a commensurate amount back.

                                                                                                                                    Another part is the growing realization that ‘pure-tech’ purpose-agnostic tools are being used in real life to eg put children in cages around the US. Lots of OSS authors have become particularly uncomfortable at the idea of their work reducing the cost of that.

                                                                                                                                    There’s probably more to it than that, but I’ve definitely seen both of those reasons given.

                                                                                                                                    1. 0

                                                                                                                                      How is this relevant to the question asked? The Rust/Bastion license clearly isn’t part of the shift you are referring to.

                                                                                                                                      1. 4

                                                                                                                                        The question opened with

                                                                                                                                        This isn’t really specific to Bastion, but there seems to be a new trend

                                                                                                                                        That didn’t strike me as an attempt to find out about rust or bastion specifically.

                                                                                                                                  1. 6

                                                                                                                                    The Discord ToS also limits the use to non-commercial use.

                                                                                                                                    1. 8

                                                                                                                                      The ToS also prohibits the use of third-party (read: non-spyware) clients; users have been banned in the past for this.

                                                                                                                                    1. 3

                                                                                                                                      I should note that C11 did introduce a few limited functions to work with UTF-8 specifically; <uchar.h> defines char16_t and char32_t as well as mbrtoc16 (multibyte char to char16_t), c16rtomb (char16_t to multibyte char), mbrtoc32 (multibyte char to char32_t) and c32rtomb (char32_t to multibyte char). For the purposes of this specific article, however, this is not very helpful: toupper/towupper are only defined for char and wchar_t, respectively; however, you’re not guaranteed that wchar_t can actually represent a char32_t, so uppercasing remains very hard.

                                                                                                                                      Grapheme clusters make this seemingly simple tasks even harder, and this is all before realizing that these functions are also locale-dependent (necessarily so, plus this also means they are OS-dependent) because of regional variations.

                                                                                                                                      1. 4

                                                                                                                                        I should note that C11 did introduce a few limited functions to work with UTF-8 specifically

                                                                                                                                        How are any of the things you mention for UTF-8 specifically? C11 doesn’t guarantee UTF-8 interpretation for mb arguments. The interpretation depends on locale, AFAIK C11 doesn’t guarantee the availability of UTF-8 locales (and in practice, there exist Windows versions that don’t have UTF-8 locales), and changing the locale affects all threads, so you can’t change the locale from within a multithreaded program.

                                                                                                                                        C11 doesn’t guarantee UTF-16 semantics for char16_t or UTF-32 semantics for char32_t. This is even less sensible than not guaranteeing two’s complement math. At least when C got the possibility of non-two’s complement math, non-two’s complement hardware existed. When char16_t and char32_t were introduced, there were no plausible alternatives for UTF-16 and UTF-32 semantics. C++20 guarantees UTF-16 and UTF-32 semantics for char16_t and char32_t.

                                                                                                                                        1. 2

                                                                                                                                          C11, § 7.28(1):

                                                                                                                                          The header <uchar.h> declares types and functions for manipulating Unicode characters.

                                                                                                                                          (emphasis mine) J.3.4 on undefined behavior:

                                                                                                                                          The encoding of any of wchar_t, char16_t, and char32_t where the corresponding standard encoding macro (__STDC_ISO_10646__, __STDC_UTF_16__, or __STDC_UTF_32__) is not defined (6.10.8.2).

                                                                                                                                          I suppose you could read that as not necessarily meaning UTF-8 (or UTF-16 or UTF-32) if these specific macros are not defined, but at that point, an implementation is just so obviously insane that it’s no use trying to deal with it.

                                                                                                                                          Incidentally, you may find it interesting that a recent draft of C2x has dropped two’s complement support, see the mention of N2412 on p. ii.

                                                                                                                                          1. 4

                                                                                                                                            dropped two’s complement support

                                                                                                                                            You had me panicking for a moment, but looking at that PDF it looks like they’ve accepted N2412, which means they’ve guaranteed two’s complement support and dropped support for other sign representations.

                                                                                                                                            1. 2

                                                                                                                                              Right, I meant to say “dropped everything but two’s complement”, my bad. Not that I can edit it anymore.

                                                                                                                                            2. 2

                                                                                                                                              I suppose you could read that as not necessarily meaning UTF-8 (or UTF-16 or UTF-32) if these specific macros are not defined, but at that point, an implementation is just so obviously insane that it’s no use trying to deal with it.

                                                                                                                                              Those macros being defined mean that wchar_t has UTF-32 semantics, char16_t has UTF-16 semantics, and char32_t has UTF-32 semantics. None of those are about UTF-8.

                                                                                                                                              Incidentally, you may find it interesting that a recent draft of C2x has dropped two’s complement support, see the mention of N2412 on p. ii.

                                                                                                                                              Fortunately, reasonable things flow from the C++ committee to the C committee.

                                                                                                                                              1. 0

                                                                                                                                                C++ is a prescriptive standard. It describes what it wants the world to be, and then hopes the world catches up. You can do that when the entire language community is on board and the language is complex enough to only have a few implementations.

                                                                                                                                                C is a descriptive standard. C runs on a massive range of hardware. C cannot afford to exclude real existing hardware targets that work in weird ways. Ones complement and signed magnitude integers really do exist in hardware. It wasn’t unreasonable to support them.

                                                                                                                                                Now that C has threading and atomics and is ballooning in scope it’s probably not implementable on those computers anyway, and they can just stick with C89. Getting rid of that support today, now that those platforms are so insanely obscure, is probably fine. But it was not unreasonable to support them in 1989 or 1999.

                                                                                                                                                1. 4

                                                                                                                                                  C++ and C are quite a bit closer together in this regard. z/OS and AIX character encoding issues show up frustratingly often on C++ standardization context.

                                                                                                                                            3. 0

                                                                                                                                              I wouldn’t expect C to guarantee UTF-8 for argv, if I’m understanding your comment correctly. C is a descriptive standard, and the world it’s describing is not universally UTF-8, even on the systems where it is most popular. Pathnames in Linux, for example, are not UTF-8, they are arbitrary null-terminated strings of bytes.

                                                                                                                                              1. 2

                                                                                                                                                Short note: path names in Rust are of “OsStr/OsString” type, which “String” can be converted into, but the other way around is falliable.

                                                                                                                                                Both have mappings to “Path/PathBuf”.

                                                                                                                                                1. 0

                                                                                                                                                  Yes path names are of type OsStr, but command line arguments as returned by std::env::args are Strings, which means that your code works fine until it panics in production one day because someone passed a non-UTF-8 path name and std::env::args tries to turn it into a String even though you’re just going to turn it back into an OsStr anyway.

                                                                                                                                                2. 2

                                                                                                                                                  My point is that, AFAICT, contrary to your up-thread comment, C11 provides no facilities that you can actually trust to do UTF-8 processing in a cross-platform way. (If you know the program will run on macOS or OpenBSD, then you can rely on certain functions performing UTF-8 operations.)

                                                                                                                                                  1. 1

                                                                                                                                                    I have not made any comments suggesting that C11 provides UTF-8 support.

                                                                                                                                                    C2x should have char8_t, however.

                                                                                                                                                    1. 3

                                                                                                                                                      Sorry, I meant xorhash’s comment upthread.

                                                                                                                                            1. 4

                                                                                                                                              Hoping that accessibility support on Windows and Mac comes out of this eventually. The lack of accessibility support on Windows and Mac is a significant issue against GTK being able to serve as a cross-desktop-OS toolkit.

                                                                                                                                              1. 10

                                                                                                                                                This sadly feels like C++ will be bound by the part of the industry that just wants to live on legacy forever. So we’ll probably need to sacrifice C++ to them and use something new, if we want a modern C++.

                                                                                                                                                1. 8

                                                                                                                                                  There are only two kinds of languages: the ones people complain about and the ones nobody uses.

                                                                                                                                                  -Bjarne Stroustrup

                                                                                                                                                  The commitment to backwards compatibility is one of the major reasons why C++ has gained such high adoption in industry. C++ is certainly not my favorite language, but being able to compile and run code from two or three decades ago without modification is massively important for infrastructure development, and few languages have demonstrated the same compatibility guarantees seen in C++.

                                                                                                                                                  1. 4

                                                                                                                                                    That Bjarne’s quote is annoying. It is obviously true (almost tautological), but completely sidesteps whether the complaints are valid or not.

                                                                                                                                                    Backwards compatibility with problems of the past was already a hard constraint for C++98, and now there’s 20 more years of C++ additions to be backwards-compatible with.

                                                                                                                                                  2. 6

                                                                                                                                                    d and rust are both solid options for a modern c++ if you don’t need the backward compatibility. as others have noted, legacy support is one of the major selling points of c++; there is no reason for them to give that up.

                                                                                                                                                    1. 4

                                                                                                                                                      Explain please. Why do you think that? I only see new tools to use but my projects are my own.

                                                                                                                                                      1. 6

                                                                                                                                                        It’s regarding the non-breaking of the API and rather deprecation of std::regex etc. They’re not breaking it because apparently the industry will fork it otherwise. So we’re stuck with some bad decisions like forever.

                                                                                                                                                        1. 2

                                                                                                                                                          because apparently the industry will fork it otherwise

                                                                                                                                                          Don’t they already fork it? Almost every C++ project seems to have some reimplementation of part of the standard library.

                                                                                                                                                          1. 1

                                                                                                                                                            There’s a really big difference between forking the standard (whatwg-style) and avoiding a few of the features of the standard library in favor of your own alternative implementations.

                                                                                                                                                            1. 2

                                                                                                                                                              I very much doubt you’d see an industry fork. The companies with sufficient interest in C++ and the massive resources required to develop compilers are probably the ones pushing C++ forward.

                                                                                                                                                              What you would be more likely to see is some companies that just stop upgrading and stick with the latest release which doesn’t break compatibility.

                                                                                                                                                              If you did see any non-standard developments to said release, I expect they would be minor and not widely distributed. Those who are resistant to change are unlikely to be making big changes, and until relatively recently C++ has had very little in the way of a community that might coordinate a fork.

                                                                                                                                                      2. 4

                                                                                                                                                        Legacy code is code that’s worth keeping running. A significant part of C++’s value is in compatibility with that code.

                                                                                                                                                        Causing an incompatibility that would be at the kind of distance from C++ that Python 3 was from Python 2 just isn’t worth it. If compatibility constaints are loosened more, Rust already exists.

                                                                                                                                                        1. 3

                                                                                                                                                          Legacy code is code that’s worth keeping running.

                                                                                                                                                          Sure. The question is whether we have to keep punishing newly written code and new programmers (who weren’t even alive when C++’s poor decisions were made) with it.

                                                                                                                                                          1. 2

                                                                                                                                                            A language called “C++” but incompatible with the C++ that existing code bases are written in wouldn’t solve problems for new programmers working on those code bases. That is, you can’t change a language spec and fundamentally alter design decisions that existing code already was built on.

                                                                                                                                                            Constraints on newly-written code depend in how the new code intermingles with old code. New code interleaved tightly into existing C++ code is constained by existing C++. For code that doesn’t interact with C++ at all or interacts with C++ through a sufficiently identifiable interface doesn’t have to be in a language called “C++”.

                                                                                                                                                        2. 3

                                                                                                                                                          Agreed. I believe that this kind of “modern C++” is Rust; there just has to be a way to keep C++ experts and their design mentality away from core development. Otherwise Rust will end up like C++.

                                                                                                                                                          1. 3

                                                                                                                                                            I’d disagree with a characterization that Rust is a modernized C++. I believe there are things in Rust that the people building C++ would love to have (epochs for example are a hot topic in that thread), but I don’t think it’s the language they would build if they could get out the chisel and compatibility break away not only ABI but maybe even syntactic decisions and more. Despite the lack of commitment so far to measly ABI breaks, with some of what’s seemingly in the pipeline really just transforming a lot of the day-to-day of working with C++, maybe all you’d need to end up with a “modern C++” is sort of a, uh “C+”. My personal choice for trimming to create a C+? Kill preprocessor directives with fire!

                                                                                                                                                            1. 1

                                                                                                                                                              I could have said more precisely:

                                                                                                                                                              Rust is what C++ developers need. It’s not necessarily what they want.

                                                                                                                                                            2. 2

                                                                                                                                                              …there just has to be a way to keep C++ experts and their design mentality away from core development. Otherwise Rust will end up like C++.

                                                                                                                                                              This comment makes me a bit sad. I understand the point that @soc is making, but I don’t think that the Rust community should ever be built on a foundation of keeping people out. The C++ community certainly struggles with a culture of complexity, but there is a lot that the C++ community can bring to the Rust community and vice versa.

                                                                                                                                                              1. 2

                                                                                                                                                                That and a lot of Rust’s core team are C++ experts already.

                                                                                                                                                                1. 1

                                                                                                                                                                  Every language is built around a set of values that result in a self-selection process for potential adopters:

                                                                                                                                                                  • The language has bad documentation? Keeps out people who think documentation is important. (See Scala.)
                                                                                                                                                                  • The language keeps adding features? Keeps out people who think that adding more features does not improve the language. (See C++, C#, JavaScript, Typescript, Swift, Rust, …)
                                                                                                                                                                  • Etc. etc.

                                                                                                                                                                  For instance point number 2 – I have decided that Rust 1.13 is roughly the max language size I’m willing to deal with when writing libraries.

                                                                                                                                                                  I have subsequently skipped all newer Rust versions and the features that were added in those versions. I can’t really “un-adopt” Rust, but I think I’m pretty far removed from usage that Rust devs would consider “mainstream”.

                                                                                                                                                            1. 32

                                                                                                                                                              If you were to start a new commercial project today, what language would you choose?

                                                                                                                                                              I’d choose Python. It’s not the best language, but I know it very well, I know its ecosystem very well, I know I can “easily” find other engineers to help me (and affordable) if I need. So, simply put, Python would be the best tool to make money, in my case.

                                                                                                                                                              1. 5

                                                                                                                                                                and affordable

                                                                                                                                                                Hey, who you’re calling cheap!?

                                                                                                                                                                Joke aside, do you mean affordable on the sense of there’s lot of people that know python so the scarcity doesn’t drive up salaries above average, or python developer salaries are below average?

                                                                                                                                                                1. 2

                                                                                                                                                                  It’s an established language and platform, so there are no surprises with cost. Python is now commonplace, and while it will go into decline at some point, it will be relatively gradual. People won’t be surprised. There are plenty of companies that are stuck with platforms that have declined more quickly, but still need to be maintained long-term. Some JS frameworks like Angular are here in some markets.

                                                                                                                                                                  In terms of average salaries, there is an expectation that established platforms will not be able to sustain salaries at the level they were when the platform was up and coming. The next cool thing is somewhat unknown, so developers can justify higher salaries because they are supposed to be working on the higher risk frontier.

                                                                                                                                                                  When it comes to an individual you need to be doing more than Python to get a higher than average salary. Or, be paid to work remotely for a company that can sustain a higher salary (e.g. get a closer to SF salary, but live in the Midwest). If salary isn’t your primary goal, you now have more opportunities to practice Python because of greater market penetration. If salary is your goal, then the best plan is to be closer to the bleeding edge, and have a broader/deeper skillset.

                                                                                                                                                                  1. 2

                                                                                                                                                                    People won’t be surprised.

                                                                                                                                                                    Does no one get in trouble with clients or management due to the later costs of Python? Having to put effort into keeping the software working when Python makes backward-incompatible changes that other languages are less likely to make. Spending time after-the-fact on mypy when the software gets large. Dealing with some aspect of the program being too slow.

                                                                                                                                                                    1. 1

                                                                                                                                                                      You are taking that out of context, but I understand your perspective. My point is that there is no surprises from the ecosystem going away, or declining rapidly, which then has a knock-on impact on the market.

                                                                                                                                                                      Python applications can have many surprises in terms of costs later on as you point out. That said, I think a large part of that is to do with the inability of people to set proper expectations.

                                                                                                                                                                      Anyone with experience, should realise that the convenience of Python comes with cost. They should highlight these trade-offs in some manner with other stakeholders. Whether it’s stretching Python, or someone creating an over-engineered monstrosity with some other framework for a company with no expertise in it, it really comes back to professionalism. There are real costs when we avoid discussion of the issues, or don’t document them, but people seem to press on regardless.

                                                                                                                                                                      At other times we have to just accept that people with relatively little experience will make important decisions. We have to accept them and develop a plan for resolving the issues. If these decisions are happening continuously around us, then moving on, or developing better influencing skills seem to be the only options.