1. 5

    I have a fairly infrequent one here: http://www.kmjn.org/notes/

    Some research and academia-related posts, some Unix-related posts, and miscellaneous other things.

    1. 13

      There’s a quote I like that I can’t remember from where:

      Thirty years ago “it reduces to 3SAT” meant the problem was impossible. Now it means the problem is trivial.

      1. 2

        I wrote something vaguely like that a few years ago, though I’m sure I wasn’t the first to observe it:

        SAT, the very first problem to be proven NP-complete, is now frequently used in AI as an almost canonical example of a fast problem

        1. 1

          Why is that? Because computers are much faster, or better algorithms?

          1. 3

            We have faster hardware and better algorithms now, yes. But the real reason is because early theoretical results which emphasized worse-case performance had scared the entire field off even trying. These results based on complexity classes are true, but misleading: as it turns out, the “average” SAT instance for many real-world problems probably is solvable. Only when this was recognized could we make progress on efficient SAT algorithms. Beware sound theories mis-applied!

        1. 3

          I wonder whether the introduced bugs or any pattern they form are detectable? If they are, attackers would move on to other targets rather than get trapped in ‘flypaper.’ Making attackers believe that the bugs are exploitable would be the real win. It would be like the tactic of keeping the telemarketer on the line to keep them from calling others.

          1. 3

            Making attackers believe that the bugs are exploitable would be the real win.

            That’s a really, common strategy called honeypot systems. Some even fake entire networks.

            1. 2

              I believe the initial assumption is that people treat large classes of bugs, like “the program crashes on invalid input”, as promising exploit candidates, in part because there is tooling to find those kinds of bugs (fuzzers and such). So you can maybe make that search harder if you inject a buch of non-exploitable bugs for each of those common categories, so that fuzzers turn up far too many false positives. But yeah, then you have the usual arms race: can people just narrow their heuristics to exclude your fake bugs? There’s a small discussion of that from one of the authors on Twitter.

            1. 2

              Why do they even still have backups from 2007 in this post-GDPR world? They have no authority to retain that data, surely.

              1. 3

                I was thinking exactly this when I read about the breach. Backups from 2017 maybe, but almost ten year old backups are useless right?

                1. 4

                  Maybe it was a seed for a staging/testing system? It’s not uncommon for many places to flop around a data seed for developers - usually they would be anonymized but that’s not always the case in all places.

                  1. 3

                    It’s not too surprising to me. When changing over to a new system, it’s fairly common to dump the old pile of spaghetti into an archive labeled Someone Sort This Mess Out Later, if you aren’t 100% sure that it doesn’t still have something important in it that needs to be ported over to the new system. Naturally, nobody ever gets around to sorting through it.

                1. 17

                  An interesting aspect of this: their employees’ credentials were compromised by intercepting two-factor authentication that used SMS. Security folks have been complaining about SMS-based 2FA for a while, but it’s still a common configuration on big cloud providers.

                  1. 11

                    What’s especially bugging me is platforms like twitter that do provide alternatives to SMS for 2FA, but still require SMS to be enabled even if you want to use safer means. The moment you remove your phone number from twitter, all of 2FA is disabled.

                    The problem is that if SMS is an option, that’s going to be what an attacker uses. It doesn’t matter that I myself always use a Yubikey.

                    But the worst are services that also use that 2FA phone number they got for password recovery. Forgot your password? No problem. Just type the code we just sent you via SMS.

                    This effectively reduces the strength of your overall account security to the ability of your phone company to resist social engineering. Your phone company who has trained their call center agents to handle „customer“ requests as quickly and efficiently as possible.

                    update: I just noticed that twitter has fixed this and you can now disable SMS while keeping TOTP and U2F enabled.

                    1. 2

                      But the worst are services that also use that 2FA phone number they got for password recovery. Forgot your password? No problem. Just type the code we just sent you via SMS.

                      I get why they do this from a convenience perspective, but it bugs me to call the result 2FA. If you can change the password through the SMS recovery method, password and SMS aren’t two separate authentication factors, it’s just 1FA!

                      1. 1

                        Have sites been keeping SMS given the cost of supporting locked out users? Lost phones are a frequent occurrence. I wonder if sites have thought about implementing really slow, but automated recovery processes to avoid this issue. Going through support with Google after losing your phone is painful, but smaller sites don’t have a support staff at all, so they are likely to keep allowing SMS since your mobile phone number is pretty recoverable.

                        1. 1

                          In case of many accounts that are now de-facto protected by nothing but a single easily hackable SMS I’d much rather lose access to it than risk somebody else getting access.

                          If there was a way to tell these services and my phone company that I absolutely never want to recover my account, I would do that in a heartbeat

                        2. 1

                          This effectively reduces the strength of your overall account security to the ability of your phone company to resist social engineering. Your phone company who has trained their call center agents to handle „customer“ requests as quickly and efficiently as possible.

                          True. Also, if you have the target’s phone number, you can skip the social engineering, and go directly for SS7 hacks.

                        3. 1

                          I don’t remember the details but there is a specific carrier (tmobile I think?) that is extremely susceptible to SMS interception and its people on their network that have been getting targeted for attacks like this.

                          1. 4

                            Your mobile phone number can relatively easily be stolen (more specifically: ported out to another network by an attacker). This happened to me on T-Mobile, but I believe it is possible on other networks too. In my case my phone number was used to setup Zelle and transfer money out of my bank account.

                            This article actually provides more detail on the method attackers have used to port your number: https://motherboard.vice.com/en_us/article/vbqax3/hackers-sim-swapping-steal-phone-numbers-instagram-bitcoin

                            1. 1

                              T-Mobile sent a text message blast to all customers many months ago urging users to setup a security code on their account to prevent this. Did you do it?

                              Feb 1, 2018: “T-Mobile Alert: We have identified an industry-wide phone number port out scam and encourage you to add account security. Learn more: t-mo.co/secure”

                              1. 1

                                Yeah I did after recovering my number. Sadly this action was taken in response to myself and others having been attacked already :)

                        1. 10

                          Well, that doesn’t sound good:

                          Spectre attacks require some form of local code execution on the target system. Hence, systems where an attacker cannot run any code at all were, until now, thought to be safe. In this paper, we present NetSpectre, a generic remote Spectre variant 1 attack. For this purpose, we demonstrate the first access-driven remote Evict+Reload cache attack over network, leaking 15 bits per hour. Beyond retrofitting existing attacks to a network scenario, we also demonstrate the first Spectre attack which does not use a cache covert channel. Instead, we present a novel high-performance AVX-based covert channel that we use in our cache-free Spectre attack. We show that in particular remote Spectre attacks perform significantly better with the AVX-based covert channel, leaking 60 bits per hour from the target system. We verified that our NetSpectre attacks work in local-area networks as well as between virtual machines in the Google cloud.

                          1. 14

                            I disagree, because that will only lead to a morass of incompatible software. You refuse for your software to be run by law enforcement, he refuses for his software to be run by drug dealers, I refuse for my software to be run by Yankees — where does it all end?

                            It’s a profoundly illiberal attitude, and the end result will be that everyone would have to build his own software stack from scratch.

                            1. 5

                              Previous discussions on reddit (8 years ago) and HN (one year ago).

                              1. 4

                                “It’s a great way to make sure proprietary software is always well funded and had congress/parliment in their corner.” (TaylorSpokeApe)

                              2. 1

                                I don’t buy the slippery slope argument. There are published codes of ethics for professional software people by e.g. the BCS or ACM, that may make good templates of what constitutes ethical activity within which to use software.

                                But by all means, if you want to give stuff to the drug dealing Yankee cop when someone else refuses to, please do so.

                                1. 9

                                  Using one of those codes would be one angle to go for ethical consensus, but precisely because they’re attempts at ethical consensus in fairly broad populations, they mostly don’t do what many of the people wanting restrictions on types of usage would want. One of the more common desires for field-of-usage restriction is, basically, “ban the US/UK military from using my stuff”. But the ACM/BCS ethics codes, and perhaps even more their bodies’ enforcement practices, are pretty much designed so that US/UK military / DARPA / CDE activity doesn’t violate them, since it would be impossible to get broad enough consensus to pass an ACM code of ethics that banned DARPA activity (which funds many ACM members’ work).

                                  It seems even worse if you want an international software license. Even given the ACM or BCS text as written, you would get completely different answers about what violates it or doesn’t, if you went to five different countries with different cultures and legal traditions. The ACM code, at least, has a specific enforcement mechanism defined, which includes mainly US-based people. Is that a viable basis for a worldwide license, Americans deciding on ethics for everyone else? Or do you take the text excluding the enforcement mechanism, and let each country decide what things violate the text as written or not? Then you get very different answers in different places. Do we need some kind of international ethics court under UN auspices instead, to come up with a global verdict?

                                  1. -10

                                    I had a thought to write software so stupid no government would use it but then I remembered linux exists

                                  2. 4

                                    It’s not a slippery slope. The example in the OP link would make the software incompatible with just about everything other than stuff of the same license or proprietary software. An MIT project would be unable to use any of the code from a project with such a rule.

                                1. 8

                                  It’s not common in software, but it’s become pretty normalized in licensing of other type of materials to violate it, through the Creative Commons set of licenses. Without arguing which is better or worse, it seems like mostly historical reasons that one set of norms (the FSF’s) prevailed in software, while another one (CC’s) prevailed in other kinds of material.

                                  In software, the big debate in the “free” world seems to be between GPL-style copyleft licenses and MIT/BSD-style permissive licenses. Plus a side debate over what you might call “ultra-copyleft” licenses (AGPL-style). But everyone agrees that usage restrictions violating Freezom Zero put you outside the Free World.

                                  In music/images/etc., though, you have quite common usage of four Creative Common licenses, all of which seem to have cultural acceptance in that scene as “free”, but two of which two violate Freedom Zero. The four licenses are: CC-BY, CC-BY-SA, CC-BY-NC, and CC-BY-NC-SA. The basic one, CC-BY, is roughly a MIT/BSD-style permissive license. Then you can mix and match either or both of the two common restrictions: “sharealike” (SA), which is roughly GPL-style copyleft, and “noncommercial” (NC), which means you can’t make money off the content, and obviously violates Freedom Zero.

                                  Once you go in that direction, though, one might wonder, is the commercial/noncommercial split the only or best way of abrogating Freedom Zero? There was an essay published in 2007 that I found interesting, Copyfarleft and Copyjustright, which argues that CC’s commercial/noncommercial split is trying to pull off a “copyjustright” approach that mainly aims to please certain types of commercial content creators by balancing amateur remix freedom with exclusive right of commercial exploitation. It argues that this balance might not be the right one if your idea of freedom is more far-left and less about promoting entrepreneurship. (I think this critique of C/NC as the right line to draw is basically right, although I’m less sure whether the copyfarleft proposal itself is workable.)

                                  1. 2

                                    boats’s personal barricade

                                    Was the extra “s” on purpose?

                                    1. 11

                                      It’s one of the styles of English possessive for singular words that end in an ‘s’. When making a plural word that ends in ‘s’ into a possessive, all authorities agree that you just add an apostrophe (“the employees’ salaries”). But when it’s a singular word that happens to end in an ‘s’, some styles prefer that you treat it the same way as any other singular word and add apostrope-s (“Alger Hiss’s trial”), while others prefer that you treat it in the same way as plural words ending in ‘s’, and add just apostrophe (“Alger Hiss’ trial”). Both styles are pretty common for a few centuries now I think. I tend to use the apostrophe-s style because it’s how I would speak (I’d say “hiss-es trial”, or in this case, “boats-es personal barricade”, to indicate the possessive). I guess this one is extra-weird because the person’s handle, boats, is a plural English word, but adopted as a handle for a single individual.

                                      1. 5

                                        Nice reply! Short of citing sources for your work, that’s about as good as it gets.

                                        1. 4

                                          I’ll add a citation in honor or @mjn’s fine reply, Wikipedia (Wikisource) has the rule from the original Strunk & White text - Strunk and White is one of the better (and readable) style guides that most people should use for the English language.

                                          1. 3

                                            Strunk and White is one of the better (and readable) style guides that most people should use for the English language.

                                            It really depends who you ask. See for example the paper linked in https://www.washingtonpost.com/news/volokh-conspiracy/wp/2015/04/21/against-strunk-whites-the-elements-of-style/ for example.

                                            1. 3

                                              Agreed. If you are at the point where you disagree based on an actual reason, like in the linked rebuttal, or are even aware of other style guides, then weigh the pros and cons appropriately. If your discipline/profession/place of work doesn’t have one and you aren’t being supervised by a professor, this is a pretty good default.

                                              I actually hesitated at wording it as rule and would have preferred guideline, but my link had it titled as rule, so take things with a grain of a salt.

                                            2. 3

                                              In practice, I would guess most authors do something simpler than S&W and just stick to either the apostrophe-only or the apostrophe-s form, though I have no data on that. Seems a bit fiddly to recommend apostrophe-s almost always, but then carve out an exception for “ancient proper names ending in -es and -is”, a second exception specifically for Jesus, and a third one for traditional punctuation of phrases like “for righteousness’ sake”. I could imagine that working as a publication’s house style that their copyeditors enforce, but I would be surprised to find it much in the wild.

                                      1. 15

                                        Interesting to see so many people with unusual keyboards. I’ll be the dissenting voice: I use the keyboard on my 2013 MacBook Pro and I’m happy with it. Before that, I used a cheap Logitech keyboard with my Linux machine, and was happy with it too. As long as I don’t type in weird positions (like sitting on a couch with a laptop) and keep my wrists more or less straight, I don’t have issues with RSI or anything.

                                        1. 5

                                          I, too, use my MacBook keyboard, and it’s been my primary way of interacting with computers for about 7 years now (prior to that, I had a desktop with a Microsoft Natural). The main thing I’ve gotten used to that I’ve found I now have trouble adjusting away from is the way the MacBook combines the keyboard with a trackpad just below it. It feels much easier on my right arm/wrist/elbow to move between kb and trackpad in this setup, versus in my previous desktop setup, where the mouse was on my desk off to the right side of the keyboard, and I had to move my arm back and forth between the two.

                                          I’m sometimes tempted to go full-on keyboard-only, but as long as I’m using a mouse regularly, the trackpad-below-kb combo feels more comfortable to me than having separate devices. Wanting to stick with that is also the reason I don’t use a docking station for my laptop even when in the office.

                                          1. 2

                                            I agree, the trackpad is so convenient that I haven’t bothered learning a lot of keyboard substitutes when I moved to a Mac. That’s why, despite my seeming indifference to input devices, I’m reluctant to move to the new MBP - the current combination works really well, and I’m not sure that’s carried over to the new model.

                                            1. 1

                                              I haven’t found any touchpad issues moving from the 2013 Macbook Air to the 2016 Macbook Pro. I don’t think there’s any particular advantage to the bigger touchpad, but it hasn’t got in my way at all. I don’t think I’ve ever had a problem of accidentally triggering mouse input with my palm, or anything like that.

                                              The one thing that would concern me with the new laptops is the stuck/broken keys issue, which seems widespread (I’ve had keys that have got stuck, and unstuck themselves) but now that the replacement program is in place I’m be much less concerned about it.

                                              1. 1

                                                That’s good to know. I’m also concerned about the new keyboards - even on the 2013 MBP I’m on my second keyboard. I don’t think the replacement program covers the 2018 MBP, does it?

                                                1. 1

                                                  I assume the 2018 MBP isn’t covered by the replacement program, since the machine’s only just been launched. It would be a little weird to launch a new product and a replacement program for an anticipated failure at the same time :)

                                                  Anyway, I’m not a close follower of Apple, but I think they tend to take their time before instituting replacement programs. People were complaining about the keyboards jamming up on the MBP for quite a while before Apple acknowledged it as a real issue.

                                                  And, though Apple’s denying it, it seems really unlikely to me that the changes to the 2018 design weren’t at least partly in response to the issues that have come up.

                                        1. 2

                                          I recently ran across this article, which was written back when Hipstamatic and Instagram were new. I think it might even be more interesting to read now. In 2011 it wouldn’t have been clear to me whether this was over-interpreting some minor app fad, but in 2018 it’s clear this style of app-produced vintage photography filters has real staying power.

                                          1. 4

                                            Back from three weeks on vacation in Scotland, so I guess probably spending the early part of the week figuring out what’s going on and picking up some threads of work again. The vacation was nice.

                                            1. 1

                                              I’m always looking for a cross-compiling system for building macOS executables from Linux, either as a single static executable, or as a self-contained relocatable bundle of (interpreter + libraries + user code entrypoint), because getting legal Mac build workers is such a pain.

                                              The best toolkit I’ve found, by far, is golang Where you just GOOS=darwin go build .... There are a variety of more-or-less hacky solutions in the Javascript ecosystem, and a few projects for Python, but for Ruby this area is sorely lacking.

                                              I mention this because while XAR looks like an awesome way to distribute software bundles, I still need to figure out a way to do nice cross-compiles if I’m going to use it to realistically target both macOS and Linux.

                                              1. 3

                                                Tell me about it. I’ve tried cross compiling Rust from Linux to OSX and it was just a saga of hurt from start to finish.

                                                For Go, did you need to jump through the hoops of downloading an out-of-date Xcode image, extracting the appropriate files and compiling a cross-linker? Or is that mysteriously handled for you by the Go distribution itself?

                                                1. 2

                                                  You literally just run GOOS=<your target os> GOARCH=<your target architecture> go build. No setup needed. Here’s the vars go build inspects.

                                                  It’s frustrating trying to do similar in compiles languages, and then interpreted languages with native modules are even worse.

                                                  1. 1

                                                    Go basically DIYs the whole toolchain and directly produces binaries. That has pros and cons, but means it can cross-compile without needing any third-party stuff like the Xcode images. For example it does its own linking, so it doesn’t need the Xcode / LLVM linker to be installed for cross-compilation to Mac.

                                                  2. 1

                                                    AFAICT, XAR still doesn’t include the Python interpreter, so it’s not completely independent?

                                                    1. 1

                                                      No reason you can’t put a whole virtualenv, python interpreter and all, into your XAR. XAR can pack anything.

                                                      You still need a tool to prepare that virtualenv so that you can pack it, and that’s the sort of tool I struggle to find - cross-compiling a venv, or equivalent in other languages.

                                                      1. 1

                                                        I think most OSS work uses the Mac builders on Travis CI for building mac binaries.

                                                        1. 1

                                                          Yes, exactly. I am less interested in different formats and more in a tool to create them. The ease of doing that with Go is the target.

                                                          1. 1

                                                            The ease of doing that with Go is the target.

                                                            By this you mean, you’re looking for a solution for Python packaging that makes it as easy as Go to distribute universally?

                                                            I used this once before to take some code I wrote for Linux (simple cli with some libraries - click, fabric, etc.) and release it for Windows: http://www.py2exe.org/index.cgi/Tutorial

                                                            The Windows users on my team used the .exe file and it actually worked. It was a while back but I remember that it was straightforward.

                                                    1. 1

                                                      I especially prefer OSM for pedestrian route-finding. I use it on Android via Maps.me, but there are various other apps too. Google Maps seems much more oriented towards road maps and often pedestrian-only paths will be missing, at least in the UK.

                                                      1. 1

                                                        What I haven’t found is a decent app for replacing the functionality of my old Garmin: showing maps and doing logging to a GPX file at the same time. Maps.me can show a track of recent movements but can’t save it.

                                                        1. 4

                                                          https://osmand.net/ might be something for you.

                                                          1.  

                                                            As an update to this, it turns our that in fdroid, there is a fork of Maps.me called just Maps that adds just this functionality. The problem with osmand is the limited downloads.

                                                        1. 2

                                                          I thought this was going to be about PCC but it appears to be something else? Actually some of this PDF seems to suffer from a very poor scan and it’s difficult to read. PCC is still under active development although it is moving very slowly.

                                                          1. 3

                                                            This compiler by Alan Snyder is a different one, yeah, which predates the PCC by a few years, but didn’t really live on. Snyder’s compiler does seem to have influenced PCC, though. The 1978 report announcing PCC says:

                                                            A number of earlier attempts to make portable compilers are worth noting. While on CO-OP assignment to Bell Labs in 1973, Alan Snyder wrote a portable C compiler which was the basis of his Master’s Thesis at M.I.T. This compiler was very slow and complicated, and contained a number of rather serious implementation difficulties; nevertheless, a number of Snyder’s ideas appear in this work.

                                                          1. 4

                                                            Main work/life news for me is that I signed a contract to start next year (January 2019) as Assistant Professor of Computer Science at American University in Washington, DC. So I’ll be moving back to the US. Should be interesting and will definitely be a change of pace, though I’ll miss living in this beautiful seaside town in Cornwall.

                                                            In work-adjacent noodling, added a feed to my paper-reading log per a request. The structure of Atom is simple enough that it was pretty easy to DIY it using a Mustache template.

                                                            Besides that, working on something for the General Video Game AI competition. My goal isn’t necessarily to build an agent that performs well, but to better understand the structure of the space. Characterize the different types of challenges encountered by agents in these kinds of arcade games, understand how algorithm/compute/etc. choices relate to performance, and so on.

                                                            1. 2

                                                              Congratulations on the new job!

                                                            1. 1

                                                              Have there been any updates? This paper is 5 years old.

                                                              1. 3

                                                                Looks like still not really cleared up. Here’s a news article from 2017.

                                                              1. 2

                                                                Paper-reading log:

                                                                I’ve been more or less keeping up with my experiment in keeping a paper-reading log. Reasonably happy with it so far. The original ambition was a more public-facing “these papers are neat and here’s why I think so” blog that explicates important papers in an accessible style. But as will probably not surprise you, that turns out to be a significant undertaking for even one paper, let alone a regular series.

                                                                This version limits me informally to ~250 words per paper and a “just notes on this paper to myself” style, meaning the notes should be comprehensible to future me given the context of the title/abstract, but not necessarily to a general audience. That reduces the usefulness to others, of course, but there seems to be no real reason not to put it online, so I did so anyway. The win is that the constraint to be short and just-notes-to-me means it’s been easier to slip it in as a routine part of my paper-reading without it becoming a big writing project in itself. Though I still don’t write an entry for every paper I read; that’d be nice, but to avoid making it too tedious I limit it to papers I’ve read, liked, and want to be able to find again in the future.

                                                                Blogging:

                                                                In public-facing explanation mode, on the other hand, I did write a blog post giving an informal overview of some recent work we’ve been doing where I work on “rapid game jams”, which are 1-2 hour game jams using parametric game-design apps.

                                                                Exhibition setup:

                                                                Besides that, I’m helping to set up an art/games exhibition that we’re hosting. I am not very good at this kind of thing. Neither my comp. sci. bachelor’s nor PhD gave me any useful training in how to drill into different kinds of walls and mount pieces on them, nor how to pack and unpack fragile things. It’s useful to learn something about, though. Today I learned that it is very easy to ruin drill bits by drilling into cinder-block walls.

                                                                1. 11

                                                                  Really not happy about this. I mean, I recognize that mac gaming is a tiny sliver market anyway, but this essentially will kill the desktop mac games market, although it’ll mean ports from mobile will get much easier.

                                                                  But really, is that what we want? Thank god for Bootcamp I guess?

                                                                  1. 5

                                                                    OpenGL already seems to be an afterthought for most game developers. It’s basically only a Linux/Mac target in practice, no? For Windows, game devs usually target DirectX, and for PS4 and Xbox, there isn’t even OpenGL support. Games today sometimes even run better on Mac using DirectX under Wine compared to using the native macOS OpenGL, although that admittedly makes them less accessible to the average user.

                                                                    1. 2

                                                                      But Wine’s DirectX uses OpenGL as backend.

                                                                    2. 1

                                                                      I assume the major game engines like unity, unreal, and cry, will just emit a Metal taget, like they do for directx and opengl currently? Also, isn’t Vulkan supposed to take over? It seems like OpenGL is just going to die off.

                                                                      1. 2

                                                                        That’s a big assumption. Said engine makers would need to feel confident enough that there will be ROI for them to spend those man hours and dollars that could more profitably be spent on supporting the next next next gen nvidia card or the Playstation 20 :)

                                                                        1. 5

                                                                          Well, MoltenVK is a thing apparently, so I guess if Vulkan does take over, maybe it won’t be /too/ bad as a macos target?

                                                                          EDIT: also looks like (based on very quick searching) that unity and unreal both support Metal as a target already. Being that ios also uses metal, I assume they likely have a vested interest in supporting it there.

                                                                          1. 2

                                                                            You’re clearly way more knowledgeable in this space than I and yeah MoltenVK looks like a thing. Maybe it’s all for the good, I dunno :)

                                                                          2. 3

                                                                            They already do for iOS - Mac OS is trivial after that. It’s no problem for Unity or Epic. It does hurt the little guy with their own engine, however.

                                                                        2. 1

                                                                          It’s not going to stop working, they’re just marking it as no-longer a priority that may stop working in a future update. I can’t imagine anybody is going to be forced to update to whatever future version of MacOS does not include OGL by default. If it’s that important to the industry, people other than Apple will pick up the implementation work. Most professional tools already support Metal and Vulcan, and it seems pretty clear to me that on all platforms, the trend away from OGL is going to continue. Vendors of various rendering and scenegraph libraries can work with their customers to determine what backends they need to support.

                                                                          1. 2

                                                                            Truthfully I’m kind of out of step with that end of things. I was just thinking in terms of all the open source I’ve seen through the years that wanted OpenGL on OSX.

                                                                            Maybe all of it’s been ported to Vulcan or Metal? I dunno.

                                                                        1. 11

                                                                          Here’s the origin of “premium mediocre” for anyone puzzled. As far as I can tell, it means “lower/middle class with pretensions to upper class”; a modern take on nouveau riche from the perspective of the petite bourgouise.

                                                                          1. 5

                                                                            Said in the right tone of voice, I’ve noticed “middle class” itself can mean that in British English (in contrast to American English, where it has an almost exclusively positive, regular-everyday-folks connotation). Often connotes someone/something that is trying to seem upper-class/posh but isn’t really.