1. 4

    Honestly, it’s better than it looks.

    The dmenu concept with volume keys actually makes huge sense for me, as it resembles the side “jogwheel” on Sony CMD feature phones and some Nokia phones too, it was very easy to navigate and extending that menu system (maybe rofi could be used) is whwat I’m looking for.

    Other aspects are still great too, but there might be some UX improvements here and there - like not popping the mpv in verbose mode on each youtube playback. And these gestures look like they’ve been pulled straight from Naruto anime.

    On top of that, making text/call support in modem is impressive!

    1. 0


      How to make your system significantly slower one more time, without any visible gain for the user apart from virtual “security” concept, as it seems like Electron apps and web-ization of desktop experience isn’t enough for that

      1. 9

        more like “The shoe has dropped on x86’s architectural compromises and mistakes; the free ride for performance at the expense of security has ended”

        1. 2

          At this particular point of the history of mankind, regular personal computer users doesn’t care about their own machine’s security at all. Instead, they get angry on yet another patch which “slows down” their machine. And while it was mythical and hard to prove in the past, it’s real now and quite “validates” their standpoint to not update or even rollback anything they use to older version because it runs faster.

          So we get the opposite effect with each exposed vulernability.

          1. 4

            At this particular point of the history of mankind, regular personal computer users doesn’t care about their own machine’s security at all.

            I think we live in different realities. I still remember the Blaster/CodeRED worms and how they took off. And the public panic. That’s why the computing scene has changed to become increasingly locked down and less open over my lifetime. Things my father would tell me about early computing seem unthinkable now, yet they were commonplace once upon a time.

            Instead, they get angry on yet another patch which “slows down” their machine.

            They already do that, even to systems that have not had a change. Perceptions are shaped by emotions and when emotions run hot, people shape their perceptions accordingly.

            The only people who really are going to “feel” this additional security “fix” (more like papering over) are kernel land people and those who depend on context switch overhead not being immense, like web services (where I work). If you can ensure you’re on dedicated hardware, you don’t need these fixes are you probably have a relatively locked down environment. However multitenant hardware like how AWS does business means if it is possible to escape Xen and touch data that isn’t yours, it’s catastrophic. An enterprise company can buy more nodes. They can’t buy back leaked secrets.

            The fundamental vulnerabilities are from the speed race of the late 90s and the processor wars favoring every and any dirty trick to beat the competition. Turns out that sacrificed more than integrity, it sacrificed security in a world where security has become more paramount. And now we’re addicted to fast speeds and security, a difficult and expensive combination

      1. 1

        Unless it manages to do puppeting on both sides of the protocol, it’s quite useless.

        1. 14

          In case you dislike reading twitter threads (as I do), here’s a slightly better version that was linked to at the end.

          1. 10

            It should be automatically done for every twitter link posted on Lobsters

          1. 2

            If you’re a developer in 2020, you could learn a lot from this course. You can learn a few principles (that I made up myself) that seem to be prevalent in the world of COBOL:

            I think the author is saying that these principles should be followed in all/most projects, though perhaps they are merely saying that these are principles you could learn. I think that most projects will actually not benefit from these principles.

            • Preserve your resources - memory, disk space, and CPU cycles are not free. Conserve them as much as possible with what you build.

            Why? Spending time on performance is by definition unnecessary unless there’s a problem with performance. And most of the time there isn’t. You don’t even need to get the algorithmic complexity right most of the time. Performance optimisation is fun, and in some cases it’s crucial, but why should it be mandatory for all or most projects?

            • Be explicit in everything you do - Take the time to figure out what you need and then declare it. Think arrays and structs instead of lists and generics. Performance comes from knowing exactly what you need and when.

            I think being explicit is great, but I don’t think lists or generics hurt that particularly.

            • Write code as if it will live for decades - For COBOL programmers, it’s true. Think ahead and act as if your code will live on for years. How should you write it so it can be maintained further down the line?

            Most code won’t live for decades and (in the only study I’ve seen on this, can’t find citation, it looked at git histories of some large projects) will only be edited by a fairly small number of people for a short amount of time, even within long-lasting projects.

            I like writing elegant, maintainable code, but it’s definitely unclear that it’s actually objectively a worthwhile endeavour. I wouldn’t prescribe to others that they must or should write nicer code. In fact, many of my scientific peers write little scripts that are essentially write-once, read-never because it’s the outputs that are interesting. That’s totally fine when you’re exploring an idea (though of course the reproducibility code for the final write-up should be made understandable to others).

            • Avoid breaking changes - Modern developers love reinventing the wheel. We also like introducing breaking changes and telling people to upgrade if they don’t like it. COBOL takes the approach to only break things when there is no other option. We need a little more of that in 2020.

            This is another unjustified value judgement. Taking backwards compatibility seriously can be a huge amount of work that, empirically, you just don’t need to do to make useful software. I don’t think it’s true modern programmers love reinventing wheels more than the programmers of yesteryear, either. If anything, I imagine it was more common to reinvent stuff when package management was harder and open source software hadn’t yet taken off.

            1. 1

              Not caring about performance is one of those locally optimal for a single developer or a small team decisions but globally sub-optimal for the rest of society. In a very real way it’s a selfish attitude.

              1. 2

                That’s just not true and you should think about it more carefully.

                There’s loads of code that’s just never on the critical path or that is not run often enough to justify the expense of performance optimisation.

                The value judgement is also unsupportable. Why should I produce efficient code when that does not have value for me? That may cost me time and happiness and make my code uglier. And, going to a utilitarian perspective, reducing the number of watts my code uses is far from the most impactful political action I can make.

                If someone depends on my code and needs different performance properties they can improve it or ask me nicely. It’s entitlement to assume that others should sacrifice for your benefit.

                There are obviously ways that we can move so that we can have generally higher performance code without incurring costs, and I support those (and use Julia, a high performance language), but that’s different from your statement.

                1. 2

                  One word: multitasking environments.

                  If you have – for example – a desktop machine with some amount of resources you don’t want that shiny Electron app to hog all of them just to write text to file or process some data which don’t need that much memory and cycles, right?

                  As a developer, you should understand you are not alone at the target machine, even in server environments, so you should always try to limit your footprint as much as possible to let other processes use the remaining resources, resulting in increased capacity of the machine.

                  That’s a bit like a savior-vivre or just a good manners in software development, right?

                  1. 1

                    The existence of situations where writing higher performance code is useful does not imply that it is appropriate to always “conserve [resources] as much as possible with what you build”. This should be obvious.

                    1. 2

                      I am struggling to come up with a situation where code is not running in a multitasking environment. Well I can but they all have even more constraints on cpu/memory/io than the code that is running in a multitasking environment. The chance that code you or I will be writing that doesn’t have a societally good reason to conserve resources is vanishingly small.

                      1. 1

                        If costs are equal, then it’s obviously better to use fewer resources. But often it is expensive to make code more efficient. In any kind of rational assessment you must assess both the benefits and costs of an action.

                        1. 1

                          I think we are actually perhaps in violent agreement here. I was making the case for caring about performance at the whole program level. After re-reading your responses it sounds like you are focusing more on the function or line of code level. I do think that if your code is not on the critical path and nearly never run then it by definition is not a performance concern.

                          I agree that performance should be considered holistically in context. The point I was trying to make was that the holistic context is the entire machine the code is running on. If your program hogs resources because the developer decided that the performance of the app was “good enough” that is when I think the decision was selfish.

                          1. 1

                            My point is that software is used in a huge variety of situations and often performance is genuinely pretty unimportant. So my comments aren’t just about individual functions but about whole libraries and systems.

                            Lots of programming contexts do not care about run time or memory use particularly. E.g. lots of scientific programming, batch business logic programs, etc. Yes, this code is sometimes or often run on hardware with UIs that it slows down, but that rarely matters much.

                            It would be nice if this code ran faster, but it’s not very important. There will usually be more valuable things to do (and often the authors don’t have the specialised skills to make it perform better, anyway).

                  2. 2

                    Yesterday I was working on my top of the line laptop and various different applications were being slow. There was no good reason for them to be slow or for so much of my CPU to be consumed but the combined number of my applications all contributed to the death by a thousand cuts of my machines performance. I wasn’t compiling anything, I wasn’t watching a video, I was just browsing the internet. I don’t even browse a lot of heavy sites on the internet. I’m working on the equivalent of a super computer and I still can’t get a high performing experience.

                    I’m not even talking just about the number of watts your code might cause my computer to consume. I’m talking purely about how any code you produce for others to use has an impact on the entire ecosystem of code I am running.

                    It’s very much in the category of globally sub-optimal for society even if it’s locally optimal for you. I’m not even making a value judgement. It very much is in your interest sometimes to sacrifice cpu/memory/io in your application in the name of shipping sooner or solving the right problem at the right time. But it doesn’t change the fact that I have a million cuts to performance sitting on my laptop right now and sometimes my computer suffers as a result. The blame doesn’t lie with any single app. It lies with all of them and the various ways they all made similarly locally optimal decisions. In a way it’s a collective action problem. It’s only worth it to an individual or company if all the individuals or companies do the same thing since changing your ways when everyone else won’t has no effect and in fact your code may still be perceived to be slow simply because it’s in the same ecosystem as everyone else.

                    1. 1

                      None of that addresses this:

                      There’s loads of code that’s just never on the critical path or that is not run often enough to justify the expense of performance optimisation.

                      Yes, lots of modern UIs are high-latency memory-hogs, but that doesn’t invalidate my argument at all.

              1. 2

                I’m gonna repeat that again and again – COBOL does not deserve the criticism it gets all the time.

                I’ve looked into specs/docs of it and some random code… If you wouldn’t tell me it was made in 60’s I might think about 90’s/00’s as well. Only the line markers are exposing its nature and I’m quite sure they’re not relevant in modern COBOL variants (yes, the language seems to be constantly upgraded).

                It’s weird, yes. It’s like a completely alternative universe of IT, for sure. But it seems… done right? I mean, someone sat down around the table and designed the language from top to bottom to do one job – do a Big Enterprise Processing at Scale. And it was already working well in 1960s!

                It’s readable, you can let your accountant sit down at 3270 terminal (or its emulator or even just a text editor) and there’s a large chance he/she’ll understand what’s happening there with very little background of IT theory. And in some cases, they might understand the process better than average JavaScript programmer sitting at Starbucks with his shiny new Macbook.

                So, it’s not a thing you’d run on your 1st line web backend or do a mobile app in it. I’m pretty sure these both can be done in COBOL though. At least that web framework thingy was done (and its author is now a leader of extremist left-wing political party in Poland, believing in communism, how weird). But you’re not gonna do anything like that seriously… okay?

                On the other hand, if I had a huge list of customers (in 100’000s or even millions) who want to buy something from me or even just store/access their info, I would actually look into the stack of COBOL+DB/2+CICS instead of NodeJS+NoSQL+Redis+K8S and millions of tiny containers, even if I know only a little bit of now. At least I would be sure my business is “safe” for next 20-30 years with its code.

                1. 2

                  During Y2K, I heard some fortran programmers came out of retirement. The author is saying COBOL, this time around. Which programmers will be called out of retirement a few decades from now?

                  1. 13

                    Perl, there is a lot of it out there but the pool of people willing to work on it is shrinking.

                    1. 3

                      Having written quite a bit of Perl and knowing some of the systems that are written in it, I think that it will be deprecated or replaced before I reach retirement because of changes in the business environment, upgrades to the underlying products, changes in process, or just after doing a risk analysis. That some of that stuff is already from the late-90’s and early 2000’s and still going is impressive (or disappointing, depending on your POV).

                      1. 2

                        Yeah, this. There was a lot of Perl written for an eight year or so stretch, and a lot of that is in places we wouldn’t expect.

                        1. 2

                          There will always be money for people who can suspend their sense of smell!

                          1. 9

                            I’ll take clean Perl code over dirty Java or C# any day of the week.

                            1. 2

                              Sure, but how many companies trying to lure Perl programmers out of retirement are going to be in possession of a clean codebase?

                              1. 1

                                I think if you’re luring programmers out of retirement to work on something, the odds of it being clean are fairly minimal, regardless of language.

                                1. 2

                                  Absolutely. That’s what I meant. I didn’t mean to imply anything specific about Perl.

                        2. 6

                          Ruby, I think

                          1. 1

                            Really? I would think that there’s a lot more legacy business-logic Java code in production than Ruby. Of course, I don’t have any hard data to back up my intuition.

                            1. 4

                              Yeah, but I see Java sticking around, like C or C++ have. Ruby is more niche than Java, but still gets used by a lot of companies. There is a possible future where it’s a legacy skill in 30 years.

                              Though, thinking about this more, I suspect VBA, FOXPRO or MUMPS are probably better candidates.

                              (And yes, I see the irony of posting this on a forum powered by Rails)

                              1. 2

                                TBH, the best thing to “reviwe” Ruby (which is IMO already done in Crystal but nevermind) is to actually kill Rails.

                                It’s completely opposite to what Ruby is, denies its values and throws many of its nice aspects into the window. And then, most people who don’t know Ruby at all are looking at it from the RoR lenses, thinking it’s a huge mess.

                                And, in the meantime, Sinatra sits there like nothing ever happened.

                                1. 2

                                  Sinatra at least did influence Go and Node.js/Express a bit.

                                  Doesn’t Crystal have exponentially growing compile times due to all of the type inference? At least, I had heard that can be a problem.

                                  I don’t see Ruby truly dying any time soon, just becoming progressively more niche over time, kinda like Perl has.

                              2. 3

                                I think the difference is that Java will be alive and well in 20 years … but Ruby? Not sure.

                                • Programming for non-programmers is now largely done in Python (scientists) or Go (infrastructure).
                                • Other languages closed the gap in terms of productivity in the web space, also NodeJS exists.
                                • Learning to program now often happens In JavaScript, thanks to browsers shipping it.
                            2. 3

                              Perl maybe?

                              1. 1

                                I see Perl getting there in the next 10 years, not the next 20-40, if it is going to get there at all.

                            1. 1

                              I dunno why, but I expected it has a whole Simpsons episode stored as CSS animation.

                              1. 7

                                It should be “Stop Making Students Use Java” first.

                                Not only due to my personal criticism against Java, but mostly because it limits you to the JVM instead of teaching students about the actual systems and platforms.

                                1. 6

                                  I agree, but for somewhat different reasons – and I’ll rephrase that as “Stop making students use java first” :)

                                  From the first section of the article:

                                  A student who has not written an if statement doesn’t need to understand the philosophy behind putting each public class in its own file, or what public or “class” even means

                                  I completely agree. In that case, perhaps a language that forces you to start everything with “class” and follow it with “public static int main” is not the best choice?

                                  If you want to teach someone to write if statements and loops, provide a tool that allows them to do just that – eliminate the distractions rather than hiding them. Just like the article mentions later on:

                                  Use a language that teaches the fundamentals of the paradigm you’re interested in, like Scheme or Python. (Please, please not Java.)

                                  IDEs are not the main problem here. Java is. But an overly helpful IDE can also bring more harm than good.

                                  When I teach my students Python, I suggest them simple IDEs like Thonny because I don’t want to see them spending 60% of their time juggling windows around in order to see the results of what they wrote – only to realize that they forgot to save the file. At the same time, I discourage them from using something like PyCharm at least for the first few days, since the constant “help” of autocompletion makes them almost brainless – if autocomplete suggests something then it’s probably right! After all, I’m just a beginner! Sometimes I encounter self-made beginner programmers who did start with PyCharm, and now they don’t really know Python – they just know how to write Python in that IDE. That’s not very helpful either, even if it seemingly makes them effective quickly.

                                  1. 3

                                    it limits you to the JVM instead of teaching students about the actual systems and platforms

                                    Conversely, it enables one to concentrate on the programming language without having to delve into the minutuae of each hosting OS.

                                    1. 1

                                      If you provide the 100% pure Java-powered OS to the world, that would be true.

                                    2. 3

                                      but mostly because it limits you to the JVM instead of teaching students about the actual systems and platforms.

                                      Most programming activity does not require you to know about the actual systems and platforms. And no matter how much you know about it there’s still a layer underneath so you gotta stop somewhere.

                                      Java is still a terrible first lang tho.

                                      1. 1

                                        Most programming activity does not require you to know about the actual systems and platforms.

                                        It sure doesn’t! Don’t get me wrong, I’m not pushing people to making their own memory management and doing syscalls at their first days.

                                        But, right after you get out of the academic code (like, algorithms only, with strictly defined input and output) you might want to know how to do basic “operational” tasks. Most of the cases reading/writing to file(s) comes first, and you probably should know where to write, why you can’t write there, what you need to do first and why it doesn’t work on a Mac in the same way (because you put some assumptions into the code). This is where “knowing the platform” comes in.

                                        That might sound silly for people around here, but there are quite a lot (mostly young) people who want “to code” but barely know what even a file is (mobile platforms are quite good at hiding that file abstraction from the user, I’m afraid of that trend moving onto bigger machines) or what’s a difference between process and thread (which isn’t obvious on mobile as well).

                                    1. 4

                                      As someone already mentioned, Grafx2 is the tool for any sort of hand-clicked graphics though I did some pixelart in GIMP too.

                                      But I’m quite sad about ways GIMP tries to chase Photoshop recently, with things like hiding 2/3 of the tools in unintuitive toolgroups.

                                      1. 2

                                        All you need to do is create a small native wrapper that opens a new tab in the user’s existing browser and package it in an platform specific installer.

                                        Has anyone created something like this? I’d love to be able to ship an javascript frontend that launches in a new instance of Firefox or Chrome (without most browser decorations).

                                        1. 2



                                          A very small library to build modern HTML5 desktop apps in Go. It uses Chrome browser as a UI layer. Unlike Electron it doesn’t bundle Chrome into the app package, but rather reuses the one that is already installed. Lorca establishes a connection to the browser window and allows calling Go code from the UI and manipulating UI from Go in a seamless manner.

                                          1. 1

                                            I think Electron is pretty much what you’re describing, right?

                                            1. 3

                                              I’d like to not pacakge in an entire chromium installation!

                                              1. 1

                                                Ah, I misinterpreted what you meant by “a new instance” of the browser.

                                                I’m not aware of a way to get what you’re describing—it seems like you either need to package an entire browser yourself, or at least as much of it as you need, or else you need to rely on the user’s existing browser and don’t have a way to make it look as if your app is its own separate thing.

                                          1. 4

                                            offline web apps

                                            They could be web apps or offline apps. No both at once.

                                            And I approve every measure to kill cancerous “webdev” in mobile apps. Just treat your customers right and deliver native applications, okay?

                                            1. 7

                                              It can be both at the same time. You get the best of both worlds, the amazing distribution capabilities of the web and the offline features of a native app. The fact that you don’t like them shouldn’t preclude people from shipping them and those who enjoy them, from using them.

                                              For example, my main machine is a Surface Pro X and I don’t get many native apps for my system but I can use PWAs on the desktop and they work great.

                                              Also you mention:

                                              I approve every measure to kill cancerous “webdev” in mobile apps

                                              Which means you’re unaware that this affects desktop as well, thought you should know.

                                              I find your phrase:

                                              Just treat your customers right and deliver native applications, okay?

                                              Disingenous are there are many reasons to ship a web app, among them is the fact that to ship an iOS application you need at least a Macintosh computer and a paid membership to Apple’s developer program, while to ship a web app you need nothing, you can do it from a cybercafe using nothing but online services. The Web is empowering, democratic as in most people can post, and interoperable. These qualities are not usually present in native apps. Saying that web developers are cancerous is quite bad. The Web is the only mass media communication channel available for mankind. It is how many of us are finding comfort and solace in this time of lockdowns and pandemic. It is how we’re learning new skills, finding new friends, recording our memories. Do not disqualify something that is larger than what your petty prejudices can reason about.

                                              1. 7

                                                Just treat your customers right and deliver native applications, okay?

                                                This may be preferable for your existing mainstream customers, but it also makes sure that you become one more reason why any iPhone/Google Android alternative is bound to fail.

                                                I hate web technologies as much as anybody else, but if it wasn’t for them I wouldn’t have Uber or Duolingo on my phone. If webapps is what it takes to have any chance at some mobile platform diversity, I’d happily waste some CPU cycles.

                                                1. 3

                                                  Are you talking on the desktop or on mobile?

                                                  On mobile, as a consumer, I really don’t want apps I need to install and that spy on me. I trust the web browser sandbox plus plugins so much more then I would ever trust app reviews. And app stores are closed ecosystems and there are so few of them and they are OS specific – argh. Disgusting.

                                                  On the desktop, app sandboxes are premature. And often suffer the same problems as the app stores.

                                                1. 1

                                                  It’s very nice, but instead of directly executing the snipped I would just print out the command with explanation of parameters as comments to not let people forget they’re still in CLI and they shouldn’t rely on all these fancy stuff.

                                                  1. 3

                                                    With tools like this, what I’m looking for is exactly the “fancy stuff”. Or at least “stuff that lets me type less”. My hands already hurt bad enough. The command line is not a hair shirt. It’s ok to make it nicer!

                                                    At any rate I’ll probably give this a go. FZF for ctrl-r and jumping around directory history has already done wonders for my general experience of life in the shell.

                                                  1. 1

                                                    Not me.

                                                    The company didn’t lifted their strict policy regarding only singe weekday for WFH. Understandable as I couldn’t bring a $30’000 hardware at home (and it’s kinda bulky) but sucks at times where I can do stuff remotely or using patched QEMU.

                                                    1. 1

                                                      Why? DiI.FM let’s you just download a PLS/M3U playlist which you can just play as set od streams in regular media player, like CMUS.

                                                      How’s that different?

                                                      1. 7

                                                        I spend a lot of time in the terminal and prefer not to leave it. I’m aware there are command line utilities capable of streaming di.fm streams from the command line, but they don’t have favorites and all stations as first-class concepts. They also require you to keep pls/m3u files around on your local machine if you want to try replicating those concepts with a directory structure. I prefer not to do that.

                                                        But most importantly, because I had some hours to spair and I wanted it :)

                                                        1. 2

                                                          I think the shell tool which logs into the DI API and returns the playlist from favorites or specific criteria on stdout or to file would be a better idea. And then you still will be able to interact with DI directly and not reinvent the wheel using proven and tested media players.

                                                      1. 4

                                                        I wish we’ll get some router/“edgebox” style device on it, replacing my APU4 with even better power usage and thermals.

                                                        1. 2

                                                          I didn’t knew I needed this in my life.

                                                          1. 1

                                                            Thanks skrzyp :)

                                                          1. 2

                                                            If someone’s gonna make the same with SimCity 2000/3000 and implement the industry logic from OpenTTD and routing/economy of Cities:Skylines (I know there would be some simplification due to isometric layout) I’ll throw a serious regular donation into him.

                                                            OpenSC doesn’t count - it was a “web application” and got DMCAed down for hardcoding original Maxis assets into it. I expect the game to be a standalone native executable using free dedicated graphics set, I think authors of OpenGFX - the baseset for OpenTTD - wouldn’t complain at all for the fair use in such open SimCity game.

                                                            LinCity-NG was… quite nice. It didn’t aged well though - the project seems to be unmaintained now and the “we did that as 3D model in Blender then dumped the bitmap out of it” 2D graphics was very cool back in the day but it looks like a student project now. On the feature side, it was playable - but didn’t even reached to the feature completion of the original SC3000, was more like the OG 2D classic SC in isometric view from 2K/3K without additions. But I spent a considerable amount of time in it regardless of that :)

                                                            TheoTown is what I’ll be pointing finger on currently, but it’s only for Android guys and AFAIK non-free. But the spirit is preserved, graphics has its own style and won’t age out that much and author adds many improvements and additions over the standard SC formula without thrashing it.

                                                            1. 1


                                                              Oh that seems pretty good work, albeit early stage. Shame.

                                                              If someone’s gonna make the same with SimCity 2000/3000 and implement the industry logic from OpenTTD and routing/economy of Cities:Skylines

                                                              I’ve been yearning to do that for years too :)

                                                              1. 1

                                                                but it’s only for Android guys

                                                                Apparently it’s available for Android, iOS, Windows, MacOS and Linux (last three via Steam). It’s not free though, in any sense of the word.

                                                              1. 3

                                                                Not the author, but I thought this was fantastic. Not even sure how it’s possible

                                                                1. 2

                                                                  He’s just doing a very elaborate string parsing to get the ASCII art.

                                                                  1. 3

                                                                    Sure looks like it’s doing some sort of simple ray tracing, doesn’t it?

                                                                    1. 2

                                                                      Yeah, it looks like iters section is stepping along the rays (up to 15 times) and seeing if they intersect with the shape function, and lastIters is finding where each ray hits the shape. Everything before iters is there to set up the rays for each pixel, and the final steps are selecting a color. (I think.)

                                                                  2. 1

                                                                    SQL is Turing complete, so it was obvious that it is possible.

                                                                    1. 4

                                                                      Obvious to you, maybe. I’m still seriously impressed with this.

                                                                  1. 4

                                                                    Super nice video, but I expected he would talk about (or debunk) the legend of writing Crash in some custom Lisp generating highly efficient machine code and allocating sectors on the CD by hand to optimize loading.

                                                                    1. 2

                                                                      I think that was jak n daxter, not crash bandicoot

                                                                        1. 3

                                                                          That post series is one my favourite devdiaries/postmortems I have read. I always link that for people interested about old consoles or games.