Having owned a Framework since April of 2022, I cannot recommend them to people who need even basic durability in their devices. Since then, I have done two mainboard replacements, two top cover replacements, a hinge replacement, a battery replacement, several glue jobs after the captive screw hubs sheared from the plastic backing…
It’s just such an absurdly fragile device with incredibly poor thermals. They sacrificed a ton of desirable features to make the laptop repairable, but ultimately have released a set of devices that, when used in real-world settings, end with you repairing the device more often than not. And these repairs are often non-trivial.
I will personally be migrating to another machine. The Framework 12’s focus on durability may be trending in the right direction, but to regain trust, I’d need to see things like drop and wear tests. A laptop that can be repaired, but needs constant upkeep/incredibly delicate handling, is ultimately not an actual consumer device, but a hobbyist device.
Maybe they’ll get better in a few years. Maybe the Framework 12 will be better. Their new focus on AI, the soldered RAM in the desktop offering, and the failure to address the flimsy plastic chassis innards, among other things, mean that they have a long way to go.
It’s definitely a “be part of the community that helps solve our product problems” sort of feeling.
I have an AMD FW13, and was trying to figure out why it loses 50+% of its battery charge overnight when I close the lid, because I don’t use this computer every single day and don’t want to have remember to charge yet another device.
So I check the basics-I’m running their officially supported Linux distro, BIOS is current, etc. And half an hour into reading forum threads about diagnosing sleep power draw, I realize that this is not how I want to spend my time on this planet. I love that they’re trying to build repairable/upgradeable devices, but that goal doesn’t matter so much if people end up ditching your products for another option because they’re just tired of trying to fix it.
I’ll chime in with the opposite experience - I’ve owned an AMD Framework 13 since it came out, and had no durability issues with it whatsoever, and it’s been one of my 2 favorite computers I’ve ever owned. I’ve done one main board replacement that saved my butt after a bottle of gin fell over on top of it in transport.
Development and light gaming (on Linux, I very much appreciate their Linux support) have been great, and the reparability both gives me peace of mind, an upgrade path, and has already saved me quite a bit of money.
I’ve owned a framework since Batch 1. Durability has not been a problem for me. My original screen has a small chip in it from when I put it in a bag with something that had sharp edges and pressured the screen for a whole flight. Slowly growing. Otherwise, it’s been solid.
Same. I have a batch 1. There are quirks, which I expected and knew I am supporting a startup with little experience. I since have upgraded and put my old board into a cooler master case. This is so amazing, and what I cared about. I am still super happy with having bought the Framework and particular for tinkerers and people who will have a use for their old mainboards it’s amazing.
I get harbouring resentment for a company you felt sold then a bad product. But at the same time, you bought a laptop from a very inexperienced company which was brand new at making laptops, a pretty difficult product category to get right when you’re not just re-branding someone else’s white-label hardware.
3 years have passed since then, if I were in the market for a category which Framework competes in these days I would be inclined to look at more recent reviews and customer testimonials. I don’t think flaws in that 3 year old hardware is that relevant anymore. Not because 3 years is a particularly long time in the computer hardware business, but because it’s a really long time relative to the short life of this particular company.
I would agree that 3 years is enough time for a company to use their production lessons to improve their product. But nothing has changed in the Framework 13.
I don’t resent Framework. I think that’s putting words in my mouth. I just cannot, in good faith, recommend their products to people who need even a semi-durable machine. That’s just fact.
a very inexperienced company which was brand new at making laptops
Founded by people who had experience designing laptops already, and manufactured by a company that manufactures many laptops. Poor explanations for the problems, IMO.
I’ve had a 12th gen Intel since Sept 2022 (running NixOS btw) and I have not had any issues, I will admit it sits in one place 99% of the time. I might order the replacement hinge since mine is a bit floppy but not too big a deal.
As for the event, I was hoping for a minipc using the 395 and I got my wish. Bit pricey and not small enough for where I want to put it and I have no plans for AI work so it’s probably not the right machine for me.
I was originally interested in the HP machine coming with the same CPU (which should be small enough to fit) but I’ve been pricing an AMD 9950 and it comes out cheaper. I was also disappointed there wasn’t a sku with 385 Max w/64GB of RAM , which I might have have ordered to keep the cost down.
For reference a new machine is intended to replace a 10 year old Devils Canyon system.
I’ve also had my Framework 13 since beginning of 2022. I’ve had to do a hinge replacement, input cover replacement, and mainboard replacement. But I sort of expected that since it’s a young company and hardware is hard. And through all of it support was very responsive and helpful.
I would expect that nowadays the laptops are probably more solidly built than those early batches!
Support was definitely helpful. I just don’t have time or money to replace parts on my machine anymore.
From what I understand, the laptops aren’t any stronger. Even the Framework 16 just got some aftermarket/post-launch foam pads to put below the keyboard to alleviate the strain on the keyboard. The entire keyboard deck would flex.
The fact that these products have these flaws makes me wonder how Framework organizes its engineering priorities.
When compared to other similar laptops from brands like HP or Lenovo, how does the deck flex compare? I definitely feel sympathetic to not being better or on par with Apple - given the heaps of money Apple has for economies of scale + lots of mechanical engineers, but it would be a bit rough if mid-tier laptops in that category were far superior.
The deck flex is on par with or worse than an HP EliteBook circa 2019. The problem is that it’s incredibly easy to bend the entire frame of the machine, to the point where it interferes with the touchpad’s ability to click.
It’s really bad, bordering on unexcusable. The fact that there’s no concrete reinforcment says that they sacrificed build quality for repairability, which is equivalent to making a leaky boat with a very fast bilge pump.
I’m not sure what you’re doing to your laptop; how are you bending the entire frame of the machine?
It’s a new company that is largely doing right by open source, and especially open hardware. The quality isn’t incredible but it is worth its value, and I find these claims you’re making dubious.
It’s a fairly common flex point for the chassis, and a common support problem. The base of the mousepad, towards the front of the laptop where there’s a depression in the case, is where the majority of the flex is.
My laptop has seen nothing but daily, regular use. You can find the claims dubious, but others are having them too.
I’ll chime in too: I’ve had the Framework 13 AMD since it came out (mid 2023) and it has been great.
I upgraded the display after the new 2.8K panel came out, it took 2 minutes. Couple months later it developed some dead pixels, so they sent me a replacement. In the process of swapping it out, I accidentally tore the display cable. It took me a while to notice/debug it, but in the end it was just a $15 cable replacement that I’m fairly sure would have otherwise resulted in a full mainboard replacement for any other laptop. (When I had Macbooks, I lost count how many times Apple replaced the mainboard for the smallest thing.)
I haven’t been too precious with it, I toss it around like I did my Thinkpad before this. There’s some scuffs but it has been fine, perhaps the newer models are more sturdy? It’s comforting to know that if anything breaks, I’ll be able to fix it.
I also run NixOS on it, it does everything I need it to do, the battery life is great (8-10 hours of moderate use) and I’ll happily swap out the battery in a few more years once it starts losing capacity.
I spend so much of my life at the computer that feeling a sense of ownership over the components makes a lot of sense to me. I don’t want to feel like I’m living in a hotel.
It is, in fact, how I want to spend my time on this planet.
To add to the chorus, I bought a 12th gen intel framework 13 on release and it’s been flawless so far. Nixos worked out of the box. I love the 3:2 screen. I can totally believe that a small/young manufacturing company has quality control issues and some people are getting lemons, but the design itself seems solid to me.
On my old dell laptop I snapped all the usb ports on one side (by lifting up the other side while keyboard/mouse were still connected). Since they’re connected directly to the motherboard they weren’t repairable without buying a new cpu. If I did the same on the framework it would only break the $12 expansion cards and I wouldn’t even have to turn it off to replace them.
Later I dropped that same dell about 20cm on to a couch with the screen open. The impact swung the screen open all the way and snapped the hinges. They wanted me to send it back for repairs but I couldn’t handle the downtime, so for a year I just had the hinges duck-taped together. I’ve dropped my framework the same way, but because the screen opens the full 180 degrees it doesn’t leverage the hinges at all. And if it did break I’d be able to ship the part and replace it myself.
Not that I support the desktop offering as anything but waste, but the soldered RAM is apparently all about throughput:
We spent months working with AMD to explore ways around this but ultimately determined that it wasn’t technically feasible to land modular memory at high throughput with the 256-bit memory bus. (source)
I wish I could name something in good faith that was comparable to a hyper-repairable x86-64 laptop. Lenovo is pivoting towards repairability with the T14 Gen 5, but I can’t recommend that either yet.
Star Labs, System76, some old Thinkpad models.. there are “competitive” things, but few things that pitch the things Framework does.
While I agree on some of that, I must stress that I’ve had hardware that was fine until just one thing suddenly broke and everything was unusable. I’ll try an analogy: with repairability, if all your components are 99% reliable and working, the whole machine is at 99% but without it, even if all of them are at 99.9% instead, when you have 10 components, you’re not in a better situation overall.
And I say that while I need to finish going through support for a mainboard replacement due to fried USB ports on a first-gen machine (although not an initial batch). BTW, funnily I’m wondering if there’s an interaction with my yubikey. I also wish the chassis was a bit sturdier but that’s more of a wish.
As for thermals, while I think they could probably be better, the 11th gen Intel CPU that you have (just like I do) isn’t great at all: 13th gen ones are much better AFAIK.
I’ve experienced a full main board failure which led to me upgrading to a 12th gen on my own dime.
The thermal problems are still there, and their fans have some surprising QA problems that are exacerbated by thermal issues.
I wish I could excuse the fact that my machine feels like it’s going to explode even with power management. The fans grind after three replacements now, and I lack the energy and motivation to do a fourth.
I think 12th gen is pretty similar to 11th gen. I contemplated the upgrade for similar reasons but held off because I didn’t need to know and the gains seemed low. IIRC it’s really with 13th gen that Intel improved the CPUs. But I agree the thermals/power seems sub-par; I feel like it could definitely be better.
BTW, I just “remembered” that I use mine mostly on my desk and it’s not directly sitting on it which greatly improves its cooling (I can’t give hard numbers but I see the temps under load are better and max CPU frequency can be maintained).
Sorry to hear about the trouble with your Framework 13. To offer another data point: I have a 12th gen Framework 13 and haven’t needed to repair a thing, I’m still super happy with it. The frame-bending I’ve also not seen, it’s a super sturdy device for me.
I feel compelled to point out that ELKS isn’t really Linux, it’s a “Linux-like” 16 bit OS. At this point, I’m not even sure if you couldn’t call it being closer to xenix86 or PC/IX, than Linux.
Nevertheless, numerous other projects have shown that you can boot actual Linux on very primitive hardware through the same hardware emulation approach, although most of them literally take days or weeks to boot. This one, albeit still slow, seems more “usable”, by virtue of emulating a much more modest environment.
I yell a lot about the software crisis, but this is a symptom of it.
The trigger on this update was pulled presumably without testing. With the effects being as wide-spread as they are, I doubt this was a “works on my machine” situation.
With the pervasive devaluation of programming as a trade by managerial forces, and the progressive migration to fungible workers, I doubt this will be the last time this happens on this scale.
If you’ve ever done any driver programming, especially windows drivers, you will know that it is difficult to test the full matrix of every possible version / configuration / matrix of other drivers on the machine.
I would expect driver developers working on something as wide spread a crowdstrike to be well versed in the nuances… but when I worked at microsoft even the full time kernel team folks would make mistakes that would cause major issues internally for people self hosting windows. I maintained a relatively small boot driver that ran on windows 8 - windows 11 (client and server), and the matrix of possible interactions your code could have was overwhelming to test at times (https://learn.microsoft.com/en-us/windows-hardware/drivers/kernel/managing-hardware-priorities).
I’m not making excuses for them, but I imagine this is not something you can chalk up to just pure negligence.
Oh, I guarantee you the things that get shipped are difficult to test. That’s a given.
This isn’t on the engineers. This is on management. CrowdStrike has been undergoing rolling layoffs and I guarantee you some of the folks that were gutted were QA-related.
It’s unfortunately a common tale. Cut staff, cut morale, micromanage, construct a meat grinder, and things like this will happen.
I agree, but consider that it is pretty hard to test antivirus updates since they are extremely time-sensitive. They typically try to respond to the latest threats within a few days. There is no real possibility of weeks of testing, or staged rollouts, or other practices companies normally use.
Working on ironing out performance kinks in my prototype of Nova. Adding graphics and input handling proved that it was faster than I thought, but could still use work, mostly in the bytecode generation.
The lack of discussion for dataflow languages is pretty weird. Yes, we can have visual interfaces showing the function of each line of code (e.g. Blockly), and no, we seldom need it.
But to be able to wrap functions in an interface that maps function calls and makes it easy to go from writing wrapper functions that deal in type FOO to getting them to do List and then Stream, that’s pretty convenient.
And if the interface generates actor-model type code for doing these things, then you have a way to write concurrent code that’s useful; even to newbies.
The typical (searchable) term for this, that I’ve found, is Flow-Based Programming. Dataflow languages encompass a lot of working literature, but Flow-Based Programming is the most narrow term for this (black boxes communicating with each-other via a dataflow graph).
This article resonates with me. A good chunk of visual programming doesn’t remove the user from playing computer in their head, and at its worst ends up as messes of wires and interconnections.
I can see, and read, code. I don’t need a visual representation of it. It may be convenient to have a visual representation, but what I want is..
A set of binoculars into the state of my program, or the arrangement and state of a system.
A way to turn diagrammatic data into data that I can use, and possibly back again. (This is why Excel is popular, among other reasons.)
A way to visualize relationships between the components of my program.
A way to test hypothetical situations.
All of these revolve around providing visibility where it doesn’t arise naturally, which visual programming systems haven’t really delivered. (If there are ones that deliver on the above, please let me know!)
Essentially, creating a software which behaves in a desired way, requires “playing computer in their head*”. No matter if it is source code or visual programming, they are just tools to reach the goal, which goal needs complex analythical thinking, abstraction, see everything in a system, etc.
*Why need to “play computer”? Because that is a deterministic machine, which you can’t talk in ideas, only pragmatically and exact. You need to think of a ton of edge cases and be fully aware of what’s required as an output.
“Regular users” can’t do that, only someone who can think in computer.
I fully disagree. “Playing computer in your head” is largely historical precedent. There’s no fundamental reason why we have to program with blindfolds on.
Yes, machines have operational models, but we don’t need to keep the state of the machine invisible or pretend we’re programming for punch card machines. We have the ability to peer into the state of the machine, explore hypotheticals, and encode ideas.
I’m even working on something that allows you to sketch programs who’s functionality is close to what you meant to write, coupled with tools to explore hypotheticals.
I’m not sure I get the “blindfold” metaphor properly.
I don’t have the statistics from the last 60 years of no-code attempts, but I have a safe bet that these tools were most effectively used by people capable of software development. We can change the “language” (or the form of expression) to something visual, something more close to natural speaking, but to phrase those words, you need to be aware of the ideas of “operational models”, “state of the machine”, abstraction, completeness, some basic programming structures (like iteration, condition, etc.). That’s what I call “playing computer in your head”, otherwise, where else are you using these skills?
To some degree you’re certainly correct, we cannot just go around guessing instructions, but I don’t think it’s fair to say that it is “essential”. You can now create programs using ChatGPT and run them and maybe some of them do what you want some of the time! This is not so different from how most programs are developed, which involves a bunch of guessing, making mistakes, testing and fixing until you have something that maybe does what you want enough of the time so you stop iterating. Also, logic programming and constraint solving and SQL-like languages don’t require that you play computer at all. You just have to specify what you want.
ChatGPT is a great example of why you need to “think in computer”. I hear a lot of PMs/POs, who cannot phrase the requirements properly. Then comes the dev team and they ask back questions to clarify what needs to be done exactly. ChatGPT doesn’t ask back questions, it assumes something and generates something. Then of course you’ll adjust your prompt to get better results, which needs you to understand how ChatGPT “works” (reacts) in the first place, which needs pattern-matching skill, so we end up in: you need analythical thinking to use it well. That’s actually “thinking in computer”.
They say the best specification is the software which works. That was made by developers. I don’t see that why anyone else would be better creating prompts to an LLM than developers.
Okay, to me “playing computer” is a more specific thing than analytical thinking. It is evaluating an imperative instruction and its impact on program state in your head.
I think it’s kind of a continuum, you can do it very strictly, like when debugging a segfault in a C program, or very high level, like when interacting with something like chatgpt, or in the middle, if you’re working with something like SQL, or Python, or Java.
Logic programming and constraint solving absolutely involve playing computer in your head, if you want your program to complete in any reasonable timeframe.
The concept of visual programming was appealing to me and noticing the failures in the fashion of the author, I too have a theory about it.
The main analogy I’d have is poetry. Great poems (at least some great poems, say Richard Brautigan) give the reader the impression “hey poetry is easy” because they say what needs to be said, imply what needs to be implied and don’t use any excessive or awkward constructions.
Essentially, if one looks at great UIs overall, they give the feeling that they are giving you all the information you need. But that feeling is actually the product of careful crafting of the symbol-system but taking into account the task-at-hand, the culture of the users and a whole variety of unsaid things.
It’s like spreadsheet. They are appealing in that they bridge data entry and programming. But a visual tool that gave you everything, data values, array subscripts and programmatic operations, would just be overwhelming and not useful.
I’d say that’s why “visual” tools have success haphazardly rather than universally,
I think a good stress-test of this argument is a field which seems like it’d pay in blood for every abstraction: gamedev. It’s also a field where 1) many projects are made by one person, despite the “proverbial death of the individual developer”, and 2) they work on towers of abstractions “as a tool to avoid hard thoughts.”
And yet these solo devs keep putting out absolute masterpieces: Minecraft, Stardew Valley, Baba is You, Tunic, just to name a few. If anything, the pace has accelerated in recent years, and I regularly play a game and find out later it was made by 1-2 people. So any conjecture that abstractions make it harder for individuals make software should explain why that doesn’t seem true in the field it’d matter most.
Oddly enough, I started my journey in game development! And I agree with you on some points, but not in others.
On the one hand, I am incredibly enthralled that indie developers have platforms to stand on. My argument isn’t “no abstractions”, it’s “careful use of abstractions”. Game development is where performance does matter to an extent, so the environment mirrors early computing where you actually pay the cost of the abstractions you introduce.
On the other, I think that we’re seeing the results of a filter, rather than the actual results. A lot of people burn out in game development, even technical individuals, because they exhaust themselves scaling mountains. Want to ship a game? You’ll realistically be dealing with entire engines and shifting your mental model to work within them, or you’ll be tolerant and technical enough to build your own.
I think it’s a question of tolerance of the norm rather than the norm being sustainable. But, as always, data, data, data.
I think if I’m in a position to advocate for better conditions for individuals, and I’m capable of bringing awareness to that, I should. I really don’t want to abstract these people away if possible.
Is it going to actually make things better, or just change the status quo ?
You said “On the other, I think that we’re seeing the results of a filter, rather than the actual results. A lot of people burn out in game development, even technical individuals, because they exhaust themselves scaling mountains.”
We have infinitely better tools than 30 year ago, yet much more people get burnt out today. Why ? Because the availability of these better tools actually increases competition and thus stress. The better the tools, the higher the mountain gets for everyone as our expectations increase - we always want something that is going to be better than the “average” video game / pizza / movie.
That’s a pretty good point! And I think it’s in line with what I’ve said: the disconnect between release cycles and “time to mastery” grew as time went on, and those curves diverged. If I give you more power over a system, phrased in terms of abstractions, your demands very well might scale with it, leading to the requirement for more power, leading to more abstractions, etc. etc.
It’s partly why I don’t think the solution is purely a technical “throw new languages/software/hardware at it” one, but something that sits firmly between how individuals interact with computers that are embedded in their daily lives and the cumulative technical knowledge we’ve acquired in the past 40+ years.
Minecraft and SdV is not masterpiece because of the engineering. MC will drop to 4 fps if you add 4k textures to it, Starfield on the other hand can provide a smooth 100 fps on the same machine.
SdV is looking like a game from before the millenia. Gaming experience is not always related to the engineering efforts put in those games.
I think a good stress-test of this argument is a field which seems like it’d pay in blood for every abstraction: gamedev.
So, gamedev, as a field has definitely always pushed the bounds of what we know how to do with a machine. But the games you listed aren’t exactly AAA 3D graphics-porn. Not to say the devs aren’t good. You can definitely screw up bad enough to make those games run like trash if you’re incompetent. But you can definitely take on a couple abstractions to make things easier that a Crysis-level game can’t.
Users don’t adore these things, businesses that pay top dollars trying to reduce time-to-market to the minimum do, it’s not because of “users loving javascript” that NPM is one of the biggest package repositories in existence today; same reason why Java got so big, it caters to businesses not to the software crafters, they don’t give a damn about building better software, let’s just see how far Go went without even supporting generics and basically throwing away innovations in programming language design since the 80s.
Make no mistake, software as it is today caters to the capitalist business model, if you want a technology to succeed better have a good marketing team and good preacher/salesman behind.
Not to keep bashing on Go, but think for a second, if it was not by Google and not having Rob Pike or Ken Thompson behind it, would it have succeeded? The answer is NO, let’s be honest.
Microwave interfaces have whole interface subsystems designed for one poke start, rather than entering specific time and power. Also, again: iPhones, VCRs, Windows 3.1.
Your model is “businesses don’t care about shipping shoddy”.
I suggest that users are fine with that quality, if they don’t have to think - https://en.wikipedia.org/wiki/Don't_Make_Me_Think . That is, lack of needing to attend to details is a significant advantage for many users across many domains, regardless of the underlying whatevers.
I write Rust, use emacs heavily, and spent a decade running Gentoo, so - I’m not that person. But I recognize the popularity and weight behind that world.
“The solution to the software crisis will not be a reversion to more constrained platforms.”
The stereotype of reversion to non-automated, non-abstracted usage of software as a solution is one I really need to argue against. We can build things that users enjoy without having to revert to constrained platforms, while at the same time restricting the amount of abstraction we can apply.
I’m saying that you need to address the well observed preference (revealed preference) for KISS designs, even if they strictly constrain the user - and many users do prefer to be constrained.
I’m not going to say I am one of them. 🙃 Just, most people (devs) I have worked with want the ez button to slap, even though it isn’t optimal in many ways.
For sure! I also want that easy button, just not phrased in terms of existing towers of abstractions. If you consider all current software “prototypical”, in that we’re building scaffolding on top of scaffolding, rather than core building materials, then it’s time for us to step back and see where we can start solidifying things.
That’ll come with the removal of a lot, but like a good diff, judgement shouldn’t fall upon the lines removed, but on the functionality present after the patch. We need to do some compression and consolidation.
Aren’t the towers of abstraction an enormous success? When I was learning to program around 1990, people were still writing think pieces about the lack of software reuse. Now that is a solved problem. If anything, people write think pieces about how there’s too much software reuse!
My response to the mountain metaphor is that a rising tide lifts all boats: our situation is more like Hawaii than the Himalayas. True, there’s a risk of drowning in abstraction and sometimes our mountains of software explode spectacularly. But it’s easier to draw something on a <canvas> now than it was to draw in a window 35 years ago. And new mountains with better abstractions are being built: look to CHERI, Rust, io_uring. Maybe Oxide’s approach to firmware will succeed? I’m optimistic.
When users of all kinds complain about the lack of interoperability of software, data silos, and pervasive mono-cultures that we can’t upend by some random individual working in their garage, I don’t think of success.
When the majority of software re-use is now delegated to shipping containerized binaries because we can’t actually build portable, composable software, I don’t think of success.
When I’m subject to the few, if any, outlets of configuration that a piece of software will give me, apart from what the authors allow, I don’t think of success.
When I think that we’re still in the same general spot as we were 35 years ago, just with the ability to move faster due to the demands of capital and product, I don’t think of success.
When the majority of software re-use is now delegated to shipping containerized binaries because we can’t actually build portable, composable software, I don’t think of success.
I feel like … we know how to build good software and good abstractions. We see this happen a lot in open source projects where people have the freedom to do the right thing without pressures from management and executives breathing down their necks. Tremendous successes abound.
But we don’t know how to incentivize producers of commercial software to build quality products. Sometimes it happens by accident.
software should be detached from profit and market economy. There are several fields in which this just works better, like healthcare. Any serious attempt at bringing software under public control, assuming there will ever be enough concentration of political capital to do that before the end of the information age, would be met with incredibly violent resistance by the oligarchs that profit from private software.
If anything, the current trend is going the opposite way: regulations on software are being attacked left and right by the oligarchs and planes started falling.
I think the danger with that approach is that it’s difficult to ensure that the correct software gets created. Markets are a very good way of ensuring that resources get allocated relatively efficiently without needing a central planning system, and without having lots of waste. (Waste in this context is having everyone learn how to write COBOL when app developers are necessary, or vice versa.) Markets have a lot of issues and require a lot of careful regulation and interventions, but they are really good at decentralised decision-making, and we should use them for that purpose.
In fairness, I can understand why people might not associate the current software market with efficiency, but we’re talking about a different kind of efficiency here! The goal of the market is to match people with desires and people who can solve those desires. Right now, few people desire fast, efficient software, as hardware is mostly cheap, so it doesn’t get created as often. It might seem counterintuitive, but this is good: it generally takes longer and more resources to write a shorter, faster, more efficient program (in the vein of “I would have written a shorter letter but I didn’t have the time”), and that time and those resources would be wasted if people didn’t actually need the efficiency.
Where problems arise is where the markets cannot capture some aspect of the “true price” of something. For example, in the discussion on software efficiency, there are environmental issues which don’t get factored into the price of hardware, and there are many groups of people who have needs, but don’t have enough buying power for those needs to be properly met. In these cases, we need regulation to “fix” the markets - pricing in environmental impacts to hardware and running costs, and ensuring minimum standards are met for all software that allow people with various disadvantages to still engage with software. However, just because the markets require adjustment, doesn’t mean that we should throw them away entirely. Software needs to remain attached to profit and markets to ensure that software gets written that actually serves people’s needs.
I realise we’re in danger of getting off-topic here and I don’t want to derail this discussion too much. But I wanted to provide a short leftist defence of markets in software, and point out ways of solving current issues that don’t involve rejecting markets entirely.
The goal of the market is to match people with desires and people who can solve those desires.
The idea that I could spend time working on software that does things that people actually want is why I write free software outside of a market. It appeals to me specifically because the opportunity to do that is so rare in the industry.
In theory, yes, a company that could do this would do well in the market, but in practice any company that achieves this ability briefly ends up self-sabotaging it away in a short time.
Aren’t the towers of abstraction an enormous success? When I was learning to program around 1990, people were still writing think pieces about the lack of software reuse. Now that is a solved problem. If anything, people write think pieces about how there’s too much software reuse!
I think the part that bothers me the most is that a lot of the “modern” abstractions are designed more for plug & play and not for extension. “Frameworks” instead of “libraries”, as I’ve seen the distinction before. If what you’re doing fits well into what the authors were expecting you to do things work really well. And if you try to step anywhere off of that pre-ordained path things start getting really hairy quickly. I wish I could remember what the project was that I was working on a few months ago… it was UI stuff and the framework provided a fabulous set of components, but adding a field validator to a text field involved climbing 3 or 4 layers up the abstraction tower and making your own variant of some superclass and then bringing back a bunch of extra functionality from the subclasses you couldn’t use.
When the majority of software re-use is now delegated to shipping containerized binaries because we can’t actually build portable, composable software, I don’t think of success.
I 100% agree. I mean… thinking about to the late 90s and early 2000s, I do somewhat appreciate that many of those containerized binaries are going to be talking JSON over HTTP and/or Websockets and the languages I use on a regular basis all have really good libraries for those protocols. On the other hand, it’d be really great if a lot of that was a matter of linking a .so and potentially using an FFI binding instead. I’m absolutely exhausted from looking at code that JPEG-encodes an image buffer, takes the JPEG, base64 encodes it, stuffs it in a JSON dict, only to have the whole decoded process reversed on the other side.
I draw a distinction between abstraction and composition, which is also in the article. It’s not a hard distinction, but I’d say:
Composition means putting parts together to form a working system. Does the result work? Is it correct? Is it fast and secure? (Composition does feel more “horizontal”)
Abstraction means hiding details. Abstracting over Windows and Unix is something that I think is often accidental complexity, or at least a big tradeoff. It saves time for the developer, but it can be a loss to the end user. (Abstraction does feel more “vertical” – and fragile when you get too high)
This person, commenting on the same article, pointed out “shallow and composable” as properties of Unix, and I agree:
So I think shell composes, but it’s actually not very abstract. And this is a major reason I’ve been working on https://www.oilshell.org/
IME, shell gets a lot of work done effectively, without much weight, and is adaptable to new requirements. One person can write a shell script to solve a problem – you don’t have to assemble a big team, and justify its existence.
(Of course something that’s challenging is for that shell script to not become a mess over the long term, and I believe we’re doing something about that)
From the article:
Programming models, user interfaces, and foundational hardware can, and must, be shallow and composable. We must, as a profession, give agency to the users of the tools we produce. Relying on towering, monolithic structures sprayed with endless coats of paint cannot last.
This is generally my preference, but I would say “must” is not true … One thing I learned the hard way is that interoperability is basically anti-incentivized.
Long story, but I think the prevelance of YAML in the cloud is a “factoring” problem, but there’s actually a deeper economic issue at play.
That is, the people on one side of the YAML write code and algorithms, and the people on the other “configure” those lego blocks that don’t actually fit together.
YAML arguably abstracts (it hides details behind an interface)
But it doesn’t compose (when you put things together, they don’t have the properties you want) …
abstracting over OS always feels weird to me, when one of the main purposes of an OS is to abstract over hardware
abstracting over hardware makes sense, because we keep getting better at making hardware, we have different tradeoffs, etc.
but with OSs, it mostly seems like a coordination problem. sometimes an intentional one, because the organizations involved were trying to build a moat
The OS already abstracts over hardware, and then we are piling more abstractions on top of OSes.
One that that leak – in terms of performance, security, or just making the application behave poorly
Electron is basically that – it lets you ship faster, but that’s about it
The “tower” or “stack’ is often not a good way of building software.
And the funny thing is that OSes are converging, with Windows gaining a Linux kernel in ~2016 (WSL), and then it also gained a Unix terminal some time later!
I guess to argue the other side, Unix was never good at GUIs … so it’s not like Macs or Unix were superfluous or anything. But it’s just that the most basic layer is still in flux, and it is converging on “Unix”, even in 2016 and 2024 …
(running Docker containers seems to require some sort of Linux x86-64 syscall ABI too)
As a thought experiment, I’d say if we knew how to perfectly abstract, we’d be able to write multi-platform GUIs that work perfectly on all targeted platforms.
But I think anyone who works in that area (I don’t) will tell you that it’s a big compromise. You can write something better if you start OS X only, or Windows only.
I think Flutter is something that abstracts over Android-iPhone, and there are many others.
And of course there were many attempts at Windows / OS X abstraction (QT etc.), but what seems to have happened is that desktop GUIs just got uniformly WORSE since those attempts were made.
Is an Electron app better than a QT app?
Rust is famously “not GUI yet”, and you can argue that if it had some yet-unknown great powers of abstraction, then it would be.
So you could say it’s an unsolved problem to have “zero-cost abstraction” in that respect (!)
(And yes this is a pun – the cost I’m talking about is in the behavior of the app, not the performance)
To summarize, I think there are many things better about where we were 20-30 years ago, but many things are worse. Latency is another one - https://danluu.com/input-lag/
Composing software from parts and maintaining latency is another unsolved problem.
If curious developers can no longer build software without scaling mountains
Is this true? To echo the article “against innovation tokens” plenty of shiny new software exists to reduce the operational complexity of managing the kind of online computer systems that back saas.
There is some context (e.g. reactions and inspirations) that is absent in README.md, so my idea was to share accessible text. I’m not affiliated with thenewstack, but I know Oleksandr and discussed the language with him, so may be a bit biased here.
My current project stems from a significant disdain for most programming languages.
I’ve been working on an open-ended space game named Nebula, starting and re-starting work several times since 2011. In 2016, after being fed up with existing programming languages and ecosystems, I aspired to build my own. It’s now been 8 years, and a minimum of 18 different languages later.
The upside is that this nearly decade-long yak shave may finally be coming to an end. Maybe I’ll get back to the stars before 2030. I wouldn’t trade this journey for anything, though. I wouldn’t be the programmer I am without going through it.
What a shitshow.
The Gen 3 devices also appear to be broken by the expiry. Wonder if this just never made it into the ticket queue.
If it ends up not being fixed, I won’t be surprised. If nobody expected it to happen, hugops to all involved in patching it.
Having owned a Framework since April of 2022, I cannot recommend them to people who need even basic durability in their devices. Since then, I have done two mainboard replacements, two top cover replacements, a hinge replacement, a battery replacement, several glue jobs after the captive screw hubs sheared from the plastic backing…
It’s just such an absurdly fragile device with incredibly poor thermals. They sacrificed a ton of desirable features to make the laptop repairable, but ultimately have released a set of devices that, when used in real-world settings, end with you repairing the device more often than not. And these repairs are often non-trivial.
I will personally be migrating to another machine. The Framework 12’s focus on durability may be trending in the right direction, but to regain trust, I’d need to see things like drop and wear tests. A laptop that can be repaired, but needs constant upkeep/incredibly delicate handling, is ultimately not an actual consumer device, but a hobbyist device.
Maybe they’ll get better in a few years. Maybe the Framework 12 will be better. Their new focus on AI, the soldered RAM in the desktop offering, and the failure to address the flimsy plastic chassis innards, among other things, mean that they have a long way to go.
It’s definitely a “be part of the community that helps solve our product problems” sort of feeling.
I have an AMD FW13, and was trying to figure out why it loses 50+% of its battery charge overnight when I close the lid, because I don’t use this computer every single day and don’t want to have remember to charge yet another device.
So I check the basics-I’m running their officially supported Linux distro, BIOS is current, etc. And half an hour into reading forum threads about diagnosing sleep power draw, I realize that this is not how I want to spend my time on this planet. I love that they’re trying to build repairable/upgradeable devices, but that goal doesn’t matter so much if people end up ditching your products for another option because they’re just tired of trying to fix it.
I’ll chime in with the opposite experience - I’ve owned an AMD Framework 13 since it came out, and had no durability issues with it whatsoever, and it’s been one of my 2 favorite computers I’ve ever owned. I’ve done one main board replacement that saved my butt after a bottle of gin fell over on top of it in transport.
Development and light gaming (on Linux, I very much appreciate their Linux support) have been great, and the reparability both gives me peace of mind, an upgrade path, and has already saved me quite a bit of money.
I’ve owned a framework since Batch 1. Durability has not been a problem for me. My original screen has a small chip in it from when I put it in a bag with something that had sharp edges and pressured the screen for a whole flight. Slowly growing. Otherwise, it’s been solid.
Same. I have a batch 1. There are quirks, which I expected and knew I am supporting a startup with little experience. I since have upgraded and put my old board into a cooler master case. This is so amazing, and what I cared about. I am still super happy with having bought the Framework and particular for tinkerers and people who will have a use for their old mainboards it’s amazing.
I get harbouring resentment for a company you felt sold then a bad product. But at the same time, you bought a laptop from a very inexperienced company which was brand new at making laptops, a pretty difficult product category to get right when you’re not just re-branding someone else’s white-label hardware.
3 years have passed since then, if I were in the market for a category which Framework competes in these days I would be inclined to look at more recent reviews and customer testimonials. I don’t think flaws in that 3 year old hardware is that relevant anymore. Not because 3 years is a particularly long time in the computer hardware business, but because it’s a really long time relative to the short life of this particular company.
I would agree that 3 years is enough time for a company to use their production lessons to improve their product. But nothing has changed in the Framework 13.
I don’t resent Framework. I think that’s putting words in my mouth. I just cannot, in good faith, recommend their products to people who need even a semi-durable machine. That’s just fact.
Founded by people who had experience designing laptops already, and manufactured by a company that manufactures many laptops. Poor explanations for the problems, IMO.
I’ve had a 12th gen Intel since Sept 2022 (running NixOS btw) and I have not had any issues, I will admit it sits in one place 99% of the time. I might order the replacement hinge since mine is a bit floppy but not too big a deal.
As for the event, I was hoping for a minipc using the 395 and I got my wish. Bit pricey and not small enough for where I want to put it and I have no plans for AI work so it’s probably not the right machine for me.
I was originally interested in the HP machine coming with the same CPU (which should be small enough to fit) but I’ve been pricing an AMD 9950 and it comes out cheaper. I was also disappointed there wasn’t a sku with 385 Max w/64GB of RAM , which I might have have ordered to keep the cost down.
For reference a new machine is intended to replace a 10 year old Devils Canyon system.
I’ve also had my Framework 13 since beginning of 2022. I’ve had to do a hinge replacement, input cover replacement, and mainboard replacement. But I sort of expected that since it’s a young company and hardware is hard. And through all of it support was very responsive and helpful.
I would expect that nowadays the laptops are probably more solidly built than those early batches!
Support was definitely helpful. I just don’t have time or money to replace parts on my machine anymore.
From what I understand, the laptops aren’t any stronger. Even the Framework 16 just got some aftermarket/post-launch foam pads to put below the keyboard to alleviate the strain on the keyboard. The entire keyboard deck would flex.
The fact that these products have these flaws makes me wonder how Framework organizes its engineering priorities.
When compared to other similar laptops from brands like HP or Lenovo, how does the deck flex compare? I definitely feel sympathetic to not being better or on par with Apple - given the heaps of money Apple has for economies of scale + lots of mechanical engineers, but it would be a bit rough if mid-tier laptops in that category were far superior.
The deck flex is on par with or worse than an HP EliteBook circa 2019. The problem is that it’s incredibly easy to bend the entire frame of the machine, to the point where it interferes with the touchpad’s ability to click.
It’s really bad, bordering on unexcusable. The fact that there’s no concrete reinforcment says that they sacrificed build quality for repairability, which is equivalent to making a leaky boat with a very fast bilge pump.
I’m not sure what you’re doing to your laptop; how are you bending the entire frame of the machine?
It’s a new company that is largely doing right by open source, and especially open hardware. The quality isn’t incredible but it is worth its value, and I find these claims you’re making dubious.
It’s a fairly common flex point for the chassis, and a common support problem. The base of the mousepad, towards the front of the laptop where there’s a depression in the case, is where the majority of the flex is.
My laptop has seen nothing but daily, regular use. You can find the claims dubious, but others are having them too.
This has been my experience with the Framework. It’s not Apple hardware, which is best in class all around, but it is on-par with my Dell XPS.
I’ll chime in too: I’ve had the Framework 13 AMD since it came out (mid 2023) and it has been great.
I upgraded the display after the new 2.8K panel came out, it took 2 minutes. Couple months later it developed some dead pixels, so they sent me a replacement. In the process of swapping it out, I accidentally tore the display cable. It took me a while to notice/debug it, but in the end it was just a $15 cable replacement that I’m fairly sure would have otherwise resulted in a full mainboard replacement for any other laptop. (When I had Macbooks, I lost count how many times Apple replaced the mainboard for the smallest thing.)
I haven’t been too precious with it, I toss it around like I did my Thinkpad before this. There’s some scuffs but it has been fine, perhaps the newer models are more sturdy? It’s comforting to know that if anything breaks, I’ll be able to fix it.
I also run NixOS on it, it does everything I need it to do, the battery life is great (8-10 hours of moderate use) and I’ll happily swap out the battery in a few more years once it starts losing capacity.
I spend so much of my life at the computer that feeling a sense of ownership over the components makes a lot of sense to me. I don’t want to feel like I’m living in a hotel.
It is, in fact, how I want to spend my time on this planet.
To add to the chorus, I bought a 12th gen intel framework 13 on release and it’s been flawless so far. Nixos worked out of the box. I love the 3:2 screen. I can totally believe that a small/young manufacturing company has quality control issues and some people are getting lemons, but the design itself seems solid to me.
On my old dell laptop I snapped all the usb ports on one side (by lifting up the other side while keyboard/mouse were still connected). Since they’re connected directly to the motherboard they weren’t repairable without buying a new cpu. If I did the same on the framework it would only break the $12 expansion cards and I wouldn’t even have to turn it off to replace them.
Later I dropped that same dell about 20cm on to a couch with the screen open. The impact swung the screen open all the way and snapped the hinges. They wanted me to send it back for repairs but I couldn’t handle the downtime, so for a year I just had the hinges duck-taped together. I’ve dropped my framework the same way, but because the screen opens the full 180 degrees it doesn’t leverage the hinges at all. And if it did break I’d be able to ship the part and replace it myself.
Not that I support the desktop offering as anything but waste, but the soldered RAM is apparently all about throughput:
With the focus of the desktop being “AI applications” that prioritize high throughpout, I’d say they could’ve gone with an entirely different chip.
I get the engineering constraint, but the reason for the constraint is something I disagree with.
Who else is making something competitive?
I wish I could name something in good faith that was comparable to a hyper-repairable x86-64 laptop. Lenovo is pivoting towards repairability with the T14 Gen 5, but I can’t recommend that either yet.
Star Labs, System76, some old Thinkpad models.. there are “competitive” things, but few things that pitch the things Framework does.
While I agree on some of that, I must stress that I’ve had hardware that was fine until just one thing suddenly broke and everything was unusable. I’ll try an analogy: with repairability, if all your components are 99% reliable and working, the whole machine is at 99% but without it, even if all of them are at 99.9% instead, when you have 10 components, you’re not in a better situation overall.
And I say that while I need to finish going through support for a mainboard replacement due to fried USB ports on a first-gen machine (although not an initial batch). BTW, funnily I’m wondering if there’s an interaction with my yubikey. I also wish the chassis was a bit sturdier but that’s more of a wish.
As for thermals, while I think they could probably be better, the 11th gen Intel CPU that you have (just like I do) isn’t great at all: 13th gen ones are much better AFAIK.
I’ve experienced a full main board failure which led to me upgrading to a 12th gen on my own dime.
The thermal problems are still there, and their fans have some surprising QA problems that are exacerbated by thermal issues.
I wish I could excuse the fact that my machine feels like it’s going to explode even with power management. The fans grind after three replacements now, and I lack the energy and motivation to do a fourth.
I think 12th gen is pretty similar to 11th gen. I contemplated the upgrade for similar reasons but held off because I didn’t need to know and the gains seemed low. IIRC it’s really with 13th gen that Intel improved the CPUs. But I agree the thermals/power seems sub-par; I feel like it could definitely be better.
BTW, I just “remembered” that I use mine mostly on my desk and it’s not directly sitting on it which greatly improves its cooling (I can’t give hard numbers but I see the temps under load are better and max CPU frequency can be maintained).
Sorry to hear about the trouble with your Framework 13. To offer another data point: I have a 12th gen Framework 13 and haven’t needed to repair a thing, I’m still super happy with it. The frame-bending I’ve also not seen, it’s a super sturdy device for me.
I can second that. I’ve had a 12th gen Intel system since late 2022 and no issues of the sort. Even dropping it once did nothing to it
“You can’t put Linux on a NES.”
“The sign can’t stop me because I can’t read!”
I think it’s closer to a
“See marge, I told you they could deep fry my shirt!”
“I didn’t say they couldn’t I said you shouldn’t!
Or the old “you were so preoccupied with whether or not you could you didn’t stop to think if you should”
In all seriousness though this is a delightful and absurd project
I feel compelled to point out that ELKS isn’t really Linux, it’s a “Linux-like” 16 bit OS. At this point, I’m not even sure if you couldn’t call it being closer to xenix86 or PC/IX, than Linux.
Nevertheless, numerous other projects have shown that you can boot actual Linux on very primitive hardware through the same hardware emulation approach, although most of them literally take days or weeks to boot. This one, albeit still slow, seems more “usable”, by virtue of emulating a much more modest environment.
Needless to say, I love all of them.
I’m going to learn how to survive in harsh conditions, much as I have this past year.
I yell a lot about the software crisis, but this is a symptom of it.
The trigger on this update was pulled presumably without testing. With the effects being as wide-spread as they are, I doubt this was a “works on my machine” situation.
With the pervasive devaluation of programming as a trade by managerial forces, and the progressive migration to fungible workers, I doubt this will be the last time this happens on this scale.
If you’ve ever done any driver programming, especially windows drivers, you will know that it is difficult to test the full matrix of every possible version / configuration / matrix of other drivers on the machine.
I would expect driver developers working on something as wide spread a crowdstrike to be well versed in the nuances… but when I worked at microsoft even the full time kernel team folks would make mistakes that would cause major issues internally for people self hosting windows. I maintained a relatively small boot driver that ran on windows 8 - windows 11 (client and server), and the matrix of possible interactions your code could have was overwhelming to test at times (https://learn.microsoft.com/en-us/windows-hardware/drivers/kernel/managing-hardware-priorities).
I’m not making excuses for them, but I imagine this is not something you can chalk up to just pure negligence.
Oh, I guarantee you the things that get shipped are difficult to test. That’s a given.
This isn’t on the engineers. This is on management. CrowdStrike has been undergoing rolling layoffs and I guarantee you some of the folks that were gutted were QA-related.
It’s unfortunately a common tale. Cut staff, cut morale, micromanage, construct a meat grinder, and things like this will happen.
Bugs happen. Rolling them out to your entire deployment this fast is negligence.
I agree, but consider that it is pretty hard to test antivirus updates since they are extremely time-sensitive. They typically try to respond to the latest threats within a few days. There is no real possibility of weeks of testing, or staged rollouts, or other practices companies normally use.
I mean, you can absolutely stage rollouts, even if it takes 24 hours you’d still probably have quite a bit less damage.
Fully agreed. My bet is on management/staff cuts rather than engineering incompetence.
Working on ironing out performance kinks in my prototype of Nova. Adding graphics and input handling proved that it was faster than I thought, but could still use work, mostly in the bytecode generation.
The lack of discussion for dataflow languages is pretty weird. Yes, we can have visual interfaces showing the function of each line of code (e.g. Blockly), and no, we seldom need it.
But to be able to wrap functions in an interface that maps function calls and makes it easy to go from writing wrapper functions that deal in type FOO to getting them to do List and then Stream, that’s pretty convenient.
And if the interface generates actor-model type code for doing these things, then you have a way to write concurrent code that’s useful; even to newbies.
The typical (searchable) term for this, that I’ve found, is Flow-Based Programming. Dataflow languages encompass a lot of working literature, but Flow-Based Programming is the most narrow term for this (black boxes communicating with each-other via a dataflow graph).
I’ve really loved my time with Node-RED.
This article resonates with me. A good chunk of visual programming doesn’t remove the user from playing computer in their head, and at its worst ends up as messes of wires and interconnections.
I can see, and read, code. I don’t need a visual representation of it. It may be convenient to have a visual representation, but what I want is..
All of these revolve around providing visibility where it doesn’t arise naturally, which visual programming systems haven’t really delivered. (If there are ones that deliver on the above, please let me know!)
Essentially, creating a software which behaves in a desired way, requires “playing computer in their head*”. No matter if it is source code or visual programming, they are just tools to reach the goal, which goal needs complex analythical thinking, abstraction, see everything in a system, etc.
*Why need to “play computer”? Because that is a deterministic machine, which you can’t talk in ideas, only pragmatically and exact. You need to think of a ton of edge cases and be fully aware of what’s required as an output.
“Regular users” can’t do that, only someone who can think in computer.
Btw, for the 3rd bullet point: https://github.com/deejayy/ts-depgraph
I fully disagree. “Playing computer in your head” is largely historical precedent. There’s no fundamental reason why we have to program with blindfolds on.
Yes, machines have operational models, but we don’t need to keep the state of the machine invisible or pretend we’re programming for punch card machines. We have the ability to peer into the state of the machine, explore hypotheticals, and encode ideas.
I’m even working on something that allows you to sketch programs who’s functionality is close to what you meant to write, coupled with tools to explore hypotheticals.
We can just take the blindfold off.
I’m not sure I get the “blindfold” metaphor properly.
I don’t have the statistics from the last 60 years of no-code attempts, but I have a safe bet that these tools were most effectively used by people capable of software development. We can change the “language” (or the form of expression) to something visual, something more close to natural speaking, but to phrase those words, you need to be aware of the ideas of “operational models”, “state of the machine”, abstraction, completeness, some basic programming structures (like iteration, condition, etc.). That’s what I call “playing computer in your head”, otherwise, where else are you using these skills?
“Playing computer in your head” usually means “simulating what the computer will do as you write code”.
In the past, we needed to keep track of registers, the stack, memory locations, etc. It hasn’t gotten much better, just more abstract.
Role-playing as a computer, simulating its operations as you specify what it should do, isn’t how things have to be.
To some degree you’re certainly correct, we cannot just go around guessing instructions, but I don’t think it’s fair to say that it is “essential”. You can now create programs using ChatGPT and run them and maybe some of them do what you want some of the time! This is not so different from how most programs are developed, which involves a bunch of guessing, making mistakes, testing and fixing until you have something that maybe does what you want enough of the time so you stop iterating. Also, logic programming and constraint solving and SQL-like languages don’t require that you play computer at all. You just have to specify what you want.
ChatGPT is a great example of why you need to “think in computer”. I hear a lot of PMs/POs, who cannot phrase the requirements properly. Then comes the dev team and they ask back questions to clarify what needs to be done exactly. ChatGPT doesn’t ask back questions, it assumes something and generates something. Then of course you’ll adjust your prompt to get better results, which needs you to understand how ChatGPT “works” (reacts) in the first place, which needs pattern-matching skill, so we end up in: you need analythical thinking to use it well. That’s actually “thinking in computer”.
They say the best specification is the software which works. That was made by developers. I don’t see that why anyone else would be better creating prompts to an LLM than developers.
Okay, to me “playing computer” is a more specific thing than analytical thinking. It is evaluating an imperative instruction and its impact on program state in your head.
I think it’s kind of a continuum, you can do it very strictly, like when debugging a segfault in a C program, or very high level, like when interacting with something like chatgpt, or in the middle, if you’re working with something like SQL, or Python, or Java.
Logic programming and constraint solving absolutely involve playing computer in your head, if you want your program to complete in any reasonable timeframe.
The concept of visual programming was appealing to me and noticing the failures in the fashion of the author, I too have a theory about it.
The main analogy I’d have is poetry. Great poems (at least some great poems, say Richard Brautigan) give the reader the impression “hey poetry is easy” because they say what needs to be said, imply what needs to be implied and don’t use any excessive or awkward constructions.
Essentially, if one looks at great UIs overall, they give the feeling that they are giving you all the information you need. But that feeling is actually the product of careful crafting of the symbol-system but taking into account the task-at-hand, the culture of the users and a whole variety of unsaid things.
It’s like spreadsheet. They are appealing in that they bridge data entry and programming. But a visual tool that gave you everything, data values, array subscripts and programmatic operations, would just be overwhelming and not useful.
I’d say that’s why “visual” tools have success haphazardly rather than universally,
I think a good stress-test of this argument is a field which seems like it’d pay in blood for every abstraction: gamedev. It’s also a field where 1) many projects are made by one person, despite the “proverbial death of the individual developer”, and 2) they work on towers of abstractions “as a tool to avoid hard thoughts.”
And yet these solo devs keep putting out absolute masterpieces: Minecraft, Stardew Valley, Baba is You, Tunic, just to name a few. If anything, the pace has accelerated in recent years, and I regularly play a game and find out later it was made by 1-2 people. So any conjecture that abstractions make it harder for individuals make software should explain why that doesn’t seem true in the field it’d matter most.
Oddly enough, I started my journey in game development! And I agree with you on some points, but not in others.
On the one hand, I am incredibly enthralled that indie developers have platforms to stand on. My argument isn’t “no abstractions”, it’s “careful use of abstractions”. Game development is where performance does matter to an extent, so the environment mirrors early computing where you actually pay the cost of the abstractions you introduce.
On the other, I think that we’re seeing the results of a filter, rather than the actual results. A lot of people burn out in game development, even technical individuals, because they exhaust themselves scaling mountains. Want to ship a game? You’ll realistically be dealing with entire engines and shifting your mental model to work within them, or you’ll be tolerant and technical enough to build your own.
I think it’s a question of tolerance of the norm rather than the norm being sustainable. But, as always, data, data, data.
Isn’t that true for every business though ? For every success there is a thousand people burnt out or living in terrible conditions.
I think if I’m in a position to advocate for better conditions for individuals, and I’m capable of bringing awareness to that, I should. I really don’t want to abstract these people away if possible.
Is it going to actually make things better, or just change the status quo ?
You said “On the other, I think that we’re seeing the results of a filter, rather than the actual results. A lot of people burn out in game development, even technical individuals, because they exhaust themselves scaling mountains.”
We have infinitely better tools than 30 year ago, yet much more people get burnt out today. Why ? Because the availability of these better tools actually increases competition and thus stress. The better the tools, the higher the mountain gets for everyone as our expectations increase - we always want something that is going to be better than the “average” video game / pizza / movie.
That’s a pretty good point! And I think it’s in line with what I’ve said: the disconnect between release cycles and “time to mastery” grew as time went on, and those curves diverged. If I give you more power over a system, phrased in terms of abstractions, your demands very well might scale with it, leading to the requirement for more power, leading to more abstractions, etc. etc.
It’s partly why I don’t think the solution is purely a technical “throw new languages/software/hardware at it” one, but something that sits firmly between how individuals interact with computers that are embedded in their daily lives and the cumulative technical knowledge we’ve acquired in the past 40+ years.
Minecraft and SdV is not masterpiece because of the engineering. MC will drop to 4 fps if you add 4k textures to it, Starfield on the other hand can provide a smooth 100 fps on the same machine. SdV is looking like a game from before the millenia. Gaming experience is not always related to the engineering efforts put in those games.
So, gamedev, as a field has definitely always pushed the bounds of what we know how to do with a machine. But the games you listed aren’t exactly AAA 3D graphics-porn. Not to say the devs aren’t good. You can definitely screw up bad enough to make those games run like trash if you’re incompetent. But you can definitely take on a couple abstractions to make things easier that a Crysis-level game can’t.
Users found programming a vcr unpleasantly technical. Users flooded to the iPhone. Users loved Windows 3.1 instead of DOS.
Users adore not being in fine control and having a simple on off button.
Pip install anything was earthshaking. (Although ofc cpan was earlier).
Any model of software that doesn’t incorporate the low control high black-box approach in some fashion isn’t going to work well.
I would argue that the essay Engelbart’s Violin hits the problem better.
Users don’t adore these things, businesses that pay top dollars trying to reduce time-to-market to the minimum do, it’s not because of “users loving javascript” that NPM is one of the biggest package repositories in existence today; same reason why Java got so big, it caters to businesses not to the software crafters, they don’t give a damn about building better software, let’s just see how far Go went without even supporting generics and basically throwing away innovations in programming language design since the 80s.
Make no mistake, software as it is today caters to the capitalist business model, if you want a technology to succeed better have a good marketing team and good preacher/salesman behind.
Not to keep bashing on Go, but think for a second, if it was not by Google and not having Rob Pike or Ken Thompson behind it, would it have succeeded? The answer is NO, let’s be honest.
Microwave interfaces have whole interface subsystems designed for one poke start, rather than entering specific time and power. Also, again: iPhones, VCRs, Windows 3.1.
Your model is “businesses don’t care about shipping shoddy”.
I suggest that users are fine with that quality, if they don’t have to think - https://en.wikipedia.org/wiki/Don't_Make_Me_Think . That is, lack of needing to attend to details is a significant advantage for many users across many domains, regardless of the underlying whatevers.
I write Rust, use emacs heavily, and spent a decade running Gentoo, so - I’m not that person. But I recognize the popularity and weight behind that world.
“The solution to the software crisis will not be a reversion to more constrained platforms.”
The stereotype of reversion to non-automated, non-abstracted usage of software as a solution is one I really need to argue against. We can build things that users enjoy without having to revert to constrained platforms, while at the same time restricting the amount of abstraction we can apply.
I’m saying that you need to address the well observed preference (revealed preference) for KISS designs, even if they strictly constrain the user - and many users do prefer to be constrained.
I’m not going to say I am one of them. 🙃 Just, most people (devs) I have worked with want the ez button to slap, even though it isn’t optimal in many ways.
For sure! I also want that easy button, just not phrased in terms of existing towers of abstractions. If you consider all current software “prototypical”, in that we’re building scaffolding on top of scaffolding, rather than core building materials, then it’s time for us to step back and see where we can start solidifying things.
That’ll come with the removal of a lot, but like a good diff, judgement shouldn’t fall upon the lines removed, but on the functionality present after the patch. We need to do some compression and consolidation.
Aren’t the towers of abstraction an enormous success? When I was learning to program around 1990, people were still writing think pieces about the lack of software reuse. Now that is a solved problem. If anything, people write think pieces about how there’s too much software reuse!
My response to the mountain metaphor is that a rising tide lifts all boats: our situation is more like Hawaii than the Himalayas. True, there’s a risk of drowning in abstraction and sometimes our mountains of software explode spectacularly. But it’s easier to draw something on a <canvas> now than it was to draw in a window 35 years ago. And new mountains with better abstractions are being built: look to CHERI, Rust, io_uring. Maybe Oxide’s approach to firmware will succeed? I’m optimistic.
When users of all kinds complain about the lack of interoperability of software, data silos, and pervasive mono-cultures that we can’t upend by some random individual working in their garage, I don’t think of success.
When the majority of software re-use is now delegated to shipping containerized binaries because we can’t actually build portable, composable software, I don’t think of success.
When I’m subject to the few, if any, outlets of configuration that a piece of software will give me, apart from what the authors allow, I don’t think of success.
When I think that we’re still in the same general spot as we were 35 years ago, just with the ability to move faster due to the demands of capital and product, I don’t think of success.
I feel like … we know how to build good software and good abstractions. We see this happen a lot in open source projects where people have the freedom to do the right thing without pressures from management and executives breathing down their necks. Tremendous successes abound.
But we don’t know how to incentivize producers of commercial software to build quality products. Sometimes it happens by accident.
software should be detached from profit and market economy. There are several fields in which this just works better, like healthcare. Any serious attempt at bringing software under public control, assuming there will ever be enough concentration of political capital to do that before the end of the information age, would be met with incredibly violent resistance by the oligarchs that profit from private software.
If anything, the current trend is going the opposite way: regulations on software are being attacked left and right by the oligarchs and planes started falling.
I think the danger with that approach is that it’s difficult to ensure that the correct software gets created. Markets are a very good way of ensuring that resources get allocated relatively efficiently without needing a central planning system, and without having lots of waste. (Waste in this context is having everyone learn how to write COBOL when app developers are necessary, or vice versa.) Markets have a lot of issues and require a lot of careful regulation and interventions, but they are really good at decentralised decision-making, and we should use them for that purpose.
In fairness, I can understand why people might not associate the current software market with efficiency, but we’re talking about a different kind of efficiency here! The goal of the market is to match people with desires and people who can solve those desires. Right now, few people desire fast, efficient software, as hardware is mostly cheap, so it doesn’t get created as often. It might seem counterintuitive, but this is good: it generally takes longer and more resources to write a shorter, faster, more efficient program (in the vein of “I would have written a shorter letter but I didn’t have the time”), and that time and those resources would be wasted if people didn’t actually need the efficiency.
Where problems arise is where the markets cannot capture some aspect of the “true price” of something. For example, in the discussion on software efficiency, there are environmental issues which don’t get factored into the price of hardware, and there are many groups of people who have needs, but don’t have enough buying power for those needs to be properly met. In these cases, we need regulation to “fix” the markets - pricing in environmental impacts to hardware and running costs, and ensuring minimum standards are met for all software that allow people with various disadvantages to still engage with software. However, just because the markets require adjustment, doesn’t mean that we should throw them away entirely. Software needs to remain attached to profit and markets to ensure that software gets written that actually serves people’s needs.
I realise we’re in danger of getting off-topic here and I don’t want to derail this discussion too much. But I wanted to provide a short leftist defence of markets in software, and point out ways of solving current issues that don’t involve rejecting markets entirely.
The idea that I could spend time working on software that does things that people actually want is why I write free software outside of a market. It appeals to me specifically because the opportunity to do that is so rare in the industry.
In theory, yes, a company that could do this would do well in the market, but in practice any company that achieves this ability briefly ends up self-sabotaging it away in a short time.
I’m with you.
From the GP:
I think the part that bothers me the most is that a lot of the “modern” abstractions are designed more for plug & play and not for extension. “Frameworks” instead of “libraries”, as I’ve seen the distinction before. If what you’re doing fits well into what the authors were expecting you to do things work really well. And if you try to step anywhere off of that pre-ordained path things start getting really hairy quickly. I wish I could remember what the project was that I was working on a few months ago… it was UI stuff and the framework provided a fabulous set of components, but adding a field validator to a text field involved climbing 3 or 4 layers up the abstraction tower and making your own variant of some superclass and then bringing back a bunch of extra functionality from the subclasses you couldn’t use.
I 100% agree. I mean… thinking about to the late 90s and early 2000s, I do somewhat appreciate that many of those containerized binaries are going to be talking JSON over HTTP and/or Websockets and the languages I use on a regular basis all have really good libraries for those protocols. On the other hand, it’d be really great if a lot of that was a matter of linking a .so and potentially using an FFI binding instead. I’m absolutely exhausted from looking at code that JPEG-encodes an image buffer, takes the JPEG, base64 encodes it, stuffs it in a JSON dict, only to have the whole decoded process reversed on the other side.
I draw a distinction between abstraction and composition, which is also in the article. It’s not a hard distinction, but I’d say:
Composition means putting parts together to form a working system. Does the result work? Is it correct? Is it fast and secure? (Composition does feel more “horizontal”)
Abstraction means hiding details. Abstracting over Windows and Unix is something that I think is often accidental complexity, or at least a big tradeoff. It saves time for the developer, but it can be a loss to the end user. (Abstraction does feel more “vertical” – and fragile when you get too high)
This person, commenting on the same article, pointed out “shallow and composable” as properties of Unix, and I agree:
https://news.ycombinator.com/item?id=40885635
So I think shell composes, but it’s actually not very abstract. And this is a major reason I’ve been working on https://www.oilshell.org/
IME, shell gets a lot of work done effectively, without much weight, and is adaptable to new requirements. One person can write a shell script to solve a problem – you don’t have to assemble a big team, and justify its existence.
(Of course something that’s challenging is for that shell script to not become a mess over the long term, and I believe we’re doing something about that)
From the article:
This is generally my preference, but I would say “must” is not true … One thing I learned the hard way is that interoperability is basically anti-incentivized.
Long story, but I think the prevelance of YAML in the cloud is a “factoring” problem, but there’s actually a deeper economic issue at play.
That is, the people on one side of the YAML write code and algorithms, and the people on the other “configure” those lego blocks that don’t actually fit together.
YAML arguably abstracts (it hides details behind an interface)
But it doesn’t compose (when you put things together, they don’t have the properties you want) …
Similar to this comment - https://lobste.rs/s/saqp6t/comments_on_scripting_cgi_fastcgi#c_28yzy4
abstracting over OS always feels weird to me, when one of the main purposes of an OS is to abstract over hardware
abstracting over hardware makes sense, because we keep getting better at making hardware, we have different tradeoffs, etc.
but with OSs, it mostly seems like a coordination problem. sometimes an intentional one, because the organizations involved were trying to build a moat
Yes exactly !!
The OS already abstracts over hardware, and then we are piling more abstractions on top of OSes.
One that that leak – in terms of performance, security, or just making the application behave poorly
Electron is basically that – it lets you ship faster, but that’s about it
The “tower” or “stack’ is often not a good way of building software.
And the funny thing is that OSes are converging, with Windows gaining a Linux kernel in ~2016 (WSL), and then it also gained a Unix terminal some time later!
I guess to argue the other side, Unix was never good at GUIs … so it’s not like Macs or Unix were superfluous or anything. But it’s just that the most basic layer is still in flux, and it is converging on “Unix”, even in 2016 and 2024 …
(running Docker containers seems to require some sort of Linux x86-64 syscall ABI too)
As a thought experiment, I’d say if we knew how to perfectly abstract, we’d be able to write multi-platform GUIs that work perfectly on all targeted platforms.
But I think anyone who works in that area (I don’t) will tell you that it’s a big compromise. You can write something better if you start OS X only, or Windows only.
I think Flutter is something that abstracts over Android-iPhone, and there are many others.
And of course there were many attempts at Windows / OS X abstraction (QT etc.), but what seems to have happened is that desktop GUIs just got uniformly WORSE since those attempts were made.
Is an Electron app better than a QT app?
Rust is famously “not GUI yet”, and you can argue that if it had some yet-unknown great powers of abstraction, then it would be.
So you could say it’s an unsolved problem to have “zero-cost abstraction” in that respect (!)
(And yes this is a pun – the cost I’m talking about is in the behavior of the app, not the performance)
To summarize, I think there are many things better about where we were 20-30 years ago, but many things are worse. Latency is another one - https://danluu.com/input-lag/
Composing software from parts and maintaining latency is another unsolved problem.
Is this true? To echo the article “against innovation tokens” plenty of shiny new software exists to reduce the operational complexity of managing the kind of online computer systems that back saas.
Adding ladders to a mountain doesn’t change its height.
This also applies to all kinds of software written, not just software as a service.
Here’s to the next twelve years! Very happy to be a part of this community.
This is a spam article. Does not belong on this site.
im intentionally working on improving my mental health because things are a little emotionally painful for me
I hope you get better, and if you feel like talking about it you’re welcome to send me a message.
Thanks a lot. It really is nice to have such kindness in our community.
Sending hugs your way. I’m rooting for you, and my DMs are open on all platforms.
It’d help if there were some screenshots of the sample’s output on a C64, emulator or otherwise.
I don’t know why this links to a news site instead of the repository it’s talking about.
There is some context (e.g. reactions and inspirations) that is absent in README.md, so my idea was to share accessible text. I’m not affiliated with thenewstack, but I know Oleksandr and discussed the language with him, so may be a bit biased here.
Experimenting with and fostering interest in Modal.
Continuing work on Feather.
My current project stems from a significant disdain for most programming languages.
I’ve been working on an open-ended space game named Nebula, starting and re-starting work several times since 2011. In 2016, after being fed up with existing programming languages and ecosystems, I aspired to build my own. It’s now been 8 years, and a minimum of 18 different languages later.
The upside is that this nearly decade-long yak shave may finally be coming to an end. Maybe I’ll get back to the stars before 2030. I wouldn’t trade this journey for anything, though. I wouldn’t be the programmer I am without going through it.
I too have a space game started in 2011 that I never completed, but which taught me so much that regular work projects never would.