It’s not always the fault of the programmer, and I do wish ops would just think some times.
“Hey, there’s something wrong! Your code has a bug!”
“Hey, my code has been running in production for four months without incident and without any changes. Something else must be at fault.”
“No, it’s your code.”
Four days later. “No, it’s not. Here’s the proof.”
“Oh. Yeah. We did make a small change the other day … “
I hit one the other day. It wasn’t technically a bug, but it was something that violated the Principle of Least Surprise so badly, I’d argue that it should be a bug.
(I don’t want to name names, but basically, there was a data structure called a “byte” that, by design, could only store numbers greater than 127 and less than 256, to enforce that you use the data structure called “char” for ASCII codepoints…It was documented, but I mean…c’mon.)
I’m not even sure what to call that cursed type.
I’d call it a ßÿ┼ë because at least you could represent that entirely in characters above 127 on the old IBM ASCII set ;)
The appearance is sufficiently cursed to give people a warning at least :)
Edited: … although the only DOS compiler I have to hand, Borland Turbo C, really does not like identifiers beginning with upper-ASCII characters. As a C programmer more than two decades ago I couldn’t remember whether any of this was specified in the standard, and it wasn’t as of C89 - character sets were implementation dependent.
This is nice advice for a novice programmer. However, as a seasoned programmer, you start finding these bugs all over the place. Bugs in libraries, in the compiler, in distro binaries, in the operating system, even in the CPU. Of course, stay honest and humble. Debug your own code first. Extraordinary claims require extraordinary evidence. But never assume that the foundation upon which you build is perfect.
This year, I found a bug in the ps binary distributed by BlackBerry QNX. That took a while to find and understand. You can read my write-up here, hopefully you are entertained: https://mental-reverb.com/blog.php?id=29
Except when it isn’t.
I wasted a half a day because I knew my colleague wouldn’t check in a broken unit test… and if he did, the CI build would fail it…..
Except he had broken that unit test and the CI build had recently been changed not to build that product variant….
“Oh Bother!”, said Poo.
My corollary to this would be: keep your tech stack small and well curated to let you prove or disprove the rule faster.
I like writing little web services in Go. I am not convinced Go is the best language for everything, but having a robust and well understood stdlib means I don’t have to go find dozens of external packages to vet, understand, and glue together. When I hit a bug I typically don’t have to go looking very far.
I’ve run into and reported quite a few compiler bugs in my career. Fewer and fewer over time, though, which seems like a good trend.
Framework bugs are not uncommon either. Especially in GUI frameworks, but I can recall finding a bunch of edge cases and missing features in HTTP libraries.
But I don’t eat to blame them until I’ve ruled out my own mistakes. It’s my fault until I prove otherwise.
If you go poking around the dark corners of Linux syscalls, you’ll find that a lot of surprising behavior isn’t officially documented at all, and you’ll have to trawl through LKML threads and read source code to figure out WTF is going on. The closer you program to the OS (or the hardware), the more likely it is that it’s not your fault.
I remember being told at university that it was highly unlikely that the bug was in the compiler. I have probably now spent more time fixing bugs in various compilers than I spent in total attending lectures as an undergrad. The problem with the ‘its always your fault’ idea is not that it’s false, it’s that it remains true when you are working on a compiler, standard library, or kernel.
There are no ‘faults’, there are just unintended consequences.
Yes, it can go all the way to sinking a ship.
But you know what the ‘real’ first rule of programming is?
Your application is only as great as the User.
Yes, capital U, User.
It doesn’t matter what your software does, or how it does it, if the User is doing something with it: You are a successful programmer.
This shouldn’t be used to excuse crap code. But .. just sometimes .. or actually, more often than one might think .. the best thing for the User was for you to put away the genius, and just make it work: for them ..
They’ll tell you, then you’ll know it again and again, if you’re doing things right: the first rule of programming is that it is a social service. Where’s the User?