I wouldn’t want to work like this again, but I’m glad I have worked like this.
I started a few years later (on mid-1980s PCs) so I happily and/or sadly missed out on program sheets, but I did get the privilege of working in the DOS-style “open editor against source code, make changes, save, exit editor, run compiler, run the executable [if you were lucky and it compiled], repeat” workflow, where every one of those steps took at least a few seconds.
Like the author, I’m glad I did; but I’ve also found that it has permanently infected my brain with some bad assumptions. Code changes/refactorings that are trivial still sometimes seem like they’ll be more of a chore for me than they actually will, so I probably endeavor upon them less often than I should. And of course the sheer speed of today’s machines is always a shock: while doing my first 3D graphics programming recently, I’ve been constantly nagged with the thought “that’s going to take several million multiplications, there’s no way we’ll be able to do those, much less 60 times per second!”, when of course it takes that many operations just to put nearly anything on the screen these days, no matter how trivial.
… and getting started in programmable logic … what a time !
More nostalgia: https://archive.org/details/80-microcomputing-magazine-1983-03
It’s my birthday, so I can be forgiven: http://brannockdevice.blogspot.com/2010/02/dear-young-geek-i-am-you-in-30-years.html