Do programmers really not care about time complexity anymore? Obviously there are lots of instances where it doesn’t matter, but it still seems like a useful concept to understand.
That’s usually the point where I see the become-a-developer-in-two-weeks folks fall down. And why I worry very little about becoming unemployable. :)
I think it depends greatly on your domain.
In process engineering, for example, we no longer have to care about it. The embedded processors we have now are so powerful they can take even the most poorly coded set of calculations and whip through them without lagging, whereas back in the 80s and 90s this was a very real concern. And the overall complexity of our applications is decreasing due to “smart” instrumentation that handles a lot of the annoying calculations for us.
I have no idea what sort of emphasis this gets in modern CS courses, as I’ve been out of college for several decades now.
As I’ve mentioned before, my last (hobby) project was on a 6502 processor, so time was not just on my mind, but to partially quote a Star Trek movie “the fire in which I burned.”
This is great, I’ve looked for similar introduction-level information about databases before, and always come up short.