1. 55
  1.  

  2. 8

    I wonder what some of this 50+ year old COBOL code actually looks like?

    Does anyone on Lobsters have experience with (oldish) production COBOL? Would love to hear some war stories.

    1. 2

      Very very long form. Mostly “clean”.

    2. 6

      This is a thoughtful article, nice counterpoint to the general bellyaching that erupted when the call for COBOL support went out a few weeks ago.

      1. 6

        As someone that did programmed in RMCobol in UNIX I don’t agree with “to truly develop COBOL, you need a mainframe.” But if you want a mainframe, download Hercules

        Also, Grace Cooper did not create Cobol… but anyway.

        1. 3

          What makes you say she didn’t create cobol I would love to know your viewpoint on this

          1. 10

            She was a pioneer in language development, and thanks to her effort in FLOW-MATIC we have Cobol (and high level languages), but we have to realise Cobol was a multi team, multi company effort. Cobol, by the way, is a good example of a successful language designed by a committee.

            Also, she was not part of Codasyl exec. committee, but an advisor to it. Also, she was not part of the six main designers, two of them women: Jean Sammet and Gertrude Tierney.

            So, was she influential? Yes, incredibly so… not only in Cobol but in the development of high level programming languages. But did she create Cobol? Saying that is not only inaccurate, but a disservice to multiple teams that worked to create Cobol.

            There’s a wonderful paper on this matter by Jean Sammet: The early story of Cobol that I highly recommend.

            1. 2

              https://en.wikipedia.org/wiki/COBOL

              I guess it’s because COBOL is based on a language designed by Grace Hopper?

          2. 2

            During Y2K, I heard some fortran programmers came out of retirement. The author is saying COBOL, this time around. Which programmers will be called out of retirement a few decades from now?

            1. 13

              Perl, there is a lot of it out there but the pool of people willing to work on it is shrinking.

              1. 3

                Having written quite a bit of Perl and knowing some of the systems that are written in it, I think that it will be deprecated or replaced before I reach retirement because of changes in the business environment, upgrades to the underlying products, changes in process, or just after doing a risk analysis. That some of that stuff is already from the late-90’s and early 2000’s and still going is impressive (or disappointing, depending on your POV).

                1. 2

                  Yeah, this. There was a lot of Perl written for an eight year or so stretch, and a lot of that is in places we wouldn’t expect.

                  1. 2

                    There will always be money for people who can suspend their sense of smell!

                    1. 9

                      I’ll take clean Perl code over dirty Java or C# any day of the week.

                      1. 2

                        Sure, but how many companies trying to lure Perl programmers out of retirement are going to be in possession of a clean codebase?

                        1. 1

                          I think if you’re luring programmers out of retirement to work on something, the odds of it being clean are fairly minimal, regardless of language.

                          1. 2

                            Absolutely. That’s what I meant. I didn’t mean to imply anything specific about Perl.

                  2. 7

                    Ruby, I think

                    1. 1

                      Really? I would think that there’s a lot more legacy business-logic Java code in production than Ruby. Of course, I don’t have any hard data to back up my intuition.

                      1. 5

                        Yeah, but I see Java sticking around, like C or C++ have. Ruby is more niche than Java, but still gets used by a lot of companies. There is a possible future where it’s a legacy skill in 30 years.

                        Though, thinking about this more, I suspect VBA, FOXPRO or MUMPS are probably better candidates.

                        (And yes, I see the irony of posting this on a forum powered by Rails)

                        1. 2

                          TBH, the best thing to “reviwe” Ruby (which is IMO already done in Crystal but nevermind) is to actually kill Rails.

                          It’s completely opposite to what Ruby is, denies its values and throws many of its nice aspects into the window. And then, most people who don’t know Ruby at all are looking at it from the RoR lenses, thinking it’s a huge mess.

                          And, in the meantime, Sinatra sits there like nothing ever happened.

                          1. 3

                            Sinatra at least did influence Go and Node.js/Express a bit.

                            Doesn’t Crystal have exponentially growing compile times due to all of the type inference? At least, I had heard that can be a problem.

                            I don’t see Ruby truly dying any time soon, just becoming progressively more niche over time, kinda like Perl has.

                        2. 4

                          I think the difference is that Java will be alive and well in 20 years … but Ruby? Not sure.

                          • Programming for non-programmers is now largely done in Python (scientists) or Go (infrastructure).
                          • Other languages closed the gap in terms of productivity in the web space, also NodeJS exists.
                          • Learning to program now often happens In JavaScript, thanks to browsers shipping it.
                      2. 3

                        Perl maybe?

                        1. 1

                          I see Perl getting there in the next 10 years, not the next 20-40, if it is going to get there at all.

                      3. 2

                        I’m gonna repeat that again and again – COBOL does not deserve the criticism it gets all the time.

                        I’ve looked into specs/docs of it and some random code… If you wouldn’t tell me it was made in 60’s I might think about 90’s/00’s as well. Only the line markers are exposing its nature and I’m quite sure they’re not relevant in modern COBOL variants (yes, the language seems to be constantly upgraded).

                        It’s weird, yes. It’s like a completely alternative universe of IT, for sure. But it seems… done right? I mean, someone sat down around the table and designed the language from top to bottom to do one job – do a Big Enterprise Processing at Scale. And it was already working well in 1960s!

                        It’s readable, you can let your accountant sit down at 3270 terminal (or its emulator or even just a text editor) and there’s a large chance he/she’ll understand what’s happening there with very little background of IT theory. And in some cases, they might understand the process better than average JavaScript programmer sitting at Starbucks with his shiny new Macbook.

                        So, it’s not a thing you’d run on your 1st line web backend or do a mobile app in it. I’m pretty sure these both can be done in COBOL though. At least that web framework thingy was done (and its author is now a leader of extremist left-wing political party in Poland, believing in communism, how weird). But you’re not gonna do anything like that seriously… okay?

                        On the other hand, if I had a huge list of customers (in 100’000s or even millions) who want to buy something from me or even just store/access their info, I would actually look into the stack of COBOL+DB/2+CICS instead of NodeJS+NoSQL+Redis+K8S and millions of tiny containers, even if I know only a little bit of now. At least I would be sure my business is “safe” for next 20-30 years with its code.

                        1. 2

                          If you’re a developer in 2020, you could learn a lot from this course. You can learn a few principles (that I made up myself) that seem to be prevalent in the world of COBOL:

                          I think the author is saying that these principles should be followed in all/most projects, though perhaps they are merely saying that these are principles you could learn. I think that most projects will actually not benefit from these principles.

                          • Preserve your resources - memory, disk space, and CPU cycles are not free. Conserve them as much as possible with what you build.

                          Why? Spending time on performance is by definition unnecessary unless there’s a problem with performance. And most of the time there isn’t. You don’t even need to get the algorithmic complexity right most of the time. Performance optimisation is fun, and in some cases it’s crucial, but why should it be mandatory for all or most projects?

                          • Be explicit in everything you do - Take the time to figure out what you need and then declare it. Think arrays and structs instead of lists and generics. Performance comes from knowing exactly what you need and when.

                          I think being explicit is great, but I don’t think lists or generics hurt that particularly.

                          • Write code as if it will live for decades - For COBOL programmers, it’s true. Think ahead and act as if your code will live on for years. How should you write it so it can be maintained further down the line?

                          Most code won’t live for decades and (in the only study I’ve seen on this, can’t find citation, it looked at git histories of some large projects) will only be edited by a fairly small number of people for a short amount of time, even within long-lasting projects.

                          I like writing elegant, maintainable code, but it’s definitely unclear that it’s actually objectively a worthwhile endeavour. I wouldn’t prescribe to others that they must or should write nicer code. In fact, many of my scientific peers write little scripts that are essentially write-once, read-never because it’s the outputs that are interesting. That’s totally fine when you’re exploring an idea (though of course the reproducibility code for the final write-up should be made understandable to others).

                          • Avoid breaking changes - Modern developers love reinventing the wheel. We also like introducing breaking changes and telling people to upgrade if they don’t like it. COBOL takes the approach to only break things when there is no other option. We need a little more of that in 2020.

                          This is another unjustified value judgement. Taking backwards compatibility seriously can be a huge amount of work that, empirically, you just don’t need to do to make useful software. I don’t think it’s true modern programmers love reinventing wheels more than the programmers of yesteryear, either. If anything, I imagine it was more common to reinvent stuff when package management was harder and open source software hadn’t yet taken off.

                          1. 2

                            Not caring about performance is one of those locally optimal for a single developer or a small team decisions but globally sub-optimal for the rest of society. In a very real way it’s a selfish attitude.

                            1. 2

                              That’s just not true and you should think about it more carefully.

                              There’s loads of code that’s just never on the critical path or that is not run often enough to justify the expense of performance optimisation.

                              The value judgement is also unsupportable. Why should I produce efficient code when that does not have value for me? That may cost me time and happiness and make my code uglier. And, going to a utilitarian perspective, reducing the number of watts my code uses is far from the most impactful political action I can make.

                              If someone depends on my code and needs different performance properties they can improve it or ask me nicely. It’s entitlement to assume that others should sacrifice for your benefit.

                              There are obviously ways that we can move so that we can have generally higher performance code without incurring costs, and I support those (and use Julia, a high performance language), but that’s different from your statement.

                              1. 3

                                One word: multitasking environments.

                                If you have – for example – a desktop machine with some amount of resources you don’t want that shiny Electron app to hog all of them just to write text to file or process some data which don’t need that much memory and cycles, right?

                                As a developer, you should understand you are not alone at the target machine, even in server environments, so you should always try to limit your footprint as much as possible to let other processes use the remaining resources, resulting in increased capacity of the machine.

                                That’s a bit like a savior-vivre or just a good manners in software development, right?

                                1. 1

                                  The existence of situations where writing higher performance code is useful does not imply that it is appropriate to always “conserve [resources] as much as possible with what you build”. This should be obvious.

                                  1. 3

                                    I am struggling to come up with a situation where code is not running in a multitasking environment. Well I can but they all have even more constraints on cpu/memory/io than the code that is running in a multitasking environment. The chance that code you or I will be writing that doesn’t have a societally good reason to conserve resources is vanishingly small.

                                    1. 1

                                      If costs are equal, then it’s obviously better to use fewer resources. But often it is expensive to make code more efficient. In any kind of rational assessment you must assess both the benefits and costs of an action.

                                      1. 1

                                        I think we are actually perhaps in violent agreement here. I was making the case for caring about performance at the whole program level. After re-reading your responses it sounds like you are focusing more on the function or line of code level. I do think that if your code is not on the critical path and nearly never run then it by definition is not a performance concern.

                                        I agree that performance should be considered holistically in context. The point I was trying to make was that the holistic context is the entire machine the code is running on. If your program hogs resources because the developer decided that the performance of the app was “good enough” that is when I think the decision was selfish.

                                        1. 1

                                          My point is that software is used in a huge variety of situations and often performance is genuinely pretty unimportant. So my comments aren’t just about individual functions but about whole libraries and systems.

                                          Lots of programming contexts do not care about run time or memory use particularly. E.g. lots of scientific programming, batch business logic programs, etc. Yes, this code is sometimes or often run on hardware with UIs that it slows down, but that rarely matters much.

                                          It would be nice if this code ran faster, but it’s not very important. There will usually be more valuable things to do (and often the authors don’t have the specialised skills to make it perform better, anyway).

                                2. 2

                                  Yesterday I was working on my top of the line laptop and various different applications were being slow. There was no good reason for them to be slow or for so much of my CPU to be consumed but the combined number of my applications all contributed to the death by a thousand cuts of my machines performance. I wasn’t compiling anything, I wasn’t watching a video, I was just browsing the internet. I don’t even browse a lot of heavy sites on the internet. I’m working on the equivalent of a super computer and I still can’t get a high performing experience.

                                  I’m not even talking just about the number of watts your code might cause my computer to consume. I’m talking purely about how any code you produce for others to use has an impact on the entire ecosystem of code I am running.

                                  It’s very much in the category of globally sub-optimal for society even if it’s locally optimal for you. I’m not even making a value judgement. It very much is in your interest sometimes to sacrifice cpu/memory/io in your application in the name of shipping sooner or solving the right problem at the right time. But it doesn’t change the fact that I have a million cuts to performance sitting on my laptop right now and sometimes my computer suffers as a result. The blame doesn’t lie with any single app. It lies with all of them and the various ways they all made similarly locally optimal decisions. In a way it’s a collective action problem. It’s only worth it to an individual or company if all the individuals or companies do the same thing since changing your ways when everyone else won’t has no effect and in fact your code may still be perceived to be slow simply because it’s in the same ecosystem as everyone else.

                                  1. 1

                                    None of that addresses this:

                                    There’s loads of code that’s just never on the critical path or that is not run often enough to justify the expense of performance optimisation.

                                    Yes, lots of modern UIs are high-latency memory-hogs, but that doesn’t invalidate my argument at all.

                            2. 2

                              There isn’t much “loosey goosey” programming happening in COBOL

                              What does this mean? Can you provide some kind of comparison to other languages? How does it relate to explicitness?