1. 39
  1.  

  2. 24

    I was a volunteer librarian in my compsci department’s library, and I fondly remember a day we did inventory and realized an entire shelf was dedicated to UML textbooks, all outdated/never checked out. I marked all of them as obsolete and moved them to the basement section.

    Has UML or even software ontology in general provided more value than the tons of paper waste it has generated? I wondered.

    1. 17

      It’s nice to be able to sketch DB schema on a white board using the standard arrow types with the rest of the team.

      Apart from that, not so much.

      1. 2

        I’ve found the sequence diagrams useful to visualize race conditions, as well. But I probably don’t really use the “official” exact notation.

        1. 1

          I prefer state tables. Hopefully I’ll eventually learn to use some proper proof assistant.

      2. 1

        Can you share the location of your CS department and timeframe? I saw the country on your GitHub profile on your country, but wanted to confirm. If I extrapolated that data point, I do know that there were significant numbers of students required to learn/use/research UML plus adjacent topics across UK universities from the mid-90s until the late 2000s. This included courses from the huge contingent of Greek academics in the UK. It was basically a requirement from 2000-2004 at some point in many degrees because of the demands of the large employers combined with the response times for universities adjusting to industry requirements.

        This is orthogonal to the usefulness of UML (or any other tech.) There are a lot of useless things that academia and industry pursue. When UML was hot, anyone close to real-world development was starting to see the value of agile as originally envisioned and the value of tools like Rails for web development which was a growing area. I realise there are many other domains, but the point was to illustrate that there are people following these boondoggles with different motivations from others that are more pragmatic. It’s harder to see this if you haven’t lived before, during, and after the rise of a thing.

        1. 2

          It was basically a requirement from 2000-2004 at some point in many degrees because of the demands of the large employers combined with the response times for universities adjusting to industry requirements.

          Not quite. It was a requirement for BCS accreditation. The BCS is a weird organisation. Industry doesn’t have much input into its requirements and neither does academia, so it ends up requiring things that neither industry nor universities want. Cambridge decided to drop BCS accreditation a few years ago after polling the companies that hired graduates and hearing that none of them cared about it (I’m not sure if the department is still accredited but the policy was that they would not change any course material to conform to the BCS’ weird view of what a computer science degree should contain).

          As I recall, the biggest problem was that there were great tools for taking code and generating UML diagrams from it, far-less-good tools for going the other way (and the fastest way of creating a load of the diagrams was to write some stub code and then tell the tooling to extract UML). When the specification is describing, rather than defining, the implementation then it ceases to be a useful design tool.

          1. 1

            You are right about the BCS being weird and accreditation being a driver. Cambridge, Oxford, Imperial, and a few others are relative outliers with the autonomy to operate in the way you describe. The bulk of students are attending relatively average universities which are forced to seek accreditation for various reasons, so this pattern just repeats.

            I hadn’t thought about UML in a long time before this post. I recall some SystemC researchers and graduate students as having the most rounded out workflows I’ve seen, but it looks like what you described everywhere else.

            1. 1

              You are right about the BCS being weird and accreditation being a driver. Cambridge, Oxford, Imperial, and a few others are relative outliers with the autonomy to operate in the way you describe

              I suspect that most others are too. My undergrad and PhD were both from Swansea. The undergrad degree was BCS accredited and had all of this pointless nonsense in it. Since I left, no employer has ever asked if it was BCS accredited, most don’t even know who the BCS is, and none has considered any of the stuff that I learned in the BCS-required modules to be valuable (at least one has considered it to be actively harmful). I strongly suspect that if a second-tier university dropped BCS accreditation then there would be no consequences. I was hoping that Cambridge’s decision to drop it would make it easier for others to do so.

      3. 18

        Sequence diagrams are technically part of UML and are still used all over the place.

        There’s probably a great essay to be written about why rigorous diagrams are so successful in electrical engineering, and so nearly useless in software.

        1. 10

          IMO it’s because in electrical (and mechanical) engineering, things have to fit together physically in 3 dimensional space, so a lot of your diagrams are a simplification of that layout. In software, anything can touch anything else.

          1. 8

            Also a lot of people in electrical engineering use verilog, because 2d diagrams don’t scale all that well.

            1. 1

              Good point. They use to hand draw in the 60s with wirewrap when the designs were simpler. In IC design, I think verilog eventually becomes a mask (or multiple masks)? It’s the authoritative design and not a perspective or thinking tool? I don’t know if software has that equivalent yet.

              1. 1

                That’s true in ASIC design. But in controls engineering most of it is still done with diagrams (LabView, Realtime Workshop, PLC ladder logic, etc) and those can get horrifyingly complicated. (I’m not saying this is a good thing.)

          2. 12

            During my time as a software architect at an insurance company, circa 2017, I remember creating quite a whole lot of UML diagrams which were embedded in big “architecture document”.

            The intended audience weren’t developers, who barely glanced at them (if at all), but rather the whole lineup of people whose job it was to “approve things”. Security analysts, system architects, infrastructure specialists, conformity directors etc. The situation was rather bad: the system architects were dead set on going to the cloud, but the infrastructure guys where scared of it. And of course, my project the “test” for the cloud.

            I would spend my half my day fussing over every coma in the text and the correct usage of such and such UML concept (as you really don’t want to derail an already political meeting because of a dumb mistake), and the other half doing presentation of said document to the various stakeholders where they would in turn fuss over every little details.

            My deduction from this experience is that UML is alive and well in some places where a heavy bureaucracy surround software development: government, bank, large corporation, etc. These place have the resources to pay “UML artist” that create diagrams for meetings notes and thick project documents so that stuff get approved and the project can move along.

            So, the true UML use case? Procedure and politics.

            1. 3

              The vision of diagram → code is fundamentally unworkable, but people keep making the same mistakes with API specifications. It’s a pity they didn’t work harder on code → diagram. There are some dragons there with layout, but it’s fundamentally a better direction to go in.

              1. 7

                I don’t think it’s fundamentally unworkable! Unreal has a blueprint visual scripting language and Da Vinci Resolve has the fairlight node language. AFAICT both seem pretty useful to people. Key is that both are diagrammatic DSLs in specific domains, as opposed to general purpose programming languages.

                1. 4

                  It’s a Systems Engineering concept that the diagrams are a “view” of the system. You create the view to illustrate visually some aspect of the system. You have to omit a lot of unnecessary detail to make a useful diagram. This is part of the reason code -> diagram isn’t as widely used. It’s easy to generate a “complete” view this way, but the resulting view isn’t particularly useful.

                  1. 1

                    funny you mention dragons – there is DRAKON[1] which normalizes layout to triviality.

                    [1] https://en.wikipedia.org/wiki/DRAKON

                  2. 3

                    One of the best uses for UML was using it the other way around; generating it from code to get an overview.

                    1. 1

                      Yeah that makes sense, but I think no one I ever talked to over an UML diagram actually remembers the exact notation of the arrows. It’s a good idea in theory, but the relationships are much more important than if the arrow is -> or ->> now.

                      I had to learn all of it for my tests when finishing university but I’d say besides generating them out I’ve only used them like a dozen times since then, which is about 12 years ago as well :P

                      1. 1

                        We used it all the time, including those arrows 💁🏻‍♀️

                        That’s not to say that it didn’t have weird priorities and showed stuff that was less important than what it emitted.

                    2. 2

                      You know, I was starting to think I imagined the push to build systems entirely by generating code based on UML diagrams. I remember going to an embedded systems conference in the early 2000s and UML was just everywhere and I could not for the life of me understand the hype. Even though I was so young and green while it was happening, it was so obvious to me that even if it could somehow be made to work, there couldn’t possibly be enough benefit to justify the effort. But no! It was real! They really believed it would work!