1. 13
  1.  

  2. 16

    Reads like a collection of own goals underlining the need for open source.

    1. 17

      Every exposure I’ve ever had to FPGA design and synthesis tools and IP sets has left me deliriously happy to be working not in that space. A truly deeply proprietary space with tools that serve to make Lotus Notes seem fun to deal with.

      1. 2

        I guess it’s because of how capital intensive developing hardware can be. Anyone with a computer can start writing and distributing code with negligible marginal cost, but to develop hardware you need money.

        One of the big incentives for Open Source software is interoperability/adoption (the open source solution should theoretically spread the fastest), but it’s more profitable to keep it closed source if people will adopt it anyway (no good alternatives).

        1. 2

          We’re already using GHDL and as soon as any FPGA vendor opens up their bitstream…. we will push to jump to that one.

        2. 2

          And ASIC EDA tools aren’t any better. At $dayjob we’ve recently starting working with one of the major vendors in the space (/^S.+s$/) and the general quality of the software & support thereof has been…profoundly underwhelming, especially considering the convoy of trucks loaded with cash we had to send them in exchange for the privilege of using it.

          Open source is by no means a silver bullet (there’s plenty of examples of terrible pieces of it), but man…I guess I’m spoiled by the sorts of things I usually deal with – when I suddenly have to use some proprietary something for some reason, probably 80+% of the time the drop in general quality is pretty astonishing.

          1. 2

            I think one of the main points about open source is the ability to scratch itches and ease pain points.

            But if you have a vast monolith of proprietary crap… you’re stuck in a land of pain and digital eczema.

            1. 2

              I think one of the main points about open source is the ability to scratch itches and ease pain points.

              Oh, certainly. I have on occasion though had situations where I’ve noticed such an itch, pulled back the curtain to scratch it, and been too horrified by what was going on beneath the UI surface to follow through and do so. (Like “I don’t want my name publicly associated with this codebase”.)

              But such instances have been rare, and having at least the ability to do so is massive advantage.

              1. 1

                I have in quite a few instances in the Open Source world I have done no more than give a good bug report and well defined small test case and had the issue resolved the next day.

                In other instances I had to walk about in the debugger until I reach a “Huh? Wtf!” moment and then posted a query to the appropriate mailing list… and received a prompt “Yeah, that looks a bit odd… I think it should be…” reply which fixed it.

                In the closed source world my experience has been universally, ahhh, buy the next version, it might be fixed in that.

      2. 3

        I looked for an approachable article describing DRC violations where non-hardware people could get the idea. I found this with a quick Google. There were a lot of these rules for older stuff. I’ve seen slides saying 2500 or something like that for 28nm.

        I don’t know how the analog people do it at that level. People think Ada’s restrictions are a straight jacket vs C or Forth. From straight jacket to iron maiden.

        1. 3

          In my experience in academia, the poor quality of EDA tools is a just a fact-of-life that people live with. Everyone knows it’s a problem, but there’s not enough collective will to generate better solutions because of the scale and domain specificity. Mainstream open source tools haven’t taken off because the tools are all industrial facing and no one wants to give up anything that could yield market share. Hopefully a burgeoning open-source hardware ecosystem and slowing die shrinks will help push for better tooling.

          1. 5

            No, I think it’s done on purpose by both industry and CompSci due to each’s incentives. The Big Three (Synopsis, Cadence, and Mentor) hold tons of patents and pre-existing tech. CompSci folks constantly devise new ways to handle problems in EDA. I saved a paper or two in each category just in case anyone wants to develop open-source tools in future. Each one was solving a NP-Hard problem, often optimizing multiple variables simultaneously. They have to work within design rules best checked with proprietary tooling like Mentor’s. Testing if they work requires fab runs whose cost still adds up with shuttle runs. You need all this stuff together for competitive designs.

            Let’s say people are willing to put in the work. They keep getting tens of millions in funds for shuttle runs to prove their designs. Then they start charging for the open tooling somehow. There will likely be aspects of their designs covered by the Big Three’s patents. Then, the incumbents begin draining them with lawsuits. Due to for-profit incentives and legal risks, most people that develop improvements to the tech do them compatibly with Big Three’s workflows before getting acquired by them eventually.

            I think Magma was last one I could find that had enough tooling to be considered a fourth vendor. They also charged a lot less for some tooling. Synopsis acquired them in 2012. Then, on analog side, there was Tanner which had less-expensive tooling focused on that stuff. Acquired by Mentor in recent years. You’re looking at absolute feature and patent dominance by three companies over a long period with enormous barrier to entry. Most new companies in NDA just focus on niches that solve one type of problem better than the Big Three, possibly acquired later, too.

          2. 2

            This sounds like a description my dad gave me of using Fortran on a mainframe in the 70s.