1. 11
  1.  

  2. 3

    Admittedly I’m exactly the uninformed software engineer he complains about, but I had trouble with the fact he seems to focus on how you should do it, and doesn’t really explain what’s actually wrong with the approach taken by misguided students. Presumably it has undesirable qualities (perhaps it doesn’t work?) but I can’t tell from the article.

    1. 1

      My guess (from 6 months of an EE degree) is that it refers to real circuits having a minuscule propagation delay, but simulators do not.

      This can lead to circuits which behave very unexpectedly momentarily when their inputs change.

      Using a clock with a cycle greater than the total delays mitigates this effect.

    2. 1

      Author is totally correct that software people often fundamentally misunderstand time, expecting it to be somehow discrete and universally ‘synchronous’ all on its own. This shows up not only in hardware design, but also big distributed systems. Physics teaches us that there’s no such thing as simultaneity, but we typically work in carefully engineered articificial worlds where we can safely pretend that there is. So, we fall down hard when these assumptions no longer hold.

      On the other hand, the author seems to imply that you can’t do non-trivial computation without a system-wide clock. This is demonstrably false: my favorite example is the GreenArrays chips. No clocks, just data flow!