1. 22
  1.  

  2. 2

    It is wild how widespread this bug is.

    I recently started using ML and specifically pytorch for work and was disappointing with how stateful everything is and how “action at a distance” seems to be the norm.

    1. 1

      I made this same sort of mistake in my research days spawning many monte carlo simulations sufficiently quickly that some of them ended up with the same seed. I ended up doing something similar. I added a seed CLI arg and passed unique seeds in from my orchestration layer. Fun bug to figure out.