1. 9
  1.  

  2. 3

    See my comment on part 1 of this series: Non-cryptographically-secure random number generators are all functions of the form:

    {state, number} = random(state)
    

    The function is a fairly simple permutation that updates the state and returns a new number. It is no more surprising that a neural network could learn this than that it can learn a trigonometric function. The network needs to learn both the function and the initial state based on observations of the numbers from a load of these calls chained together but the sequence of numbers have a (fairly simple) arithmetic relationship. This kind of PRNG is designed to be fast, which means that the relationship is fairly simple, it’s just designed to give output in a fairly uniform distribution.

    A cryptographically secure random number generator is normally (warning: this is a massive oversimplification, do not build a random number generator like this!) something like this:

    {state, cryptographic_hash(number)} = random(state)
    

    Assuming the same sequence, now the network needs to learn how to reverse a cryptographic hash function to be able to get a number that allows it to learn the sequence. If a neural network can learn that then it would be very surprising and have huge implications for the hash function used.

    Even starting from the Mersenne Twister and feeding the neural network with the md5sum of the results rather than the raw numbers would be unlikely to work. That algorithm contains some of the properties of a cryptographically secure PRNG but is far weaker than a real one.

    1. 1

      Yeah, the editor’s note makes it pretty clear that this is the authors proving that they can make the technique work on the easy stuff before moving on to the hard stuff. So yes, you would expect a NN to be able to predict these non-crypto PRNG’s. First step of any complex predictive model is being able to predict what you already know to be true. Ideally doing this teaches you things that you can then use to make the non-trivial predictions better. I am interested in seeing where they go with this.

      1. 3

        My point is that I don’t think they’re building up useful things if they want to be able to predict the output from cryptographic random number generators. Starting with simple cryptographic hash functions and see if they could learn to reverse, say, md4, would be a good starting point because that might lead to techniques to reverse the SHA family. Or alternatively trying to learn AES plaintext from cyphertext with a stream cypher mode. That would give them the building blocks for a cryptographic cypher.

        Their current approach is like saying that they want to learn the tan function and are starting with Fibonacci. There are sufficient similarities that they look superficially like they’re creating important building blocks but the important thing for tan is that it’s discontinuous over the real numbers, whereas Fibonacci is a defined only over positive integers.