1. 13
  1.  

  2. 24

    First of all, nothing is truly random when we consider the modern computers that we use on daily basis.

    I wish people would stop repeating that. We like to think of computers as digital circuits, but in practice they’re built on top of analogue electronics that are susceptible to quantum effects. Hardware random number generators take advantage of this.

    Most systems use hardware sources as one of the sources of entropy and feed them into something like Fortuna, which is designed to combine different entropy sources of varying qualities in such a way that feeding in a poor entropy source does not decrease the quality of the output, which it does by building on top of cryptographic hash primitives. This means that they can take various other inputs (contents of the root directory, cycle count of last query time, cycle time at which devices appeared on a bus, and so on) of varying qualities and still provide cryptographically secure random numbers if the hardware entropy source is biased.

    1. 5

      That’s pretty much compatible with what the author was saying for what it’s worth.

      1. 1

        Most systems use hardware sources as one of the sources of entropy and feed them into something like Fortuna,

        An aside, I wouldn’t like to entirely rely on these (fortunately contemporary OSes mix in other entropy sources too). Not because I think my CPU vendor is likely to put a backdoor in them, but because I’ve heard horror stories of hardware RNGs silently failing and outputting trivial sequences like “0101010101010101…” forever, with no errors being raised.

        1. 1

          But that’s the thing about Fortuna: you use several sources of entropy and mix them in a clever way such that, even if an attacker adversarially controls some, but not all, of them, you still get your random numbers as random as the best non-compromised source.

          1. 1

            Yes. I didn’t say it doesn’t. The above is couched as a hypothetical scenario. I expressed gratitude that it is not happening. :)

            1. 2

              Ah, sorry, poor reading comprehension on my part this time!

      2. 20

        The difference between /dev/random and /dev/urandom is that /dev/random only returns random bytes within the estimated number of bits of noise in the entropy pool whereas /dev/urandom does not block waiting for more entropy. Thus /dev/urandom is theoretically vulnerable to a cryptographic attack on the algorithms used by the driver and less random than /dev/random. /dev/random is more suitable for very high-quality randomness applications such as one-time pad or key generation.

        It’s probably worth pointing out that since ~March of this year there is no difference between random and urandom.

        https://www.theregister.com/2022/03/21/new_linux_kernel_has_improved/

        Donenfeld also held a fairly interesting talk about this on Linux plumbers.

        https://www.youtube.com/watch?v=-_yzaSp2xtY

        1. 2

          Thanks, noted!

        2. 9

          Still my favorite hack for this:

          use std::collections::hash_map::RandomState;
          use std::hash::{BuildHasher, Hasher};
          
          let random_value = RandomState::new().build_hasher().finish() as usize;
          println!("Random: {}", random_value);
          
          1. 7

            The right way to generate pseudorandom numbers without dependencies is to implement a PRNG. Most of the ones I’ve seen are one-liners, combinations of multiplications/additions/shifts. Just be sure to look it up from a reputable source, don’t try to make one up or it might have terrible problems like a short period.

            I’m not sure what the point of zero-dependencies is, anyway. PRNGs are a solved problem and I’d rather import a library that’s already been tested and proven itself, rather than take a chance writing my own code in a domain where a typo can make the output useless.

            1. 4

              First of all, nothing is truly random when we consider the modern computers that we use on daily basis. This is because they are designed to be deterministic.

              Well that’s just outright wrong. Physical properties and timings have been used for multiple decades, which may not be perfectly random (there might be bias) but sure as heck aren’t deterministic.

              But then for most of the last two decade pretty much every desktop/laptop processor has had “true” rngs built in, and then every non-bottom of the line android smartphone for this last decade has had true rng as well. I would assume that nowadays even the cheapest of of phones have true rng.

              Repeating nonsense/falsehoods about random and rngs really isn’t a great way to start an article about them.

              1. 3

                I’m surprised there was no attempt to do the more common approach of doing a load of bitshifts and operations on the number to get the next random-looking number.

                1. 1

                  I’ve been going down a rabbit hole trying to understand Tausworthe generators (aka linear-feedback shift register). I had copied a 3 line function from Stack Overflow and wanted to understand where these magic numbers come from. So I started reading Tausworthe’s 1965 paper [pdf], but am now bogged down in finite field theory and primitive polynomials.

                2. 2

                  ASLR implementations tend to have weaknesses that manifest in ways that might not be obvious just looking at the scrambled output. For instance, take the program

                  fn main () { dbg![main as *const u8]; }
                  

                  results in

                  0x000055a7a8b55410
                  0x000055a4d561b410
                  0x000055c683ec3410
                  0x0000560f7b369410
                  

                  or, more succinctly:

                  fn main () { dbg![main as *const u8 as usize % 4096]; }
                  

                  results in

                  1088
                  1088
                  1088
                  

                  which would break things using this entropy source for indexing into some slice etc…

                  While the stack in the post’s example gets a different distribution due to its reliance on a stack frame’s placement in the address space, there are still sharp edges about every system’s ASLR implementation that it’s kind of frustrating that only exploit writers seem to pay any attention to, despite their significant impacts on performance.

                  1. 2

                    despite their significant impacts on performance.

                    What are you referring to here?