• Chip Unicorn@im-in.space
    link
    fedilink
    arrow-up
    0
    ·
    23 hours ago

    @octade Hi Raze –

    Have you tried generating millions of your random numbers, then subjecting them to randomness tests? Which tests did you use?

    • OCTADE@soc.octade.netOP
      link
      fedilink
      arrow-up
      1
      ·
      20 hours ago

      If you are referring to MEGARAND, no. There is no need for that since all of that has already been done over the years for the underlying primitives:

      /dev/urandom … b2sum … shuf … chacha20 …

      These primitives have been run through the gauntlet for years and are known to produce or use very good entropy. Chacha20 is especially prized for this and taking already random data and running it through the chacha20 cipher with random keys and/or salts is a very nice hedge against patterns and biases. Megarand stretches these primitive outputs to build a much larger pool for wherever you might want a big initial pool for pads, tokens, seeds, whatever.

      If you’re paranoid you can run dieharder tests on the output, but it would just be placebo at this point.