1. 16
  1.  

  2. 3

    See also: https://en.wikipedia.org/wiki/Minifloat

    Related: hoops that one jumps through when training using fp16 accelerated in hardware: https://docs.nvidia.com/deeplearning/sdk/mixed-precision-training/index.html#training

    1. 2

      Ha, I remember trying to do 8 bit fixed point 4.4 in z80 assembler. I was trying to make a boid flocking demo on a gameboy color, and I didn’t understand cordic functions. I still remember the frustration!

      1. 1

        Maybe math here, since this talks about how to represent a numbering system.