1. 4
  1.  

  2. 5

    Varying probability amplitudes, and this is critical, imply that an infinite amount of information can be encoded in a QBit, a major departure from classical computation.

    I’m currently taking Aaronson’s quantum CS class, and he regularly cautions against pop-sci explanations of the quantum advantage like this one. The reality is that most of the advantages provided by qbits are quite mundane, and you certainly can’t use them to “encode an infinite amount of information”. Not in a useful sense, anyway.

    There are several specific problems where you can use a qbit to replace log(n) bits, but only for a pre-determined n. For example, if you have a counter that’s going to be equal to one of two values at the end of a procedure, you can replace that counter with a qbit and get a usually correct answer.

    You also can’t transmit an arbitrary amount of information with a qbit; you can transmit one bit (at most) just by sending a qubit down a wire, and two bits (at most) using superdense coding.

    That’s not to detract from the article; the content is quite interesting. Just after a few years of studying QM, I have an instinctive pedantic reaction to things that will only make casual readers more confused about it.

      1. 1

        Not a physics expert here, but in my mind the cool thing is not the double slit experiment with things like ordinary light or water, but the double slit experiment with electrons and single photons and the wizardry that accompanies when we find out that simply trying to answer the question “which slit did the photon/electron go through” turns the wave into a particle.

        Also, I thought the homemade double slit experiment with single photons is something one should try: http://physics.wm.edu/Seniorthesis/SeniorThesis2005/TarSeniorThesis.pdf