1. 4

discussion on the previous story

  1.  

  2. 2

    This is actually a good thing, because it offers plausible deniability for future Fappening-style leaks.

    1. 2

      Wait until you see what happens next election when another liberal woman is running for the Democrats. It won’t be pretty.

      1. 3

        Wait until you see what happens next election when another liberal woman is running for the Democrats. It won’t be pretty.

        The whole point is that if fakes are undistinguishable from real footage, video no longer matters. If you’re bothered by people masturbating to falsified videos, you have bigger problems than those that can be handled by reasoning.

        1. 3

          I agree, this cuts both ways. Any person “caught” in an actual documented embarrassing position can plausibly claim the footage was generated by a malicious party.

          In the end, this will probably create a market for cryptographically secured cameras, like some still cameras used for forensics.

          But there will be a lot of turmoil before this all shakes out.

          1. 3

            this will probably create a market for cryptographically secured cameras, like some still cameras used for forensics

            I think this is going to have to happen for all video and camera devices and content over time. Otherwise the whole notion of video or photographic “evidence” is going to go out the window, along with all the law and precedent that’s been built up on it for decades, casting us completely adrift in a sea of post-truth.

    2. 1

      I suspect this will motivate some legislation to require all digital cameras embed a unique cryptographic signature in their images.

      1. 1

        It’s going to be interesting to see whether governments actually catch up with this, or whether it’s left to industry to respond to “market” demand for “video and photos that we can trust”. Seems like a pretty good case for regulation, in that when these techniques get good enough that we can’t tell whether they’re genuine or not (without getting into an infinite ML-turtle regression), pretty much all legal infrastructure that could ever rely on visual recordings as evidence is going to be up for grabs until there’s a way of proving the evidence is valid.