1. 29
  1.  

  2. 26

    The principle is the same with electronic voting machines. When software becomes law, proprietary code is like having secret laws.

    1. 2

      Interesting observation, and spot on I think.

    2. 8

      Regardless of guilt, I don’t envy anybody whose task it is to convey the severity of discovered bugs. I wonder how this general line of reasoning will drive additional certification requirements over time.

      1. 20

        I wonder if this will slow down the pace of technological adoption in law enforcement. That could be a good thing. Instead of rushing a product to market, perhaps companies will more carefully consider the potential flaws in their systems to avoid future embarrassment. That will cost more, so then perhaps law enforcement organizations will be less eager to roll out new technologies until their efficacy has been unambiguously demonstrated.

        Probably nothing will change, but I can dream!

        1. 8

          Considering previous flaws in forensic science (e.g. bite mark analysis being pseudoscience) caution is definitely a good thing here.

      2. 4

        This is plainly the right decision. I bet this is going to significantly reduce reliance on software though - the defense is going to ask the “expert” to crawl all over the code and get them to repeatedly say that they don’t know why the code is as it is.

        Experts relying on simpler models they coded themselves are going to look a lot better than those using third party software.