1. 14

Previously submitted as Post Office/Horizon IT scandal: falsehoods legal practitioners believed about software. Deleted and resubmitted as link post + comments to save the mods a judgement call over whether this ought to be a text post or a link post.

Parts of this speech are on topic for this forum, parts are not; I’m excerpting the parts that are on-topic in a comment below, and my commentary in a second comment.

  1.  

  2. 7

    Parts of this speech are on topic for this forum, parts are not

    Without wanting to invite the discussion (one I try, and occasionally fail, to stay out of for my own health) - I think almost all of this is on topic for this forum!

    The speech itself hits hard, as well it should given the incredible perversion of justice that has taken place. I think it’s important reading.

    The technical content is fairly limited, which is to be expected given the original audience for the speech. But the intersection of “the system contained bugs” and “nobody was allowed to prove that the system contained bugs” is an important one. It shows the importance of making systems observable and auditable, but also highlights depressingly well the need for transparency of process. Even if Horizon had contained all those features (it might yet?) the process here was essentially rigged so that defendants arguably would never have been granted access to the information that could have proved their innocence.

    All in all, a very depressing story and I’m glad some justice has finally been found. Never trust a computer, or opposing counsel, without verification.

    1. 4

      I agree with everything you wrote, including that perhaps all of the speech could be considered on-topic for this forum. No matter. The excerpts are certainly on topic, and I hope people will click through to read the speech.

      1. 2

        Sadly, I willing to bet a case of beer that if the defendants had been poor and/or black the matter would never have been righted.

        Reading this it’s clear the problem is only in a very small part technical, and in a very large part in the attitudes and ethics within in the legal and corporate system.

        1. 2

          The defendants are poor.

          That’s kind of the point. They were totally bankrupted, or never had wealth.

          1. 2

            Clearly middle class, not poor. Afterwards, yes, poor, but with upper middle class education and connections.

      2. 4

        On June 3 2021, Paul Marshall addressed a few hundred students at the University of Law in London on the Post Office Horizon IT scandal¹. Below I’ve quoted the excerpts that describe how people (victims, lawyers, judges, prosecutors) related to the software, because I believe they are relevant to the practice of programming (and thus on topic for this site). You can read the full speech at www.postofficetrial.com.

        All emphasis mine. My own thoughts on implications in a comment.

        The main learning point for programmers that I took from it: People thought, and probably often think, that when a computer program fucks up, it will be obvious that it did.

        ¹ Summary of the scandal: the Horizon IT system for accounting had bugs that caused accounting discrepancies; the UK’s Post Office Ltd. (POL) was aware of the bugs but nonetheless prosecuted subpostmasters for fraud when such discrepancies occurred; furthermore POL withheld the exonerating evidence; this went on for 13 years, in which time over 700 subpostmasters were prosecuted and convicted (~52 per year; pre-Horizon this was ~5 per year); recently 45 convictions have been overturned, and these are unlikely to be the last.


        Excerpts from Paul Marshall’s speech of 2021-06-30 to students at the University of Law, London

        […]

        [I]n 1997 Lord Hoffmann, universally regarded as a clever judge, loftily declared that no one needs a degree in electronics to know whether a computer is working or not.

        […]

        The law treats computers like machines. But computers are not machines – or at least they are not only machines. Part of the present problem is that technology advances so rapidly that our means of dealing with it cannot keep pace. There is more regulation covering the design of a toaster than there is of someone who writes and sells computer software.

        […]

        [I]n 2010 at Mrs Seema Misra’s trial, prosecuting counsel opened and closed the case for the Crown by telling the jury that, were there to have been a problem with the Horizon computer system, any such problem would have been manifest and obvious to a Horizon computer terminal operator. That’s, in effect, Lord Hoffmann’s point. It’s wrong.

        […]

        The Law Commission expressed a similar view in two reports to Parliament in 1993 and 1997. The Commission recommended that safeguards for evidence derived from computers in legal proceedings be removed. Until 2000, a person relying on computer evidence at a criminal trial was required to prove that the computer was working properly. The Post Office Horizon scandal tracks exactly the period since the removal of protections[.]

        […]

        The transcript of [Mrsx Misra’s] trial shows that she was close to taunted by the prosecution for her being unable to point to identifiable problems: ‘Mrs Misra says that there must be a fault with Horizon, but she can’t point to any problem she actually had’.

        The jury was invited to infer that the only cause of the discrepancy must be theft. That should never have happened. Had her trial been conducted properly, the Post Office should have been required to prove that the Horizon system was working at the time she experienced shortfalls. As we now know […] the Post Office could not have done so.

        […]

        1. 7

          Important: although below I only discuss technical interventions, that is a self-imposed limitation because this is a programming forum. The most important failures in this scandal were failures of ethics, and making repetition less likely will require prosecutions and other legal, political, or societal changes. Nonetheless, I believe there are some small things that can be done at the programming level.

          As I wrote above, the main learning point for programmers that I took from the paragraphs is: People think that when a computer program fucks up, it will be obvious that it did.

          So how can we make arbitrary future errors more obvious, if we don’t know their nature yet? (Known errors can be fixed or disclosed.) There are many errors the programmers may miss that a user can detect — if they have the required info. For the user to detect a program error, they need to be able to observe and check what the software is doing. So it’s up to us to make that information accessible.

          So when you write a program, empower your users. Create buttons to download spreadsheets and logs. Ship the changelog with the program, and let users open the changelog from the UI. Give users a chance of checking your program’s work, so that when your program is fucking up they can figure out what’s happening.

          You may be thinking: ‘no, no, this is a technical solution to a social problem. It’s naïve prosecutors and judges who are the problem here. Program fuckups are inevitable; judges should never assume that a computer program is free of bugs.’ And I agree that the law should not assume a program is fully correct! But also, by making it easier to observe what our program is doing, we may be able move the fault-finding ‘to the left’, to earlier stages, before trial happen. Consider how our CI pipelines are aimed at detecting errors as early as possible:

          • in the IDE,
          • at compilation time,
          • when unit testing,
          • when integration testing,
          • in production.

          The earlier the problem is detected, the less harm it does, the easier it is to detect, and the easier it is to fix.

          Now extend this chain beyond the production stage. Once deployed, real problems (caused by errors in your software) can be detected:

          • by the user;
          • by their colleagues;
          • by their client or employer;
          • at a legal trial;
          • or by an investigatory panel re-examining old trials…

          Again, the earlier the software’s error is detected, the less harm it does and the easier it is to ‘fix’. And the consequences of late detection are worse here than in the software/CI part of the pipeline. And the more information your program makes available — even if it’s behind a button — the more chance of early error detection – the more ability a user has to check what your program did.

          So again: empower your users to supervise your software. Create buttons to export spreadsheets and logs. Integrate the changelog in the UI. Give users a chance of checking your program’s work.

        2. 2

          This is very interesting, and is something I hadn’t heard of until today. I’m reminded of https://how.complexsystems.fail/