The word of man against the machine. How do we prove the truth?

Lee Castleton remembers the deficit that appeared in his post office on New Year’s Eve 2003, £ 1,103.68. A week later another loss emerged, this time at £ 4,230. Then another loss and another. By March, the branch postmaster was £ 25,000 short. “I kind of learned from the second loss that it was not my fault,” said Castleton.
With no way to interrogate the post office’s supportive computer systems, he called the IT helpline and its managers and emailed them 91 times. All he received, however, were instructions to check, which he did dozens of times, and after a few frivolous reassurances, the officials completely stopped responding.
Castleton, who was an engineering technician and then briefly a stockbroker, bought a post office in the port city of Bridlington, in the north of England, hoping to provide a lifestyle for his young family. Instead, a Supreme Court ruling bankrupted him by ordering him to pay the £ 321,000 to the Post Office which it spent on suing him for fictitious debt. Bankruptcy prompted him to return to stockbroking. He therefore had to take care of himself by sometimes working as an engineer, sometimes sleeping in his car and existing on the mortgage payments for the family’s apartment above the now dissolved post office.
The flawed IT system that led to the state-owned British Post Office suing more than 700 branch postmasters for thefts they did not commit, and the bankruptcy of many others, is now the subject of a public inquiry.
This incident contributes to the growing global response to the human damage that automated processes can cause. In the US, a group of scientific advisers in the White House is calling for a “Bill of Rights” to prevent the injustices caused by artificial intelligence.
Many of these events focus on how AI-enabled algorithms can reinforce social bias, such as the marginalization of female job seekers in male-dominated fields, and black citizens identified by AI instruments as a potentially repeat offender, receiving harsher prison sentences. judges. However, digital injustice is not limited to artificial intelligence, nor is it a new phenomenon. In 2011, the British government apologized to the relatives of two Royal Air Force pilots who blamed them for the 1994 Chinook crash that killed them, which activists said was possibly caused by faulty software.
All this raises the question of how to establish the truth in disputes that put people’s word against the reliability of computers.
“When patients are disadvantaged, staff are often blamed,” writes Harold Thimbleby, professor emeritus of science at Swansea University. People forget about the other suspect in such cases, namely flawed technology, or some hidden human-machine interaction. His new book, Fix it, describes this situation. During a routine examination, a hospital discovered a mismatch between measurements that were automatically uploaded to a database from clinical devices and the nurses’ paper notes. Because computers do not lie, the hospital accused the nurses of issuing false patient records, and some were tried. But three weeks later, an IT support engineer from the device provider caused the trial to collapse when he revealed during a thorough cross-examination that he had “improved” the poorly maintained database by deleting records.
In the Post Office scandal, a combination of flawed software, inadequate disclosure and lies aided and encouraged by the legal presumption that computers worked reliably, says Paul Marshall, a lawyer who represented three convicted post office workers without charge in the Court of Appeals. Judges and jurors have been trusting post office witnesses for more than a decade that his “Horizon” billing system, provided by IT company Fujitsu, is reliable and concluded that branch postmasters must have stolen the money they registered as missing. But in 2019, the release of error logs known to have affected Horizon, which has been around all along, led the more investigating judge to conclude that the system was “not robust enough”.
The assumption of computer reliability places the onus on anyone who challenges digital evidence to prove that a computer is unreliable. This can be “extremely difficult” when defendants have IT knowledge and do not have access to systems, says Martin Thomas, Emeritus Professor of IT at Gresham College. Computers can also misbehave while looking perfect. This was the “dead end” that involved the post office workers, says Marshall. He added, “They had no basis for questioning what the computer was saying because they did not know how it worked or whether it could fail, and the post office did not tell them about it.”
Asking the right questions is also important when you do not agree on an email related evidence. In 2018, Peter Duffy, a consultant urologist, won a lawsuit over unfair dismissal against the National Health Service’s Morcambe Bay University Hospitals Trust, UHMPT. He then published a book in which he claimed that failures were related to the death of a patient, who asked the National Health Service and the supervisory body of the National Health Service, NHS Improvement, to have an external investigation conducted.
An investigation conducted between 2020 and 2021 revealed two emails allegedly sent by Duffy in 2014, when the patient’s health deteriorated. Duffy says the emails were fabricated. However, due to her entry into the registry, he implicated him in causing poor patient care.
In a statement, UHMPT CEO Aaron Cummins said that “two separate and independent external reviews” of the investigation “found no evidence that the emails in question were tampered with and no evidence that they were not from the mail account. was not sent. Duffy’s NHS Hospital email.
However, during Duffy’s 2018 labor case hearing, a trust fund judge ordered that all correspondence regarding the patient’s death be searched. But none of the fund’s digital searches found the disputed emails. The emails also did not appear in information gathered through two internal NHS investigations into a patient’s death, or in response to freedom of information requests made by the deceased’s family and Duffy himself.
“How can an organization’s cyber security assessment today validate emails that were supposed to have been sent six years ago, that have not been acknowledged or handled by recipients, and conflict with recent clinical observations and that the family of the deceased remembers, Duffy asked.
Without commenting on Duffy’s own case, Thimblebee says when you do digital searches and tells the court that there are no more emails to be found, “you can not assume that this is true”. There should be strong evidence of the emails, “like backup,” he says.
From banking applications to algorithms that determine employment options, computer-controlled systems have entered our daily lives in countless small ways since the first post office trials. While we promote the access of technology, we can not say the same about the law’s ability to deal with its shortcomings. “You can become a lawyer without any knowledge of electronic evidence, even if it forms part of almost every lawsuit.” Stephen says Mason, co-editor of the law book Electronic Evidence and Electronic Signatures. “It really does matter,” Marshall said, referring to the jail sentence of sub-postmaster Sima Misra for alleged theft of funds that the Horizon Post Office system showed was missing. “In four separate cases before three different judges,” Marshall said, Misra requested the disclosure of Horizon’s error logs but was denied. A decade later, error records led to her sentence being overturned.
In a 2020 paper submitted to the Department of Justice, Marshall and some of the co-authors recommended that the legal assumption of computer reliability be reconsidered. Assuming that all computer programs contain errors, the authors struggle with how to prevent injustices without filling courtrooms with hope-motivated efforts, such as demands from motorists for investigations into speed camera software.
The paper recommends that, as a standard procedure, organizations that rely on evidence collected by computers should be required to disclose error logs in their systems and known vulnerabilities. For well-run operations it should be clear, says Thomas of Gresham College, otherwise “the onus should fall on institutions to prove that it was not the computer that had things wrong”.
While companies often hire IT consultants to give expert opinion on court cases, individuals can rarely afford the experts. In order to reduce this inequality, Swansea University’s Themblebee proposes that independent IT panels be set up to advise judges on when and when digital evidence can be reasonably relied upon. “I think in the world people will be able to say, ‘Of course this is an IT problem and we have the right to contact the IT committee, and then the committee will take an informed view,'” he said.
If such a system were in place when the Post Office filed cases, the Castleton family could have led a very different life. Now that he’s a factory engineer working the night shift, rather than a businessman, Castleton says he’s struggling with a relentless venture. He said: “I felt like I was drowning and no one did anything to save me. I was just unimportant.”

Leave a Comment