Government reliance on the manual entrance of COVID-19 data into basic software has caused data errors and civilian confusion across the globe. The UK used a Microsoft Excel spreadsheet to track COVID-19 cases – until the high caseload became too large for the software to handle. The maxed-out file stopped loading cases into the government’s system and left more than 15,000 cases behind in the national count. As a result, exposed people were not contact-traced and quarantined, causing an additional 125,000 infections and an estimated 1,500 deaths. One employee’s misstep in the US state of Ohio prevented 4,000 COVID-19 deaths from being reported in the state’s system. The newest entry to the COVID-19 count glitch list comes from Spain.
On March 10, a respected peer-reviewed medical journal, The Lancet, published Spain’s child COVID-19 mortality rate as around two to four times that of the US, UK, Italy, Germany, France and South Korea. The paper said that 54 children (defined as below 19) had died of COVID-19 in the small country, making Spain’s reported death rates a staggering 4.9% for kids aged 10-19 – which is at least 2.92 percentage points higher than other countries in the report.
Right after the Lancet paper was published, Pere Soler, a paediatrician at a hospital in Catalonia, started getting calls. Concerned reporters were trying to reach him for comment. “The first question that I received was, ‘Have you been lying to us?'” Soler says. He and other prominent paediatricians around the country had been in close contact with a circle of reporters throughout the pandemic, keeping them updated on child COVID-19 research and school reopenings. This high of a child mortality rate did not add up with the numbers doctors had been seeing and feeding to the media.
As a re-examination of the information would soon reveal, in reality, only seven children had died of COVID-19. (The Lancet data has since been corrected.)
“Even though I didn’t know what the problem was, I knew it wasn’t the right data,” Soler realised once he got his hands on the Lancet paper. “Our data is not worse than other countries. I would say it is even better,” he says. Paediatricians across the nation contacted Spain’s main research institutes, as well as hospitals and regional governments. Eventually, they discovered that the national government somehow misreported the data.
It’s hard to pinpoint exactly what went wrong, but Soler says the main issue is that patient deaths for those over 100 were recorded as children. He believes that the system couldn’t record three-digit numbers, and so instead registered them as one-digit. For example, a 102-year-old was registered as a two-year-old in the system. Soler notes that not all centenarian deaths were misreported as children, but at least 47 were. This inflated the child mortality rate so much, Soler explains, because the number of children who had died was so small. Any tiny mistake causes a huge change in the data.
The Spanish national media’s report on the Lancet paper – a week after its initial publication – framed the data as erroneous and reflected the correct numbers for Spain. This prevented public uproar and tempered any potential storm of terrified parents. But even though numbers can be corrected with relative ease, it is important for government data to be accurate, as John Farmer, the chief technology officer for New York City, explained to Slate.
“In the 21st century, good tech and good data are necessary for good policy. That’s the core of it,” he said. “So I think that’s why it’s so important that all of us who do work in government really take a fine-tooth comb to the systems we’re using, and to the data that we’re producing, to make sure that it really does make sense.”
Since computers were invented by humans, he notes, they also sometimes make human errors. When computers make mistakes, then, it’s up to the people behind the machine to question and problem-solve. “When our technology systems give us answers that don’t make sense or seem out of line, it’s on us to ask why and not just accept the data. Too often, people shrug their shoulders and say, ‘Well, that’s what the computer says.'” The swift action doctors and media took to sniff out the suspicious numbers in Spain and work to correct them is a model of how data should be interrogated.
Spain’s software error is also an example of what can happen when technology is built without the future in mind. “What’s key is when we build a technology system, we need to build it for the long term,” Farmer says. He drew the comparison between digit-based software error in Spain and the Y2K bug, when, as ’99 rolled into 2000, computer programmers feared their systems would register the new millennium’s dates as 1900. “That just emphasises how important it is to think ahead. You can’t assume that what you’re building is only going to be used for five years or 10 years,” Farmer said.
Even if we humans lack foresight, we need our software to be prepared for whatever next comes our way – pandemics and all.
This piece was originally published on Future Tense, a partnership between Slate magazine, Arizona State University and New America.