What COVID Challenges Reveal About Data Handling

COVID-19 has presented an unprecedented challenge to global society. This international pandemic has transformed the way we live and work, causing untold difficulties for the health of individuals and economies, and radically altering the way we interact with the world.

The data of this pandemic is mind-blowing. The identified global case load is currently approaching 51 million spread over more than 200 countries, with diagnosed deaths peaking above the one million mark in September. Expert consensus is that both these figures represent drastic underestimates. The ability to capture and process data simply hasn’t kept pace with the pandemic itself. There has never been a more critical example of the fundamental need for appropriate data handling.

Data drives informed response

Data has been fundamental in our fight against COVID-19. It powers infection analysis, underlines efforts to track and trace contacts, and offers the foundation for critical decisions about healthcare capacity and consequent measures to tackle the virus.

Digital epidemiological surveillance has been a key feature of the global response to COVID-19. Population-surveillance systems have embraced traditional health data alongside billions of additional data points such as mobile phone location, Bluetooth proximity, immigration data, and other sources. Symptom tracking apps have been released in multiple countries globally, with hundreds of millions of users tracking and updating their symptom status. Indonesian COVID app Aarogya Setu alone has been downloaded more than 150 million times.

This massive and sprawling ecosystem of data points has created a remarkable resource burden for nations around the world. The rapid spread of the virus, and parallel growth in data volumes from COVID-19 testing and tracking, has created unique challenges in adopting a robust, effective, and integrated data platform solution.

These challenges were highlighted in a recent news story from the UK, where use of a simple Microsoft Excel spreadsheet as a format for data sharing reportedly caused the loss of up to 16,000 positive COVID-19 tests. Data experts were quick to question the big data challenges inherent in such a platform choice. We know, because we had those same conversations at Neural Technologies.

This example illustrates the ever-present issue of rapidly adapting and retrofitting legacy systems to meet emerging data needs. Integrating data into Excel from a standard CSV file was identified as the source of the problem in this case, with the chosen data source providing a data volume that ultimately ended up exceeding the potential of the Excel platform.

CSV is a standard data format, and can be used to transfer data of any volume. Excel however has a maximum file limit, which varies depending on the version of the software used. This created a situation whereby not only was data lost, but lack of data tracking and robust oversight processes meant there was no automated system to identify this failure. 

It is not the only example of such Excel-inspired errors. As highlighted in the story linked above, financial institution JP Morgan lost almost USD6 billion through a mistake in an Excel spreadsheet in 2013. In 2003, misaligned rows in a spreadsheet cost power company TransAlta USD23 million. These cases are frighteningly common. 

Excel is an accessible, and basic format for simple data needs. That’s also why it’s unsuited for complex high-volume and critical tasks such as COVID-19 test and trace, or high-value financial records. It does not provide a clear auditing record, and does not offer the requisite automation needs of the modern high-volume landscape.

Simplicity can solve complexity

The specific failure of the UK COVID-19 testing and recording system is just one case, but there are many similar examples of critical challenges caused by inappropriate data processes. Ensuring trusted data veracity across all data sources is one of the most critical problems associated with big data.

The question of data veracity was at the heart of concerns about another major COVID-19 data failing surrounding controversial treatment Hydroxychloroquine. A multinational registry analysis undertaken in Australia and published by the Lancet was ultimately retracted, when it was revealed that the origin of data was unverified and potentially incorrect. This study had initially led to the cancellation of global trials of the drug by the World Health Organization, showing the dangers of decisions based on incorrectly collected data. 

A professional, integrated, and fit-for-purpose solution is fundamental to any vital decision making process. This must include trusted and complete data as a foundation. That’s true whether it’s financial analysis, or fundamental health concerns. 

The right system is designed to make complex data simple. That’s why Neural Technologies’ orchestration solution utilizes open API to provide simple integration with both new and legacy data sources. That approach allows integration across all data sources, creating a rapidly scalable system that ensures that decisions are made with all the relevant information available. 

This question of appropriate data use underlines one further fundamental challenge of the COVID-19 pandemic that is emerging for business. Fresh data means fresh change, and no period has presented more seismic shifts to that foundation than the 2020 pandemic. Modeling from 2019 may well be obsolete, creating a fundamental need to integrate contemporaneous and timely data into decision making. 

While this is undoubtedly a challenging time for business and society, it is one which offers a remarkable opportunity to reassess and redeliver on critical data priorities. 

With the COVID-19 pandemic catapulting forward digital adoption by several years, businesses are likely to face rapidly growing data handling needs in future. What’s evident from the COVID-19 data challenges emerging during this crisis, is that it’s better to embed a robust system with trusted data sources from the start, than struggle to adapt as those data volumes grow. If you need to explore that opportunity in partnership, Neural Technologies is available to assist you.

Revenue Protection
Protect and Grow your end to end revenue chain
Data Integration
See more and do more with your data in real-time
Signaling

Unlock new business opportunities with our Telco-grade signaling stack

Follow Us

Connect With Us And See What We Can Do For You

Latest News & Insights