Reconnaissance Should Happen Long Before Body Count

Screenshot of PDC’s interactive map services

For anyone who has listened to me bloviate about baseline Covid-19 data gathering, and the need to employ cell phones in the effort, it appears to be happening at last:

Thank Jonathan Yen for bringing it to my attention.  You’ll all enjoy my silence on the matter now.

The problem, in the simplest possible terms, is that for three months, national health agencies have not been able to give meaningful answers on the most relevant questions: transmission rates, infection rates, speed of spread, amplitude and duration of wave, peak health system demand, or, embarrassingly, even death count.  All of these depend on ratios, and without a basis for estimating what the denominator value should be (based on broad population symptom sampling, relative to baselines set in non-COVID years), the best numerators we’ve got (hospitalizations and deaths) don’t tell us anything.

In the absence of a solid denominator in any of these ratios, due to inadequate testing, public health planners couldn’t do any predictive mapping or modelling, which is why their estimates are so broad in range and changeable day to day. 

In the plainest possible language, they have traded methodological purity for thousands of avoidable deaths.  Their daily confirmed infection numbers are determined by test availability, not disease spread, so for empirical certainty of the tiny sample in their grasp, they have been underreporting, by orders of magnitude, current infections numbers.  This can be demonstrated by easy arithmetic, calculating from confirmed COVID deaths to estimate the number of infections when they actually occurred at least a month prior, and then doubling that number every 3-4 days back to the present day to capture the exponential rate of transmission.  By this method, it’s obvious that daily reports of infection spread in February and March, relying on scanty lab confirmed cases, were as useless as the interim strategies for tracking and containment derived from this data. 

Obviously the response would have been swifter and more focused if the daily death count wasn’t the most reliable data point for guessing what happened a month previous.  Small wonder officials could never guess what was going to happen a month hence.  The value of planning is registered whole and entire in the future (apologies to Kant), but with no useful baseline or trend data, all we could do was count bodies and speculate about what must have happened, what might be happening, and what might happen next.  After three months of this nincompoopery, there are still breathless expressions of surprise at the pace, direction, and ferocity of this virus. 

If they had admitted that their lab tests would be too slow and sparse for emergency planning, and had employed mass symptom reporting by phone app, there would have been a coherent response sooner and it would have been better targeted.  The degree of inaccuracy by these methods would have been far less than the tiny lab test sample produced.  Both needed to be done because certainty is needed in individual cases and because the pace and patterns of COVID spread needed to be predicted.  A symptom survey app can be launched overnight and be updating constantly for real time results.  We all know how long it took to get lab testing started, and how long each test takes from swabbing by a medical professional, to a final report from the lab.  One source of data is fast and cheap, the other is slow and expensive.  As usual, the public health bureaucracies chose to rely on the latter.  Now that it’s too late, we’re coming to our senses. 

It's easy to blame politicians for being slow to admit the scale of this pandemic, and we all want to lionize public health officials for their apolitical leadership.  However their preferred 19th century data gathering methods effectively assured that the response would be too late, spastically uncoordinated, and ultimately reliant on citizens being shut in their homes.  As a public health strategy, this only superficially differs from the methods employed against the bubonic plague in 1665.  The similarities are scary given that we’ve had three and a half centuries of advancements in microbiology and data analytic since then yet are still using a playbook developed by Charles II and the Lord Mayor of London.   

Anyway, despite being three months too late, I’m relieved to see the big data capabilities of the digital age brought to bear on this problem.  Although public health agencies have stuck doggedly to their antiquated, narrow, slow, methods, the use of 21st century data capture and analysis, using proven visualization and modelling tools, will at least help guide governments and citizens through a staged recovery process.  With a four-week lapse between infections and reported deaths, we would otherwise remain in the dark until all the mortuaries have emptied and the whitecoats wave an “all clear” to us.

Comments are closed.