Another blow dealt to public faith in scientific models Amanda Devine
The random antibody testing of 3,000 people across the state of New York has delivered yet another blow to the faith we placed in the computer models that Governor Cuomo and President Trump used to shut down the economy and place all of America under virtual home detention.
The tests show 2.7 million in New York state have developed antibodies through exposure. Meaning, with 16,000 COVID-19 deaths, the state’s mortality rate is a little less than 0.6 percent. Nowhere near as lethal as the dire 3.4 percent death rate the World Health Organization was billing early last month, and these figures will keep changing as more data comes to hand.
And it wasn’t all because we are perfect practitioners of self-isolation and hand washing.
The President’s coronavirus task force took into account those mitigation measures when it used an amalgam of models to predict that between 100,000 and 240,000 Americans likely would die.
A model from the University of Washington has since revised the projected death toll to 60,000 down from an initial 162,000.
As of Friday, 51,000 Americans had lost their lives, and now the updated models are edging closer to grim reality.
Of course, every death is one too many. But what we have seen over the past two months is that computer models are unreliable when it comes to predicting the future, and the premise we agreed on to vaporize 25 million jobs exaggerated the risks.
For people who understand how models work, their imperfections are no surprise.
Coronavirus task force tsar Dr Anthony Fauci last month admitted that exaggeration was built into every computer model of diseases he’s dealt with: “They always overshoot,” he told CNN. Computer models are not crystal balls, only a useful tool. Fauci calls them a “hypothesis.” They allow you to test scenarios and provide an approximation of alternative realities. But they are no substitute for common sense and prudent judgment.
So, since the models were used as the rationale for shutting down our $23 trillion economy, we should at least understand their methodology. What were the assumptions fed into the models that led to such an overestimation of the risk? And did they include a scenario which allowed for a less drastic intervention than a total shutdown of the economy?
We know now that 64 percent of those who have died in New York were aged over 70. Of patients hospitalized with the disease, 94 percent had underlying conditions such as obesity or diabetes. Did the models include a scenario in which we focused efforts on protecting the elderly and infirm while allowing the young and healthy to keep the economy ticking?
We know now that less populated rural states have suffered less than New York, California and Michigan, so did the models consider targeted shutdowns and travel restrictions in hard-hit or dense areas, while allowing the economy to breathe in the rest of the country?
The consequences of overreach are dire. The International Monetary Fund this week warned the coronavirus has plunged the world into the worst economic crisis since the Great Depression. Famine, war and human misery are sure to follow.
As we emerge blinking in the sunlight in coming weeks to survey the smoldering remains of our economy, it’s not unreasonable to ask the question: did alarmist models persuade us to err too far on the side of caution?
The next challenge is to re-open the economy, and Cuomo reportedly wants to spend money we don’t have on consultants, McKinsey & Co, to create — you guessed it — a computer model to tell him the best way to move forward. This time lets demand to know what are the assumptions and political calculations that go into these new models before they are used to determine our future.
Or better yet, forget McKinsey, and trust the innate common sense of the American people.
Comments are closed.