Computer “Death Panels” Are Killing Health Care Workers

Published on

Computer “Death Panels” Are Killing Health Care Workers; Unintentional and Hidden Biases Determine Who Gets COVID Vaccine

Know more about Russia than your friends:

Get our free ebook on how the Soviet Union became Putin's Russia.

Q3 2020 hedge fund letters, conferences and more

Computerized Death Panels Now Deciding Who Will Live Or Die

WASHINGTON, D.C. (December 24, 2020) - Although "death panels" under Obamacare were discredited, there are at least two different kinds of computerized death panels now actively at work deciding who will live and who will die from COVID-19, suggests public interest law professor John Banzhaf.

Both involve algorithms - a set of instructions for computers, something like a formula or recipe - which decide both how the virus is to be distributed nationally, and also locally at places like individual medical facilities, says Banzhaf, a computer expert who obtained the first copyrights on computer programs and created the computer-based Banzhaf Index.

GIGO (garbage in, garbage out) may now been replaced by BIDO (bias in, death out), says Banzhaf, since any delays in getting the vaccine to health care workers most in need of this life-saving protection is certain to result in deaths among this most at-risk population.

Vaccine Distribution

For example, the Stanford Health Care center refused to provide the vaccine first to those with the greatest exposure to the coronavirus because the algorithm used a scoring system which prioritized distribution based upon age rather than risk.

For example, a 65-year-old doctor working from home get 1.15 additional points solely because of age, while someone working in a very high risk area where 50% of the employees were positive got only 0.5 points.

Apparently as a result, only 7 of the hospital’s resident physicians, those most at risk and arguably most essential, were among the first 5,000 to receive the vaccine.

No wonder protesting workers at high risk staged a demonstration at this they chanted “F--- the algorithm” and “Algorithms suck."

An Algorithm With A Racial Bias

At the health-care provider Optum, its algorithm reportedly disadvantaged Black patients by ranking them lower in terms of needed care than White patients. The company's response: “predictive algorithms that power these tools should be continually reviewed and refined.”

Even the master program which determines how the vaccine is to be distributed national [Tiberius] may be discriminating on the basis of race and causing the unnecessary deaths of Black health care workers, says Banzhaf.

The program apparently relies in part on census data in determining how much vaccine should be shipped to different areas of the country.

But studies show that the census under count for African American men is much larger than for the total male population, and is especially acute for those 30-49 - prime ages for Black health care workers.

But that could result in the unnecessary deaths of many Black doctors and nurses, suggests Banzhaf, because African Americans - for a variety of reasons - are infected with COVID-19 at almost 3 times the rate for comparable Whites, and they are more than twice as likely to die from the virus.

While the media regularly reports every time a Black person is killed by police in a questionable shooting, they are much less likely to report each time a computer death panel, operating under a flawed and/or biased algorithm, kills a Black health worker, says Banzhaf.