UnitedHealth Faces Class Action Lawsuit Over Alleged Misuse of Algorithm Denying Rehab Care

A class-action lawsuit was filed against UnitedHealth Group and its subsidiary, NaviHealth, accusing them of illegally using an algorithm to deny rehabilitation care to seriously ill patients. The suit, filed in the U.S. District Court of Minnesota by the California-based Clarkson Law Firm, alleges that the companies are aware of the algorithm's high error rate but continue to use it, leading to wrongful claim denials for Medicare beneficiaries.

The lawsuit follows a recent STAT investigation that revealed UnitedHealth pressured medical employees to follow an algorithm, known as nH Predict, which predicts a patient's length of stay, to issue payment denials to individuals with Medicare Advantage plans. Internal documents disclosed that the company set a goal for clinical employees to keep patients' rehab stays within 1% of the days projected by the algorithm.

Become a Subscriber

Please purchase a subscription to continue reading this article.

Subscribe Now

The complaint accuses UnitedHealth and NaviHealth of "systematically denying claims" for Medicare beneficiaries struggling to recover from debilitating illnesses in nursing homes. It claims that the companies knowingly used an algorithm with an alleged 90% error rate, denying patients' claims and counting on only a tiny percentage—0.2%—to file appeals to overturn the decisions.

In response to the allegations, a UnitedHealth spokesperson stated that the NaviHealth predict tool is not used to make coverage determinations but rather as a guide to inform providers and caregivers about the patient's needs. The company asserts that coverage decisions are based on CMS coverage criteria and the terms of the member's plan, and it defends itself by stating that the lawsuit has no merit.

The lawsuit, representing the families of two deceased Wisconsin residents who had Medicare Advantage coverage through UnitedHealth, details instances where patients were denied necessary care. One case involves Gene Lokken, 91, who fractured his leg and ankle, stayed in a nursing home without physical therapy, and was only approved for 19 days of therapy before being deemed safe to go home. The other case involves Dale Tetzloff, 74, who suffered a stroke and had his care cut off after 20 days, despite doctor recommendations for long-term care.

The lawsuit alleges breach of contract, breach of good faith and fair dealing, unjust enrichment, and insurance law violations in multiple states. It contends that the elderly are prematurely removed from care facilities or forced to deplete family savings due to an AI model's disagreement with real-life doctors' recommendations.

As the legal battle unfolds, the case sheds light on the potential consequences of relying on algorithms in healthcare decision-making and raises questions about the balance between technological tools and human judgment in ensuring proper patient care.