Health

Five Steps to Develop Active Learning in Value Assessment

With inflamed rhetoric and distrust surrounding health care costs and the evaluation of cost-effectiveness, it’s sometimes difficult to think lasting progress on value assessment can ever be achieved.

After spending a week recently at the pre-eminent gathering of health economics and outcomes researchers — the annual ISPOR meeting — it’s clear that while broad consensus exists that value assessment is important, we have vast disagreement on how to measure value.

Much of the controversy revolves around the quality-adjusted life year. Does it work? Is it discriminatory? Should it be scrapped or can it be improved? These questions and more were posited at a panel I moderated at ISPOR. And while the debate around the value of the QALY is important, I believe it distracts us from taking action to improve methods and models that can be both credible and relevant to the real-world decisions that patients, their clinicians and purchasers face.

Step back and think about it: We’re currently relying solely on one static, decades-old algorithm that cannot reflect the experience of complex and comorbid conditions to determine whether a health care intervention is cost-effective. That inflexible commitment in the absence of alternatives is both short-sighted and in conflict with the needs of decisionmakers in the U.S. health care system.

What’s an alternative approach? Instead of focusing solely on QALY-based methods in cost-effectiveness analysis, I propose adopting an active learning approach in which all stakeholders collaborate to define and improve the multiple methods and models that provide insight to assessing value.

Moving from theory to function requires action. Here are five concrete steps to move value assessment away from a static snapshot in time to a real-time, active learning approach for value assessment:

Take the transparency challenge: Providing complete public access to models, underlying data, and utilizing open-source programming language would allow for model comparison, evaluation and validation. Committing to open-source also sends an important signal about the importance of collaboration and finding common ground.

Embrace flexibility: U.S. health care has multiple decisionmakers with multiple considerations. To think we can use one model to arrive at one answer about what an intervention is worth for a parent, payer and innovator is shortsighted. Adopting an active learning system for value assessment so tailored approaches are built will more accurately account for differences in patient population and preferences in decisionmaking.    

Allow for iteration: Employing an active learning system instead of using a rigid model to provide one answer at a time accounts for innovation and discovery in science and health economics. If we rely on one way to judge the cost effectiveness of a health care intervention, we deprioritize innovation because we rely on old ways to account for newly discovered value.

Improve incorporation of non-clinical data: Another argument for a learning system is the ability to incorporate non-clinical data, including real-world data and patient perspectives, into cost-effectiveness analysis. We’re achieving progress by making some clinical trials more patient-centered, which will eventually improve clinical evidence, but we need to go further, faster and sooner by employing rich data sets that currently don’t have a plug in to the clinical evidence but are important nonetheless. We are ignoring real-world evidence, registries and other rich data sets that offer important and actionable cost-effectiveness information while waiting for the decades-long maturation of clinical evidence.

Embrace the theory of “and”: Embracing either the QALY (or a new and improved QALY) or MCDA is a false choice. Advanced models must look at cost effectiveness from multiple vantage points to give a clearer picture of what drives value. Models, like those IVI constructs, should be able to showcase and compare analyses using both CEA and MCDA just as easily as a web browser displays different apps to users.

Let’s recognize that the science and practice of value assessment is young in the United States. Paraphrasing a paper in Value in Health, with no agreement on the proper way to determine and define value, we should explore different methods and systems and make them more accessible for different decision makers in real situations with real consequences.

Instead of risking dependency on only one way because the system is comfortable with its simplicity, we need commitment from all actors to embrace an active and transparent learning environment in which we create, test, refine and utilize multiple approaches to look at value. Doing so can ultimately create trust that our collective understanding and calculation of value is both scientifically credible and life-relevant.

 

Jennifer Bright is the executive director of IVI, and she is also the president of consulting group Momentum Health Strategies.

Morning Consult welcomes op-ed submissions on policy, politics and business strategy in our coverage areas. Updated submission guidelines can be found here.

Morning Consult