After a (longer than expected) hiatus, we are back with the purpose of keeping our website up to date. As you will see, our work has not stopped during the past three years. In the coming posts we will keep you informed about the impact AdViSHE made on the field of validation of health economic models. On the occasion of AdViSHE’s (almost) fifth birthday, we are also launching a new Twitter account (@AdViSHE). Regards from the AdViSHE team.
The validation assessment tool AdViSHE is a tool for structured reporting on all relevant aspects of validation (conceptual model, input data, implemented software program, and model outcomes) but does not indicate any particular methodology. In a recent Value in Health article, Isaac Corro Ramos and colleagues provide further details on one of these aspects: the validation of health economic (HE) model outcomes against empirical data. In particular, they present a new Bayesian method to determine how well HE model outcomes compare to empirical data. Validity is based on a pre-established accuracy interval where the model outcomes should fall. The method uses the outcomes of a probabilistic sensitivity analysis and results in a posterior distribution around the probability that HE model outcomes can be regarded as valid.
The applicability of their method was shown in a case study. A published diabetes model (MICADO) was used to validate the outcome “number of patients who are on dialysis or with end-stage renal disease”. Results indicate that a high probability of a valid outcome is associated with relatively wide accuracy intervals. In particular, 25% deviation from the observed outcome implied approximately 60% expected validity.
The paper concludes that current practice in HE model validation can be improved by using an alternative method based on assessing whether the model outcomes fit to empirical data at a predefined level of accuracy. This method has the advantage of assessing both model bias and parameter uncertainty and resulting in a quantitative measure of the degree of validity that penalises models predicting the mean of an outcome correctly but with overly wide credible intervals.
Isaac Corro Ramos, George A.K. van Voorn, Pepijn Vemer, Talitha L. Feenstra and Maiwenn J. Al. “A New Statistical Method to Determine the Degree of Validity of Health Economic Model Outcomes against Empirical Data”. Value in Health. May 2017. DOI: http://dx.doi.org/10.1016/j.jval.2017.04.016
The validation assessment tool AdViSHE has been used to validate a cost-effectiveness model for frail older persons. In this model, Jonathan Karnon and colleagues present a cost-utility analysis of a physiotherapy-based intervention for frail, Australian individuals. Validation of the cost-effectiveness model was attempted for each of the validation categories described in AdViSHE.
Face validity of the model structure, and model inputs and outputs was established through a detailed face-to-face presentation to clinicians, researchers and consumers.
Verification of the implemented models was performed using independent programming of different sections, extreme value testing. Cross validation was not possible, due to an absence of models to compare it to. Several outcomes were compared to previously published values.
Karnon J, Afzali HH, Putro GV, Thant PW, Dompok A, Cox I, Chikhwaza OH, Wang X, Mwangangi MM, Farransahat M, Cameron I. “A Cost-Effectiveness Model for Frail Older Persons: Development and Application to a Physiotherapy-Based Intervention.” Appl Health Econ Health Policy. 2017 Mar 27. doi: 10.1007/s40258-017-0324-z. [Epub ahead of print]
Based on AdViSHE, Pieter de Boer and colleagues investigated the reporting habits of recently published studies on seasonal influenza and early breast cancer. After all, transparent reporting of validation efforts of health economic models give stakeholders better insight into the credibility of model outcomes.
They conclude that although validation is deemed important by many researchers, this is not reflected in the reporting habits of health economic modelling studies. Systematic reporting of validation efforts would be desirable to further enhance decision makers’ confidence in health economic models and their outcomes.
The paper (open access) can be downloaded here. The full reference is:
- P.T. de Boer, G.W.J. Frederix, T.L. Feenstra, P. Vemer “Unremarked or Unperformed? Systematic Review on Reporting of Validation Efforts of Health Economic Decision Models in Seasonal Influenza and Early Breast Cancer” PharmacoEconomics 2016 Sep;34(9):833-45. doi: 10.1007/s40273-016-0410-3
This website contains all the relevant information on the project that spawned AdViSHE. The paper that presented AdVISHE was published in PharmacoEconomics in early 2016. The full text can be found on their website, or downloaded here. The full reference is:
P. Vemer, I. Corro Ramos, G.A.K. van Voorn, M.J. Al, T.L. Feenstra “AdViSHE: A Validation-Assessment Tool of Health-Economic Models for Decision Makers and Model Users” PharmacoEconomics (2016) 34:349–361 DOI 10.1007/s40273-015-0327-2 (Open Access)
Other publications that were part of this project were:
- Comments on the validation guidelines of the ISPOR-SMDM modeling good research practices task force, published in Value in Health (2013). It can be found here.
- Argumentation why patients should be involved in the modelling process. It was published in Applied Health Economics and Health Policy (2016) and can be found here.
- A new Bayesian framework was developed, to measure the accuracy of health economic decision models statistically. It uses stepwise validation when new data becomes available. The paper is currently under review.