File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Variable importance analysis with interpretable machine learning for fair risk prediction

TitleVariable importance analysis with interpretable machine learning for fair risk prediction
Authors
Issue Date2024
Citation
PLOS Digital Health, 2024, v. 3, n. 7, article no. e0000542 How to Cite?
AbstractMachine learning (ML) methods are increasingly used to assess variable importance, but such black box models lack stability when limited in sample sizes, and do not formally indicate non-important factors. The Shapley variable importance cloud (ShapleyVIC) addresses these limitations by assessing variable importance from an ensemble of regression models, which enhances robustness while maintaining interpretability, and estimates uncertainty of overall importance to formally test its significance. In a clinical study, ShapleyVIC reasonably identified important variables when the random forest and XGBoost failed to, and generally reproduced the findings from smaller subsamples (n = 2500 and 500) when statistical power of the logistic regression became attenuated. Moreover, ShapleyVIC reasonably estimated non-significant importance of race to justify its exclusion from the final prediction model, as opposed to the race-dependent model from the conventional stepwise model building. Hence, ShapleyVIC is robust and interpretable for variable importance assessment, with potential contribution to fairer clinical risk prediction.
Persistent Identifierhttp://hdl.handle.net/10722/351370
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorNing, Yilin-
dc.contributor.authorLi, Siqi-
dc.contributor.authorNg, Yih Yng-
dc.contributor.authorChia, Michael Yih Chong-
dc.contributor.authorGan, Han Nee-
dc.contributor.authorTiah, Ling-
dc.contributor.authorMao, Desmond Renhao-
dc.contributor.authorNg, Wei Ming-
dc.contributor.authorLeong, Benjamin Sieu Hon-
dc.contributor.authorDoctor, Nausheen-
dc.contributor.authorOng, Marcus Eng Hock-
dc.contributor.authorLiu, Nan-
dc.date.accessioned2024-11-20T03:55:53Z-
dc.date.available2024-11-20T03:55:53Z-
dc.date.issued2024-
dc.identifier.citationPLOS Digital Health, 2024, v. 3, n. 7, article no. e0000542-
dc.identifier.urihttp://hdl.handle.net/10722/351370-
dc.description.abstractMachine learning (ML) methods are increasingly used to assess variable importance, but such black box models lack stability when limited in sample sizes, and do not formally indicate non-important factors. The Shapley variable importance cloud (ShapleyVIC) addresses these limitations by assessing variable importance from an ensemble of regression models, which enhances robustness while maintaining interpretability, and estimates uncertainty of overall importance to formally test its significance. In a clinical study, ShapleyVIC reasonably identified important variables when the random forest and XGBoost failed to, and generally reproduced the findings from smaller subsamples (n = 2500 and 500) when statistical power of the logistic regression became attenuated. Moreover, ShapleyVIC reasonably estimated non-significant importance of race to justify its exclusion from the final prediction model, as opposed to the race-dependent model from the conventional stepwise model building. Hence, ShapleyVIC is robust and interpretable for variable importance assessment, with potential contribution to fairer clinical risk prediction.-
dc.languageeng-
dc.relation.ispartofPLOS Digital Health-
dc.titleVariable importance analysis with interpretable machine learning for fair risk prediction-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1371/journal.pdig.0000542-
dc.identifier.scopuseid_2-s2.0-85201515446-
dc.identifier.volume3-
dc.identifier.issue7-
dc.identifier.spagearticle no. e0000542-
dc.identifier.epagearticle no. e0000542-
dc.identifier.eissn2767-3170-
dc.identifier.isiWOS:001416929300001-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats