File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1371/journal.pdig.0000542
- Scopus: eid_2-s2.0-85201515446
- WOS: WOS:001416929300001
Supplementary
- Citations:
- Appears in Collections:
Article: Variable importance analysis with interpretable machine learning for fair risk prediction
| Title | Variable importance analysis with interpretable machine learning for fair risk prediction |
|---|---|
| Authors | |
| Issue Date | 2024 |
| Citation | PLOS Digital Health, 2024, v. 3, n. 7, article no. e0000542 How to Cite? |
| Abstract | Machine learning (ML) methods are increasingly used to assess variable importance, but such black box models lack stability when limited in sample sizes, and do not formally indicate non-important factors. The Shapley variable importance cloud (ShapleyVIC) addresses these limitations by assessing variable importance from an ensemble of regression models, which enhances robustness while maintaining interpretability, and estimates uncertainty of overall importance to formally test its significance. In a clinical study, ShapleyVIC reasonably identified important variables when the random forest and XGBoost failed to, and generally reproduced the findings from smaller subsamples (n = 2500 and 500) when statistical power of the logistic regression became attenuated. Moreover, ShapleyVIC reasonably estimated non-significant importance of race to justify its exclusion from the final prediction model, as opposed to the race-dependent model from the conventional stepwise model building. Hence, ShapleyVIC is robust and interpretable for variable importance assessment, with potential contribution to fairer clinical risk prediction. |
| Persistent Identifier | http://hdl.handle.net/10722/351370 |
| ISI Accession Number ID |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Ning, Yilin | - |
| dc.contributor.author | Li, Siqi | - |
| dc.contributor.author | Ng, Yih Yng | - |
| dc.contributor.author | Chia, Michael Yih Chong | - |
| dc.contributor.author | Gan, Han Nee | - |
| dc.contributor.author | Tiah, Ling | - |
| dc.contributor.author | Mao, Desmond Renhao | - |
| dc.contributor.author | Ng, Wei Ming | - |
| dc.contributor.author | Leong, Benjamin Sieu Hon | - |
| dc.contributor.author | Doctor, Nausheen | - |
| dc.contributor.author | Ong, Marcus Eng Hock | - |
| dc.contributor.author | Liu, Nan | - |
| dc.date.accessioned | 2024-11-20T03:55:53Z | - |
| dc.date.available | 2024-11-20T03:55:53Z | - |
| dc.date.issued | 2024 | - |
| dc.identifier.citation | PLOS Digital Health, 2024, v. 3, n. 7, article no. e0000542 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/351370 | - |
| dc.description.abstract | Machine learning (ML) methods are increasingly used to assess variable importance, but such black box models lack stability when limited in sample sizes, and do not formally indicate non-important factors. The Shapley variable importance cloud (ShapleyVIC) addresses these limitations by assessing variable importance from an ensemble of regression models, which enhances robustness while maintaining interpretability, and estimates uncertainty of overall importance to formally test its significance. In a clinical study, ShapleyVIC reasonably identified important variables when the random forest and XGBoost failed to, and generally reproduced the findings from smaller subsamples (n = 2500 and 500) when statistical power of the logistic regression became attenuated. Moreover, ShapleyVIC reasonably estimated non-significant importance of race to justify its exclusion from the final prediction model, as opposed to the race-dependent model from the conventional stepwise model building. Hence, ShapleyVIC is robust and interpretable for variable importance assessment, with potential contribution to fairer clinical risk prediction. | - |
| dc.language | eng | - |
| dc.relation.ispartof | PLOS Digital Health | - |
| dc.title | Variable importance analysis with interpretable machine learning for fair risk prediction | - |
| dc.type | Article | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1371/journal.pdig.0000542 | - |
| dc.identifier.scopus | eid_2-s2.0-85201515446 | - |
| dc.identifier.volume | 3 | - |
| dc.identifier.issue | 7 | - |
| dc.identifier.spage | article no. e0000542 | - |
| dc.identifier.epage | article no. e0000542 | - |
| dc.identifier.eissn | 2767-3170 | - |
| dc.identifier.isi | WOS:001416929300001 | - |
