File Download
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.jai.2025.03.004
- Scopus: eid_2-s2.0-105002757287
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Article: DPCIPI: A pre-trained deep learning model for predicting cross-immunity between drifted strains of Influenza A/H3N2
| Title | DPCIPI: A pre-trained deep learning model for predicting cross-immunity between drifted strains of Influenza A/H3N2 |
|---|---|
| Authors | |
| Keywords | Cross-immunity prediction Deep learning Hemagglutination inhibition Influenza strains Pre-trained model |
| Issue Date | 30-Jun-2025 |
| Publisher | Elsevier |
| Citation | Journal of Automation and Intelligence, 2025, v. 4, n. 2, p. 115-124 How to Cite? |
| Abstract | Predicting cross-immunity between viral strains is vital for public health surveillance and vaccine development. Traditional neural network methods, such as BiLSTM, could be ineffective due to the lack of lab data for model training and the overshadowing of crucial features within sequence concatenation. The current work proposes a less data-consuming model incorporating a pre-trained gene sequence model and a mutual information inference operator. Our methodology utilizes gene alignment and deduplication algorithms to preprocess gene sequences, enhancing the model's capacity to discern and focus on distinctions among input gene pairs. The model, i.e., DNA Pretrained Cross-Immunity Protection Inference model (DPCIPI), outperforms state-of-the-art (SOTA) models in predicting hemagglutination inhibition titer from influenza viral gene sequences only. Improvement in binary cross-immunity prediction is 1.58% in F1, 2.34% in precision, 1.57% in recall, and 1.57% in Accuracy. For multilevel cross-immunity improvements, the improvement is 2.12% in F1, 3.50% in precision, 2.19% in recall, and 2.19% in Accuracy. Our study showcases the potential of pre-trained gene models to improve predictions of antigenic variation and cross-immunity. With expanding gene data and advancements in pre-trained models, this approach promises significant impacts on vaccine development and public health. |
| Persistent Identifier | http://hdl.handle.net/10722/364151 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Du, Yiming | - |
| dc.contributor.author | Li, Zhuotian | - |
| dc.contributor.author | He, Qian | - |
| dc.contributor.author | Tulu, Thomas Wetere | - |
| dc.contributor.author | Chan, Kei Hang Katie | - |
| dc.contributor.author | Wang, Lin | - |
| dc.contributor.author | Pei, Sen | - |
| dc.contributor.author | Du, Zhanwei | - |
| dc.contributor.author | Wang, Zhen | - |
| dc.contributor.author | Xu, Xiao Ke | - |
| dc.contributor.author | Liu, Xiao Fan | - |
| dc.date.accessioned | 2025-10-23T00:35:17Z | - |
| dc.date.available | 2025-10-23T00:35:17Z | - |
| dc.date.issued | 2025-06-30 | - |
| dc.identifier.citation | Journal of Automation and Intelligence, 2025, v. 4, n. 2, p. 115-124 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/364151 | - |
| dc.description.abstract | <p>Predicting cross-immunity between viral strains is vital for public health surveillance and vaccine development. Traditional neural network methods, such as BiLSTM, could be ineffective due to the lack of lab data for model training and the overshadowing of crucial features within sequence concatenation. The current work proposes a less data-consuming model incorporating a pre-trained gene sequence model and a mutual information inference operator. Our methodology utilizes gene alignment and deduplication algorithms to preprocess gene sequences, enhancing the model's capacity to discern and focus on distinctions among input gene pairs. The model, i.e., DNA Pretrained Cross-Immunity Protection Inference model (DPCIPI), outperforms state-of-the-art (SOTA) models in predicting hemagglutination inhibition titer from influenza viral gene sequences only. Improvement in binary cross-immunity prediction is 1.58% in F1, 2.34% in precision, 1.57% in recall, and 1.57% in Accuracy. For multilevel cross-immunity improvements, the improvement is 2.12% in F1, 3.50% in precision, 2.19% in recall, and 2.19% in Accuracy. Our study showcases the potential of pre-trained gene models to improve predictions of antigenic variation and cross-immunity. With expanding gene data and advancements in pre-trained models, this approach promises significant impacts on vaccine development and public health.</p> | - |
| dc.language | eng | - |
| dc.publisher | Elsevier | - |
| dc.relation.ispartof | Journal of Automation and Intelligence | - |
| dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
| dc.subject | Cross-immunity prediction | - |
| dc.subject | Deep learning | - |
| dc.subject | Hemagglutination inhibition | - |
| dc.subject | Influenza strains | - |
| dc.subject | Pre-trained model | - |
| dc.title | DPCIPI: A pre-trained deep learning model for predicting cross-immunity between drifted strains of Influenza A/H3N2 | - |
| dc.type | Article | - |
| dc.description.nature | published_or_final_version | - |
| dc.identifier.doi | 10.1016/j.jai.2025.03.004 | - |
| dc.identifier.scopus | eid_2-s2.0-105002757287 | - |
| dc.identifier.volume | 4 | - |
| dc.identifier.issue | 2 | - |
| dc.identifier.spage | 115 | - |
| dc.identifier.epage | 124 | - |
| dc.identifier.eissn | 2949-8554 | - |
