File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Incorporating deep features in the analysis of tissue microarray images

TitleIncorporating deep features in the analysis of tissue microarray images
Authors
KeywordsRecursive space partitioning
Hierarchical clustering
Tissue microarray images
Deep representation learning
Automatic scoring
Issue Date2019
Citation
Statistics and its Interface, 2019, v. 12, n. 2, p. 283-293 How to Cite?
Abstract© 2019, International Press of Boston, Inc. Tissue microarray (TMA) images have been used increasingly often in cancer studies and the validation of biomarkers. TACOMA-a cutting-edge automatic scoring algorithm for TMA images-is comparable to pathologists in terms of accuracy and repeatability. Here we consider how this algorithm may be further improved. Inspired by the recent success of deep learning, we propose to incorporate representations learnable through computation. We explore representations of a group nature through unsupervised learning, e.g., hierarchical clustering and recursive space partition. Information carried by clustering or spatial partitioning may be more concrete than the labels when the data are heterogeneous, or could help when the labels are noisy. The use of such information could be viewed as regularization in model fitting. It is motivated by major challenges in TMA image scoring-heterogeneity and label noise, and the cluster assumption in semi-supervised learning. Using this information on TMA images of breast cancer, we have reduced the error rate of TACOMA by about 6%. Further simulations on synthetic data provide insights on when such representations would likely help. Although we focus on TMAs, learnable representations of this type are expected to be applicable in other settings.
Persistent Identifierhttp://hdl.handle.net/10722/296874
ISSN
2023 Impact Factor: 0.3
2023 SCImago Journal Rankings: 0.273

 

DC FieldValueLanguage
dc.contributor.authorYan, Donghui-
dc.contributor.authorRandolph, Timothy-
dc.contributor.authorZou, Jian-
dc.contributor.authorGong, Peng-
dc.date.accessioned2021-02-25T15:16:52Z-
dc.date.available2021-02-25T15:16:52Z-
dc.date.issued2019-
dc.identifier.citationStatistics and its Interface, 2019, v. 12, n. 2, p. 283-293-
dc.identifier.issn1938-7989-
dc.identifier.urihttp://hdl.handle.net/10722/296874-
dc.description.abstract© 2019, International Press of Boston, Inc. Tissue microarray (TMA) images have been used increasingly often in cancer studies and the validation of biomarkers. TACOMA-a cutting-edge automatic scoring algorithm for TMA images-is comparable to pathologists in terms of accuracy and repeatability. Here we consider how this algorithm may be further improved. Inspired by the recent success of deep learning, we propose to incorporate representations learnable through computation. We explore representations of a group nature through unsupervised learning, e.g., hierarchical clustering and recursive space partition. Information carried by clustering or spatial partitioning may be more concrete than the labels when the data are heterogeneous, or could help when the labels are noisy. The use of such information could be viewed as regularization in model fitting. It is motivated by major challenges in TMA image scoring-heterogeneity and label noise, and the cluster assumption in semi-supervised learning. Using this information on TMA images of breast cancer, we have reduced the error rate of TACOMA by about 6%. Further simulations on synthetic data provide insights on when such representations would likely help. Although we focus on TMAs, learnable representations of this type are expected to be applicable in other settings.-
dc.languageeng-
dc.relation.ispartofStatistics and its Interface-
dc.subjectRecursive space partitioning-
dc.subjectHierarchical clustering-
dc.subjectTissue microarray images-
dc.subjectDeep representation learning-
dc.subjectAutomatic scoring-
dc.titleIncorporating deep features in the analysis of tissue microarray images-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.4310/SII.2019.v12.n2.a9-
dc.identifier.scopuseid_2-s2.0-85063733110-
dc.identifier.volume12-
dc.identifier.issue2-
dc.identifier.spage283-
dc.identifier.epage293-
dc.identifier.eissn1938-7997-
dc.identifier.issnl1938-7989-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats