File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1007/978-1-4614-4457-2_8
- Scopus: eid_2-s2.0-84940693788
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Book Chapter: A flexible and effective linearization method for subspace learning
Title | A flexible and effective linearization method for subspace learning |
---|---|
Authors | |
Issue Date | 2013 |
Publisher | Springer |
Citation | A Flexible and Effective Linearization Method for Subspace Learning. In Fu, Y & Ma, Y (Eds.), Graph Embedding for Pattern Analysis, p. 177-203. New York, NY: Springer, 2013 How to Cite? |
Abstract | In the past decades, a large number of subspace learning or dimension reduction methods [2,16,20,32,34,37,44] have been proposed. Principal component analysis (PCA) [32] pursues the directions of maximum variance for optimal reconstruction. Linear discriminant analysis (LDA) [2], as a supervised algorithm, aims to maximize the inter-class scatter and at the same timeminimize the intra-class scatter. Due to utilization of label information, LDA is experimentally reported to outperform PCA for face recognition, when sufficient labeled face images are provided [2]. |
Persistent Identifier | http://hdl.handle.net/10722/321642 |
ISBN |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Nie, Feiping | - |
dc.contributor.author | Xu, Dong | - |
dc.contributor.author | Tsang, Ivor W. | - |
dc.contributor.author | Zhang, Changshui | - |
dc.date.accessioned | 2022-11-03T02:20:26Z | - |
dc.date.available | 2022-11-03T02:20:26Z | - |
dc.date.issued | 2013 | - |
dc.identifier.citation | A Flexible and Effective Linearization Method for Subspace Learning. In Fu, Y & Ma, Y (Eds.), Graph Embedding for Pattern Analysis, p. 177-203. New York, NY: Springer, 2013 | - |
dc.identifier.isbn | 9781461444565 | - |
dc.identifier.uri | http://hdl.handle.net/10722/321642 | - |
dc.description.abstract | In the past decades, a large number of subspace learning or dimension reduction methods [2,16,20,32,34,37,44] have been proposed. Principal component analysis (PCA) [32] pursues the directions of maximum variance for optimal reconstruction. Linear discriminant analysis (LDA) [2], as a supervised algorithm, aims to maximize the inter-class scatter and at the same timeminimize the intra-class scatter. Due to utilization of label information, LDA is experimentally reported to outperform PCA for face recognition, when sufficient labeled face images are provided [2]. | - |
dc.language | eng | - |
dc.publisher | Springer | - |
dc.relation.ispartof | Graph Embedding for Pattern Analysis | - |
dc.title | A flexible and effective linearization method for subspace learning | - |
dc.type | Book_Chapter | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1007/978-1-4614-4457-2_8 | - |
dc.identifier.scopus | eid_2-s2.0-84940693788 | - |
dc.identifier.spage | 177 | - |
dc.identifier.epage | 203 | - |
dc.publisher.place | New York | - |