File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1631/FITEE.2200297
- Scopus: eid_2-s2.0-85135851367
- WOS: WOS:000840009700002
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: On the principles of Parsimony and Self-consistency for the emergence of intelligence
Title | On the principles of Parsimony and Self-consistency for the emergence of intelligence |
---|---|
Authors | |
Keywords | Closed-loop transcription Deep networks Intelligence Parsimony Rate reduction Self-consistency TP18 |
Issue Date | 2022 |
Citation | Frontiers of Information Technology and Electronic Engineering, 2022, v. 23, n. 9, p. 1298-1323 How to Cite? |
Abstract | Ten years into the revival of deep networks and artificial intelligence, we propose a theoretical framework that sheds light on understanding deep networks within a bigger picture of intelligence in general. We introduce two fundamental principles, Parsimony and Self-consistency, which address two fundamental questions regarding intelligence: what to learn and how to learn, respectively. We believe the two principles serve as the cornerstone for the emergence of intelligence, artificial or natural. While they have rich classical roots, we argue that they can be stated anew in entirely measurable and computable ways. More specifically, the two principles lead to an effective and efficient computational framework, compressive closed-loop transcription, which unifies and explains the evolution of modern deep networks and most practices of artificial intelligence. While we use mainly visual data modeling as an example, we believe the two principles will unify understanding of broad families of autonomous intelligent systems and provide a framework for understanding the brain. |
Persistent Identifier | http://hdl.handle.net/10722/327786 |
ISSN | 2023 Impact Factor: 2.7 2023 SCImago Journal Rankings: 0.700 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Ma, Yi | - |
dc.contributor.author | Tsao, Doris | - |
dc.contributor.author | Shum, Heung Yeung | - |
dc.date.accessioned | 2023-05-08T02:26:48Z | - |
dc.date.available | 2023-05-08T02:26:48Z | - |
dc.date.issued | 2022 | - |
dc.identifier.citation | Frontiers of Information Technology and Electronic Engineering, 2022, v. 23, n. 9, p. 1298-1323 | - |
dc.identifier.issn | 2095-9184 | - |
dc.identifier.uri | http://hdl.handle.net/10722/327786 | - |
dc.description.abstract | Ten years into the revival of deep networks and artificial intelligence, we propose a theoretical framework that sheds light on understanding deep networks within a bigger picture of intelligence in general. We introduce two fundamental principles, Parsimony and Self-consistency, which address two fundamental questions regarding intelligence: what to learn and how to learn, respectively. We believe the two principles serve as the cornerstone for the emergence of intelligence, artificial or natural. While they have rich classical roots, we argue that they can be stated anew in entirely measurable and computable ways. More specifically, the two principles lead to an effective and efficient computational framework, compressive closed-loop transcription, which unifies and explains the evolution of modern deep networks and most practices of artificial intelligence. While we use mainly visual data modeling as an example, we believe the two principles will unify understanding of broad families of autonomous intelligent systems and provide a framework for understanding the brain. | - |
dc.language | eng | - |
dc.relation.ispartof | Frontiers of Information Technology and Electronic Engineering | - |
dc.subject | Closed-loop transcription | - |
dc.subject | Deep networks | - |
dc.subject | Intelligence | - |
dc.subject | Parsimony | - |
dc.subject | Rate reduction | - |
dc.subject | Self-consistency | - |
dc.subject | TP18 | - |
dc.title | On the principles of Parsimony and Self-consistency for the emergence of intelligence | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1631/FITEE.2200297 | - |
dc.identifier.scopus | eid_2-s2.0-85135851367 | - |
dc.identifier.volume | 23 | - |
dc.identifier.issue | 9 | - |
dc.identifier.spage | 1298 | - |
dc.identifier.epage | 1323 | - |
dc.identifier.eissn | 2095-9230 | - |
dc.identifier.isi | WOS:000840009700002 | - |