File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1142/S0218213005002168
- Scopus: eid_2-s2.0-33746225646
- WOS: WOS:000233469000002
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Subsampled model aggregation
Title | Subsampled model aggregation |
---|---|
Authors | |
Keywords | Ensemble Learning Machine Learning Subsampling |
Issue Date | 2005 |
Publisher | World Scientific Publishing Co Pte Ltd. The Journal's web site is located at http://www.worldscinet.com/ijait/ijait.shtml |
Citation | International Journal On Artificial Intelligence Tools, 2005, v. 14 n. 3, p. 385-397 How to Cite? |
Abstract | There has been a recent push for a new framework of learning, due in part to the availability of storage, networking and the abundance of very large datasets. This framework argues for parallelized learning algorithms that can operate on a distributed platform and have the ability to terminate early in the likely event that data size is too inundating. Methods described herein propose a subsampled model aggregation technique based on the bagging algorithm. It is capable of significant run-time reduction with no loss in modeling performance. These claims were validated with a variety of base-learning algorithms on large web and newswire datasets. © World Scientific Publishing Company. |
Persistent Identifier | http://hdl.handle.net/10722/152339 |
ISSN | 2023 Impact Factor: 1.0 2023 SCImago Journal Rankings: 0.288 |
ISI Accession Number ID | |
References |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kao, B | en_US |
dc.contributor.author | Katriel, R | en_US |
dc.date.accessioned | 2012-06-26T06:37:18Z | - |
dc.date.available | 2012-06-26T06:37:18Z | - |
dc.date.issued | 2005 | en_US |
dc.identifier.citation | International Journal On Artificial Intelligence Tools, 2005, v. 14 n. 3, p. 385-397 | en_US |
dc.identifier.issn | 0218-2130 | en_US |
dc.identifier.uri | http://hdl.handle.net/10722/152339 | - |
dc.description.abstract | There has been a recent push for a new framework of learning, due in part to the availability of storage, networking and the abundance of very large datasets. This framework argues for parallelized learning algorithms that can operate on a distributed platform and have the ability to terminate early in the likely event that data size is too inundating. Methods described herein propose a subsampled model aggregation technique based on the bagging algorithm. It is capable of significant run-time reduction with no loss in modeling performance. These claims were validated with a variety of base-learning algorithms on large web and newswire datasets. © World Scientific Publishing Company. | en_US |
dc.language | eng | en_US |
dc.publisher | World Scientific Publishing Co Pte Ltd. The Journal's web site is located at http://www.worldscinet.com/ijait/ijait.shtml | en_US |
dc.relation.ispartof | International Journal on Artificial Intelligence Tools | en_US |
dc.subject | Ensemble Learning | en_US |
dc.subject | Machine Learning | en_US |
dc.subject | Subsampling | en_US |
dc.title | Subsampled model aggregation | en_US |
dc.type | Article | en_US |
dc.identifier.email | Kao, B:kao@cs.hku.hk | en_US |
dc.identifier.authority | Kao, B=rp00123 | en_US |
dc.description.nature | link_to_subscribed_fulltext | en_US |
dc.identifier.doi | 10.1142/S0218213005002168 | en_US |
dc.identifier.scopus | eid_2-s2.0-33746225646 | en_US |
dc.relation.references | http://www.scopus.com/mlt/select.url?eid=2-s2.0-33746225646&selection=ref&src=s&origin=recordpage | en_US |
dc.identifier.volume | 14 | en_US |
dc.identifier.issue | 3 | en_US |
dc.identifier.spage | 385 | en_US |
dc.identifier.epage | 397 | en_US |
dc.identifier.isi | WOS:000233469000002 | - |
dc.publisher.place | Singapore | en_US |
dc.identifier.scopusauthorid | Kao, B=35221592600 | en_US |
dc.identifier.scopusauthorid | Katriel, R=14045433000 | en_US |
dc.identifier.issnl | 0218-2130 | - |