File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Subsampled model aggregation

TitleSubsampled model aggregation
Authors
KeywordsEnsemble Learning
Machine Learning
Subsampling
Issue Date2005
PublisherWorld Scientific Publishing Co Pte Ltd. The Journal's web site is located at http://www.worldscinet.com/ijait/ijait.shtml
Citation
International Journal On Artificial Intelligence Tools, 2005, v. 14 n. 3, p. 385-397 How to Cite?
AbstractThere has been a recent push for a new framework of learning, due in part to the availability of storage, networking and the abundance of very large datasets. This framework argues for parallelized learning algorithms that can operate on a distributed platform and have the ability to terminate early in the likely event that data size is too inundating. Methods described herein propose a subsampled model aggregation technique based on the bagging algorithm. It is capable of significant run-time reduction with no loss in modeling performance. These claims were validated with a variety of base-learning algorithms on large web and newswire datasets. © World Scientific Publishing Company.
Persistent Identifierhttp://hdl.handle.net/10722/152339
ISSN
2023 Impact Factor: 1.0
2023 SCImago Journal Rankings: 0.288
ISI Accession Number ID
References

 

DC FieldValueLanguage
dc.contributor.authorKao, Ben_US
dc.contributor.authorKatriel, Ren_US
dc.date.accessioned2012-06-26T06:37:18Z-
dc.date.available2012-06-26T06:37:18Z-
dc.date.issued2005en_US
dc.identifier.citationInternational Journal On Artificial Intelligence Tools, 2005, v. 14 n. 3, p. 385-397en_US
dc.identifier.issn0218-2130en_US
dc.identifier.urihttp://hdl.handle.net/10722/152339-
dc.description.abstractThere has been a recent push for a new framework of learning, due in part to the availability of storage, networking and the abundance of very large datasets. This framework argues for parallelized learning algorithms that can operate on a distributed platform and have the ability to terminate early in the likely event that data size is too inundating. Methods described herein propose a subsampled model aggregation technique based on the bagging algorithm. It is capable of significant run-time reduction with no loss in modeling performance. These claims were validated with a variety of base-learning algorithms on large web and newswire datasets. © World Scientific Publishing Company.en_US
dc.languageengen_US
dc.publisherWorld Scientific Publishing Co Pte Ltd. The Journal's web site is located at http://www.worldscinet.com/ijait/ijait.shtmlen_US
dc.relation.ispartofInternational Journal on Artificial Intelligence Toolsen_US
dc.subjectEnsemble Learningen_US
dc.subjectMachine Learningen_US
dc.subjectSubsamplingen_US
dc.titleSubsampled model aggregationen_US
dc.typeArticleen_US
dc.identifier.emailKao, B:kao@cs.hku.hken_US
dc.identifier.authorityKao, B=rp00123en_US
dc.description.naturelink_to_subscribed_fulltexten_US
dc.identifier.doi10.1142/S0218213005002168en_US
dc.identifier.scopuseid_2-s2.0-33746225646en_US
dc.relation.referenceshttp://www.scopus.com/mlt/select.url?eid=2-s2.0-33746225646&selection=ref&src=s&origin=recordpageen_US
dc.identifier.volume14en_US
dc.identifier.issue3en_US
dc.identifier.spage385en_US
dc.identifier.epage397en_US
dc.identifier.isiWOS:000233469000002-
dc.publisher.placeSingaporeen_US
dc.identifier.scopusauthoridKao, B=35221592600en_US
dc.identifier.scopusauthoridKatriel, R=14045433000en_US
dc.identifier.issnl0218-2130-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats