File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

postgraduate thesis: Improving discrete AdaBoost for classification by randomization methods

TitleImproving discrete AdaBoost for classification by randomization methods
Authors
Issue Date2016
PublisherThe University of Hong Kong (Pokfulam, Hong Kong)
Citation
Dong, F. [董凤娇]. (2016). Improving discrete AdaBoost for classification by randomization methods. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5760964
AbstractAdaboost, a typical boosting method for classification, performs well in classification problems. Many researchers have applied different types of randomization techniques to Adaboost for further improving the efficiency of classification. However, these methods of randomization seldom aim at the chance mechanism underlying the training data itself, especially at the response level. We propose a new modified Adaboost procedure which takes into account the chance mechanism. Three different methods are investigated for estimating the conditional probabilities of class labels given feature covariates, based on which the class labels are randomized within the training dataset. The first method, which we term quantile calibration, makes use of a reweighting scheme to find a reliable interval containing the conditional class probability. The second method applies Bootstrap Aggregating to obtain an equal weight ensemble vote for each class label. The third method exploits a well-known connection between the score function of AdaBoost and class probabilities under an additive logistic regression setup. Empirical results show that our new procedure successfully alleviates the overfitting problem, and in many cases improves the classification performance of Adaboost as well.
DegreeMaster of Philosophy
SubjectBoosting (Algorithms)
Dept/ProgramStatistics and Actuarial Science
Persistent Identifierhttp://hdl.handle.net/10722/226781
HKU Library Item IDb5760964

 

DC FieldValueLanguage
dc.contributor.authorDong, Fengjiao-
dc.contributor.author董凤娇-
dc.date.accessioned2016-06-30T04:24:09Z-
dc.date.available2016-06-30T04:24:09Z-
dc.date.issued2016-
dc.identifier.citationDong, F. [董凤娇]. (2016). Improving discrete AdaBoost for classification by randomization methods. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5760964-
dc.identifier.urihttp://hdl.handle.net/10722/226781-
dc.description.abstractAdaboost, a typical boosting method for classification, performs well in classification problems. Many researchers have applied different types of randomization techniques to Adaboost for further improving the efficiency of classification. However, these methods of randomization seldom aim at the chance mechanism underlying the training data itself, especially at the response level. We propose a new modified Adaboost procedure which takes into account the chance mechanism. Three different methods are investigated for estimating the conditional probabilities of class labels given feature covariates, based on which the class labels are randomized within the training dataset. The first method, which we term quantile calibration, makes use of a reweighting scheme to find a reliable interval containing the conditional class probability. The second method applies Bootstrap Aggregating to obtain an equal weight ensemble vote for each class label. The third method exploits a well-known connection between the score function of AdaBoost and class probabilities under an additive logistic regression setup. Empirical results show that our new procedure successfully alleviates the overfitting problem, and in many cases improves the classification performance of Adaboost as well.-
dc.languageeng-
dc.publisherThe University of Hong Kong (Pokfulam, Hong Kong)-
dc.relation.ispartofHKU Theses Online (HKUTO)-
dc.rightsThe author retains all proprietary rights, (such as patent rights) and the right to use in future works.-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subject.lcshBoosting (Algorithms)-
dc.titleImproving discrete AdaBoost for classification by randomization methods-
dc.typePG_Thesis-
dc.identifier.hkulb5760964-
dc.description.thesisnameMaster of Philosophy-
dc.description.thesislevelMaster-
dc.description.thesisdisciplineStatistics and Actuarial Science-
dc.description.naturepublished_or_final_version-
dc.identifier.doi10.5353/th_b5760964-
dc.identifier.mmsid991019898389703414-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats