File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Robust Educational Dialogue Act Classifiers with Low-Resource and Imbalanced Datasets

TitleRobust Educational Dialogue Act Classifiers with Low-Resource and Imbalanced Datasets
Authors
KeywordsEducational Dialogue Act Classification
Imbalanced Data
Large Language Models
Low-Resource Data
Model Robustness
Issue Date2023
Citation
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2023, v. 13916 LNAI, p. 114-125 How to Cite?
AbstractDialogue acts (DAs) can represent conversational actions of tutors or students that take place during tutoring dialogues. Automating the identification of DAs in tutoring dialogues is significant to the design of dialogue-based intelligent tutoring systems. Many prior studies employ machine learning models to classify DAs in tutoring dialogues and invest much effort to optimize the classification accuracy by using limited amounts of training data (i.e., low-resource data scenario). However, beyond the classification accuracy, the robustness of the classifier is also important, which can reflect the capability of the classifier on learning the patterns from different class distributions. We note that many prior studies on classifying educational DAs employ cross entropy (CE) loss to optimize DA classifiers on low-resource data with imbalanced DA distribution. The DA classifiers in these studies tend to prioritize accuracy on the majority class at the expense of the minority class which might not be robust to the data with imbalanced ratios of different DA classes. To optimize the robustness of classifiers on imbalanced class distributions, we propose to optimize the performance of the DA classifier by maximizing the area under the ROC curve (AUC) score (i.e., AUC maximization). Through extensive experiments, our study provides evidence that (i) by maximizing AUC in the training process, the DA classifier achieves significant performance improvement compared to the CE approach under low-resource data, and (ii) AUC maximization approaches can improve the robustness of the DA classifier under different class imbalance ratios.
Persistent Identifierhttp://hdl.handle.net/10722/354283
ISSN
2023 SCImago Journal Rankings: 0.606
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorLin, Jionghao-
dc.contributor.authorTan, Wei-
dc.contributor.authorNguyen, Ngoc Dang-
dc.contributor.authorLang, David-
dc.contributor.authorDu, Lan-
dc.contributor.authorBuntine, Wray-
dc.contributor.authorBeare, Richard-
dc.contributor.authorChen, Guanliang-
dc.contributor.authorGašević, Dragan-
dc.date.accessioned2025-02-07T08:47:39Z-
dc.date.available2025-02-07T08:47:39Z-
dc.date.issued2023-
dc.identifier.citationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2023, v. 13916 LNAI, p. 114-125-
dc.identifier.issn0302-9743-
dc.identifier.urihttp://hdl.handle.net/10722/354283-
dc.description.abstractDialogue acts (DAs) can represent conversational actions of tutors or students that take place during tutoring dialogues. Automating the identification of DAs in tutoring dialogues is significant to the design of dialogue-based intelligent tutoring systems. Many prior studies employ machine learning models to classify DAs in tutoring dialogues and invest much effort to optimize the classification accuracy by using limited amounts of training data (i.e., low-resource data scenario). However, beyond the classification accuracy, the robustness of the classifier is also important, which can reflect the capability of the classifier on learning the patterns from different class distributions. We note that many prior studies on classifying educational DAs employ cross entropy (CE) loss to optimize DA classifiers on low-resource data with imbalanced DA distribution. The DA classifiers in these studies tend to prioritize accuracy on the majority class at the expense of the minority class which might not be robust to the data with imbalanced ratios of different DA classes. To optimize the robustness of classifiers on imbalanced class distributions, we propose to optimize the performance of the DA classifier by maximizing the area under the ROC curve (AUC) score (i.e., AUC maximization). Through extensive experiments, our study provides evidence that (i) by maximizing AUC in the training process, the DA classifier achieves significant performance improvement compared to the CE approach under low-resource data, and (ii) AUC maximization approaches can improve the robustness of the DA classifier under different class imbalance ratios.-
dc.languageeng-
dc.relation.ispartofLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)-
dc.subjectEducational Dialogue Act Classification-
dc.subjectImbalanced Data-
dc.subjectLarge Language Models-
dc.subjectLow-Resource Data-
dc.subjectModel Robustness-
dc.titleRobust Educational Dialogue Act Classifiers with Low-Resource and Imbalanced Datasets-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1007/978-3-031-36272-9_10-
dc.identifier.scopuseid_2-s2.0-85164911268-
dc.identifier.volume13916 LNAI-
dc.identifier.spage114-
dc.identifier.epage125-
dc.identifier.eissn1611-3349-
dc.identifier.isiWOS:001321530100010-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats