File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: A Note on the Equivalence of the Unidimensional Item Response Theory and Cognitive Diagnosis Models
Title | A Note on the Equivalence of the Unidimensional Item Response Theory and Cognitive Diagnosis Models |
---|---|
Authors | |
Issue Date | 2018 |
Publisher | Columbia University. |
Citation | Conference on Statistical Methods for Innovative Testing and Learning, Department of Statistics, Columbia University, New York, USA, 7-8 July 2018 How to Cite? |
Abstract | At present, most existing educational assessments are developed and analyzed using unidimensional item response theory (IRT) models. To obtain information that can be used for diagnostic purposes, the same assessments have also been retrofitted with cognitive diagnosis models (CDMs). However, it remains unclear the extent to which two disparate psychometric frameworks can be simultaneously used to analyze the same assessment. To address this issue, we propose a framework for relating the two classes of psychometric models, as well as boundaries as to when this can be done. Specifically, we impose certain conditions on the higher-order generalized deterministic inputs, noisy and gate (HO-GDINA) model, and reformulate its success probability as a function of the higher-order ability. It
can be shown that with appropriate constraints, the HO-GDINA model reduces to the four unidimensional IRT models when only a single attribute is required. When two or more attributes are required, the item response function of the HO-GDINA model can be well approximated by unidimensional IRT models. We investigate a number of factors (e.g., slope and intercept of the higher-order structure, guessing and slip of the item parameters, sample size) to determine their impact on the quality of item parameter approximation, as well as ability estimation. Preliminary results show that a wider range of intercept values leads to a lower bias in the IRT parameter estimates and a slightly higher correlation between the true and estimated abilities. Additionally, more discriminating items and larger sample size lead to a higher correlation between the true and estimated abilities. |
Description | Invited Presentation |
Persistent Identifier | http://hdl.handle.net/10722/270294 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | de la Torre, J | - |
dc.date.accessioned | 2019-05-24T09:09:04Z | - |
dc.date.available | 2019-05-24T09:09:04Z | - |
dc.date.issued | 2018 | - |
dc.identifier.citation | Conference on Statistical Methods for Innovative Testing and Learning, Department of Statistics, Columbia University, New York, USA, 7-8 July 2018 | - |
dc.identifier.uri | http://hdl.handle.net/10722/270294 | - |
dc.description | Invited Presentation | - |
dc.description.abstract | At present, most existing educational assessments are developed and analyzed using unidimensional item response theory (IRT) models. To obtain information that can be used for diagnostic purposes, the same assessments have also been retrofitted with cognitive diagnosis models (CDMs). However, it remains unclear the extent to which two disparate psychometric frameworks can be simultaneously used to analyze the same assessment. To address this issue, we propose a framework for relating the two classes of psychometric models, as well as boundaries as to when this can be done. Specifically, we impose certain conditions on the higher-order generalized deterministic inputs, noisy and gate (HO-GDINA) model, and reformulate its success probability as a function of the higher-order ability. It can be shown that with appropriate constraints, the HO-GDINA model reduces to the four unidimensional IRT models when only a single attribute is required. When two or more attributes are required, the item response function of the HO-GDINA model can be well approximated by unidimensional IRT models. We investigate a number of factors (e.g., slope and intercept of the higher-order structure, guessing and slip of the item parameters, sample size) to determine their impact on the quality of item parameter approximation, as well as ability estimation. Preliminary results show that a wider range of intercept values leads to a lower bias in the IRT parameter estimates and a slightly higher correlation between the true and estimated abilities. Additionally, more discriminating items and larger sample size lead to a higher correlation between the true and estimated abilities. | - |
dc.language | eng | - |
dc.publisher | Columbia University. | - |
dc.relation.ispartof | Conference on Statistical Methods for Innovative Testing and Learning, Department of Statistics, Columbia University, New York | - |
dc.title | A Note on the Equivalence of the Unidimensional Item Response Theory and Cognitive Diagnosis Models | - |
dc.type | Conference_Paper | - |
dc.identifier.email | de la Torre, J: jdltorre@hku.hk | - |
dc.identifier.authority | de la Torre, J=rp02159 | - |
dc.identifier.hkuros | 288959 | - |
dc.publisher.place | New York, NY | - |