Platzhalter Logo

Tutorial

The program for the scientific program can be found here.

Pre-conference tutorial

Cognitive Diagnosis Modeling: A General Framework Approach and Its Implementation in R

Jimmy de la Torre, The University of Hong Kong
Miguel A. Sorrel, Universidad Autónoma de Madrid


Abstract
This tutorial aims to provide participants the necessary practical experience to use cognitive diagnosis models (CDMs) in applied settings. It will also highlight the theoretical underpinnings needed for the proper use of CDMs.

In this tutorial, participants will be introduced to a proportional reasoning (PR) assessment that was developed from scratch using a CDM paradigm. Participants will get opportunities to work with PR assessment-based data. Moreover, they will learn how to use GDINA , an R package developed by Ma and de la Torre (in press) for a series of CDM analyses (e.g., model calibration, CDM evaluation at item and test levels, Q-matrix validation, differential item functioning analysis). To ensure the proper use of CDMs, the theoretical bases for these analyses will be discussed.

The intended audience of the tutorial includes anyone interested in CDMs who has some familiarity with item response theory and the R programming language. No previous knowledge of CDM is required. By the end of the session, participants are expected to have a basic understanding of the theoretical underpinnings of CDM, as well as the ability to conduct various CDM analyses using the GDINA package. Participants will be requested to bring their laptops for the GDINA package hands-on exercises.


Summary

Goals


The tutorial aims to (1) provide an overview of CDMs and some recent developments therein, (2) give participants a hands-on experience conducting various CDM analyses using PR assessment-based data, and (3) introduce participants to the GDINA R package as a tool for carrying out comprehensive CDM analyses.

Importance of the Topic


Unlike traditional item response models, CDMs aim to provide information that is finer-grained and more relevant to classroom instruction and learning. As a state-of-the-art methodology, CDM is not typically offered as a regular course in most measurement programs, and may be novel to many practitioners. Therefore, this tutorial will be useful to faculty and students specializing in educational measurement, as well as professionals working in government or testing organizations.
Furthermore, at present, very few assessments used to provide diagnostic information are developed using a CDM framework. An exception is the PR assessment developed by the lead instructor of this tutorial, Dr. de la Torre, based on a National Science Foundation grant. He will share his experience in developing the assessment, which could be useful for participants interested in developing their own diagnostic assessments.

Lastly, very few computer programs that can be used for CDM analyses are currently available, and many of them suffer from various limitations. In this tutorial, the GDINA package will be introduced. This package overcomes several drawbacks in existing software packages, and offers a set of functions for CDM analyses, such as calibration of various diagnostic models and validation of the Q-matrix. After this tutorial, participants are expected to be able to conduct various CDM analyses using the GDINA package.

Presenters


Dr. Jimmy de la Torre is a Professor at the Faculty of Education at The University of Hong Kong. His research interests include latent variable models for educational and psychological measurement, and how assessment can be used to improve classroom instruction and learning. His work in the area of CDM includes development of various cognitive diagnosis models, implementation of estimation codes for cognitive diagnosis models, and development of a general framework for model estimation, test comparison, and Q-matrix validation. He is an ardent advocate of CDM, and to date, has conducted more than two dozens of national and international CDM workshops. Jimmy was the recipient of the 2008 Presidential Early Career Award for Scientists and Engineers given by the White House, and the 2009 Jason Millman Promising Measurement Scholar Award and the 2017 Bradley Hanson Award for Contribution in Educational Measurement given by the National Council on Measurement in Education.

Dr. Miguel A. Sorrel is an Interim Associate Professor at the Universidad Autónoma de Madrid. His research focuses on CDM, item response theory, and computerized adaptive testing. He has published papers on CDM in Applied Psychological Measurement, Educational and Psychological Measurement, and Organizational Research Methods. He is also a contributor of the GDINA package and the main developer of the cdcatR package.

Dr. Jimmy de la Torre       Dr. Miguel A. Sorrel


Schedule
The full-day tutorial consists of seven 50-minute sessions, two coffee breaks, and a lunch break. The structure of the training session is given below, and is followed by a brief description of each session.
Time Topic
09:00-09:10 Welcome
09:10-10:00 (1) Introduction to CDM and Development of a Diagnostic Assessment
10:00-10:50 (2) The G-DINA Model Framework
10:50-11:20 Break
11:20-12:10 (3) Introduction to the GDINA R Package and Model Calibration
12:10-13:00 Lunch Break
13:00-13:50 (4) Model Fit Evaluation
13:50-14:40 (5) Model Comparison
14:40-15:10 Break
15:10-16:00 (6) Q-Matrix Validation
16:00-16:50 (7) Differential Item Functioning and Attribute Classification Accuracy
16:50-17:00 Wrap-Up

Session 1: This session will introduce the diagnostic modeling framework in the context of educational assessment. Various aspects (i.e., definition, specification) and methods (i.e., expert opinion, think-aloud, empirical) of Q-matrix validation will then be discussed, followed by an introduction to the PR test, an assessment explicitly developed using the CDM paradigm.

Session 2: This session will introduce the G-DINA model and several widely used CDMs it subsumes (e.g., DINA, DINO, R-RUM). Joint attribute distributions, estimation methods, and monotonicity constraints will also be discussed.

Session 3: This session will briefly introduce R and then focus on model calibration using the GDINA package. Participants will calibrate the PR data using different CDMs, and interpret the results.

Session 4: This session will discuss item-level (i.e., proportion correct, correlation, log-odds ratio) and test-level (i.e., deviance, AIC, BIC) fit statistics provided by the GDINA package, based on which participants will evaluate the fit of different CDMs to the PR data using several alternative Q-matrices.

Session 5: This session will discuss a test-level (likelihood ratio) and an item-level (Wald) tests for comparing saturated and reduced models under the G-DINA model framework. Participants will need to identify appropriate CDMs for each item for the PR data based on these tests using the GDINA package.

Session 6: This session will introduce a general method for empirically validating Q-matrix based on the G-DINA model, which can be used in conjunction with all the reduced models. Participants will validate different Q-matrices for the PR data based on this method using the GDINA package.

Session 7: The session will address the issue of differential item functioning (DIF) by introducing a DIF detection method based on the Wald test and likelihood ratio test. These methods can be used with the G-DINA model or any models that it subsumes. Moreover, given a particular set of item parameters and Q-matrix, evaluating the quality of a diagnostic assessment using attribute classification accuracy will be introduced and discussed in this session.