Academic Journals Database
Disseminating quality controlled scientific knowledge

An analysis of the discrete-option multiple-choice item type

Author(s): Neal M. Kingston | Gail C. Tiemann | Harold L. Miller Jr. | David Foster

Journal: Psychological Test and Assessment Modeling
ISSN 2190-0493

Volume: 54;
Issue: 1;
Start page: 3;
Date: 2012;
VIEW PDF   PDF DOWNLOAD PDF   Download PDF Original page

Keywords: computer-based testing | innovative item types | discrete-option multiple-choice | multiple-choice | testwiseness

The discrete-option multiple-choice (DOMC) item type was developed to curtail cheating and reduce the impact of testwiseness, but to date there has been only one published study of its statistical characteristics, and that was based on a relatively small sample. This study was implemented to investigate the psychometric properties of the DOMC item type and systematically compare it with the traditional multiple-choice (MC) item type. Test forms written to measure high school-level mathematics were administered to 802 students from two large universities. Results showed that across all forms, MC items were consistently easier than DOMC items. Item discriminations between DOMC and MC items varied randomly, with neither performing consistently better than the other. Results of a confirmatory factor analysis was consistent with a single factor across the two item types.

Tango Rapperswil
Tango Rapperswil

RPA Switzerland

RPA Switzerland

Robotic process automation