Monday, March 02, 2015, 4:00 PM – 5:00 PM (PST)
If at First You Don't Succeed: An Analysis and Interpretation of Candidate Retake Response Patterns
Practice Area Division(s): Certification/Licensure, Workforce Skills Credentialing
Topic: Testing, Measurement, and Psychometrics
The testing industry supports and encourages examinees to be offered at least one exam retake opportunity. Credentialing programs often define and enforce specific exam retake policies that include requirements such as waiting periods, retake fees, or education/retraining. The intent of these policies is to limit candidates’ exposure to the exam content for security purposes and encourage candidates to make a serious effort at the earlier attempts. Although programs typically provide failing candidates with guidance on preparing for their second attempt, there are a number of strategies that candidates may use to achieve a higher score.
The research in this area supports the idea that examinees perform better on a second attempt at an exam, but there is little empirical research providing insight as to how examinees’ response patterns change or remain the same between attempts. This information can help program stakeholders better understand how candidates prepare for a second attempt at an exam and if having prior exposure to some content provides an advantage. From a program maintenance perspective, analysis of retake performance and response patterns can help identify compromised items or help detect cheating.
In this session, representatives from two credentialing programs will provide an overview of their exam retake policies, including the rationale for such policies. Presenters will explain how response patterns were analyzed for repeat test takers for both programs, including in-depth investigations of these response patterns by item response time, item difficulty, item type, and exam length.
The main focus of this presentation will be a review of the key findings that emerged from this investigation, along with the resulting impact for these programs. From a psychometric perspective, specific findings will be detailed that raise validity and security concerns. From a policy perspective, key findings will be interpreted with regards to program impact and future policy discussions. Guidance will be provided to program leaders as to why such investigations should be conducted, how to analyze response patterns, and suggestions for psychometric and policy interpretations.
PRESENTERS:
Susan Davis-Becker, Alpine Testing Solutions
Amanda Wolkowitz, Alpine Testing Solutions
Jared Zurn, National Council of Architectural Registration Boards
Jack Terry, National Board of Examiners in Optometry