Advances in High-Stakes Noncognitive Testing: IRT Methods to Improve Accuracy and Efficiency - Stephen Stark, PhD (University of South Florida)
Applications of Forced-Choice Assessments in Noncognitive Testing - Christopher Nye, PhD (Michigan State University)
Over the last 20 years, interest in noncognitive testing for development and selection in workplace and educational contexts has increased tremendously. Longstanding concerns about response biases, such as faking good and rating scale errors, have been mitigated by the introduction of item response theory (IRT) forced-choice methods that yield scores having normative properties. Moreover, when combined with computerized adaptive item selection and methods for detect aberrant responding in real time, efficiency and security of noncognitive testing has substantially improved. This set of presentations will review IRT methods developed to enable multidimensional forced-choice testing in high-stakes scenarios and then discuss applications of this methodology in the assessment of both personality and vocational interests. We will first describe an IRT model developed for test construction and scoring, an approach to computerized adaptive testing (CAT), and methods for detecting aberrant responding. Then we will present evidence that personality and vocational interest measures developed using this methodology can help to reduce faking while also providing validity for predicting a number of workplace outcomes.
Dr. Stephen Stark received his PhD in IO psychology from the University of Illinois at Urbana-Champaign and is currently a Professor and Director of the IO Psychology doctoral program at the University of South Florida. His research and teaching focus on measurement and selection with emphasis on item response theory forced-choice models, computerized adaptive testing, differential item functioning, and aberrant responding detection. His research has provided the foundations for civilian and military assessments, most notably the Army’s Tailored Adaptive Personality Assessment System (TAPAS). Dr. Stark is a Fellow of SIOP, APA, and the U.S. Army Research Institute. He is editor of the International Journal of Testing and has served on the editorial boards of several peer-reviewed journals.
Dr. Christopher Nye received his PhD in IO psychology from the University of Illinois at Urbana-Champaign and is currently an Associate Professor of Organizational Psychology at Michigan State University. His research primarily involves organizational research methods, personnel selection and assessment, and the influence of individual differences in the workplace. He has published a number of scholarly articles and/or chapters on these topics and has received awards from the Academy of Management, SIOP, and the International Personnel Assessment Council for this work. He has also been a Consortium Research Fellow for the Defense Manpower Data Center and a Senior Consortium Research Fellow for the U.S. Army Research Institute.
Free for Students
Students will need a registration code to sign up for free. Please email the Secretary (firstname.lastname@example.org) using your student email address to receive the code.
We understand that you may be financially affected by Covid-19. Therefore, we are offering the option to attend this event for free. Please email the Secretary (email@example.com) to receive a registration code.
2021 © PTCMW