On March 13, 2017, RESNA launched the updated Assistive Technology Professional (ATP) examination.
In early 2017, RESNA's Professional Standards Board (PSB) completed its update of the Assistive Technology Professional (ATP) certification examination. Through its exam update process (described below), RESNA has crafted a quality exam that tests candidates on the knowledge that is currently used in the field. Additionally, the update marks the elimination of US-centric funding content from the exam.
To update the exam, RESNA's Professional Standards Board (PSB) enlisted the assistance of 53 subject matter experts. In each step of the development process, a diverse group of subject matter experts (SMEs) was carefully selected to represent the wide range of specialty areas, practice settings, and academic backgrounds of assistive technology service providers. RESNA contracted consulting pscyhometricians to facilitate the steps of the exam update in compliance with the standards of the National Commission for Certifying Agencies. The steps of the update process, outlined below, were designed to promote the development of a quality exam that is valid, fair, and reliable. AT subject matter experts made all decisions about exam content and scoring, which were based on data provided by practitioners from the field.
Job analysis study
Development of the current ATP exam was initiated in 2014 with a job analysis study facilitated by psychometric consultant, Knapp and Associates. A panel of SMEs identified the job tasks typically performed by assistive technology professionals with basic competence. The job task outline was then validated through a survey of practitioners. The resulting survey data yielded the test blueprint, which specifies how many exam questions are dedicated to each job task. Note: the job task listing is published as the “Exam Outline” on the RESNA website. Facilitated by the Ohio State University’s (OSU) Center for Education and Training on Employment CETE, a panel of SMEs analyzed the job tasks listed in the Exam Outline to identify the steps in performing these tasks, the knowledge and skills required to perform the tasks, potential mistakes that could result from performing the tasks incorrectly, etc. This detailed matrix would serve as a tool for the SMEs who later wrote the new exam questions, guiding the SMEs in crafting questions that would test relevant topics.
Facilitated by the Ohio State University’s (OSU) Center for Education and Training on Employment CETE, a panel of SMEs analyzed the job tasks listed in the Exam Outline to identify the steps in performing these tasks, the knowledge and skills required to perform the tasks, potential mistakes that could result from performing the tasks incorrectly, etc. This detailed matrix would serve as a tool for the SMEs who later wrote the new exam questions, guiding the SMEs in designing questions that would test relevant topics.
ITEM (TEST QUESTION) WRITING
Two separate panels of SMEs were assembled to write and vet a bank of test questions to represent the job tasks listed in the Exam Outline. During each of two workshops facilitated by OSU CETE, the SMEs were arranged into smaller groups to design the questions, and then all the questions were reviewed and revised by the larger group.
Standard Setting (Setting the passing cut-off score
After the test was assembled with the new questions in accordance with the blueprint, candidates piloted the new exam. Psychometricians from OSU CETE analyzed the statistical performance of each of the exam questions and then facilitated a cut-score study to determine the passing score for the exam. During the study, a panel of SMEs rated the difficulty level of each test question in respect to basic competency, and then the panelists' question ratings were aggregated and averaged to determine the recommended passing cut-off score. After reviewing the results of the study, RESNA's Professional Standards Board finalized the passing cut-off score.