ASC Test Development Process

APA's two new Advanced Specialty Certification examinations — AICP CTP and AICP CEP — were developed in conjunction with Prometric in compliance with industry standard psychometric practices and in accordance with the Standards for Educational and Psychological Testing published by the American Psychological Association, the American Educational Research Association, and the National Council on Measurement in Education. Below is an overview of Prometric's proven test development process that guided the creation of these exams.

Job Analysis

Through a Job Analysis survey, specific key responsibilities and competencies required for effective performance in a specialized planning field are identified. Subject matter experts (SMEs) are assembled who then participate in a focus group which Prometric facilitates. The focus group evaluates the necessary knowledge, skills and abilities necessary to measure the desired level of competency in a given planning specialization, including establishment of tasks and subtasks that are required to earn the designation.

Test Specification (Blueprint) Development

The development of a test specification (blueprint) document follows the Job Analysis and ensures that each section and objective is appropriately represented on the test. Ratings are solicited from the SMEs during the Job Analysis in order to ensure that the weightings of each objective are appropriate.  

Item Writing Workshop

A panel of SMEs is recruited and is trained to write items (items=test questions) during an Item Writing Workshop. During the workshop, a Prometric facilitator conducts item writing training and facilitates the item writing process. SMEs then author questions on their own time and submit them for editing and review. Each item is designed to be an accurate measure of knowledge and relevancy to the goals of the testing program, as well as psychometrically sound and legally defensible.

Item Review

A separate panel of SMEs — under the direction of a qualified Prometric psychometrician — reviews each newly written test item for accuracy, clarity, valid linkage to the test specifications and accuracy of the identified key. SMEs then revise items in order to meet these requirements.

Psychometric and Language Item Editing

Each item undergoes a psychometric, sensitivity and language review. Prometric's psychometric review verifies that each item conforms to recognized psychometric item writing guidelines and does not favor any particular nationality, race, religion, or gender. During the editing process, Prometric editors check all items for grammar, usage, readability, clarity, and consistency of usage. Once the items are selected for inclusion, a Prometric test developer constructs the test in accordance with the test specifications.

Test Administration

The test is administered to qualified candidates in a live testing environment.

Item Analysis and Review of Statistically Flagged Items

Prometric combines a detailed study and subject matter expert judgment to measure the relationship between item types and objectives; to assess fairness, reliability and validity of each item; and examine performance discrepancies. SMEs review items with poor performance and help determine how to best score them. Prometric's proven methods include the modified Angoff and Borderline Group method.

Standard Setting / Cut Score Analysis

A standard setting panel of SMEs is created to determine the pass-fail standards and the associated cut scores for the test. A cut score is the minimum score that a candidate must obtain in order to pass the exam. SMEs examine each test question and make a preliminary assessment of what the minimally qualified candidate should and should not answer correctly. The panel then takes the exam themselves and reconvenes to adjust their definition of the minimally qualified candidate based on their own performance. Prometric juxtaposes the results of the Item Analysis and Standard Setting studies with candidate performance to recommend a range of scoring options. The cut score that is selected is intended to ensure that only qualified candidates pass the exam and are awarded the credential. While the selected cut score will not always result in the desired pass rate — because of factors such as a small candidate pool — as more candidates take the test, pass rates should stabilize over time to reflect the optimal pass-fail ratio.

Ongoing Test Maintenance

Over time the test is periodically re-evaluated to ensure that it remains a relevant and accurate measure of candidates' expertise in the field. As in early stages of the test development process, a panel of SMEs reexamines performance discrepancies and reassesses issues such as the fairness, reliability, and validity of test items with the aim of determining whether adjustments to the test or cut score are necessary.

Prometric's Road Map to Successful Certification Programs


Certification Roadmap

1. JOB ANALYSIS

  • Identifies job tasks and knowledge, skills and abilities (KSAs)
  • Represents critical first step in test development
  • Provides information to guide professional development initiatives
  • Important investment
  • Many options for conducting job analysis

2. TEST SPECIFICATIONS

  • Test blueprint; content outline
  • Derives from job analysis findings
  • Links test items to job analysis
  • Directs item writing and test assembly

3. TEST DEVELOPMENT

  • Item writing and review
  • Test form assembly
  • Sensitivity review
  • Conduct test development activities:

In-person workshops, At-home assignments, Internet conferencing


4. PROGRAM MARKETING
Develop communications campaign for constituents:

  • Press releases
  • Mission statement
  • Presentation materials:

Demo disks, Internet page, Direct mail, Journal article, Candidate bulletins

5.    TEST ADMINISTRATION
Delivery modes:

  • Paper-and-pencil
  • Computer-based
  • Internet-based (proctored)

6. SETTING A PASSING SCORE
Experts meet to determine the standard of competence


7. TEST SCORING
Types of scores:

  • Percentiles
  • Scaled scores
  • Subscores
  • Equating

Psychometric models:

  • Item response theory
  • Classical test theory

8. TEST SCORE REPORTING

  • Pass/Fail
  • Numeric scores
  • Feedback to candidates

9. ONGOING TEST MAINTENANCE

  • Continually reevaluate
  • Make continuous investments to keep process current
  • Benchmark against industry standards

© 2011 Prometric Inc.