IJCA - Volume I - Flipbook - Page 38
38 The International Journal of Conformity Assessment
an item response and an underlying dimension and
is thus preferred in some cases. Also called a oneparameter model.
Two-parameter model
In item response theory, a model that specifies two
parameters affecting an individual’s response to a
particular test item: (a) the difficulty level of the item;
and (b) the discriminating power of the item
Three-parameter model
In item response theory, a model that specifies three
parameters affecting an individual’s response to a
particular test item: (a) the difficulty level of the item;
(b) the discriminating power of the item; and (c) in
multiple-choice items, the effect of guessing. The
probability of a correct response to the item is held
to be a mathematical function of these parameters.
Anchor test
A set of test items used as a reference point in
comparing alternate forms of a test. One alternate
form is administered to one group of participants,
another is administered to a different group, and the
items comprising the anchor test are administered
to both groups. Scores on each alternate form are
then compared with scores on the anchor test.
Annex 4
Scheme Validation Process Flow Chart
5.1 Scheme analysis
Scheme Technical Committee (STC)—with the
support of specialized experts/consultants—
proceeds in a competence analysis. Scheme
competences are documented.
5.2 Evaluate academic/training requirements
STC members evaluate any applicable academic/
training requirements of the certification scheme
according to all applicable (market/legal/statutory/
normative) scheme requirements.
5.3 Evaluate experience requirements
STC members evaluate any applicable experience
requirements of the certification scheme according
to all applicable (market/legal/statutory/normative)
scheme requirements.
5.4 Evaluate certification maintenance/
recertification requirements
STC members evaluate any applicable certification
maintenance/recertification requirements of the
certification scheme according to all applicable
2022 | Volume 1, Issue 1
(market/legal/statutory/normative) scheme
requirements.
5.
5.5 Select and develop tests
Specialized experts/consultants evaluate the
scheme analysis information (competencies
requirements) and determine the knowledge,
skills, and abilities and the methods for their
measurement.
Burcu, H., Yüksel, T., and Zafer, K. 2020. “Distractor
Analysis Based on Item Difficulty Index and Item
Discrimination Index.” Journal of Gumushane
University Institute of Science and Technology.
6.
Glen, S. n.d. “Classical Test Theory: Definition.”
https://www.statisticshowto.com/classical-testtheory/.
7.
Glen, S. n.d. “Item Response Theory: Simple
Definition.” https://www.statisticshowto.com/
item-response-theory/.
5.6 Set cutting scores and review final test
STC experts review the test item-by-item. They
select the correct answers, are told the keyed
answers, and are asked what percent of qualified
candidates would pass each item. The STC experts
judge which, if any, of the knowledge, skills, or
abilities is measured by the test. This is also
their final review of the complete test before it is
printed. The detailed scheme validation procedure
(modified-Angoff model) is provided at the end of
this document.
5.7 Edit, compose, and print tests
Examination department edits, composes, and prints
(if required) the tests.
5.8 Write content validation report
Quality assurance manager writes a content-related
validation report. After the STC reviews a draft, the
final report detailing the activities undertaken is
provided to the certification manager for approval,
then it is given back to the quality assurance
manager for inclusion in the management review
agenda.
REFERENCES
1.
AERA (American Educational Research
Association), APA (American Psychological
Association), and NCME (National Council on
Measurement in Education). 2014. The Standards
for Educational and Psychological Testing.
American Educational Research Association.
2.
Anastasopoulos, G. 2017. “Setting the Passing
Score of an Exam.” Presentation at the IPC Annual
General Meeting, Montreal, Canada, October 2017.
3.
Association of Boards of Certification.
2013. “Validating Your Certification
Exam.” http://www.abccert.org/pdf_docs/
ValidatingYourCertificationExam.pdf.
4.
Biddle, R. 1993. “How to Set Cutoff Scores for
Knowledge Tests Used in Promotion, Training,
Certification, and Licensing.” Public Personnel
Management 22: 63-79.
8.
Henryssen, S. 1971. “Gathering, analyzing,
and using data on test items.” In Educational
measurement, 2nd ed., edited by R.L. Thorndike,
130-159. Washington, D.C.: American Council on
Education.
9.
ISO (International Organization for
Standardization). 2012. Conformity assessment
— General requirements for bodies operating
certification of persons. ISO/IEC 17024:2012.
https://www.iso.org/standard/52993.html.
10. ISO
(International Organization for
Standardization). 2014. Conformity assessment
— Vocabulary related to competence of persons
used for certification of persons. ISO/IEC TS
17024:2014. https://www.iso.org/standard/62024.
html.
39
11. Liu,
F. 2008. “Comparison of Several Popular
Discrimination Indices Based on Different Criteria
and Their Application in Item Analysis.” Master’s
thesis, University of Georgia. https://getd.libs.
uga.edu/pdfs/liu_fu_200808_ma.pdf.
12. Professional
Testing Inc. n.d. “How Do You
Determine if a Test Has Validity, Reliability,
Fairness, and Legal Defensibility?” https://www.
proftesting.com/test_topics/pdfs/test_quality.pdf.
13. Statistics
How To. n.d. https://www.
statisticshowto.com.
14. U.S.
Department of Labor Employment & Training
Administration. 1999. “Testing and Assessment:
An Employer’s Guide to Good Practices.”
https://wdr.doleta.gov/opr/fulltext/document.
cfm?docn=6032.