We recently had a chance to visit with Trish Young, Kryterion’s Director of Psychometric Services, to talk about two topics that merit discussion.
The first was test maintenance, specifically, how to ensure long-term performance and do so by intention rather than by reaction.
The second topic was about best practices for selecting and managing subject matter experts (SMEs) and not just for test development. They’re essential for the ongoing maintenance of the exam as well.
Q: What should programs do to maintain their certification exams?
The scope of exam maintenance is really related to the level of risk for exam security. There are common maintenance measures for all exams, but the frequency of implementing them will vary based on testing volume, how high the stakes are, the exam’s exposure to the market and the degree to which test candidates, specifically friends or co-workers, talk about the exam.
So, if you have a low-risk program with just a few hundred test-candidates per year, you may only need periodically to conduct health checks or run statistics on item performance. You might also look at new item development and beta testing for those items.
If you have a higher-risk program, you’ll want to create new test forms every year or at least revise them during that period. For high-volume exams, new test forms may be needed annually to maintain exam security. An evaluation of passing/cut scores will be required for new test forms.
Q: What are common mistakes or misconceptions that programs have related to exam maintenance?
The most common misconception that sponsors have is assuming that exam development is a “one and done” process, which often overlooks the need for a budget for continued exam maintenance. Even when it’s understood that the exam will require ongoing maintenance, it may not become a priority until something unexpected (and unwelcome) happens, like a spike in the pass rate due to exam overexposure.
Q: How do you maintain exams for IT certifications where the software is constantly changing?
The key is ongoing item development and including unscored items on exams so that you have replacement items available when older items become outdated. Because the unscored items will have data behind them, you’ll know their level of difficulty and whether or not replacement will require an adjustment to the exam cut score.
Q: How should programs ensure that exam content does not become outdated?
Industry best practices advise conducting a new job task analysis every five-to-seven years to ensure that the exam is measuring things that are currently important for competent performance in a job role.
Test sponsors should also be looking at the actual test questions themselves. Even if the current test blueprint isn’t outdated, the test questions themselves may be, particularly if they’re impacted by changes in technology or laws or regulations. Generally, test questions should be reviewed once a year by subject matter experts to identify questions that have become outdated.
Selecting and Managing SMEs
Q: What are best practices when selecting SMEs?
The most critical thing to do—especially if you ever pursue accreditation or need to defend your program in any way—is to ensure that your subject matter group is representative of the credential target audience.
Your SMEs should be representative in terms of skills, competencies, geographical areas where the job role is practiced, different work settings and educational backgrounds…basically anything that could impact how people do the certified job role.
SMEs must have a thorough understanding of the tasks, knowledge and skills that the certified person uses on the job so that test items are written at the appropriate level of difficulty.
The SMEs should all hold the target certification and be in good standing.
They shouldn’t have any conflicts of interest, especially in the area of exam-prep activities.
Q: How many SMEs are needed for exam development activities?
There’s no set number, but experience suggests that you typically need eight to 12 SMEs to achieve a group representative of the credential target audience.
No fewer than eight SMEs should be used for a passing score study. Ten to 12 SMEs would be even better in order to balance the occasional extreme view that could otherwise distort a passing score.
For activities like item review, fewer SMEs may be OK, say five or six, because items will ultimately be reviewed by other SMEs at multiple points in the test-development process.
Q: Should you use different groups of SMEs for different development or maintenance activities?
There are pros and cons for using the same SMEs throughout test development/maintenance and for using different SMEs during the process.
The pros of using the same SMEs include saving staff time by not having to recruit and coordinate the subject matter experts for various test-development activities. Another pro is the continuity that results from using the same group throughout test development. Yet another benefit is having the same SMEs who wrote test items be present during item review to answer questions and explain their purpose.
Reasons for using different SMEs for test-development activities include avoiding situations where SMEs who wrote the questions become defensive during item review. Using different SMEs also averts the problem where SMEs become too familiar with test questions and—especially in the latter stages of test development—start believing that they’re now too easy.
Some continuity can be helpful in terms of passing information and context on to new SMEs.
If SMES will be used for extended periods of time, they should have term limits stipulating when they must step down. Fresh faces with fresh eyes should be rotated into the process at designated points.
Q: What are the common mistakes/omissions that test sponsors make related to SMEs?
Probably the biggest omission is not gathering enough background info on SMEs and documenting their participation. Background information ensures that SMEs indeed hold the certification, for example, and that they are representative of the credential target audience. Documentation is critical for following test-development best practices and may contribute to greater legal defensibility for the program.
Another omission is not having signed non-disclosure agreements (NDAs) from SMEs or not periodically reminding them about NDA terms, especially conflicts of interest like exam-prep activities.
Sometimes less-experienced SMEs are excluded from the standard-setting process, which effectively ignores the contribution they could potentially make to ensure that questions are at the appropriate level of difficulty.
Another mistake is selecting SMEs based on convenience rather than being representative of the credential target audience.
Q: Is it ever appropriate to pay SMEs for their time and effort? Or is that a conflict of interest?
It’s perfectly acceptable to pay SMEs. Programs that do often compensate their SMEs with continuing education credits, which saves money for both sponsors and SMEs. Many SMEs report that they benefitted just from being part of the exam-development and maintenance process. They derived value from learning new things and from expanding their network of professional contacts.