PEBC is aware that some candidates are concerned about the standardization across exam centres; claiming that certain centres are preferred as candidates believe that some sites have higher pass rates than others. This is a common claim that PEBC has heard over the years and is one that PEBC has repeatedly dispelled.
PEBC utilizes rigorous processes and measures to ensure the quality of its exams for our candidates. The following sections describe the steps that PEBC takes to ensure standardization of its performance exams within and across the OSCE sites.
Interactive Station Development and Assessment Criteria
Each station is developed with tremendous detail to outline:
- the problem to be resolved within the case;
- an overview of the subject of the case (patient / caregiver / healthcare provider) who would be portrayed by a trained actor, called a standardized patient (SP) that meets the description within the case;
- the dialogue that the SP will follow, including verbal cues to provide to candidates and instructions on how to respond to questions from the candidates;
- how the station will be scored with detailed criteria
Examples of sample stations from PEBC’s website can be found here.
SP and Assessor Recruitment Training
PEBC utilizes the exam centres who undertake a comprehensive approach to recruit exam personnel to administer the exam, which includes the SPs and assessors. There are specific criteria that are used for recruitment of the assessors which can be found on PEBC’s website. Assessor eligibility is rechecked annually by confirming licensure status and good standing with the provincial regulatory authorities. Assessors are required to follow a strict code of conduct and confirm that there is no potential or actual conflict(s) of interest prior to each exam.
In advance of the OSCE, the teams of exam personnel (SPs and assessors) are led through multiple training sessions to ensure that they fulfill their roles consistently with accuracy. For the assessors, they review examples of different levels of performance and go through rounds of assessment on sample stations to ensure that their ratings are consistent with one another and at a level deemed appropriate for entry-to-practice.
Processes on Exam Day
On exam day, further training as described above is completed to ensure understanding of content and processes. Throughout the exam, senior exam personnel at each site review the assessors’ ratings to ensure consistency and accuracy. As part of the quality assurance process, second experienced assessors attend different stations to confirm consistency of scoring.
Although a national standard is used to develop and assess content and performance (i.e., NAPRA standards), PEBC recognizes that the scope of practice varies from province to province. Assessors are trained to document unique responses so that they can be reviewed following the exam. This allows PEBC to ensure candidates get credit for any acceptable response that falls outside the scoring guidelines.
Also, any exam incidents that occur at the exam site are documented by exam personnel for review following the exam.
Post-Exam Quality Assurance
PEBC collates all data from the exam centres and conducts analyses to review the ratings of all assessors and all stations within sites (i.e., multiple tracks in a single location) and across sites. Clinical and measurement experts work together to check for any scoring anomalies. Further analyses are periodically conducted to compare the pass rates of the different candidate populations (e.g., Canadian and international first time-test takers and repeaters) from each of the OSCE sites.
A Quality Assurance panel meets over several days following each exam to review scoring reports, unique responses, exam incidents, the performance of borderline candidates, and those taking the exam for their fourth and final attempt to confirm the appropriate scoring of performance.
All this work from development to release of results is done to ensure that the OSCE is a valid, reliable, and fair assessment of competence and to ensure the accuracy of the exam results.
Claims of Easier & Harder OSCE Sites
In the past, PEBC has heard claims from candidates that they believe certain sites from across the country are easier and their perception of which sites these are changes over time. Also, PEBC has noticed that during the application process, some candidates may preferentially select certain sites far from their place of residence to take an exam where it may be presumed that candidates perceive the outcome to be more favourable.
In recent weeks, a few candidates have emailed PEBC about concerns that those being placed in Ottawa are more likely to be successful than others taking the exam elsewhere and claimed that candidates are being urged by others to withdraw from the exam if they have been assigned to certain other sites.
In addition to the routine systematic analyses that PEBC completes and prompted by these candidate concerns, PEBC has undertaken a more in-depth retrospective review of pass rates and scoring within the global domains across candidate populations at the different sites from 2018 to 2021. These candidate populations include the Canadian reference group, international first-time test takers as well as Canadian and international repeat test takers. The findings of this review have confirmed that within each candidate population, there were no sites with consistently higher scores or overall pass rates than others across administrations.
It is unfortunate that, in some instances, unsuccessful candidates who believe themselves to be well-prepared and competent, try to explain the reason for their past exam results by attributing them to alternate external exam factors. It is important that candidates focus their efforts on understanding the gaps in their competence and work towards ways of further bridging those gaps to improve their performance on a future exam attempt. PEBC encourages candidates who are registered for the May OSCE to use the next few weeks to center their attention on their exam preparations instead of these unsubstantiated distractions which will have no impact on the outcome of the OSCE.
It is important that conclusions be drawn based on the facts, and with these facts in hand candidates can be reassured that PEBC’s stringent examination assessment practices are based on the principles of transparency, integrity, and fairness.