TrueLearn’s Step 2 CK SmartBank Outperforms UWorld
An emerging Step 2 CK question bank, TrueLearn’s USMLE Step 2 CK SmartBank is quickly becoming one of the most comprehensive exam preparation tools available. As the 1,300+ question bank continues to grow, we strive to ensure we continue to provide the best testing experience. As part of maintaining our SmartBank, a practical test was performed to understand which questions students preferred, content-wise. We removed brand reputation as a factor with another industry leader – making it a blind and unbiased TrueLearn vs UWorld comparison.
Ten recent Step 2 examinees were sent links to four sets of two Step 2 questions (for eight questions total). Each set of questions tested the same learning objective and covered the same subject: one of them was from TrueLearn’s USMLE Step 2 SmartBank, the other from UWorld’s USMLE Step 2 QBank. Thus, the questions in each set were unique but were designed to test and teach the same concept.
Examinees were not told that any of the questions came from UWorld. The questions were also randomized so that “version A” and “version B” did not always correspond to either UWorld or TrueLearn. Examinees were simply told to review each question in a set, answer survey questions related to the questions, and select which version they preferred overall.
TrueLearn vs UWorld Examinees’ Comparison
Examinees voted three out of the four TrueLearn questions superior to the equivalent UWorld question. There were clear winners in each set, but some results were closer than others, there were no total “blow-outs,” and no question from either TrueLearn or UWorld was rated negatively.
The tallies below are based on examinees’ answers when asked to compare the two questions. Examinees were also asked to rate aspects of the question numerically (specifically, the stem, the rationale, and how closely the stem matched the exam). For the most part, the question that was rated highest numerically lined up with the qualitative results, though the UWorld pediatrics question actually scored slightly higher on its numerical scores, even though more examinees selected the TrueLearn question as the superior version.
The winner here was TrueLearn (6-3-1). Overall, the majority of examinees found the TrueLearn question to be more appropriately difficult and reflective of the exam (eg, “Incorrect answer choices are more relevant making it more difficult to arrive at the correct answer”). They also thought TrueLearn’s rationale was well-organized and more educational (eg, “The second version has an explanation that does better with connecting concepts”). However, the granular numerical scores for both questions (quality of item, quality of rationale, how well the item matched exam style) were very similar, with the UWorld question actually coming up slightly ahead here.
Some examinees thought that additional details could have been added to the UWorld question to increase its difficulty and mirror the exam more (eg, “It may be slightly short, so adding other details would be good”), and that the explanation could have done more teaching (eg, “It could be improved by explaining how increased septal wall thickness is diagnostic, and therefore relating it back to the question stem”).
The winner here was UWorld (5-2-3). Overall, the majority of examinees thought the UWorld question resembled the exam more (eg, “Better in terms of length and number of lab values”) and that its explanation was appropriately thorough but also concise (eg, “Concept covered in totality”).
By comparison, a few examinees believed TrueLearn’s explanation was thorough, however may have had an excess of lab values. Examinees also stated that the question’s explanation was too detailed, challenging the learner to sift through concepts they feel they had already grasped.
The winner here was TrueLearn (5-2-3), with the TrueLearn rationale, in particular, coming out ahead (eg, “The explanation was more relevant to the question and provided more detail”). Overall, the majority of examinees found the TrueLearn question to be the more difficult of the two while still being fair (eg, “The question appropriately challenges the student to think broadly about potential ethical implications”), and the explanation to be clear and concise (eg, “Great rationale with relevant table and concise straightforward explanation”).
However, some examinees said that even though they preferred the TrueLearn version, they also liked the UWorld version and that taking both would be helpful (eg, “Both are great. They feel very much like the actual exam”).
The clear winner here was TrueLearn (7-2-1). Overall, the majority of examinees found the TrueLearn question to be more difficult and reflective of the exam (eg, “Loved the distracting information in the stem”), whereas the UWorld question was thought to be easier due to lack of details (eg, “Be specific with pertinent negatives rather than broad [bowel or bladder symptoms]; it could be longer to make it more difficult”).
Examinees also felt that the UWorld explanation left out some pertinent information (eg, “Slightly more detail could be added to the rationale, comparing and contrasting the classic textbook presentation of ruptured AAA with the atypical presentation seen here with an explanation of how the student should still arrive at the correct diagnosis”), whereas the TrueLearn explanation provided more relevant details (eg, “It is short, and only covers necessary information which I think is missing in a lot of questions”).
UWorld is undoubtedly the most well-known and well-respected question bank for the USMLE. However, TrueLearn’s brand recognition and brand trust among medical students as a worthy USMLE preparation tool is increasing. This practical test shows that the level of quality between UWorld questions and recently written TrueLearn questions is comparable, with TrueLearn edging out Uworld in most categories.
Writing a high-quality USMLE-style question based on the National Board of Medical Examiners’ (NBME) content outline and an exam-relevant explanation is no easy feat. Students want just the right level of detail, not too long, not too short, and sometimes differ with each other over what exactly constitutes that happy medium. As the medical exam prep resource landscape evolves, it becomes increasingly crucial for learners to evaluate which products will fit their individual learning needs through the many milestones of their academic journey. When focusing on question content rather than brand recognition, the learner can make a more informed decision regarding curriculum support and content they require to become a more efficient and effective test-taker outperforming on exam day.