Distractor Analysis – Keeping ARRT Exams Effective – Part 2

Get Connected Icon
Sep 19, 2025

As mentioned in a previous article, ARRT psychometricians analyze the responses on each exam item-including the wrong ones. These incorrect options, or distractors, tell a story that helps psychometricians understand if an item works. Using Minnesota trivia items as examples, we consider three common patterns observed in items.

We start with our first item. The percentage of candidates responding to each option is in parentheses:

"What is a nickname for the state of Minnesota?"

A.) The Sunshine State (0%)

B.) The Desert State (0%)

C.) The Ocean State (1%)

D.) The Land of 10,000 Lakes (99%)

In this case, the percentages reveal that the distractors are not distracting. Minnesota is known for its lakes, and the key is D. If ARRT psychometric staff were to observe these percentages in a radiography pilot item, they might ask a committee of volunteer subject matter experts (SMEs) to revise options A, B, and C to make them more plausible to candidates. Items without plausible distractors can be too easy, rendering an item ineffective at measuring competency.

Consider a second example:

"Which of the following townships is located at the Minnesota zip code of 56672?"

A.) Boy River (23%)

B.) Berner (26%)

C.) Radium (27%)

D.) Fox (24%)

In this case, the percentages show an even spread of responses, indicating candidates are likely guessing. Only a handful of people would know that the key is A because this item is highly specific geographically. If this percentage pattern were observed for an ARRT item, a psychometrician might ask a committee of SMEs to modify the stem to make the item easier. Instead of measuring knowledge and skills, items that are too hard reflect guessing and random chance.

Finally, consider a third example. For those selecting a given option, the average overall exam score is provided in brackets next to the percentage responding to the option (e.g., option A below has an [85], indicating that those selecting option A had an average overall test score of 85 on the exam):

"What is the true northernmost point in Minnesota?"

A.) The Northwest Angle (75%) [85]

B.) Angle Inlet (10%) [74]

C.) Penasse Island (10%) [75]

D.) Elm Point (5%) [73]

The percentages indicate that most candidates selected option A. However, the key is C, Penasse Island! The reason: the wording is ambiguous. While Penasse Island is, in fact, the northernmost point, The Northwest Angle is a region that contains Penasse Island, which is confusing. Critically, high scorers tended to select option A (i.e., they had an average exam score overall of 85) but got the item wrong. High scorers were confused about the wording. If these statistics were observed for an ARRT item, a psychometrician might ask SMEs to replace option A with another option. ARRT exam items are not meant to be tricky.

As shown, distractors tell a story. By analyzing distractors, ARRT psychometric staff help ensure that each ARRT item effectively measures the knowledge and skills needed for entry-level practice. The result is a precise and effective examination.