Rules For Writing Multiple Choice Questions


 

This is a back-to-basics article about an undervalued and often overlooked tool: the multiple-choice question. It may not be as exciting as immersive learning or AI-generated podcasts, but it might be more important.

Well-written assessment items reduce errors from poorly constructed questions. Reliable and valid test items provide more accurate data about what learners know and can do. This data helps you diagnose learning gaps, refine instruction, and improve performance.

What is a reliable assessment item?

Reliability refers to the degree to which measurement is consistent. A reliable test produces similar results when the same knowledge or skill is measured across different administrations of testing, See Are Your Online Tests Reliable? for a deeper dive.

What is a valid assessment item?

A valid assessment measures the intended knowledge or skill. Validity is the degree to which an assessment accurately measures what it is intended to measure. See Are Your Online Tests Valid? to learn more.

How to Write Effective Multiple-Choice Test Items

Following the rules below will help you write more effective assessments. This guidance ensures questions are interpreted as intended and answer options are clear and free of hints. If you use AI to write test questions, these rules will help you discern good from poor items.

Writing the Stem (The Question)

Prior to writing test items, ensure the performance objectives are clear. Then derive items from clearly defined objectives.

One way to plan an exam is creating a test specification table or blueprint to ensure the content of the test items map to the intended objectives. This helps to avoid duplication of content or a focus on unimportant material.

  • Include the main idea in the stem. Placing most of the words in the question stem helps the learner understand the problem without having to read the response options. This way, the answer choices can be short, making them less confusing and more legible.
  • Use a single, clearly defined problem. Ensure the item targets one specific concept so the answer is either completely right or completely wrong.
  • Use the direct question format. Experts recommend using a direct question rather than an incomplete statement.
  • Avoid unintentional cues. Ensure question stems do not hint at the correct option.
  • Use a simple structure that is easy to understand. 
  • Clarify the meaning by being precise. Keep in mind that words can have many meanings depending on colloquial usage and context.
  • Avoid double negatives. Example: Instead of ‘Which of the following comments would NOT be unwelcome in a work situation?’ Write it in the positive form: ‘Which of the following comments are acceptable in a work situation?’
  • Test comprehension and critical thinking, not just recall. Multiple choice questions are criticized for testing the superficial recall of knowledge. Go beyond recall by asking learners to interpret facts, evaluate situations, explain cause and effect, make inferences, and predict results. See Writing Multiple-Choice Questions for Higher-Order Thinking.

2. Writing the Distractors (Incorrect Options)

Appropriate distractors are a part of the item’s instructional value. Analyzing why learners choose specific distractors can provide diagnostic information about their misconceptions, guiding remedial instruction.

  • Write plausible distractors. Ensure all incorrect options make sense and are feasible.
  • Use common errors. The most effective distractors reflect actual misconceptions or mistakes typical learners make.
  • Maintain uniformity in answer choices. All options should be similar in content and structure.
  • All options should be grammatically consistent and semantically close to the correct answer. Ensure all distractors are grammatically consistent with the correct answer and with the question stem. Also, all distractors should have semantic closeness, for example: major depression, chronic depression, melancholic depression.
  • Keep answer choices similar in length. Avoid making the correct answer noticeably longer or shorter than the distractors. All options should be homogeneous, grammatically consistent, and similar in length/complexity to avoid cueing.
  • Use true statements that don’t answer the question. A distractor can be a factually true statement that simply does not satisfy the specific requirement of the stem.
  • Ensure only one correct answer: The item should have only one clearly best option. Mutually exclusive options prevent confusion where two answers might be technically.
  • Avoid complex combination options. Combinations such as Both A and B may evaluate logical skills rather than content knowledge.
  • Use ‘All of the Above’ and ‘None of the Above’ with caution. All of the Above can be an obvious give-away answer when it’s not used consistently. It may also encourage guessing if the learner thinks one or two answers are correct. A None of the Above response makes it difficult to know if the learner really knew the correct answer.
  • Distractors for high-stakes Exams. One study demonstrated that using incorrect responses on free-response versions of the item create the most valid and reliable distractors (Ali, Karr & Ruit, 2016).

3. Writing the Keys (Correct Answers)

  • Align the key with learning the objective and cognitive level.
  • Each question should present only one clearly correct answer.
  • Keys should be unequivocally correct.  The key should always be defensible and based on evidence while distractors are objectively inferior.
  • Consider adding brief justifications to multiple-choice questions. This improves performance and can reveal when correct keys are chosen for wrong reasons, reducing misinterpretation of scores.

4. Managing Questions, Answer Choices, and Feedback

  • Three options are usually optimal. Using one correct answer and two distractors per answer choice is most efficient because it allows for more items on a test, improving content coverage and reliability without compromising test quality. “Using more options does little to improve item and test score statistics and typically results in implausible distractors (Rodriguez, 2005). Listen to my conversation with Dr. Rodriguez or download the transcript.
  • Keep the number of options consistent. Consistency reduces cognitive load so learners can focus on the question and answers.
  • Place options in logical order.  When options are numeric, chronological, or alphabetical, order them logically to reduce irrelevant difficulty.
  • Randomize the order of answer choices. Distribute keys roughly evenly and avoid human “middle‑option” bias. Randomization or balanced schemes improve fairness.
  • When possible, make the test a learning experience. Write high-quality explanations when a test is part of a learning experience to provide an additional way to support comprehension. The explanations should include why an answer is correct and why the distractors are incorrect.

Resource:

References:

  1. Ali, S., Karr, P., & Ruit, K. (2016). Journal of the Scholarship of Teaching and Learning, Vol. 16, No. 1, February 2016, pp.1-­‐14. doi: 10.14434/josotl.v16i1.19106
  2. Rodriguez, M. C. (2005). Three options are optimal for multiple-choice items: A meta-analysis of 80 years of research. Educational Measurement: Issues and Practice, 24(2), 3–13.

[Disclaimer: The content in this RSS feed is automatically fetched from external sources. All trademarks, images, and opinions belong to their respective owners. We are not responsible for the accuracy or reliability of third-party content.]

Source link

Share.
Leave A Reply

error: Content is protected !!
Exit mobile version