Last modified: 2018-07-07
Abstract
Multiple choice questions (MCQs) are frequently used to assess students in different educational streams for their objectivity and wide reach of coverage in less time. However, the MCQs to be used must be of quality which depends upon its difficulty index (p-value) and discrimination index (DI)
Objective. The objectives of this study were to incorporate revised Bloom’s taxonomy into multiple choice questions in an organic Chemistry quiz and to assess its effectiveness in detecting areas of improvement in designing questions.
Problem Statement. MCQs are typically associated to the revised Bloom's Taxonomy of the lower order level learning outcome for the category of 'Knowledge'. In the researcher’s opinion, it is feasible to design MCQs spanning all the categories of the revised Bloom's Taxonomy.
Literature Review. Carneson, Delpierre, and Masters (2016) proved MCQs can be set at different cognitive levels. Avidon (2010) also agreed that MCQs are versatile. “MCQs need not be limited merely to testing knowledge; tests can include more challenging items … and these might well allow more of the intended learning outcomes for a unit to be assessed efficiently …†(p.3).
Methodology. Study was conducted in a sixth form centre. An online quiz in organic chemistry was administered to 26 students. It comprised of 10 single response type MCQs and total marks awarded were 10. Each item had single stem with four options including, one right response and three other wrong but plausible distractors. Each right answer was allotted 1 mark and each wrong answer was given 0, range of score being 0-10.
Findings. Items analysed in the research as shown in table 1 were neither too simple nor excessively complex. Mean p-value in this study was 0.48 ± 0.20 well within the acceptable range (0.41-0.60) identified in literature. Mean DI in present study was 0.50 ± 0.25 which is more than the tolerable limit of 0.15.
Table 1. Assessment of 10 items based on various indices amongst 26 students
Parameter
Mean
SD
Difficulty Indexa (p-value)
0.48
0.20
Discrimination Indexb (DI)
0.50
0.25
a Difficulty index represent the percentage of students who answered the item accurately. For instance, 48% of the students answered correctly for the Chemistry quiz.
b Discrimination index is the ability of an item to differentiate between students of higher and lower abilities and ranges from -1 to +1. The higher the value of DI, the item is more able to discriminate between students of higher and lower abilities.
Â
Table 2 illustrated difficulty index for remembering questions were prominently higher than the rest and befittingly, discrimination index for remembering and understanding were lower than the other taxonomies.
Table 2. The Difficulty Index and Discrimination Index of items based on the 6 types of revised Bloom’s Taxonomy
Â
No of items
Difficulty Index (p-value)
Discrimination Index (DI)
Mean
SD
Mean
SD
Remembering
1
0.85
-
0.29
-
Understanding
1
0.19
-
0.29
-
Applying
2
0.40
0.08
0.64
0.10
Analysing
2
0.54
0.11
0.71
0.40
Evaluating
2
0.46
0.05
0.43
0.00
Creating
2
0.46
0.33
0.43
0.40
Â
On a different note, the extremely low p-value for understanding indicated that the item ought to be checked for probable confounding language, controversies, or even a mistaken key (Hingorjo & Jaleel, n.d.).
Conclusion. Item analysis is a straightforward, yet important measure implemented after a test producing data with respect to the reliability and validity of MCQs by calculating p-value and DI. Based on the computed p-value and DI, items in this study were aptly demanding and efficient at discerning high and low performing learners. Hence, well-designed multiple-choice questions incorporating different learning domains of revised Bloom’s taxonomy may be a potential method and recommended for evaluating critical thinking skills in large classes.