Item analysis of examinations in the Faculty of Medicine of Tunis
##plugins.themes.academic_pro.article.main##
Abstract
Abstract
Introduction: Item analysis is the process of collecting, summarizing and using information from students’ responses to assess test items’ quality. This study used this approach to evaluate the quality of items and examinations given in the Faculty of Medicine of Tunis (FMT).
Methods: This study concerned the examinations of 2012-2013 (principal session). It analyzed 3138 items from 66 examinations, of which, 46 were multidisciplinary (187 disciplines). A total of 2515 students took the examinations. “AnItem.xls” file was used for the analysis that focused on difficulty, discrimination and internal consistency.
Results: Mean difficulty for all examinations was optimum (mean difficulty index: 0.59). Majority of items (89.17%) were either easy or of acceptable difficulty. Mean discrimination for all examinations was moderate (mean item discrimination coefficient: 0.28) with poor discrimination in 23.62% of items. Maximal discrimination occurred with disciplines of difficulty index between 0.4-0.6. « Ideal » items represented 27.02%. Mean internal consistency for all examinations was acceptable (Cronbach’s alpha: 0.79). Disciplines with nonacceptable internal consistency (68.45%) contained a maximum of 33 items (each one) and a positive correlation between their alpha and the number of their questions. Distributions were mostly (72.73%) platykurtic and negatively asymmetric (89.39%). First year of studies had the best parameters.
Conclusion: Our examinations had an acceptable internal consistency, and a good level of difficulty and discrimination. They tended to facility and discriminated basically students of medium level. Item analysis is useful as a guide to item writers to improve the overall quality of questions in the future.
Keywords:
Difficulty index, discrimination index, internal consistency, score distribution.##plugins.themes.academic_pro.article.details##
References
- Hingorjo MR, Jaleel F. Analysis of One-Best MCQs: the difficulty index, discrimination index and distractor efficiency. J Pass Med Assoc 2012; 62:142-147.
- University of Washington. Office of educational assessment. Understanding item analysis. (http://www.washington.edu/oea/services/scanning_scoring/scoring/item_analysis.html). Accessed February 21, 2015.
- MacAlpine M. A summary of methods of item analysis. CAA Centre Bluepaper 2, University of Luton 2002.
- Educational Data Systems. Preliminary item statistics using point-biserial correlation and p-values. (http://www.eddata.com/resources/publications/EDS_Point_Biserial.pdf). Accessed February 21, 2015.
- Tavakol M, Dennick R. Making sense of Cronbach's alpha. Int J Med Educ 2011; 2:53-55.
- Meshkani Z, Hossein Abadie F. Multivariate analysis of factors influencing reliability of teacher made tests. IJME 2005; 6:149-152.
- Kartik A, Neeraj R. Itemized analysis of questions of multiple choice question (MCQ) exam. IJSR 2013; 2:279-280
- Tabatabaee M, Bahreyni Toosi M, Derakhshan A, Dalloee M, Gholami H. Analytic assessment of multiple-choice tests. IJME 2003; 2:87-91.
- Henrysson S. Correction of item-total correlations in item analysis. Psychometrika 1963; 28:211-218.
- McDonald M. Systematic assessment of learning outcomes: Developing multiple-choice exams. Mississauga: Jones and Bartlett Publishers; 2002.
- Barman A, Ja'afar R, Rahim F, Noor A. Psychometric characteristics of MCQs used in assessing pahase-II undergraduate medical students of university Sains Malaysia. The Open Medical Education Journal 2010; 3:1-4.
- Bouzidi L, Jaillet A. Can online Peer assessment be trusted?. Educational Technology and society 2009; 12:257-268.
- Mitra NK, Nagaraja HS, Ponnudurai G, Judson JP. The levels of difficulty and discrimination indices in type A multiple choice questions of pre-clinical semester 1 multidisciplinary summative tests. IeJSME 2009; 3: 2-7.
- Yu CH. Talk presented at: SAS Users Group International 26; April 23, 2001; California, CA. (http://www2.sas.com/proceedings/sugi26/p246-26.pdf). Accessed February 21, 2015.
- Gliem J, Gliem R. Talk presented at: Midwest Research to Practice Conference in Adult, Continuing, and Community Education. October 9, 2003; Ohio, OH. (https://scholarworks.iupui.edu/bitstream/handle/1805/344/Gliem%20&%20..?sequence=1). Accessed February 20, 2015
- Jackson S. Research methods: a modular approach. Belmont, CA: Cengage Learning; 2008
- Lord F. Applications of item response theory to practical testing problems. New York, NY: Routledge 2008.
- Theodorsson T, El Shafie K, Al Wardy A, Khan A, Al Mahrezi A, Al Shafaee M. Assessment of family doctors in Oman: getting the questions right preliminary findings of a performance analysis of multiple choice questions. The Internet Journal of Medical Education 2009; 1
- Si-Mui S, Rashiah R. Relationship between item difficulty and discrimination indices in true/false type multiple choice questions of a para-clinical multidisciplinary paper. Ann Acad Med Singapore 2006; 35:67-71.