Analysis of Grade Eight Examination Subjects in the Banadir Region of Somalia

  • Ahmed Mohamud Warsame Ministry of Education
Keywords: Weight of Cognitive Skills, Difficulty Level, Instructional Objectives, Design of Question Paper, Item Analysis
Share Article:

Abstract

Item analysis is essential in the development of assessment tools and in standardised computing measures of student performance. This study was conducted to analyse the questions of grade eight examination subjects in the Banadir region of Somalia with respect to the instructional objectives, the difficulty level, and their impact on the student’s achievement. The population of the study was grade 8 students of the Banadir region of Somalia who sat for the national examinations in 2022. A purposive sampling technique was followed for the selection of subjects and the subject experts. A random sampling technique was used to select 10% of question papers for study in every subject examined in Banadir. Also, the view of 16 subject experts was taken into consideration regarding the design of the question paper, instructional objectives, alignment with the syllabus, and difficulty level. The findings of the study revealed the effect of the difficulty level and instructional objectives on students’ achievement was apparent. However, the study found a severe mismatch between the distribution of the items in the original table of specifications and the analysed items. The study recommended for post examinations analysis to improve the quality of the questions and to conduct separate analyses for each subject in the syllabus.

Downloads

Download data is not yet available.

References

Ado Abdu Bichi, Rahimah Embong, & Mustafa Mamat. (2015). Classical Item Analysis of Science Achievement Test. Proceedings of the IniSZA Research Conference 2015, 2015(April), 1–11.

Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., Raths, J., & Wittrock, M. C. (2001). A Taxonomy for Learning, Teaching and Assessing. 302.

Shakil, M. (2008, March). Assessing Student Performance Using Test Item Analysis and its Relevance to the State Exit Final Exams of MAT0024 Classes-An Action Research Project. In A Paper presented on MDC Conference Day.

Kinsey, T. L. (2003). A comparison of IRT and Rasch procedures in a mixed-item format test. University of North Texas.

Kumar, A., & Ii, S. P. (2014). An analytical study of mathematics question papers of madhya pradesh board with respect to its patterns/design, content coverage, difficulty level, objectivity in marking and its impact on achievement of students. In Research Article International Journal of Advancement in Education and Social Sciences, 2(2).

MoECHE. (2017a). Somalia National Curriculum Framework. MoECHE, Mogadishu, Somalia.

MoECHE. (2017b). Somlia National Education Policy. MoECHE, Mogadishu, Somalia.

Suruchi, S., & Rana, S. S. (2014). Test Item Analysis and Relationship Between Difficulty Level and Discrimination Index of Test Items in an Achievement Test in Biology. Paripex - Indian Journal of Research, 3(6), 56–58. https://doi.org/10.15373/22501991/june2014/18

Matlock-Hetzel, S. (1997). Basic Concepts in Item and Test Analysis Susan Matlock-Hetzel Texas. Texas A&M University. https://www.ptonline.com/articles/how-to-get-better-mfi-results.

Published
13 January, 2023
How to Cite
Warsame, A. (2023). Analysis of Grade Eight Examination Subjects in the Banadir Region of Somalia. East African Journal of Education Studies, 6(1), 11-21. https://doi.org/10.37284/eajes.6.1.1046