Home > Archive > 2024 > Volume 14 Number 10 (2024) >
IJIET 2024 Vol.14(10): 1414-1420
doi: 10.18178/ijiet.2024.14.10.2172

Enhancing Educational Assessments: Score Regeneration through Post-Item Analysis with Artificial Intelligence

Najoua Hrich1,2,*, Mohamed Azekri3, Charafeddin Elhaddouchi1, and Mohamed Khaldi2
1. Regional Center for Education and Training Professions, Institutions for Higher Executive Training, Tangier, Morocco
2. Computer Science and University Pedagogical Engineering Research Team, Normal Higher School, Abdelmalek Essaadi University, Morocco
3. Regional Academy of Education and Training, Ministry of National Education Preschool and Sports, Morocco
Email: hrnajouaofficiel@gmail.com (N.H.); medazekri@gmail.com (M.A.); charaff4@yahoo.fr (C.E.); medkhaldi@yahoo.fr (M.K.)
*Corresponding author

Manuscript received March 3, 2024; revised May 30, 2024; accepted August 2, 2024; published October 15, 2024

Abstract—Over the years, the significant increase in candidates taking part in examinations and competitive exams has prompted the relevant bodies and institutions to reconsider their assessment approaches. This evolution largely stems from the growing prevalence of Artificial Intelligence (AI) and continuous technological advances. The primary aim of this research is to evaluate the effectiveness of Multiple-Choice Tests (MCTs) enhanced by AI through comprehensive item analysis followed by score regeneration focusing on the most discriminant items, aiming to strengthen assessment accuracy. The predominant adoption of MCTs has emerged, offering a practical and efficient solution for rigorously assessing a wide range of candidates. The success of this method hinges on the quality of its items. Therefore, ensuring the validity of such exams relies heavily on statistical analysis to select relevant items that are balanced in terms of difficulty and discriminatory power. Given the challenges of this analysis during test development, a new approach using score regeneration through an AI tool is proposed. This approach is based on a posterior statistical analysis of candidate performance with adjusted scores by eliminating the least discriminating items. The research sample was intentionally selected, consisting of computer science trainee teachers spread over the last three academic years. To validate this approach, a comparative study was conducted using the t-student’s test and the Spearman correlation coefficient on the grades obtained in algorithmic and programming training modules each year. The results demonstrate that incorporating this score regeneration phase considerably improves the credibility of MCTs-based assessments, providing a solid foundation for educational decision-making. The findings affirm the research objective by showing that AI-enhanced MCTs offer a reliable and valid method for large-scale candidate assessment.

Keywords—assessments, items analysis, Artificial Intelligence (AI), competitive exams, Multiple-Choice Tests (MCTs)

[PDF]

Cite: Najoua Hrich, Mohamed Azekri, Charafeddin Elhaddouchi, and Mohamed Khaldi, "Enhancing Educational Assessments: Score Regeneration through Post-Item Analysis with Artificial Intelligence," International Journal of Information and Education Technology vol. 14, no. 10, pp. 1414-1420, 2024.


Copyright © 2024 by the authors. This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

General Information

  • ISSN: 2010-3689 (Online)
  • Abbreviated Title: Int. J. Inf. Educ. Technol.
  • Frequency: Monthly
  • DOI: 10.18178/IJIET
  • Editor-in-Chief: Prof. Jon-Chao Hong
  • Managing Editor: Ms. Nancy Y. Liu
  • E-mail: editor@ijiet.org
  • Abstracting/ Indexing: Scopus (CiteScore 2023: 2.8), INSPEC (IET), UGC-CARE List (India), CNKI, EBSCO, Google Scholar
  • Article Processing Charge: 800 USD

 

Article Metrics in Dimensions