Explainable AI for assessment design optimization in undergraduate education

Authors

DOI:

https://doi.org/10.61286/e-rms.v4i.381

Keywords:

explainable artificial intelligence, machine learning, education, assessment design, undergraduate education, learning analytics, curriculum optimization, educational data mining.

Abstract

Assessment design plays a pivotal role in shaping undergraduate learning outcomes, but often the choice of assessment structure and weighting is made following the pedagogical intuition rather than research data. Although machine learning has gained significant use in the higher education sector to forecast the performance of students, insufficient effort has gone into showing how the elements of assessment per se affect academic performance. This paper proposes an assessment-centric explainable artificial intelligence (XAI) framework for analyzing and optimizing assessment design in undergraduate education. Real course assessment data in the form of assessment weights, frequency and diversity machine learning models are trained to predict course outcomes and SHapley Additive exPlanations (SHAP) are used to measure the contribution of individual assessment components. Unlike existing student-centric explainability approaches, the proposed framework focuses on assessment structures, enabling transparent analysis of how design choices affect success and failure risk. According to the outcomes of the experiment, high stakes final exams are strongly related with the probability of failures, whereas the diversified strategies of continuous evaluation decrease the probability of failures. The outcomes provide a feasible information to the curriculum developers and academic decision-makers that could help them to redesign assessment based on evidence.

Downloads

Download data is not yet available.

References

Akçapınar, G., Altun, A., & Aşkar, P. (2019). Using learning analytics to develop early-warning system for at-risk students. International Journal of Educational Technology in Higher Education, 16(1), 1–20. https://doi.org/10.1186/s41239-019-0172-z

Altukhi, Z. M., & Pradhan, S. (2024). Systematic literature review: Explainable AI definitions and challenges in education. ICIS 2024 Proceedings.

Baker, R. S. J. D., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future visions. Journal of Educational Data Mining, 1(1), 3–17. https://doi.org/10.5281/zenodo.3554657

Doshi-Velez, F., & Kim, B. (2017). Towards a rigorous science of interpretable machine learning. arXiv. https://doi.org/10.48550/arXiv.1702.08608

Efendi, Y. (2025). Machine learning-based classification of student adaptability in online learning with feature engineering. TIERS Information Technology Journal, 6(1), 129–143. https://doi.org/10.38043/tiers.v6i1.6806

Fiok, K., Farahani, F. V., Karwowski, W., & Ahram, T. (2022). Explainable artificial intelligence for education and training. The Journal of Defense Modeling and Simulation, 19(2), 133–144. https://doi.org/10.1177/15485129211028651

Guan, Y., Wang, F., & Song, S. (2025). Interpretable machine learning for academic performance prediction: A SHAP-based analysis of key influencing factors. Innovations in Education and Teaching International, 1–20. https://doi.org/10.1080/14703297.2025.2532050

Holstein, K., McLaren, B. M., & Aleven, V. (2019). Designing for complementarity: Teacher and student needs for orchestration support in AI-enhanced classrooms. International Conference on Artificial Intelligence in Education, 157–171. https://doi.org/10.1007/978-3-030-23204-7_14

Hooda, M., Rana, C., Dahiya, O., Rizwan, A., & Hossain, M. S. (2022). Artificial intelligence for assessment and feedback to enhance student success in higher education. Mathematical Problems in Engineering, 2022(1), Artículo 5215722. https://doi.org/10.1155/2022/5215722

Islam, M. M., Sojib, F. H., Mihad, M. F. H., Hasan, M., & Rahman, M. (2025). The integration of explainable AI in educational data mining for student academic performance prediction and support system. Telematics and Informatics Reports. https://doi.org/10.1016/j.teler.2025.100203

Johora, F. T., Hasan, M. N., Rajbongshi, A., Ashrafuzzaman, M., & Akter, F. (2025). An explainable AI-based approach for predicting undergraduate students academic performance. Array, 26, Artículo 100384. https://doi.org/10.1016/j.array.2025.100384

Kalasampath, K., Spoorthi, K. N., Sajeev, S., Kuppa, S. S., Ajay, K., & Angulakshmi, M. (2025). A literature review on applications of explainable artificial intelligence (XAI). IEEE Access. https://doi.org/10.1109/ACCESS.2025.3546681

Kaliisa, R., Misiejuk, K., López-Pernas, S., Khalil, M., & Saqr, M. (2024). Have learning analytics dashboards lived up to the hype? A systematic review of impact on students’ achievement, motivation, participation and attitude. Proceedings of the 14th Learning Analytics and Knowledge Conference, 295–304. https://doi.org/10.1145/3636555.363688

Kesgin, K., Kiraz, S., Kosunalp, S., & Stoycheva, B. (2025). Beyond performance: Explaining and ensuring fairness in student academic performance prediction with machine learning. Applied Sciences, 15(15), 8409. https://doi.org/10.3390/app15158409

Khosravi, H., Shum, S. B., Chen, G., Conati, C., Tsai, Y.-S., Kay, J., Knight, S., Martinez-Maldonado, R., Sadiq, S., & Gašević, D. (2022). Explainable artificial intelligence in education. Computers and Education: Artificial Intelligence, 3, Artículo 100074. https://doi.org/10.1016/j.caeai.2022.100074

Kizilcec, R. F. (2016). How much information? Effects of transparency on trust in an algorithmic interface. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 2390–2395. https://doi.org/10.1145/2858036.2858402

Latif, G., Abdelhamid, S. E., Fawagreh, K. S., Brahim, G. B., & Alghazo, R. (2023). Machine learning in higher education: Students’ performance assessment considering online activity logs. IEEE Access, 11, 69586–69600. https://doi.org/10.1109/ACCESS.2023.3287972

Leichtmann, B., Humer, C., Hinterreiter, A., Streit, M., & Mara, M. (2023). Effects of explainable artificial intelligence on trust and human behavior in a high-risk decision task. Computers in Human Behavior, 139, Artículo 107539. https://doi.org/10.1016/j.chb.2022.107539

Lundberg, S. M., & Lee, S.-I. (2017). A unified approach to interpreting model predictions. Advances in Neural Information Processing Systems, 30.

Lünich, M., & Keller, B. (2024). Explainable artificial intelligence for academic performance prediction: An experimental study on the impact of accuracy and simplicity of decision trees on causability and fairness perceptions. Proceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency, 1031–1042. https://doi.org/10.1145/3630106.3658953

Martin, F., Kim, S., Bolliger, D. U., & DeLarm, J. (2025). Assessment types, strategies, and feedback in online higher education courses in the age of artificial intelligence: Perspectives of instructional designers. TechTrends, 1–17. https://doi.org/10.1007/s11528-025-01115-8

Nicol, D. J., & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199–218. https://doi.org/10.1080/03075070600572090

Ofori, E., & Dake, D. K. (2025). Explainable artificial intelligence in LSTM transformer models for student performance analysis. Discover Computing, 28(1), Artículo 313. https://doi.org/10.1007/s10791-025-09814-9

Osunbunmi, I., Feyijimi, T., Cutler, S., Brijmohan, Y., Arinze, L., Dansu, V., Bamidele, B., Wu, J., & Rabb, R. (2025). Artificial intelligence in engineering education research: Using machine learning models to predict undergraduate engineering students’ persistence to graduation. Journal of Engineering Education, 114(4), Artículo e70034. https://doi.org/10.1002/jee.70034

Pachouly, S., & Bormane, D. S. (2025). Explainable artificial intelligence in education: Transforming teaching and learning-a review. TPM–Testing, Psychometrics, Methodology in Applied Psychology, 32(S8), 1571–1584. https://doi.org/10.5281/zenodo.17866084

Pawlowsky-Glahn, V., Egozcue, J. J., & Tolosana-Delgado, R. (2015). Modeling and analysis of compositional data. John Wiley & Sons.

Romero, C., & Ventura, S. (2010). Educational data mining: A review of the state of the art. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 40(6), 601–618. https://doi.org/10.1109/TSMCC.2010.2053532

Sanfo, J.-B. M. B. (2025). Application of explainable artificial intelligence approach to predict student learning outcomes. Journal of Computational Social Science, 8(1), Artículo 9. https://doi.org/10.1007/s42001-024-00344-w

Tariq, R., Orozco-del-Castillo, M. G., Zamir, M. T., Ramírez-Montoya, M. S., & Wilberforce, T. (2025). Explainable artificial intelligence for predictive modeling of student stress in higher education. Scientific Reports, 15(1), Artículo 38375. https://doi.org/10.1038/s41598-025-22171-3

Yorke, M. (2003). Formative assessment in higher education: Moves towards theory and the enhancement of pedagogic practice. Higher Education, 45(4), 477–501. https://doi.org/10.1023/A:1023967026413

Downloads

Published

2026-04-17 — Updated on 2026-04-17

How to Cite

Adnan Muhisn , S. (2026). Explainable AI for assessment design optimization in undergraduate education. E-Revista Multidisciplinaria Del Saber, 4, e-RMS05042026. https://doi.org/10.61286/e-rms.v4i.381