Fluctuations of Ability Estimates in Testing in Item Response Theory

  • Hideo Hirose Kurume University
Keywords: item response theory, purely probabilistic fluctuation, basic formula representing fluctuations, bootstrap method, Fisher information matrix, learning analytics, matrix decomposition

Abstract

By analyzing the fluctuations of ability estimates in testing, we first obtain the purely probabilistic fluctuations of ability estimates in a one-time testing under the condition that the students' abilities can be estimated by using the item response theory, and next, by taking into account such the probabilistic fluctuations, we find students who reveal the discrepancies of observed abilities between two separated testings. When such discrepancies of abilities are observed, test results are considered to be affected by some factors such as the physical conditions of the examinees, the teacher's teaching skills, and students' study skill developments. To describe such a phenomenon, we proposed a basic formula. The accuracies are obtained under the situation that the observed data follows the item response theory.  To investigate whether we can assume such a condition or not, we have introduced the matrix decomposition perspective, and confirmed that the item response theory were used properly. Using an example case took in a university mathematics testing, we have shown how we have extracted the purely probabilistic fluctuations and segregated fluctuations due to other factors.

References

R. de Ayala, The Theory and Practice of Item Response Theory. Guilford Press, 2009.

F.B. Baker and S-H. Kim, Item Response Theory: Parameter Estimation Technique, 2nd edn., Marcel Dekker, 2004.

A.P. Dempster, N.M. Laird, and D.B. Rubin, Maximum Likelihood from Incomplete Data via the EM Algorithm, Journal of the Royal Statistical Society. Series B , 39, 1977, pp.1-38.

M.C. Edwards, C.R. Houts, and L. Cai, A diagnostic procedure to detect departures from local independence in item response theory models, Psychol Methods, 23, 2018, pp.138-149.

R. Fletcher, Practical Methods of Optimization, Wiley, 2000.

G.H. Golub, C.F. Van Loan, Matrix Computations, Johns Hopkins Univ. Press, 2012.

R. Hambleton, H. Swaminathan, and H. J. Rogers, Fundamentals of Item Response Theory. Sage Publications, 1991.

H. Hirose and T. Sakumura, Test evaluation system via the web using the item re- sponse theory, in Computer and Advanced Technology in Education, 2010, pp.152- 158.

H. Hirose, T. Sakumura, Item Response Prediction for Incomplete Response Matrix Using the EM-type Item Response Theory with Application to Adaptive Online Abil- ity Evaluation System, IEEE International Conference on Teaching, Assessment, and Learning for Engineering, 2012, pp.8-12.

H. Hirose, Yu Aizawa, Automatically Growing Dually Adaptive Online IRT Testing System, IEEE International Conference on Teaching, Assessment, and Learning for Engineering, 2014, pp.528-533.

H. Hirose, Y. Tokusada, K. Noguchi, Dually Adaptive Online IRT Testing System with Application to High-School Mathematics Testing Case, IEEE International Con- ference on Teaching, Assessment, and Learning for Engineering, 2014, pp.447-452.

H. Hirose, Y. Tokusada, A Simulation Study to the Dually Adaptive Online IRT Test- ing System, IEEE International Conference on Teaching, Assessment, and Learning for Engineering, 2014, pp.97-102.

H. Hirose, Meticulous Learning Follow-up Systems for Undergraduate Students Us- ing the Online Item Response Theory, 5th International Conference on Learning Tech- nologies and Learning Environments, 2016, pp.427-432.

H. Hirose, M. Takatou, Y. Yamauchi, T. Taniguchi, T. Honda, F. Kubo, M. Imaoka, T. Koyama, Questions and Answers Database Construction for Adaptive Online IRT Testing Systems: Analysis Course and Linear Algebra Course, 5th International Con- ference on Learning Technologies and Learning Environments, 2016, pp.433-438.

H. Hirose, Learning Analytics to Adaptive Online IRT Testing Systems “Ai Arutte” Harmonized with University Textbooks, 5th International Conference on Learning Technologies and Learning Environments, 2016, pp.439-444.

H. Hirose, M. Takatou, Y. Yamauchi, T. Taniguchi, F. Kubo, M. Imaoka, T. Koyama, Rediscovery of Initial Habituation Importance Learned from Analytics of Learning Check Testing in Mathematics for Undergraduate Students, 6th International Confer- ence on Learning Technologies and Learning Environments, 2017, pp.482-486.

H. Hirose, Success/Failure Prediction for Final Examination Using the Trend of Weekly Online Testing, 7th International Conference on Learning Technologies and Learning Environments, 2018, pp.139-145.

H. Hirose, Attendance to Lectures is Crucial in Order Not to Drop Out, 7th Inter- national Conference on Learning Technologies and Learning Environments, 2018, pp.194-198.

H. Hirose, Time Duration Statistics Spent for Tackling Online Testing, 7th Inter- national Conference on Learning Technologies and Learning Environments, 2018, pp.221-225.

H. Hirose, Prediction of Success or Failure for Examination using Nearest Neighbor Method to the Trend of Weekly Online Testing, International Journal of Learning Technologies and Learning Environments, 2, 2019, pp.19-34.

H. Hirose, Relationship Between Testing Time and Score in CBT, International Jour- nal of Learning Technologies and Learning Environments, 2, 2019, pp.35-52.

H. Hirose, Current Failure Prediction for Final Examination using Past Trends of Weekly Online Testing, 9th International Conference on Learning Technologies and Learning Environments, 2020, pp.142-148.

H. Hirose, More Accurate Evaluation of Student’s Ability Based on A Newly Pro- posed Ability Equation, 9th International Conference on Learning Technologies and Learning Environments, 2020, pp.176-182.

H. Hirose, Analysis of Fluctuations of Ability Estimates in Testing, 10th International Conference on Learning Technologies and Learning Environments, 2021, pp.148-153

H. Hirose, Difference Between Successful and Failed Students Learned from Analyt- ics of Weekly Learning Check Testing, Information Engineering Express, Vol 4, 2018, pp.11-21.

H.Hirose,KeyFactorNottoDropOutistoAttendLectures,InformationEngineering Express, 5, 2019, pp.59-72.

Hideo Hirose, Dually Adaptive Online IRT Testing System, Bulletin of Informatics and Cybernetics Research Association of Statistical Sciences, 48, 2016, pp.1-17.

W. J. D. Linden and R. K. Hambleton, Handbook of Modern Item Response Theory. Springer, 1996.

Y. Koren, R.M. Bell and C. Volinsky, Matrix Faxctorization Techniques for Recom- mender Systems, Computer, 42, 2009, pp.30-37.

E. Polak, Optimization : Algorithms and Consistent Approximations, Springer, 1997.

T. Sakumura, T. Kuwahata and H. Hirose, An Adaptive Online Ability Evaluation System Using the Item Response Theory, Education & e-Learning, 2011, pp.51-54.

T. Sakumura and H. Hirose, Making up the Complete Matrix from the Incomplete Matrix Using the EM-type IRT and Its Application, Transactions on Information Processing Society of Japan (TOM), 72, 2014, pp.17-26.

T. Sakumura, H. Hirose, Bias Reduction of Abilities for Adaptive Online IRT Testing Systems, International Journal of Smart Computing and Artificial Intelligence (IJS- CAI), 1, 2017, pp.57-70.

Y. Tokusada, H. Hirose, Evaluation of Abilities by Grouping for Small IRT Testing Systems, 5th International Conference on Learning Technologies and Learning Environments, 2016, pp.445-449.

Published
2023-04-30
Section
Technical Papers