Automated feedback on the structure of hypothesis tests

S.G. Tacoma, B.J. Heeren, J.T. Jeuring, P.H.M. Drijvers

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

Abstract

Understanding the structure of the hypothesis testing procedure is challenging for many first-year university students. In this paper we investigate how providing automated feedback in an Intelligent Tutoring System can help students in an introductory university statistics course. Students in an experimental group (N=154) received elaborate feedback on errors in the structure of hypothesis tests in six homework tasks, while students in a control group (N=145) received verification feedback only. Effects of feedback type on student behavior were measured by comparing the number of tasks tried, the number of tasks solved and the number of errors in the structure of hypothesis tests between groups. Results show that the elaborate feedback did stimulate students to solve more tasks and make fewer errors than verification feedback only. This suggests that the elaborate feedback contributes to students’ understanding of the structure of the hypothesis testing procedure.
Original languageEnglish
Title of host publicationProceedings of the Eleventh Congress of the European Society for Research in Mathematics Education
EditorsUffe Thomas Jankvist, Marja van den Heuvel-Panhuizen, Michiel Veldhuis
Place of PublicationUtrecht, the Netherlands
PublisherFreudenthal Group & Freudenthal Institute, Utrecht University and ERME
Pages2969-2976
ISBN (Print)978-90-73346-75-8
Publication statusPublished - 2019

Keywords

  • intelligent tutoring systems
  • domain reasoner
  • Hypothesis testing
  • statistics education

Fingerprint

Dive into the research topics of 'Automated feedback on the structure of hypothesis tests'. Together they form a unique fingerprint.

Cite this