Defense Date

3-8-2019

Graduation Date

Spring 5-10-2019

Availability

Immediate Access

Submission Type

dissertation

Degree Name

EdD

Department

Instructional Technology (EdDIT)

School

School of Education

Committee Chair

Rachel Ayieko

Committee Member

Misook Heo

Committee Member

Gibbs Kanyongo

Keywords

Instructional technology, student response system, formative assessment, question sequencing, statistics education

Abstract

Formative assessment has long been used to gauge students’ understanding of course material prior to taking an exam. With the advent of more advanced technology, only recently have instructors been able to combine formative assessment with a student response system to allow students to respond to questions in real time during class. Previous studies show mixed findings on the relationship between the use of student response systems and student learning. For example, some studies show that those students who used a student response system performed better on exams or in the course when compared to those who did not, while others found no significant difference. In addition, the influence of testing students formatively multiple times before a summative assessment has been of little focus.

A quasi-experimental design was used in this study to test students on 112 concepts in introductory statistics at three time points: during class using a student response system, during an online quiz about one week later, and on an exam at the end of the unit. Each concept was associated with one of four course units and was assigned a level of cognitive demand. The primary goal of the study was to determine if the sequences of correct and incorrect responses that students provided on two formative assessments influenced their ability to answer a corresponding summative assessment question correctly using a logistic regression model and Monte Carlo simulation. Also, a series of loglinear models was used to determine if the sequence of responses, unit of the course, and level of cognitive demand were independent.

The results of this study indicate that students who answered both formative assessments correctly performed the best on the exam, followed by those who answered only the quiz question correctly. However, students who answered only the student response system question correctly fared no better on the exam than those who missed both formative assessments. Students who completed more sequences were more likely to overachieve their predicted exam results. Moreover, the results showed that students’ sequences of responses, the course unit, and the level of cognitive demand were not independent. Students tended to overachieve on less cognitively demanding sequences requiring descriptive statistics and on strategic thinking exam questions requiring inference, but underachieved in the probability unit and on more challenging descriptive statistics questions.

This study provided insight on the influence of that repeated practice on exam performance, suggesting that working independently after learning a concept is more beneficial to student learning than using a student response system in class. This study also demonstrated how statistics education can effectively use formative assessment in the classroom and test higher-order thinking using multiple-choice questions. University instructors may find the results useful in reevaluating the use of active learning in their classroom.

Language

English

Share

COinS