Instructional Technology (EdDIT)
School of Education
David D. Carbonara
Joseph C. Kush
Sandra D. Embler
LMS, Educational Data Mining, LMS Tools, Frequency of Interaction, Student Learning, Student Achievement, Online Secondary Courses
The pedagogy of teaching and learning has been changing since computers were first integrated into the classroom. As technology evolves, the evaluation of the instructional tool’s effectiveness will continue to be an area of research need. The effectiveness of an instructional tool can be measured by student learning and achievement. Student learning and achievement was found to be most effective when the characteristics of active learning/engagement, frequent interaction, and feedback were present. The presence is provided by the instructor. Chickering and Gamson (1987) developed the Seven Principles for Good Practice (SPGP) in Undergraduate Education to improve teaching and learning.
The population for this study will be students enrolled in asynchronous online secondary school courses. In an online environment, the classroom is provided through a Learning Management System (LMS). The instructor uses the tools provided in the LMS to interact with students. This study uses the SPGP that support the active learning/engagement, frequent interaction, and feedback characteristics for effective student learning. The LMS tools of updates, assignments, tests, and discussion boards support the SPGP principles 1, 2, 3, and 4. The student scores for pretest, posttest and semester final grade will be identified for each course. The pretest will be used as a control variable while posttest and semester final grade will be used as dependent variables in each hierarchical multiple regression. The independent variables for LMS tools will be determined by the instructors use frequency each semester. The courses are identified by curricular subject area and will be analyzed to determine if curricular subject area has any effect on the predictive power for both semester final grade and posttest scores.
This study employed a data mining procedure to determine if LMS tools could predict semester final grades (achievement) and posttest scores (learning). The findings suggest that the LMS tools can predict posttest scores but not semester final grades. Additionally, the study determined whether curricular subject area had an effect on the predictive power of the LMS tools. The findings of this study suggest that curricular subject area can predict the variance in semester final grades and posttest scores. The findings also suggest that there was unequal variance across curricular subject areas for the dependent variables. By categorizing the courses by curricular subject area, the predictive power of the LMS tools was positively affected. The LMS tools had large effect sizes in science and social studies for posttest scores when categorized by curricular subject area.
Additionally, the LMS tools updates, assignments, tests, and discussion boards varied in predictive strength and relationship to the dependent variables. The findings of this study indicated that the LMS assignment and discussion board tools were significant predictors with small positive effects for posttest scores. The findings also suggested that the LMS test tool was a significant predictor with a small negative relationship to posttest scores. The negative relationship found in this study contradicts the literature related to the frequency of tests in traditional classroom environments. The LMS test tool was primarily a learner-content interaction, whereas assignments primarily were a learner-instructor interaction and discussion boards were primarily a learner-learner interaction. The LMS update tool was a significant predictor for posttest scores but had a small positive relationship for semester-long courses and a negative relationship for year-long courses. The frequency of the LMS tools varied by curricular subject area. Specifically, the LMS assignment tool had the highest mean frequency across all subject areas.
The LMS tools, when added to pretest scores, contribute an additional 3% (SY1516 YL), 4% (SY1415 SL), and 8% (SY1516 YL) prediction of the variance of posttest scores with a small effect. The LMS tools for SY1415 YL predicted 14% of the variance with a medium effect. Specifically, the findings supported the linear positive relationship between assignments and discussion boards for posttest scores. The findings did not support that the LMS tools were a significant predictor for semester final grades when categorized by school year. By categorizing the courses by curricular subject area, the LMS tools were significant predictors for semester final grades and posttest scores. The LMS tools categorized by curricular subject area had small effects for semester final grades. The largest overall effect of the LMS tools was on posttest scores categorized by curricular subject area. Career and technical education SL was a small effect with 6% variance prediction. For medium effects the variance prediction was 20% for English YL, 17% for fine arts YL, 15% for math SL, and 16% for world languages YL. Finally, for the large effects, LMS tools added 29% variance prediction for science YL and 39% variance prediction for social studies YL. Therefore, curricular subject area does have an effect on the predictive power of LMS tools. This study provides a further example of educational data mining and the results that can be achieved with a strong pedagogical framework.
Barkand, J. (2017). Using Educational Data Mining Techniques to Analyze the Effect of Instructors’ LMS Tool Use Frequency on Student Learning and Achievement in Online Secondary Courses (Doctoral dissertation, Duquesne University). Retrieved from https://dsc.duq.edu/etd/198