Friday, May 11, 2012

Is technology-enhanced learning effective?

Image source: http://telscenter.org/projects/tels
Is Technology-Enhanced Learning Effective? What the Research Says about TEL.


The posting below is a reprint of tomorrow's professor newsletter vol 64, issue 4 (http://cgi.stanford.edu/~dept-ctl/cgi-bin/tomprof/postings.php). This short article looks at what recent research says about the effectiveness of technology-enhanced learning. It is prepared by the Research and Evaluation Team, Office of Information Technology, University of Minnesota - Twin Cities. http://z.umn.edu/research. In an effort to make research in the educational technology field more accessible, OIT's Research & Evaluation team produces frequent brief synopses of important recent studies. These synopses may be freely shared and used for non-profit academic purposes.http://z.umn.edu/briefs. For further information contact Dr. J.D. Walker (jdwalker@umn.edu).

Instructors interested in technology-enhanced learning (TEL) frequently want to know whether digital technology is educationally effective. Their question is not whether students like digital technology, or whether students are engaged by it, but instead whether it enhances student learning outcomes.

Despite a growing body of research into TEL, it is hard to give a simple answer to this question, in part because TEL studies are frequently deeply embedded in a particular context, which makes it difficult to know how well the studies generalize outside of that context.

A recent thorough and methodologically sound meta-analysis[1] by Barbara Means and colleagues for the U.S. Department of Education helps to address this problem by providing an overview of conclusions that are supported overall by the research on TEL. Means' primary concern was to compare the effectiveness of courses with an online component[2] to fully face-to-face courses.

Means used a stringent selection procedure in selecting studies for the meta-analysis, limiting the studies to those that used a comparative research design, measured learning outcomes objectively, controlled statistically for possible differences between control and treatment samples, and reported effect sizes for student learning outcomes. This procedure yielded 50 contrasts from studies conducted between 1996 and 2008.

* The DOE Meta-analysis: Findings

Online versus face-to-face : Means found that an average effect size of +0.20 (p < .001) standard deviations favoring the courses with an online component. This means that, on average, students in courses with an online component outperformed students in face-to-face courses by a small but statistically significant amount, after controlling for other factors.

Means is careful to say, however, that this finding almost certainly does not represent a pure effect of technology, or of the delivery method used in the different courses. Instead, online courses were associated with other instructional conditions, such as increased learning time, different materials, and enhanced opportunities for collaboration, which are the likely mechanisms through which they achieved superior results.

Factors that had no effect : The meta-analysis analyzed the influence of a large number of potential moderator variables and found that the main effect holds independent of the vast majority of these variables, including:

* learner type (K-12, undergraduate, graduate/professional);
* subject matter (medical/health care, others);
* type of knowledge tested (declarative, procedural, strategic); and
* type of computer-mediated communication with peers and with instructor (asynchronous only versus asynchronous plus synchronous).

Factors that had an effect : Several other moderator variables were statistically significant, however or nearly so, including:

* Blended learning : The authors separated purely online education from "blended" or "hybrid" conditions, or courses in which face-to-face instruction is enhanced or supplemented by online materials and/or activities. They then compared each of these separately to fully face-to-face conditions. Completely online instruction had an advantage of +0.05 (p = .46, not significant) standard deviations over purely face-to-face instruction, while the advantage of blended instruction over face-to-face was +0.35 (p < .001).

* Curriculum and instructional methods : When students in the online condition were exposed to a different curriculum and/or instructional methods from students in the face-to-face condition, the advantage of the online condition was +0.40 (p < .001); when these factors were equivalent across conditions, it was +0.13 (p < .05). This finding suggests that the positive effects of using online technology in education are enhanced when an instructor adapts curriculum and instructional approach to the use of technology.

* Type of online learning experience : Instructor-directed, expository learning had an effect size of +0.39 (p < .01); collaborative, interactive instruction, +0.25 (p < .001); independent, active online learning, +0.05 (not significant).

* Time on task : When students in the online condition spent more time on task than students in the face-to-face condition, the advantage of the online condition was +0.45; otherwise it was +0.18. This difference approached the threshold of statistical significance (p = .06).

Variants of online learning : Means' study also examined different ways of implementing TEL. In other words, in addition to comparing online and face-to-face learning, the meta-analysis addressed studies which compared different learning conditions all of which involve online technology.

This part of the analysis found that some frequently recommended online learning practices did not result in improved learning outcomes. These included providing multimedia in online learning materials (e.g. enhancing text with static graphic and embedded video), and incorporating quizzes into the online environment.

A group of studies which examined the effects of giving learners control over online resources produced mixed results, with some studies favoring providing learner control and others yielding null results.

However, it does appear that learning in the online environment can be improved by individualizing instruction (providing feedback and guidance customized to each learner's performance) and by promoting learner reflection (through prompts designed to foster student self-assessment and metacognition).

* Discussion and Best Practices

Blending works : This study adds to a growing consensus around the conclusion that the most effective type of instruction combines the online and face-to-face environments. Other meta-analyses which reach this conclusion are Bernard et. al (2004) and Zhao et. al (2005).

Adapting instruction works : Means' work supports the thoughtful adaptation of instructional methods and materials to the online environment, and forthcoming research by OIT's Research and Evaluation Team reaches a similar conclusion.

Self-directed online learning is not the best : Collaborative or instructor-directed online learning achieved results superior to those attained through independent, self-directed online learning, which may provide a partial explanation for why online learning has not proven to be a money-saver for cash-strapped educational institutions.


References:

1 Means, B., Toyama, Y., Murphy, R., Bakia, M., Jones, K. (2010). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning. Center for Technology in Learning, U.S. Department of Education. Retrieved May 10, 2010 fromhttp://www.ed.gov/about/offices/list/opepd/ppss/reports.html .

2 "Courses with an online component" included courses which merely supplemented an unchanged face-to-face course with online materials, as well as courses delivered entirely online with no face-to-face interaction.




1 comment: