Making Connections Through the Written Word
Two news briefs caught my attention recently. Both described the report Evaluating Digital Learning for Adult Basic Literacy and Numeracy from SRI International, but they did so in quite different ways.
The Digital Promise brief, “Digital Tools Support Adult Learning,” asks “Do edtech tools support learning?” and says that “The answer, according to a new report released by SRI International last week, points to yes. … researchers found that edtech products can help build math and reading skills, as well as confidence in using online technologies.” It is accompanied by an infographic that presents the report’s positive findings.
By contrast, the EdSurge brief, “Immature, Still Needs to Grow Up,” notes that “results were mixed” and quotes the report itself: “The study produced no conclusive evidence that the use of the products was more effective in raising students’ math or reading skills than the participating ABE program sites’ current curricula and approaches” (p. ES-12).
What accounts for the difference?
Partly it’s a matter of roles and relationships. In conducting the study that underlies the report, which was funded by the Joyce Foundation, SRI worked with a number of partners, including Digital Promise but not including EdSurge.
But the truth is that both news briefs are correct — as far as they go. Each reports on one aspect of the information contained in the report, but neither tells the full story, and both miss the most important points.
The study was intended “to investigate the role and efficacy of online learning technology products in improving the basic reading and math outcomes of low-skilled adults in ABE programs” (p. 3). It looked at five products designed to develop basic literacy and numeracy skills through web-based instruction:
The research took place in 14 adult education sites, including community colleges, nonprofit adult education centers, and K-12 district adult education programs. Each site implemented only one of the five products, so each product was used in three sites, except for ALEKS, which was used in two.
The report’s authors discuss the ways that variations among the five products and among the different implementation sites limit their ability to extrapolate generalized findings. They describe differences in the duration of product use, the degree to which product use was integrated into the established curriculum, and whether product use was mandated. “The quasi-experimental designs used to estimate effects do not disentangle the effects of product use from other aspects of instruction, including direct instruction by the instructor,” they write. “The practices highlighted in particular sites that may appear promising cannot be separated from the characteristics of the ABE program and instructors in that site and the products used” (pp. 5-6). In addition, the authors note that each of the technology products was being implemented for the first time at each of the study sites: “Thus, the findings reported here are for ABE program sites, instructors, and students in the early adoption stage and may not reflect the outcomes of product use in more mature implementations” (p. 5).
So, as EdSurge notes, the study did not produce a clear linkage between use of the edtech products and outcomes as shown on standardized testing. It did demonstrate an overall trend, however: “Overall the results were inconclusive but tended to be more positive than negative: Greater use of the products was associated with better gains in student test scores” (p. 43). In addition, the report provides several other important insights about edtech use.
As the report’s authors note, “the significance of these findings should not be underestimated. Many of the students enrolled in ABE programs have had little prior success developing their basic skills in formal education environments. … Given the size of the population in need of the kinds of services ABE programs provide, these findings indicate that learning technologies … can be part of the solution, helping ABE instructors do what they do better and providing many adults with the confidence that they can use online resources on their own time and at their own pace” (p. 45).
Adult education is a complex undertaking, and every adult learner brings different strengths, goals, and needs. A careful reading that evaluates the how and why of a product or method’s effectiveness is far more useful — and rewarding — than a simple “works / doesn’t work” on-off metric.