Improving How We Review Edtech: 3 Case Studies
Your content has been saved!
Go to My Saved Content.Recently I evaluated three nonprofit sites that provide evaluations of online game-based skill or fact practice.
All three are on the right track, but only one offered the specific educator-directed professional evaluation needed to guide teachers in choosing the best of these resources. The other two had limitations, such as inadequate documentation from specialists with expertise in child development or education, no clear rubrics used for evaluation of websites, and/or murky details of applications and suitability for individual students and topics.
Taking a look at each site will hopefully shed some light on how, moving forward, we might improve the way we review edtech products in terms of expertise, consistency and transparency.
1. EdSurge Product Reviews
EdSurge is a work in progress that currently includes staff reviews of only 30 of the 100-plus products listed. According to the staff writers' bios, only two of these individuals have backgrounds of expertise in correlating learning tools with developmental or cognitive science. The site states that staff writers "get their information from dozens of interviews with teachers and administrators," which limits the consistency of professional, standardized analysis.EdSurge also uses "contributing writers" with a range of backgrounds, including classroom teachers, without further noting background qualifications for all contributors. It would be of additional value if the reviewers used the same criteria of evaluation or rubrics in their evaluations.
The site provides search categories including game-based learning, subject matter, standards and cost (some free, but mostly for a price).
I found a few inconsistencies, including a link without a review, and missing analyses. This resource website will be more useful when each review includes the name and appropriate qualifications of the individual reviewer and depth analysis of the characteristics that make games suitable for the variety of learners in an age group.
For busy teachers without experience in evaluating sites, there is limited value in simple links to websites that depend on trial and error to find the most effective classroom tools.
2. ClassroomWindow
ClassroomWindow's mission is "to create transparency so that our teachers are equipped with the tools they need -- and deserve -- to be successful . . . to share feedback on the quality of instructional materials."
Their evaluations were not from professional consultants, but rather were feedback from teachers who responded to surveys or shared opinions of personal experiences with online programs. There were no rubrics or objective comparisons to validate or confirm these opinions. Further, there was no data about types of students who would benefit from the recommended online games.
This site recently migrated to a Facebook page, perhaps an easier platform to support its community-driven content.
3. Common Sense Media
An independent nonprofit media evaluation site, Common Sense Media's reviews cast a wide net. The website indicates that they rely on developmental criteria from authorities to determine content appropriateness and learnings ratings. The areas ranked are extensive, characteristics of the ranks are described, and the Education Ratings & Review Program Advisors have backgrounds suited for the evaluations, including a Sesame Street strategy director and university professors of digital media and other developmental, cognitive or psychological fields.
It would be helpful to know which experts evaluated which product, whether there is more than one evaluator for each game, and if evaluators consistently use uniform criteria for ranking.
The site does provide information on cost as part of the descriptions, but I'd like to see users have the option of going right to the free resources instead of having to read each one for that information. Additional categories that need more fleshing out for validity of ranking include the criteria for selecting "editors' picks" and "award winners."
High points of this website are the extensive ways to browse by subject, skills (thinking and reasoning, tech skills, self-direction), genres (blogging, educational, creating), topics (dinosaurs, sports, cars and trucks), and popularity with kids.
One of the categories of high value is age appropriateness. As they explain, "For each title, we indicate the age for which a title is either appropriate or most relevant." The age-appropriate rating is supported by a link to "What's going on at each age group."
Here you’ll find informative developmental background information, including age-specific descriptions of representative characteristics in areas of cognitive, social-emotional and physical development, and technological/digital savviness. These are accompanied by examples that provide guidance in what to expect and what is needed at these developmental stages.
Further, Common Sense has lists of specific content categories with information about why they rate something for an age group, such as positive messages, ease of play, violence, privacy and safety, and others.
Apps, video games and websites also have a separate area where their learning potential is rated by engagement, approach to learning, feedback qualities, and availability of support or extensions. These descriptions would be even more valuable as part of a comprehensive rubric, especially with the breakdown of their five-star system translated into rubric format.
Making Continued Progress
The reviewing source recommendations for online learning games and websites are only as valuable as the characteristics evaluated, the reviewers, and the criteria by which success in these parameters are measured.
In addition to my recommendations specific to the Common Sense Media website, I'd propose categories that evaluate and give descriptive rankings for grade or mastery levels, standard alignment, and quality of corrective and progress feedback provided.
It would be important to know the availability of summative feedback and storage of overall progress for each desired learning goal so that students and teachers could go back to see changes over time. Ideally, this record keeping would include effort-to-progress reporting (e.g. mastery per hour) to demonstrate to students the relationship between their time practicing and amount of mastery achieved.
This pattern of goal progress feedback would go a long way to increase students' growth mindsets, especially when frequent failure has resulted in the negative expectations of fixed mindsets.
What more would you like to see in a great online learning resource clearinghouse? What would the assessments include? What have you seen elsewhere in reviews of online skill-building sites that could make this valuable resource even more powerful?