How We Rate

The SpotOn™ Rubric

The reviews and ratings SpotOn uses to evaluate digital content are based upon a rubric developed initially through a modified Delphi survey process, which included national academic content and technology experts, as well as through focus group interviews of central Ohio teachers, principals, and national publishers. This initial rubric covered only Core resources, but formed the foundation for the SpotOn Master Rubric, which can be applied to eight product types including Core, non-Core, Units, Lessons, Games, Apps, and Adaptive resources.

The development and validation process of the SpotOn Rubric was facilitated through The Ohio State University by experts from the College of Education and Human Ecology. The Rubric was designed to ensure that each resource evaluated receives an objective and consistent review, addressing criteria that are specific to each resource type. The SpotOn rubric applies criteria over four dimensions:

  • Content Quality
  • Pedagogy
  • Technology 
  • Standards

If you are interested in learning more about how we rate, you may view our public SpotOn Rubric. If you have additional questions, please contact us.


Reviewers and Methodology

   

SpotOn's Digital Review Specialists are highly qualified, credentialed educators, licensed in the areas in which they review. Many of our reviewers hold advanced education degrees and all have undergone extensive training in using the SpotOn rubric to evaluate digital content. Reviewers have access to standard instructional documentation provided with the resource, but do not receive any additional support or training on a resource. This ensures that what our reviewers know about a resource would be the same as any other educator.

Each resource is evaluated by multiple Digital Review Specialists to ensure review objectivity and to determine review consistency. For each SpotOn Dimension in the rubric, reviewers assign the resource a value of 1 to 4, on a 4-point scale, and provide written commentary to support their scoring.

Upon completion of the initial review by the Digital Review Specialist, the SpotOn review passes through a multi-stage process to ensure consistency. When scoring discrepancies of two or more degrees exist, a score resolution process is implemented to ensure consistency. Our subsequent editorial process harmonizes both the qualitative and quantitative aspects of the reviews prior to inclusion in the SpotOn website. Throughout this process, our team strives to ensure our reviews adhere to the rubric, are accurate and factual, and provide evidence to support our ratings.

Weightings and Ratings

In the course of a review, reviewers assign scores to each of the criteria outlined in the SpotOn rubric. These criteria scores are then weighted and averaged to create the SpotOn Dimension scores. The SpotOn Dimension scores are then combined through weighted averaging to create an overall SpotOn Score. The weightings for the SpotOn Dimensions and overall SpotOn Score are based on input provided through central Ohio administrator and teacher feedback sessions. The College of Education and Human Ecology at The Ohio State University has initiated a national research study to validate these weightings, results of which are expected in 2016.

Our Reviews


In addition to the written comments for each Dimension, all SpotOn reviews include an overall editorial summary and a list of the product's strengths and weaknesses. Furthermore, the comprehensive review includes product images and information supplied by the content provider, in order to provide educators with the information they need for evaluating a classroom resource.