Rather than present a visual deliverable on this page, I’d like to share a plan developed by Pooja Gupta, MAJ Michael Whitted, and me which outlines how a summative evaluation could be conducted for training materials put out to remote learners at the US Department of Defense’s Defense Information School (DINFOS). As mentioned on the IDE 641 page, DINFOS made the decision a few years before the COVID-19 pandemic to offer this training in a remote format to enlisted US Air Force personnel. It’s too bad the USAF didn’t see the Bates model before doing so! It appears they fell into the very trap Dr. Bates cautions against: one cannot just throw in-person materials up on a website and hope they will convey instruction the same way to remote learners.
Michael wrote the background material explaining the military environment to readers. Pooja contributed many evaluation questions and a structure to the evaluation, plus created a presentation (not shown here) for peer review. I contributed background information to both reports, added the evaluation matrix to the presentation, wrote several evaluation questions in both reports, and created an observation rubric for the summative evaluation. We received great feedback from the instructor and a few ideas on how to proceed in the future.
Creating the observation rubric, which you can review on pages 20 and 21 of the report below, was a task I am rather proud of, actually. The task caused me to pull together several of the principles of designing good evaluation questions, including the scale of measurement, the need to measure only one variable at a time, and establishing observable and measurable criteria by which to evaluate the learning outcomes. I look forward to using my newfound skills when executing the E in ADDIE, evaluation.
Special thanks to fellow M.S. student Erin Smith, USAF Reserve, for conceiving of the project and providing us with all the training materials we needed to execute both our projects.
