Previously, I’ve discussed the importance of evaluating training in the context of Kirkpatrick’s Evaluation Model. Many people neglect the evaluation process because they are either on a very strict schedule (analysis is usually neglected, too) or they just don’t see how administering tests can work in the training.
The latter is a major misconception. Assessment is not about giving graded tests.
Tools: Assessment can be accomplished in several different ways. Some examples include:
|Poster Presentations||Oral Presentations|
|Case Studies||Written Reports|
|Fill In The Blanks||State Examinations or Certifications|
|Publication||Observing Student Reactions|
Simple Process: If creating assessment methods is daunting, I recommend starting simple.
- Select a training objective.
- Pair the objective with an activity.
- Create a rubric for evaluation.
- Review each student’s completed activity compared to the rubric to see if it aligns with the objective.
- Create a snapshot of your overall
Grading: Assessing your learners does not mean you need to change your grading structure or even give a grade at all. However, if learners believe that they are receiving something in return, they may give more effort when completing the assessment. Case in point: if you’ve been asked to complete a survey, you may have given up half way through if you didn’t see a personal benefit in completing it.
Cost: Evaluating learning does not have to be costly endeavor. Although there are many different software and hardware options to aid in assessing learning outcomes (such as PollEverywhere, TurningPoint, ExamSoft, and PearsonVue), assessment can be accomplished without purchasing third-party products. If assessment is a new goal for your organization, I recommend working through some of the simpler assessment strategies and tools before deciding on an external assessment product.
Previously, we discussed the ADDIE model and the importance of each step when designing instruction. The last step, evaluation is equally important to the actually design and development of instruction.
Evaluation helps instructional designers determine the success of the instruction or determine the gaps in learning that must be overcome to improve future designs of the instruction.
Donald Kirkpatrick created a ‘four level’ model for training course evaluation in 1959 and gained popularity in the 70s.
Kirkpatrick’s Four Levels of Evaluation are designed to evaluate training programs in a sequenced order. The model is typically displayed in a pyramid, in which the later levels are more difficult to assess and take a longer time to do so.
The four levels of Kirkpatrick’s Evaluation Model are:
Level One- Reaction
Level Two- Learning
Level Three- Behavior
Level Four- Results
Level One- Reaction is the basic level of evaluation in which the participants’ opinions and feelings about the training are measured. I typically hand out evaluation surveys at the end of each training to poll participants on how they liked the overall presentation and whether or not they are interested in using the technology in question.
Level Two- Learning is an increase in knowledge and/or skills as a result of the training. This training can be measured during the training in the form of a test. I have my colleagues walk through the instructional quick guides and complete the task at hand to demonstrate basic knowledge.
Level Three- Behavior is the transfer of knowledge and/or skills from the training to the job. This step is best evident 3-6 months after training and is observed while the trainee is performing the task. I have observed my colleagues as they attempt to achieve the tasks we discussed during training. Behavior skills are not yet achieved, as my trainees are not comfortable in performing the tasks without me standing by.
Level Four- Results the last level of evaluation occurs when results can be measured as a byproduct of the training program; such as, attendance and participation has a monetary or performance-based impact. Performance has positively been affected by trainings conducted on collaboration tools and best practices in email. Colleagues have stated that they spend less time emailing documents and working on shared resources since being trained on these tools.
It’s amazing what digging into older computer files reveals. I recently located the Poster I developed for the Sloan-C International Conference on Online Learning 2010. I’ve decided to share this nugget of information with you. A PDF version (better quality image) can be found at WebCampus Poster.
The topic of the Presentation was the Quality Audit (QA) Process used by Stevens Institute of Technology’s WebCampus Department to evaluate the use of technology and interaction in its online courses. By conducting the QA process on individual courses, WebCampus hopes to A) provide feedback to assist the instructor in adjusting their teaching pedagogy to create a more effective online learning environment and B) collect data regarding Best Practices in Online Learning.
The Quality Audit Process is something close to my heart, mainly because it was one of my first major tasks at Stevens Institute of Technology. I was asked to conduct a Quality Audit of an existing and currently running online course. I followed the process using the already established Quality Audit Form (seen center of the Poster).
When I provided the feedback to the Instructor, I was blown away by his response email. To say the least, it was very negative. As a new employee, I handled the situation the best I could. I assessed the situation and addressed the instructor’s concerns as diplomatically as possible. But more importantly, I realized the Instructor was also “blown away” by the Quality Audit. He was not informed that the Audit would take place or given a reason for why it was conducted.
Once I realized this, I adjusted the Quality Audit Process to include a “Pre-Audit” phase. This includes emailing the selected Instructors and informing them of the benefits of the Quality Audit while assuring them that the information gathered would be in no way used during their employee review. Since changing the process (and since this poster was created), roughly 200 course sections have been Audited, and I have not received any negative responses to the QA taking place. Consider the lesson learned- Open Communication is necessary for acceptance of constructive feedback.