INSOURCES BLOG

Star InactiveStar InactiveStar InactiveStar InactiveStar Inactive

Evaluate-using-ROIBy Jack Phillips, PhD, and Patti Phillips, PhD

Before training evaluation begins, program objectives must be developed. Program objectives are linked to the needs assessment. Let's look at how objectives can be developed within the context of the five-level ROI framework. Setting clear objectives at each level is important because success with training objectives helps answer basic questions regarding program effectiveness.

Level 1 Objectives (Reaction, Satisfaction, and Planned Action)
These describe issues that are important to the success of the program, including facilitation, relevance and importance of content, logistics, and intended use of knowledge and skills. The best Level 1 objectives should:

  • Identify issues that are important and measurable
  • Be attitude-based, clearly worded, and specific
  • Specify how participants have changed their thinking or perceptions as a result of the program

For example, participants' perceptions of program relevance is a strong predictor of how well they will apply learned knowledge and skills. So, a good Level 1 objective is:

  • At the end of the course, participants will perceive program content as relevant to their jobs.

Reaction data can be collected with feedback questions such as, "Was this program useful, necessary, motivational, and important to your success?" Measures of success with Level 1 Objectives include:

  • 80 percent of participants view the knowledge and skills as relevant to their daily work activity as indicated by rating this measure 4.5 out of 5 on a Likert scale.

Level 2 Objectives (Learning)
Level 2 objectives communicate expected outcomes from instruction. The best learning objectives:

  • Describe outcome-based behaviors that are observable and measurable
  • Describe competent performance that should occur as the result of learning
  • Spell out what the participant must be able to do as a result of learning

As with Level 1 objectives, Level 2 objectives should be clearly worded and specific. A typical learning objective may be:

  • At the end of the program, participants will be able to implement Microsoft Word.

Sounds reasonable. But, how will you know you have achieved success? You need a measure, such as:

  • Within a 10-minute time period, participants will be able to demonstrate to the facilitator the following applications of Microsoft Word with zero errors:
    • File, Save as, Save as Web Page.
    • Format, including font, paragraph, background, and themes.

Now, you can evaluate the success of learning.

Level 3 Objectives (Application and Implementation)
Where learning objectives and their success measures tell you what participants can do, Level 3 objectives tell you what participants are expected to do when they leave the learning environment. The best Level 3 objectives are observable and measurable, outcome-based, clearly worded, specific, and:

  • Emphasize applying what was learned
  • Describe the expected outputs of the training program
  • Provide the basis for evaluating on-the-job performance changes

A typical application objective might be:

  • Participants will use effective meeting behaviors

Again, you need specifics in order to evaluate success. What are effective meeting behaviors and to what degree should participants use those skills? With Level 3 evaluation, you can also follow-up to assess success with learning transfer. Here you look for barriers to application as well as enablers that support learning. Gathering data around these issues allows you to take corrective action when evidence of a problem exists.

Level 4 Objectives (Business Impact)
Level 4 objectives provide the basis for the questions that you ask during the overall evaluation process. They measure the consequences of participant¡¦s applied skills and knowledge and place emphasis on achieving bottom-line results. The best Level 4 objectives:

  • Are results-based, clearly worded, and specific
  • Spell out what the participant has accomplished in the business unit as a result of the program.

A sample Level 4 Objective might be:

  • Increase market share of young professionals by 10% within nine months of new ad launch.

Success with Level 4 objectives is critical when you want to achieve a positive ROI.

Level 5 Objectives (ROI)
Level 5 objectives target the specific economic return anticipated when an investment is made in a program. Some organizations are satisfied with a 0% ROI break-even. This says that the organization got the investment back. Remember, not all programs are suitable for ROI -- typically only 5-10% of all learning programs are evaluated at this level.

Summary
The value of training occurs when participants react positively to an event; acquire new knowledge, information or skills; apply those on the job after a program; and, as a result of these applied actions, positively influence targeted business measures. Defining specific success measures across multiple levels of evaluation will help you move beyond simple tracking of reaction data and make it easy for your audience to understand the results reported. Each level of evaluation provides important, standalone data. Reported together, the five-level ROI framework gives data that tells the complete story of program success or failure.

Measuring ROI in Training and Development - Workshop

pyramidConnect training objectives to industry results. Use evaluation data for continuous improvement.

Measuring the effectiveness of training to ensure it meets workforce performance needs and linking training program objectives to business results are essential parts of the training cycle. They are also critical compliance requirements linked to continuous improvement and industry engagement.

This two-day workshop emphasises the Phillips ROI Methodology and participants will develop ways to apply the return-on-investment techniques to learning and performance solutions.

At the end of the program, participants will be able to:

  • Link training program learning objectives to industry needs
  • Apply ROI techniques to training evaluation
  • Develop an evaluation plan that meets organisational and compliance requirements
  • Use evaluation data for continuous improvement purposes

More information about this workshop

Write comment (0 Comments)

Star InactiveStar InactiveStar InactiveStar InactiveStar Inactive

Trainers and vocational education practitioners recognize that the additional measurement and evaluation is needed. However, regardless of the motivation to pursue evaluation, they struggle with how to address the issue. They often ask, "Does it really provide the benefits to make it a routine, useful tool?" "Is it feasible within our resources?" "Do we have the capability of implementing a comprehensive evaluation process?" The answers to these questions often lead to a debate and controversy. Controversy stems for misunderstandings about the additional evaluation can and cannot do and how it can or should be implemented in the organizations. The following is a list of myths, including the appropriate clarifications:

MEASUREMENT AND EVALUATION, INCLUDING ROI, IS TOO EXPENSIVE. When considering additional measurements and evaluation, cost is usually the first issue to surface. Many practitioners think that evaluation adds cost to an already lean budget that is regularly scrutinized. In reality, when the cost of the evaluation is compared to the budget, a comprehensive measurement and evaluation system can be implemented for less than 5% of the total direct learning and development of performance improvement budget.

EVALUATION TAKES TOO MUCH TIME. Parallel with the concern about cost is the time involve in the evaluation-time to design instruments, collect date, process the data, and communicate results to the groups that need them. Dozens of shortcuts are available to help reduce the total time required for evaluation.

SENIOR MANAGEMENT DOES NOT REQUIRE IT. Some learning and development staff think that if management dies not ask for additional evaluation and measurement, the staff does not need to pursue it. Sometimes, senior executives fail to ask for results because they think that the data are not available. They may assume that results cannot be produced. Paradigms are shifting, not only within the learning and performance improvement, but within senior management groups as well. Senior managers are beginning to request higher-level data shows application, impact, and even ROI.

MEASUREMENT AND EVALUATION IS A PASSING FAD. While some practitioners regard the move to more evaluation, including ROI, as a passing fad, accountability is a concern now. Many organizations are asked to show the value of the programs. Studies show this trend will continue.

EVALUATION ONLY GENERATES ONE OR TWO TYPES OF DATA. Although some evaluation processes generate a single type of date (reaction level, for example), many evaluation models and processes generate a variety of date, offering a balanced approach based on both qualitative and quantitative data. The process in this book collects as many as seven different types of qualitative and quantitative data, within different timeframes, and from different resources.

EVALUATION CANNOT BE EASILY REPLICATED. With so many evaluation processes available, this issue becomes an understandable concern. In theory, any process worth implementing should be one that can be replicated from one study to another. Fortunately, many evaluation models offer a systematic process. With certain guiding principles or operating standards to increase the likelihood that two different evaluators will obtain the same results.

EVALUATION IS TOO SUBJECTIVE. Subjectivity of evaluation has become a concern, in part because of the studies conducted using estimates and perceptions that have been published and presented at conferences. The fact is that many studies are precise and are not based on estimates. Estimates usually represent the worst –case scenario or approach.

IMPACT EVALUATION IS NOT POSSIBLE FOR SOFT-SKILL PROGRAMS. This concern is often based on the assumptions that only technical or hard skills can be evaluated, not soft skills. For example, practitioners might find measuring the success of leadership, term-building, and communication programs difficult. What they often misunderstand is that soft-skills learning and development programs can, and should, drive hard-data items, such as output, quality, cost, and time.

EVALUATION IS MORE APPROPRIATE FOR CERTAIN TYPES OF ORGANIZATIONS. Although evaluation is easier in certain types of programs, generally, it can be used in any setting. Comprehensive measurement systems are successfully implemented in health care, nonprofit, government, and educational areas, in addition to the traditional service and manufacturing organizations. Another concern expressed by some is that the only organizations have a need for measurement and evaluation. Although this may appear to be the case (because large organizations have large budgets), evaluation can work in the smallest organizations and simply must be scaled down to fit the situation.

IT IS NOT ALWAYS POSSIBLE TO ISOLATE THE EFFECTS OF LEARNING. Several methods are available to isolate the effects of learning on impact data. The challenge is to select an appropriate isolation technique for the resources available and the accuracy needed in the particular situation.

A PROCESS FOR MEASURING ON-THE-JOB IMPROVEMENT SHOULD NOT BE USED. This myth is believed because the learning and development staff usually has no control over participants after they leave the program. Belief in it is fading, though, as organizations realized the important of measuring the results of workplace learning solutions. Expectations can be created so that participants anticipate a follow-up and provide data.

A PARTICIPANT IS RARELY RESPONSIBLE FOR THE FAILURE OF PROGRAMS. Too often, participants are allowed to escape accountability for their learning experience. It is too easy for the participant to claim that the program was not supported by their managers, it did not fit the culture of the work group, or that the systems or processes were in conflict with the skills and processes presented in the program. Today participants are held more accountable for the success of learning in the workplace.

EVALUATIONS IS ONLY THE EVALUATOR'S RESPONSIBILITY. Some organizations assign an individual or group the primary responsibility for evaluation. When that is the case, other stakeholders assume that they have no responsibility for evaluation. In today's climate, evaluation must be a shared responsibility. All stakeholders are involved in some aspect of analyzing, designing, developing, delivering, implementing, coordinating, or organizing a program.

SUCCESSFUL EVALUATION IMPLEMENTATION REQUIRES A DEGREE IN STATISTICS OR EVALUATION. Having a degree or possessing some special skill or knowledge is not a requirement. An eagerness to learn, a willingness to analyze data, and a desire to make an improvement in the organization are the primary requirements. After meeting these requirements, most individuals can learn how to properly implement evaluation.

NEGATIVE DATA ARE ALWAYS BAD NEWS. Negative data provide a rich source of information for improvement. An effective evaluation system can pin point what went wrong so that changes can be made. Barriers to success as well as enablers of success can be identified. Such data will generate conclusions that show what must be changed to make the process more effective.

Reference: Improving HUman Performance, 2012, ASTD Press

Write comment (0 Comments)

Star InactiveStar InactiveStar InactiveStar InactiveStar Inactive

The complete story of a training program success or failure

Patricia and Jack Phillips, from ROI Institute, included return on investment (ROI) as the fifth level of the traditional Kirkpatrick evaluation levels. According to this new ROI methodology for training and education practitioners, the new evaluation framework includes five levels of data:

  • Level 1 Reaction, Satisfaction, and Planned Action—Data representing participants' reactions to the program and their planned actions is collected and analyzed. Reactions may include participants' views of the course content, facilitation, and learning environment. This category of data also includes data often used to predict application of acquired knowledge and skills, including relevance, importance, amount of new information, and participants' willingness to recommend the program to others.
  • Level 2 Learning—Data representing the extent to which participants acquired new knowledge and skills is collected and analyzed. This category of data also includes the level of confidence participants have in their ability to apply what they have learned.
  • Level 3 Application and Implementation—Data is collected and analyzed to determine the extent to which participants effectively apply their newly acquired knowledge and skills. This category also includes data that describes the barriers that prevent application and any supporting elements (enablers) in the knowledge transfer process.
  • Level 4 Business Impact—Data is collected and analyzed to determine the extent to which participants' applications of acquired knowledge and skills positively influenced key measures that were intended to improve as a result of the program. When reporting data at Level 4, a step to isolate the program's effects on these measures from other influences is always taken.
  • Level 5 Return on Investment—Impact measures are converted to monetary values and compared with the fully loaded program costs. You can have improvement in productivity, for example, but you must determine the monetary value of that improvement and what that improvement cost you in order to calculate ROI. If the monetary value of productivity's improvement exceeds the cost, your calculation results in a positive ROI.

Each level of evaluation answers basic questions regarding the program success.

Level of Evaluation

Key Questions

Level 1:

Reaction,

Satisfaction, and Planned

Action

  • Was the program relevant to participants’ jobs and mission?
  • Was the program important to participants’ jobs and mission success?
  • Did the program provide new information?
  • Do participants intend to use what they learned?
  • Would participants recommend the program to others?
  • Is there room for improvement with facilitation, materials, and the learning environment?

Level 2:

Learning

  • Did participants acquire the knowledge and skills presented in the program?
  • Do participants know how to apply what they learned?
  • Are participants confident to apply what they learned?

Level 3:

Application and

Implementation

  • How effective are participants at applying what they learned?
  • How frequently are participants applying what they learned?
  • If participants are applying what they learned, what is supporting them?
  • If participants are not applying what they learned, why not?

Level 4:

Business

Impact

  • So what if the application is successful?
  • To what extent did application of learning improve the measures the program was intended to improve?
  • How did the program affect output, quality, cost, time, customer satisfaction, employee satisfaction, and other measures?
  • How do you know it was the program that improved these measures?

Level 5: ROI

  • Do the monetary benefits of the improvement in business impact measures outweigh the cost of the program?

The reason for referring to evaluation data as levels is that it facilitates managing and reporting the data. More important, however, these five levels present data in a way that makes it easy for the audience to understand the results reported. Each level of evaluation provides important, stand-alone data. Reported together, the five-level ROI framework represents data that tells the complete story of program success or failure.

Reference: Return of Investment Basics, Patricia Pulliam Phillips, Jack J. Phillips, ASTD Press, 2005 ROI Institute

Write comment (0 Comments)

Disclaimer
Privacy Policy
Terms of Sale
Terms of Use

  • Email: info@insources.com.au
  • Phone: 1300 208 774
  • ABN 74 625 075 041 

SUBSCRIBE TO OUR NEWSLETTER

© 2019 - 2020 by Insources Group Pty Ltd. All rights reserved.

Search