Star InactiveStar InactiveStar InactiveStar InactiveStar Inactive

pyramid6Level 2 Learning Objectives
There is increased interest in evaluating the acquisition of knowledge and skills. These drivers include growth in the number of learning organizations, emphasis being placed on intellectual capital, and increased use of certifications as a discriminator in the selection process. Given this, Level 2 objectives should be well defined.

Level 2 objectives communicate expected outcomes from instruction; they describe competent performance that should be the result of learning. The best learning
objectives describe behaviors that are observable and measurable. As with Level 1 objectives, Level 2 objectives are outcome based. Clearly worded and specific, they
spell out what the participant must be able to do as a result of learning.

There are three types of learning objectives:

  1. Awareness—participants are familiar with terms, concepts, and processes.
  2. Knowledge—participants have a general understanding of concepts and processes.
  3. Performance—participants are able to demonstrate the knowledge and skills acquired.

A typical learning objective may be: "At the end of the program participants will be able to implement Microsoft Word."

Sounds reasonable. But, what does successful implementation look like? How will you know you have achieved success? You need a performance measure, that performance measure is described in the Performance Criteria of Units of Competency (Training Packages) and will required an interpretation (unpacking) proces to provide the level of detail required for a effective assessment process (evaluation Level 2).

Compare broad objective with implementaiton measures (after interpretation of the Unit of competency):

Objective: At the end of the course, participants will be able to implement Microsoft Word

Measure: Within a 10-minute time period, participants will be able to demonstrate to the trainer the following applications of Microsoft Word with zero errors:

  • File, Save as, Save as Web Page
  • Format, including font, paragraph, background, and themes
  • Insert tables, add columns and rows, and delete columns and rows

Level 3 Application and Implementation Objectives
Where learning objectives and their specific measures of success tell you what participants can do, Level 3 objectives tell you what participants are expected to do
when they leave the training environment. Application objectives describe the expected outputs of the training program. They describe competent performance that
should be the result of training and provide the basis for evaluating on-the-job performance changes. The emphasis is placed on applying what was learned.

The best Level 3 objectives identify behaviors that are observable and measurable, outcome based, clearly worded, specific, and spell out what the participant has
changed as a result of the learning.

A typical application objective might read something like this: Participants will use effective meeting behaviors.

Again, you need specifics in order to evaluate success. What are effective meeting behaviors and to what degree should participants use those skills?

Compare application objective with measurable behaviors.

Objective:Participants will use effective meeting behaviors.

Measures:Participants will develop a detailed agenda outlining the specific topics to be covered for 100% of meetings. Participants will establish meeting ground rules at
the beginning of 100% of meetings. Participants will follow up on meeting action items within three days following 100% of meetings.

With defined measures, you now know what success looks like.

An important element of Level 3 evaluation is that this is where you can assess success with learning transfer. Is the system supporting learning? Here you look for barriers
to application as well as supporting elements (enablers). It is critical to gather data around these issues so that corrective action can be taken when evidence of a problem exists. You may ask how you can influence issues outside your control—say, when participants indicate that it is the supervisor that prevents them from applying newly acquired knowledge.

Through the evaluation process, data is developed that arms you to engage in dialogue with supervisors. Bring the supervisor into the fold; ask the supervisor for help. Tell the supervisors that there is evidence that some supervisors do not support learning opportunities and you need their advice as how to remedy the situation.

A comprehensive assessment at Level 3 provides you with tools to begin the dialogue with all stakeholders. Through this dialogue you may find that many managers
and supervisors and colleagues do not understand the role of Vocational Education and Training, nor do they have a clear understanding of the adult learning process. This is an opportunity to teach them, thereby, increasing their support.

Write comment (0 Comments)

Star InactiveStar InactiveStar InactiveStar InactiveStar Inactive

Measuring ResultsProgram objectives are the fundamental basis for evaluation.

Program objectives drive the design and development of the program and define how to measure success. Program objectives define what the program is intended to do and how to measure participant achievement and system support of the learning transfer process. All too often, however, minimal emphasis is placed on developing objectives and their defined measures.

Defining Program Objectives
Before the evaluation begins, the program objectives must be developed. Program objectives are linked to the needs assessment. When a problem is identified, the needs assessment process begins. Assessments are conducted to determine exactly what the problem is; how on-the-job performance change can resolve the problem; what knowledge or skills need to be acquired to change on-the-job performance; and how best to present the solution so that those involved, the consumers, can acquire the knowledge and skills to change performance to solve the business problem. From here, program objectives are developed to help guide program designers and developers, provide guidance to facilitators, provide goals for participants, and provide a framework for evaluators.
Program objectives reflect the same five-level framework used in categorizing evaluation data. The key in writing program objectives is to be specific in identifying measures of success. All too often, very broad program objectives are written.
While this is acceptable in the initial phases of program design, it is the specific measures of success that drive results and serve as the basis for the evaluation.

Level 1 Reaction, Satisfaction, and Planned Action Objectives
Level 1 objectives are critical in that they describe expected immediate and long term satisfaction with a program. They describe issues that are important to the success of the program, including facilitation, relevance and importance of content, logistics, and intended use of knowledge and skills. But, there has been criticism of the Level 1 evaluation. This criticism surrounds the use of the Level 1 overall satisfaction as a measure of success. The overuse of the overall satisfaction measure has led many organizations to make funding decisions based on whether participants like a program, later realizing the data was misleading.
Level 1 objectives should identify issues that are important and measurable rather than esoteric indicators that provide limited useful information. They should be attitude based, clearly worded, and specific. Level 1 objectives specify that the participant has changed in thinking or perception as a result of the program and underscore the linkage between attitude and the success of the program. While Level 1 objectives represent a satisfaction index from the consumer perspective, these objectives should also have the capability to predict program success. Given these criteria, it is important that
Level 1 objectives are represented by specific measures of success.
A good predictor of the application of knowledge and skills is the perceived relevance by participants of program content. So, a Level 1 objective may be
At the end of the course, participants will perceive program content as relevant to their jobs.
A question remains, however: "How will you know you are successful with this objective?" This is where a good measure comes in. The Table below compares the broad objective with the more specific measure.

Objective Measure
At the end of the course, participants will perceive program content as relevant to their jobs. 80% of participants rate program relevance a 4.5 out of 5 on a Likert scale.

Now, for those of you who are more research driven, you might want to take this a step further by defining (literally) what you mean by "relevance." Relevance may be defined as:

  • knowledge and skills that participants can immediately apply in their work
  • knowledge and skills reflective of participants' day-to-day work activity.

If this is the case, the measures of success are even more detailed.


  • At the end of the course, participants will perceive program content as relevant to their jobs.


  • 80% of participants indicate that they can immediately apply the knowledge and skills in their work as indicated by rating this measure a 4.5 out of 5 on a Likert scale.
  • 80% of participants view the knowledge and skills as reflective of their day-to-day work activity as indicated by rating this measure 4.5 out of 5 on a Likert scale.

Success with these two measures can be reported individually, or you can combine the results of the two measures to create a "relevance index."
Breaking down objectives to specific measures provides a clearer picture of success.

Continue in Program Objectives Part 2

Reference: ROI Basics, Patricia Phillips and Jack Phillips, ASTD Press, 2005

Write comment (0 Comments)

Star InactiveStar InactiveStar InactiveStar InactiveStar Inactive

JackPattiIn recent years, we have witnessed change in organisational accountability, especially toward investment in training, programs, and projects.

Employers and government are concerned about the value of their investment on vocational education and training. Today, this concern translates into financial impact, the actual monetary contribution from a training program.

Although monetary value is a critical concern, it is the comparison of this value with the project costs that captures stakeholders' attention— and translates to ROI. "Show me the money" is the familiar response from individuals asked to invest (or continue to invest) in major projects (including training programs). At times, this response is appropriate.

At other times, it is misguided; measures not subject to monetary conversion are also important, if not critical, to most projects. A balanced profile of success is needed. This profile should include qualitative and quantitative data as well as financial and non-financial outcomes. Excluding the monetary component from a success profile is unacceptable in this age of "show me" generation. The monetary value is sometimes required before training is approved.

This issue is compounded by concern that most projects (including training programs) today fail to live up to expectations. A systematic process is needed that can identify barriers to, and enablers of, success and can drive organisational improvements.

The challenge lies in doing it— developing the measures of value, including monetary value, when they are needed and presenting them in a way so that stakeholders can use them:

  • • Before the project is implemented.
  • • During implementation, so that maximum value can be attained.
  • • During post-analysis, to assess the delivered value against the anticipated value.
  • The ROI MethodologyTM is a process that address all three scenarios

Why the ROI Certification program works

  1. Focused. The content is rich with examples, tools, techniques, case studies, and templates to make it easy to collect and analyze powerful data. Participants often leave this workshop indicating that this is the most important workshop in their professional career.
  2. Proven. The ROI Methodology is built on application and process improvement. Beginning with the first studies in 1970s, the process has been refined, enhancements added, process models developed, and an impressive list of applications. It meets the needs for executives, professional evaluators, and users alike. Over 5,000 organisations are now using this methodology to conduct ROI Studies on all types of projects and programs.
  3. Practical. This workshop is not based of the success of another theory, but in practical processes. Mathematics are basic. There are no confusing theories, time-wasting trivia, and certainly no touchy-feely stuff in this workshop. Participants are taught how to use this methodology in their world, designed around their projects. They learn how to do an ROI study and they prove it after the workshop.
  4. Grounded in reality. When it comes to analytics and ROI, It is sometimes difficult to stay realistic or relevant. This workshop is based on a proven methodology with standards that are conservative, consistent, and credible. These standards have evolved and new ones have been added over time, all approved by the users. It has been designed, shaped, modified, and enhanced by its users. All of the examples, applications, and case studies are real situations. It is nothing but reality.
  5. Cost effective. When considering books, workshops, job aids, skills acquired, five-days of valuable facilitation, online access, the right to use materials, and the designation of Certified ROI Professional, this is a bargain. Compare to other certifications, this is the most cost-effective certification. This is not just a one-time workshop. This is an ongoing learning opportunity.
  6. Endorsed by executives and organisations. This methodology has been approved and endorsed by top executives and Chief Financial Officers (CFOs) in many organisations. Sometimes the CFO is involved in implementing this process in an organisation. Over half of the Fortune 500 companies have endorsed this methodology. Over 20 professional associations have endorsed it, such as the Association for Talent Development (ATD), Society for Human Resource Management (SHRM), the International Public Manager Association (IPMA), and the Australian Society of Training and Development (ASTDI) to name a few. Many non-governmental organisations have also endorsed it, such as the United Nations. Over 25 federal governments, including the USA, Mexico, Canada, UK, Singapore, Australia, Chile, Italy, and Egypt have endorsed the methodology. These endorsements are not sought, but came from those organisations after they saw the power of the methodology.
  7. Sought-after designation. The Certified ROI Professional is now a sought-after designation in many professional fields, particularly in the Human Capital area. Since the first 5-day certification was conducted in 1995, over 10,000 managers and professionals have participated in the ROI Certification, with actual 4,000 actual CRPs. This certification is a work-product certification so that the employers and clients know that participants can conduct a ROI study. Certifies ROI Professionals report that they have been able to translate this designation into new job assignments, new responsibilies, promotions, salary increases. Some indicated that the certification have been a factor in keeping the job in the face of layoffs.
  8. Design and delivered by thought leaders. This workshop was designed by the founders of ROI Institute, Jack and Patti Phillips and it is delivered by senior executives of ROI Institute. The workshop is constantly updated. Jack and/or Patti are usually involved in each of the certification workshops, along with other team members. Each facilitator has years of experience in using ROI in top organisations, extensive publications, and consulting experience with a variety of audiences.
  9. Immediately applicable. The tools, processes, and skills learned in this workshop can be applied immediately. Some participants make adjustments during the workshop, modifying the process, policies, and practices of their respective organisations. The ROI Methodology can be used to evaluate existing programs or new programs. Ideally, the time to start the evaluation process is at the beginning of a program.
  10. Valuable takeaways. Participants have many takeaways, including: 4-5 books provided tailored to the participants industry or application, a detailed workbook with places for notes and actions, models and application guides, 15-20 case studies in the area of their interest, at least a dozen articles, archived webinars, templates, and downloadable tools. Research generated by ROI Institute and Insources is available to participants at no cost and membership in the ROI Institute and Insources exclusive members only websites is provided at no charge.

For more information about this ROI Professional Certification click here

Write comment (0 Comments)

Privacy Policy
Terms of Sale
Terms of Use

  • Email:
  • Phone: 1300 208 774
  • ABN 74 625 075 041 


© 2019 - 2020 by Insources Group Pty Ltd. All rights reserved.