During the last 10 years, the vocational education and training sector has been presented with a set of so called “reforms”. Instead of them promoting improvements in the system, they have contributed to a weaker delivery of education and training. A long list of government agencies and bureaucrats have consistently missed the point when trying to apply new regulatory frameworks.
Our sector has been demoralised by the consequences of such incompetence and the future reputation of VET has been jeopardised.
What went wrong?
Many factors have contributed to create the situation we now have. In this post, I would like to discuss the ones I consider critical:
- Lack of strategic vision
- No strategy to build up capabilities to support the system
- Lack of effective consultation
- Ineffective regulation, and
- Lack of monitoring and evaluation.
Lack of strategic vision
No one will argue the importance of pursuing a national vocational education and training system, or the benefits of incorporating a variety of flexible and competitive private providers into the system, but I wonder about the strategic plan to make it happen.
During the era of pre-training packages, our training system demonstrated effective delivery of training solutions for traditional trades, based on a curricula approach closely linked to occupational standards.
Training packages can help us to expand VET into non-traditional occupations, and meet the demand in current and future workplaces for competent individuals in non-traditional occupations.
One problem is that training packages have been enforced, without considering whether we were ready, or even able to use training packages. Some VET professionals continue to argue the need for training package-based qualifications for a trade certificate, or units of competency-based training for a course that leads to a licence? RTOs find it challenging to unpack a unit of competence that leads to a licence, where the industry regulator will accept no interpretation but their own criteria to grant the licence (as it should).
Moving from a curricula-based to training package-based system
There is no doubt that training packages provide us with the basis for a world class vocational education, but implementing the system has been comparable to a third world class project (with all respect for third world countries).
What is the plan to build the capabilities required to interpret, write and use training packages? How are we monitoring and measuring the results of that plan?
To use the VET system, regulators, employers, students and other stakeholders must understand the system, which has been built on top of two pillars: Training Packages and a Quality Framework.
This required step in the implementation plan has been neglected with awful consequences. People who are interpreting, writing, using and regulating the use of training packages, do not understand them, including staff within an RTO, “qualified trainers”, people working for Service Skills Organisations, IRCs, regulators, employers, and students.
When I’m working and dealing with managers of RTOs, qualified trainers, the VET regulator’s auditors, I come across a significant number of professionals who don’t understand how to unpack training packages, how to apply the principles of assessment, how to collect assessment evidence in line with the rules of evidence, how to evaluate training, how to assess the effectiveness of a quality management system.
I also come across a significant number of employers who don’t understand the relevance of qualifications and units of competency to specific workplace outcomes, and students that are not able to identify, in the VET sector, a relevant pathway to meet their own needs (specially for non-trade qualifications).
A VET regulatory approach with no VET sense
The regulation of VET must be based on the quality of training and assessment practices and outcomes, in other words the evaluation of training practices and results.
The VET Quality Framework is our version of ISO 9001 applied to the VET sector. This framework is designed to regulate the operation of RTOs and their core functions, which can be organised in three main domains:
- Training and assessment practices
- Dealing with students, and
- Administration and governance.
Training and assessment practices include the design, delivery, assessment of training products, and results of such training. Effective regulation requires expertise in competency-based education, instructional design and training evaluation principles. The objective of regulating training and assessment practices is to ensure training products are relevant to industry and students, and produce measurable outcomes.
In reality, the current regulatory framework is based on paperwork, not on results, and the regulator’s auditors are not trained in competency-based education, instructional design, or training evaluation.
The regulation of RTOs and students’ interaction is based on universal consumer rights and contractual obligation, and the objective is to protect students’ rights. In reality, regulators have not demonstrated competency in protecting these rights, and illegal activities have been carried on undetected by VET regulators and only detected by other government agencies, after issues escalated.
The regulation of an RTO’s governance and administration is based on general principles of quality management systems, and the objective is to ensure the RTO can control its own operation within the established legal requirements. Despite this being an area where regulators have performed at their best, it is inexplicable how the number of providers has grown, at unsustainable rates, when flying under the regulator’s radar.
More regulation, more ineffective regulation
The government has been trying to improve the sector’s performance by introducing more regulation, without actually doing a proper analysis of the situation and trying to understand the real performance issue.
The government seems more concerned with punishing non-complaint providers, than building the capabilities required to deliver quality training and education. More regulation will not build capabilities; more regulation will not improve quality.
We need more collaboration, participation and evaluation.
In the VET system, everybody works in isolation and we need a more collaborative environment. RTOs need more platforms and initiatives that promote collaboration, where common issues such as maintaining currency with contemporaneous training and education techniques, technology applied to education and learning, leadership, talent development, quality management, international markets, etc. Perhaps our peak bodies should consider not spending 100% of their time lobbying the government, but also setting standards with case studies, doing research, and promoting collaboration.
Lack of collaboration is not only an issue for RTOs, but also among Industry Skills Councils. This became clear when examining the inconsistency in the quality of different training packages, and it is not clear how this situation will improve in the new environment with SSOs.
We need more and better industry participation. Without it, results of training cannot be measured with regards to the application of the skills and knowledge taught, and its potential effect on the industry. There are no avenues for industry to participate in the evaluation of training, outside the industry engagement requirement within the Standards, which has failed to properly align training objectives with industry needs. Industry should participate in the evaluation and regulation of RTOs, and policy makers should consider this.
The diversity of VET’s application makes its regulation a complex task. Relevant independent bodies should support the VET regulators in measuring training and assessment results. These industry independent bodies should include the participation of industry regulators, RTOs, employers, and employees’ associations, unions, and relevant community organisations.
Finally, VET in Australia should adopt a consistent, credible and comparable framework to evaluate training results, in line with current international standards. An evaluation framework that produces the data that instructional designers, trainers, students, industry and government need and understand. In this sense, we should look at the Kirkpatrick and Phillips models of evaluation. But this is a topic for another post.