Validation and Moderation of Assessment

Purpose

GlobalNet is committed to undertaking effective and proactive validation and moderation of its assessment tools and practices

Scope

This policy applies to any accredited training that GlobalNet delivers but does NOT apply to any non-accredited training that GlobalNet provides

Definitions

GlobalNet: GlobalNet Pty Ltd or GlobalNet ICT Pty Ltd trading as GlobalNet Academy

Assessment: The process of collecting evidence and making judgements on whether competency has been achieved, to confirm that an individual can perform to the standard expected in the workplace, as expressed by the relevant endorsed industry/enterprise competency standards of a Training Package or by the learning outcomes of an accredited course

Assessment tool: a means of collecting the evidence that assessors use to make judgements about whether students have achieved competency. An assessment tool includes the following components:

  • The context and conditions for the assessment
  • The tasks to be completed by the student
  • An outline of the evidence to be gathered from the student
  • The assessment criteria used to judge the quality of performance
  • Administration, recording and reporting requirements

External Parties: All people except those employed by GlobalNet to deliver and assess any units of competency from any of the qualifications in the Training Package or curriculum being moderated/ validated are considered to be external parties

Moderation of assessment: the process of bringing assessment judgements and standards into alignment. It is a process that ensures the same standards are applied to all assessment results within the same units. It is an active process in the sense that adjustments to assessor judgements are made to overcome differences in the difficulty of the tool and/or the severity of judgements

Validation of Assessment: a quality review process. It involves checking that the assessment tool produced valid, reliable, sufficient, current and authentic evidence to enable reasonable judgements to be made as to whether the requirements of the relevant aspects of the Training Package or accredited course have been met. It includes reviewing and making recommendations for future improvements to the assessment tool, process and/or outcomes

Why we undertake validation and moderation

Our validation process is designed to ensure that we are (and remain) compliant with the training package requirements that we are training and assessing against. Training packages change from time to time and this is one of the processes that helps us ensure that we are incorporating these changes

Transparency

Everyone associated with our organisation should know why and how we validate. This makes our validation process transparent. The purpose, process and implications of validation and/or moderation should be transparent to all relevant stakeholders

Representative

We ensure that we take a representative sample.  This will allow us to identify any systemic issues with assessment practices and decisions.

Education

Our validation process will educate trainers and assessors about what is required from the units and from the organisation thus supporting the idea of continual learning

Improvement

Our validation and moderation processes (where moderation is required) form an integral part of our assessment process. It helps us gather constructive feedback from those involved which leads to continuous improvement across the organisation.

Consistency

Our validation process should reduce our margin of errors to acceptable levels which do not affect the over quality of the training or competence of the student.

Quality

Wherever possible we will involve outside individuals and organisations familiar with the industry and training packages so that we get an appreciation of how they approach the training and assessment of the training packages being validated

Validation

There are a few trigger points that can be used for validation. GlobalNet may use one or more of these to validate our training and assessment practices.

Before a course

Where we validate before training and assessment takes place, we will focus on:

  • The interpretation of the unit(s) of competency
  • The development of a common understanding of the standard to be achieved, what the competent person would look like
  • The assessment strategy, the overall design of the assessment process, including types of assessments to be used, location, number of assessments
  • Our assessment tools and the evidence guides
  • The benchmarks against which learner performance is to be assessed

During a course

Where we validate during a training and assessment phase (such as mid-way through the year), the process will focus on:

  • The actual performance being undertaken by a student
  • The assessment process and the role of the assessor

After a course

Where we validate after an assessment period (such as after the end of a year), we will focus on:

  • the effectiveness of the assessment tool and the assessment process
  • the standard of performance achieved
  • the validity of the evidence collected
  • reporting and record keeping
  • the accuracy and consistency of the assessment
  • judgements/ decisions that has been made (moderation)

Moderation

Moderation is one of our quality control processes which helps us ensure consistency of assessment between internal GlobalNet trainers and across RTO providers. When we moderate, we will generally do this after the assessment takes place and we focus on the judgements made by assessors to ensure that the same standards are applied to all assessment results within the same units, regardless of delivery mode or location.

Confidentiality

If we access or share confidential information, we will always keep this information confidential to the validation process. This way, we do not unfairly identify assessors (i.e.. those who developed the assessment tools and/or made the judgements) and students (i.e. those whose evidence is submitted in the process).This allows us to focus on the quality of the assessment tools and the assessment judgements rather than the individuals involved