View Document

Program and Course Review Procedure

This is the current version of this document. You can provide feedback on this policy document by navigating to the Feedback tab.

Section 1 - Context

(1) This procedure details the rules and governance pathways for program and course reviews.

Top of Page

Section 2 - Authority

(2) The authority for this document is established by the Program and Course Policy.

Top of Page

Section 3 - Scope

(3) In scope are all Higher Education (HE) and Vocational Education (VE) programs and courses offered by the RMIT Group, including programs delivered by controlled entities and offshore partners, and non-award courses (i.e. micro-credentials and Future Skills), which are embedded in award courses.

(4) Not included in scope, unless specified otherwise, are programs or courses that are administrative in nature i.e., non-award (NONA career), cross-institutional enrolment, exchange, Study Abroad and inactive programs.

Top of Page

Section 4 - Procedure

(5) RMIT’s Centre for Education Innovation and Quality (CEIQ) provides leadership, support and guidance to staff completing reviews of programs and courses in Higher Education (HE). The College of Vocational Education (CoVE) Quality and Compliance team leads the review process for Vocational Education (VE). The School of Graduate Research is responsible for quality assurance for all Higher Degree by Research programs and courses.

(6) RMIT’s program and course review processes are designed to meet the requirements of the Higher Education Standards Framework (Threshold Standards) 2021 (HESF), the Standards for Registered Training Organisations (RTOs) 2015 and other regulatory standards as applicable.

(7) The RMIT program and course HE review process is comprised of three components:

  1. ongoing course monitoring
  2. Annual Program Reviews
  3. Comprehensive Program Reviews.  

(8) Vocational Education performs an Annual Review of Training and Assessment Strategy for each educational training product and a five-year cyclical assessment validation for all nationally accredited training products on the RMIT Scope of Registration.

Ongoing Course Monitoring – HE (coursework) 

Ongoing Course Monitoring

(9) Each course or unit of learning is reviewed through a cyclical monitoring process.

  1. CEIQ monitors courses or units of learning using student feedback data and profiles, courses and units of learning in accordance with agreed internal standards and metrics.
  2. College Associate Deputy Vice-Chancellor Learning and Teaching (or equivalent) use the course profiles to prioritise courses for the development of a Course Action Plan.
  3. College Quality Units report on ongoing course enhancements to gauge the efficacy of impact.
  4. CEIQ evaluates Higher Education outcomes and reports to the Higher Education Committee.

Arrangements for Ongoing Course Monitoring

(10) Some courses or units of learning may be reviewed more frequently than others depending on, but not limited to:

  1. the agreed internal standards, metrics and triggers for monitoring
  2. RMIT or college strategies
  3. student feedback
  4. feedback from within the sector or related industry.

Annual Program Review – HE (coursework)

Annual Program Review

(11) Each program within scope participates in an Annual Program Review which ensure continuous program monitoring.

  1. CEIQ maintains the Annual Program Review schedule in consultation with colleges and schools.
  2. Program Managers (or equivalent) direct their Program Review Team in responding to the Terms of Reference.
  3. College Quality Units must obtain college-level endorsement from the Dean/Head of School (or delegate), college Associate Deputy Vice-Chancellor Learning and Teaching (or delegate) before submission to CEIQ. 
  4. CEIQ monitors and evaluates submissions and reports outcomes to the Programs Committee and Academic Board.
  5. Outcomes of this report will reflect the following:
    1. Complete/met standards: The submission responded fully to the Terms of Reference and provided evidence on most or all academic key indicators.
    2. Monitor: The submission is missing evidence and/or reflection as specified by the Terms of Reference (e.g., missing evidence of Australian Qualifications Framework (AQF) mapping, external validation, performance enhancement plan or update, etc.)
      1. CEIQ and the College Quality Unit agree on the timeline for updates and close out. This information is included in the submission to sub-committees and the Academic Board.
    3. Escalation: The submission is missing evidence and reflection on key HESF standards embedded in the Terms of Reference (e.g. missing evidence of AQF mapping, external validation, performance enhancement plan update, etc.) and has not addressed recommendations made by CEIQ within the allocated timeframe. The submission will be reviewed in the next program review cycle.
      1. CEIQ and the College Quality Unit agree on timelines for updates and close out. This information is included in the submission to the Programs Committee and Academic Board.

Alternative Arrangements for Annual Program Reviews

(12) The Deputy Vice-Chancellor Education (or delegate) may require an out-of-cycle Annual Program Review based on:

  1. student feedback
  2. complaints
  3. changes in data (e.g. student feedback, program dashboard 2.0).

Annual Program Review of Training and Assessment Strategy – VE

(13) Program Managers are responsible for undertaking a review of the Training and Assessment Strategy (TAS) that includes:

  1. alignment to training package or accredited course requirements
  2. sufficient trainers and assessors
  3. evaluation and monitoring strategies.

(14) The CoVE Quality and Compliance team ensures a current version of the TAS is stored centrally.

Alignment to Training Package or Accredited Course

(15) Each TAS for a vocational education training product will be updated annually as a minimum or as required prior to delivery to ensure the training and assessment continues to meet the training package, accredited course or unit requirements and the needs of the anticipated cohort.

Sufficient Trainers and Assessors

(16) Each TAS for a vocational education training product will be updated annually as a minimum or as required to include sufficient trainers and assessors with the relevant vocational competencies and current industry skills and knowledge for the teaching period.

Evaluation and Monitoring Strategies

(17) Annual evaluation of VE programs is managed by the Program Managers and Cluster Directors with specific input based on data, consultation and feedback mechanisms.

(18) Each TAS for a vocational education training product will be updated annually as a minimum or as required to include evidence of continuous improvement from the following:

  1. industry engagement
  2. student feedback
  3. trainer and assessor feedback
  4. program and cohort data.

Program and Cohort Data

(19) Program Managers are responsible for examining program and cohort data, and where relevant, using this data to inform the TAS. The data can include but is not limited to:

  1. student demographic data
  2. Language Literacy and Numeracy level
  3. completion data
  4. retention data including Early Warning Signs (EWS).

(20) Cluster Directors are responsible for assuring that complaints and appeals are managed using the Student and Student-Related Complaints Policy suite and Assessment, Academic Progress and Appeals Regulations as appropriate. 

Vocational Education Cyclical Review of Assessments

(21) The Associate Deputy Vice-Chancellor Learning and Teaching VE (or delegate) is responsible for ensuring that the validations schedule is managed in accordance with ASQA Standards.

(22) Cluster Directors are responsible for ensuring assessment validations are conducted in line with the requirements of the ASQA Standards and that the identified outcomes from validations are completed in a timely manner.

(23) Program Managers are responsible for planning and conducting validation of assessment practices and judgements which includes selection of units to be validated, arranging a panel and scheduling of meetings.

Evidence thresholds for Annual Program Reviews – HE (coursework)

(24) The Annual Program Review Terms of Reference are designed to meet a minimum threshold of evidence.

(25) Evidence and reflection must be included in the program review submissions where required by the Annual Program Review Terms of Reference.

Evidence for Program Enhancement Plan Update

(26) Program Managers are responsible for showing evidence of having actioned their program enhancement plans, if applicable, by reporting a status update of each goal and commentary. Status updates include:

  1. not yet started
  2. in progress
  3. complete
  4. no longer applicable.

(27) CEIQ monitors the status updates and reports them to the Programs Committee and Academic Board. Program Managers (or equivalent), in collaboration with their College Quality Unit, identify and submit to CEIQ notification endorsed by the College Associate Deputy Vice-Chancellor Education - Learning, Teaching & Quality (or equivalent) when a program will not complete an Annual Program Review due to the following:

  1. program is approved for discontinuation, and
  2. has less than ten (10) students.

Evidence Reflective Data Check

(28) Program Managers (or equivalent) are responsible for showing evidence of and reflecting on year-on-year data sets from the Program Review Dashboard as required by the Terms of Reference.

(29) Data inputs include, but are not limited to, student success, student outcomes, enrolments trends, program experience, student demand and strategic alignment, and external benchmarking where available.

(30) Program Managers (or equivalent), if required by the Terms of Reference, must provide a written submission that uses data and reflection to consider program strengths and potential actions for improvement.

Comprehensive Program Review HE (coursework)

Comprehensive Program Review

(31) Each program within scope participates in the Comprehensive Program Review at least once within a seven-year cycle to meet HESF requirements.

  1. CEIQ maintains the Comprehensive Program Review schedule in consultation with the program owner/college.
  2. Program Managers (or equivalent) lead their Program Review Team in responding to the Terms of Reference which includes providing data, evidence, a reflection on pedagogy and curriculum and program dashboard data. The response will include the development of a program enhancement plan.
  3. Program Managers submit formatted responses guided by the Comprehensive Program Review Terms of Reference and Coversheet (e.g., evidence, equivalency and comparability, forum guidance).
  4. College Quality Units ensure that college-level endorsement of each review is obtained from the college Dean/Head of School and Associate Deputy Vice-Chancellor Education - Learning, Teaching & Quality (or equivalent) and submitted to CEIQ.
  5. CEIQ monitors and evaluates submissions and reports outcomes and recommendations for re-accreditation to the Programs Committee and Academic Board.
  6. Where critical issues have been identified the college will be given up to 12 months to resolve and resubmit a complete Comprehensive Program Review before re-accreditation is sought.

(32) Outcomes include recommended re-accreditation status of a program, as indicated per the categories below to be recommended by the Deputy Vice-Chancellor Education for the Programs Committee to endorse: 

  1. re-accreditation of all programs with completed Comprehensive Program Review for a period of seven years. Assurance documentation will include the Comprehensive Program Review report for the program.
  2. commencement of the discontinuation process of a program when a Comprehensive Program Review has not been completed in the required timeframe of seven years. Assurance documentation will include all Comprehensive Program Review documents for the program, accompanied by the reasons for the proposed discontinuation. Any required teach-out plans must be submitted to the Programs Committee by the relevant College Associate Deputy Vice-Chancellor Learning and Teaching/Dean/College Deputy Vice-Chancellor.

(33) The Programs Committee endorses the programs to be re-accredited. 

(34) The Academic Board approves the programs to be re-accredited.

Alternative Arrangements for Comprehensive Program Reviews

(35) The Deputy Vice-Chancellor Education (or delegate) may require an out-of-cycle Comprehensive Program Review based on:

  1. student feedback
  2. complaints
  3. changes in data (e.g. student feedback, program dashboard).

(36) CEIQ advises any changes to the program review schedule to the Programs Committee and Academic Board.  

(37) The overview of the Program Review process is located on the Educator Resource Hub.

Evidence Thresholds for Comprehensive Program Reviews HE (coursework)

(38) CEIQ maintains the Comprehensive Program Review Terms of Reference to meet a minimum threshold of evidence.

(39) The design of the Terms of Reference holistically requires Program Managers to provide evidence and academic reflection of, but not limited to:

  1. curriculum and pedagogy
  2. reflective data check
  3. student feedback
  4. industry engagement
  5. external referencing and benchmarking, inclusive of peer review (refer to the Program and Course External Referencing and Benchmarking Procedure)
  6. Equivalence and Comparability of Academic Standards in Multiple Locations and Modes: see the Educator Resource Hub
  7. staff qualifications and scholarship.

(40) Evidence and reflection must be included in the program review submissions where required by the Comprehensive Program Review Terms of Reference.

(41) Evidence for Comprehensive Program Reviews is collected on an ongoing basis throughout the life-cycle of the program.

Evidence for Reviewing Curriculum and Pedagogy

(42) Program Managers (or equivalent) are responsible for showing evidence that the program’s curriculum and pedagogy are in line with the appropriate AQF level.

(43) Program Managers (or equivalent) must submit evidence and reflection on the mapping of:

  1. Program Learning Outcomes
  2. Course Learning Outcomes
  3. Assessment.

(44) Program Managers (or equivalent) must submit evidence and reflection of having operationalised:

  1. current RMIT program and course design principles or frameworks (refer to the Program and Course Design Procedure - Higher Education Coursework)
  2. current RMIT teaching and learning principles or frameworks, refer to the Centre fo the Educational Innovation and Quality (CEIQ).

Evidence for Reviewing Student Feedback – HE and VE

(45) Program Managers (or equivalent) are responsible for showing evidence of using student feedback.

(46) Student feedback is any judgments and opinions formed by students regarding their experience of RMIT and expressed through formal mechanisms such as program and course level surveys, student-staff consultative committees, or informal mechanisms such as focus groups, other methods of local data collection and monitoring social media.

(47) The systematic collection, use and reporting of student feedback is performed to monitor and improve the quality of the student experience. Student feedback will be used to:

  1. improve the quality of programs and courses through the development of program enhancement plans
  2. enhance program and course design and the connection between courses in a program
  3. support the scholarship of learning and teaching, including professional development programs
  4. improve the provision of learning resources, facilities, equipment and services through the development of improvement plans.

(48) Vocational Education Program Managers are responsible for incorporating student feedback into the Training and Assessment Strategy.

Surveys – HE and VE

(49) The Dean/Head of School/Cluster Director and relevant service area directors/managers are responsible for ensuring that relevant student feedback is systematically shared and actioned in their area to ensure Course Coordinators and Program Managers can take timely informed actions for improvement.

(50) The following surveys are the standard instruments for capturing student feedback about courses, programs and the broader RMIT experience:

  1. The Course Experience Survey (CES) is designed to capture feedback about students’ learning experiences within a particular course.
  2. The Student Experience Survey (SES)(HE) and the Learner Questionnaire (LQ)(VE) are designed to capture feedback from undergraduate and postgraduate by coursework and vocational education students regarding their program experience and broader RMIT experience, including services and facilities.
  3. The Graduate Outcomes Surveys (GOS)(HE) and the Student Outcome Survey (SOS)(VE) captures information about graduate outcomes to track employability and further study pathways. A longitudinal Graduate Outcomes Survey is administered three years after graduation. The SOS is administered three months after the end of the program.

(51) Staff will seek student feedback in all locations to ensure an accurate representation of the RMIT cohort. The feedback must be in a form that can be captured, analysed and reported every time a course is delivered, where appropriate. The RMIT Student Surveys Team may facilitate with Data and Analytics and relevant college/school representatives, of courses or units for survey purposes. The standard RMIT survey instruments will be used.

(52) Where a non-standard survey instrument is used to collect feedback for a cohort (e.g., for more than one course), approval must be sought from the Student Surveys Unit before it can be used.

(53) Where the standard survey instrument is inappropriate for specific delivery modes (e.g., non-classroom based) or the needs of specific student cohorts, alternative student feedback mechanisms may be deployed. Where a non-standard survey instrument is used to collect feedback, approval must be sought from the Student Surveys Unit before it can be used.

(54) Where appropriate, the Student Feedback Team can be consulted for the ethical, compliance and privacy issues that may arise through the use of non-standard or bespoke survey instruments. Stakeholders for consultation may include the Research Integrity Office and Education Regulation Compliance and Audit (ERCA).

Student Staff Consultative Committees – HE and VE

(55) Deans/Heads of School/Cluster Directors (or equivalent) are responsible for ensuring that all coursework programs or training products that are at least one semester in length have representation on Student Staff Consultative Committees (SSCCs). 

(56) The Program Manager (or equivalent) is responsible for organising and operationalising the SSCC and for:

  1. keeping a record of SSCC feedback from each teaching period of program delivery
  2. showing evidence of using feedback from the SSCC for program/course review and enhancement.

(57) The Program Manager/Course Coordinator (or equivalent) invites students in coursework programs to volunteer to be student representatives at the start of the teaching year and at the start of each teaching period where mid-year/flexible intake applies. This invitation is given in the first session of a core course in each year level of the program.

  1. The Program Manager and Course Coordinator (or equivalent) explains the purpose, benefits and attendance requirements of SSCCs.
  2. The Program Manager (or equivalent), ensures, if possible, that volunteers comprise a representative sample: for example, a mix of genders, domestic and international students.
  3. Enough students should be encouraged to volunteer to ensure that student numbers at least match those of staff at the student feedback meetings.
  4. The Program Manager (or equivalent) is responsible for overseeing the setting of meeting agendas, recording minutes of the meeting, using the SSCCs minutes template and/or the issues/actions log, and making this information available to all students.
  5. SSCC meetings concentrate on suggesting improvements to the student experience of learning and teaching in programs, including feedback on facilities, information technology and administration.

(58) Where invitations to students in a program to join the SSCC have been met with insufficient response, a digital channel, such as through Canvas and/or email must be provided to all students to provide feedback that aligns with the SSCC timelines and feedback requirements.

(59) Deans/Heads of School/Cluster Directors (or equivalent) are responsible for ensuring that the coursework program has sufficient student feedback where student attendance at SSCCs could not be met.

(60) The feedback obtained via a digital channel must be:

  1. a representative sample from the student cohort
  2. collated and summarised into relevant themes
  3. addressed by the Program Manager (or equivalent)
  4. circulated back to students with the actions taken
  5. provided as evidence of using student feedback as part of the program/course review and enhancement.
Please see the Educator Resource Hub for resources that support SSCCs.

Evidence for Reviewing Industry Partnered Engagement – HE and VE

(61) Each HE coursework program, or discipline, has an Industry Advisory Committee. A single Industry Advisory Committee may be convened to support all programs in the same discipline. Where an Industry Advisory Committee is not fit for purpose, evidence of Industry Engagement should be provided inclusive of clauses (62) and (63) below.

(62) Program Managers (or equivalent) are responsible for showing evidence of industry partnered engagement. Industry partnered engagement can include:

  1. partnering with sector employers, RTO’s relevant industry bodies or businesses,
  2. exchanging staff, resources and knowledge with industry bodies or networks,
  3. employer surveys and feedback.

(63) The Dean/Head of School/Cluster Director (or equivalent) responsible for the program oversees the operationalising of the Industry Advisory Committee. This includes setting meetings and agendas and recording meeting minutes, including issues and actions. Formal reports are forwarded to the Programs Committee or other relevant committees of the college or of RMIT.

(64) Higher Education Industry Advisory Committees advise on matters associated with program design, delivery and review, in particular:

  1. recommendations on proposed program developments
  2. student demand and the community need for the program
  3. prospective employment opportunities for graduates of the program
  4. the extent to which the program offered meets its stated aims and objectives
  5. advises on key relationships among RMIT, employers and the profession
  6. considers the resources required for program delivery
  7. supports the accreditation and re-accreditation of RMIT programs by external bodies
  8. advises on research and development activities and relevant consultation with external bodies
  9. advises on matters associated with development, delivery and assessment.

(65) Industry engagement in VE ensures the alignment of industry trends to inform the development and ongoing review of VE qualifications, including:

  1. helping design training and assessment strategies
  2. selecting suitable resources
  3. seeking feedback about how training and assessment will be provided, and
  4. confirming trainers and assessors have current industry skills.

(66) For VE programs, a range of industry engagement strategies must be adopted to demonstrate evidence that a number of stakeholders have been consulted over a period of time.

Evidence for Reviewing External Validation and Benchmarking – HE

(67) Program Managers (or equivalent) are responsible for showing evidence in response to the Program and Course External Referencing and Benchmarking Procedure.

(68) CEIQ interprets, designs and maintains the Program and Course External Referencing and Benchmarking Procedure, including:

  1. working with the Data and Analytics team and other relevant units to ensure robust benchmarking across the sector
  2. supporting external peer review with College Quality Teams and Program Managers.

(69) Program Managers (or equivalent) must show evidence of the degree to which they look outside of their courses and programs. By responding to the RMIT Program and Course External Referencing and Benchmarking Procedure in the Comprehensive Program Review Terms of Reference the following will be addressed:

  1. course enhancements
  2. internal and external benchmarking in accordance with the Program and Course External Referencing and Benchmarking Procedure.

Evidence for Equivalence and Comparability of Academic Standards in Multiple Locations and Modes – HE and VE

(70) Program Managers (or equivalent) are responsible for showing evidence of equivalence and comparability of academic standards and quality. Evidence and reflection on this must be included in the HE program review submissions where required by the Comprehensive Program Review Terms of Reference or Annual Program Review Terms of Reference. 

(71) The equivalence and comparability framework comprises factors that define equivalence, comparability and customisation of different offerings of programs. For resources and instructions on how to respond to the RMIT Equivalence and Comparability Framework, see Educator Resource Hub.

  1. Equivalence factors are those that ensure academic standards are evidenced through compliance with relevant RMIT strategic directions, policies, procedures and guidelines and external quality assurance frameworks, including the Higher Education Standards Framework (Threshold Standards) 2021 and the Standards for Registered Training Organisations (RTOs) 2015.
  2. Comparability allows for contextualisation and customisation to take account of local factors and to meet the needs of specific student cohorts.
  3. Customisation aligns the learning design of a course offering and the media used for the presentation of materials with the students’ profile to promote effective learning.

(72) Program Managers (or equivalent) are responsible for including program data of all locations in their program review submissions including, but not limited to:

  1. attrition
  2. completion
  3. grades/pass rate/competency
  4. moderation reports
  5. retention
  6. student feedback.   

Student Feedback for Programs Delivered in Conjunction with Partners – HE and VE

(73) Appropriate feedback processes and calendars are endorsed by the college Associate Deputy Vice-Chancellor Learning and Teaching (or equivalent) and appropriate personnel from partner organisations.

(74) Partner contract negotiations will include standard survey instruments offered by RMIT with results to course, school/industry cluster and college leaders to ensure that:

  1. contractual obligations in relation to student feedback are considered and met
  2. student feedback procedures already in place by partner institutions are considered
  3. the type and conduct of student feedback is appropriate given the cultural context of delivery
  4. the use of student feedback results is aligned with RMIT’s and partner institutions’ student feedback policies
  5. outcomes from student feedback are communicated to students and partners
  6. regular monitoring of improvements occurs. 

(75) International and Engagement Portfolio and the school/industry cluster partner manager, in conjunction with the Associate Dean Learning and Teaching, Associate Deputy Vice-Chancellor, Learning and Teaching (or equivalent) and CEIQ, are responsible for consultation, development and implementation of partner feedback processes. See Partnered Delivery of Coursework Awards Guideline.

Higher Degrees by Research

(76) The School of Graduate Research coordinates the annual and comprehensive reviews for Higher Degree by Research programs under the governance of the Graduate Research Committee and the Research Committee.

(77) The Higher Degree by Research Program Review Cycle determines the cadence and focus areas for annual and comprehensive reviews in alignment with the Higher Education Standards Framework (Threshold Standards) 2021.

(78) Comprehensive reviews for Higher Degree by Research programs are held every five to seven years with a review panel comprising of internal and external members.

(79) Higher Degree by Research programs are reviewed in clusters relating to relevant schools and/or colleges.

(80) Data sources for program reviews may include, but are not limited to:

  1. student performance data
  2. external referencing and benchmarking
  3. student feedback and stakeholder consultation
  4. student complaints
  5. program and course guides
  6. professional accreditation standards
  7. government and industry reports.

(81) Higher Degree by Research Delegated Authorities are responsible for showing evidence of using student feedback in program review documentation.

(82) Student feedback is any judgement and opinion formed by students regarding their experience of RMIT and expressed through formal mechanisms such as program and course level surveys, student-staff consultative committees, or informal mechanisms such as focus groups and social media.

(83) Higher degree by research candidate feedback collated through formal RMIT surveys will comply with clauses listed in this procedure.

(84) Schools are required to develop and report on program enhancement plans addressing recommendations made by review panels and/or School of Graduate Research during the program reviews.

(85) The Associate Deputy Vice-Chancellor Research Training and Development (or nominee) may require an out-of-cycle program review based on:

  1. candidate feedback
  2. formal complaints
  3. changes in data.

(86) School of Graduate Research monitors and evaluates program review submissions and report outcomes, including recommendations to re-accredit higher degree by research programs to the Graduate Research Committee, Research Committee and Academic Board.