Communicating Quality Part 9
With the launch of the Educational Quality Framework (EQF) and associated policies and procedures, we have been highlighting an aspect of the EQF in more detail. In this last article for 2019 we revisit the idea of ongoing quality monitoring and improvement and the value of data.
Who’s afraid of data?
To a cynic, the idea of keeping a regular close eye on data – key accountability measures (KAMs), operational performance measures (OPMs), enrolment summary reports, topic demographics, pivot tables, load models and so on – may seem like 21st century bunkum. What could numbers on a spreadsheet tell us that we don’t already know, and besides, if it ain’t broke why fix it? On the contrary, the data held within our University systems, and collated, crunched and customised in the Business Analytics portal tell an intimate story of course and topic performance, and within the figures are the students – their demographics, enrolment trends, success rates and experience. Where the data can expose fields of deficiency or need, they can also uncover areas of excellence and predict potential sources of opportunity or challenge. Marrying up data with observation can lead to vital insights as the hypothetical case below demonstrates.
The suite of courses in the discipline of X were scheduled for internal course accreditation. These courses represented a flagship offering by the University, were extensively resourced, publicised, and professionally regarded. This had not always been the case, and in fact the first two years of offering were extremely problematic, with significant enrolment and timetabling errors, negative student experiences, confusing topic and assessment design, and scathing SET feedback reflecting high attrition rates and poor student outcomes. Through close monitoring and evaluation of data, the course advisory group was able to make annual recommendations which lead to improvements in both curriculum design and delivery for the suite of courses. The ongoing monitoring and improvement resulted in a reversal of attrition, improved student learning outcomes and the successful professional accreditation of the entire suite of courses. This information was included in the internal course accreditation submission and the suite of courses was reaccredited without issue, and the teaching team was commended on their response to and use of data to improve their courses.
Close reading and analysis of data doesn’t have to be just at the topic or course level. Data monitoring can also be useful on an individual level as suggested in the following scenarios.
Dr M has been coordinating the same topic for the past 9 years. Over the last 5 years he has noticed a shift in SET feedback – where once he received largely positive comments, responses were now often critical of his teaching style. This has occurred against a backdrop of increased student enrolment, changing demographics and increasing attrition. By drilling down into his topic demographic data and undertaking professional development in teaching to large and diverse cohorts, Dr M was able to adapt his topic design, assessment and pedagogy to better suit the needs of his changing student cohort. As a result of the changes his SET feedback has become noticeably more positive, attrition rates have decreased, and student results have improved.
Professor S had been teaching for decades, and nearly always received positive SET feedback. Loved by students and colleagues alike, she was well-regarded as an outstanding teacher who regularly participated in academic development activities, contributed to pedagogical innovations, and was often called upon to mentor others and share insights. Professor S was also extremely humble, treating much of what she did as ‘just part of the job’. One year a colleague encouraged her to apply for a teaching award. Initially hesitant and doubting the existence of evidence to support claims of excellence and impact, the Professor nevertheless obtained and analysed various points of data including SET, student success rates, retention and attrition figures compared against College and University averages, attendance data and enrolment numbers across the many years of her career. The strength of the data emboldened the Professor to submit a teaching award application, and she went on to win the University Teacher of the Year Award.
Data is your friend
Ongoing monitoring of data is not performance micromanaging or surveillance. Sure, the term ‘monitoring’ may conjure up images of inspection, scrutiny or shadowy reconnaissance, but this is not the intent. Here, the goal is to maintain a rapport with your data. It is not there to punish and expose but rather to support and vindicate. It is your data, and you are entitled to access it and use it to stimulate change and validate direction. Furthermore, the staff in Planning and Analytical Services are some of the friendliest folk you’ll ever meet and will work collegially with you to source and personalise relevant facts and figures to suit your needs, whether that be at the course, topic or individual level.
Analytics has the power to help higher education tackle some of its biggest challenges. Colleges and universities have access to vast stores of data from the numerous systems that run virtually every aspect of the institution, but putting the need for informed decision-making together with the available data in a way that results in useful analytics can be harder than it seems. However, it’s work that can have an enormous impact on the health and future of our institutions. (Reinitz, 2019)
Reinitz, B. (2019). Keys to an analytics future: Governance, collaboration, and communication. EDUCAUSE Review. Retrieved from https://er.educause.edu/blogs/2019/9/keys-to-an-analytics-future-governance-collaboration-and-communication
Written by Anna Smith
Project Officer, Learning and Teaching – CILT