Principles – if you don’t like ’em, we have others

Just in case people find it useful, this is a worked example of responding to that “principles” section that tends to crop up in policy consultation. Any policy idea, even the most nakedly and arbitrary ideological nonsense, will have “principles” – because where would we be if we didn’t have principles?

(who said “Top Shop”?)

Whereas it may seem to be a fairly innocuous set of “motherhood-and-apple-pie” stuff that pretty much everyone would nod through, these principles serve to both frame and constrain the debate in and around the paper that follows. The majority of consultation responses will concentrate on later question as these directly impinge on institutional or organisational activity or commitment – and most people who wade through stuff like this are paid to do so by said organisation or institution.

But as I’m responding on my own account I’m more concerned with the assumptions that underpin the consultation, and less concerned with any interim projections of likely effects. This means that I’m hyper-vigilant (almost comically so) to the nuances in phrasing and meaning within these short and apparently uncontroversial statements.

There is a huge value in making an independent and personal response to a consultation – and I would encourage all wonks and wonks-in-training to have a crack at a couple (HEFCE QA would be a good one, also have a look at the BIS student loan repayment threshold freeze if you fancy getting stuck in to a bit of finance . It’s a great personal learning exercise, and it can sometimes have a positive effect on national policy-making.

[for the avoidance of any doubt, what follows is an excerpt from a personal response to the QA consultation, that explicitly does not reflect the views of any organisation, grouping, political party or secret society. It is presented in the public domain (cc-0), so you may reuse it without citation if you wish]

Question 1: Do you agree with our proposed principles to underpin the future approach to quality assessment in established providers?

I have responded to each principle in turn.

  1. Be based on the autonomy of higher education providers with degree awarding powers to set and maintain academic standards, and on the responsibility of all providers to determine and deliver the most appropriate academic experience for their students wherever and however they study.

This principle attempts to address the new complexity of the institutional landscape in this area. Broadly “providers of HE” may or may not be approved “HE providers” – with or without institutional undergraduate and/or research degree awarding powers – and may or may not hold the title “university” (and may or may not have tier 4 sponsor status).

For the purposes of academic quality assurance it is not clear why a distinction is drawn here between “HE providers with degree awarding powers”, and “all providers”. For the latter, the designation process already requires that a particular course meets “quality” criteria via the QAA Higher Education Review and consequent annual monitoring. This process explicitly examines the ability of any provider to manage quality and academic standards.[1] The principle should surely be (as was the case until very recently) that all HE should be delivered to the same academic standards and assured to the same academic standards wherever it is delivered.

The use of “autonomy” in one case and “responsibility” in the other also exacerbates this artificial divide. The current system of QA requires that all HE delivery is supported by an institutional system that manages and ensures academic quality and academic standards and this principle should be defended and maintained.

2. Use peer review and appropriate external scrutiny as a core component of quality assessment and assurance approaches.

A purely internal system of scrutiny would not be fit for purpose in ensuring the continued high standard of English HE provision. Though internal institutional monitoring (both data-led and qualitative) will support the maintenance of standards, the “gold standard” is comparability with peers and adherence to relevant national and global requirements. The existing QAA Higher Education Review process (which is common to existing providers and new entrants) directly ensures that peers from across the sector are involved in making a judgement on institutional quality assurance and quality assessment processes.

3. Expect students to be meaningfully integrated as partners in the design, monitoring and reviewing of processes to improve the academic quality of their education.

The key here is a “meaningful” integration, beyond mere committee membership. Academic staff at all levels should also have a role in designing, monitoring and reviewing processes – this would be a key factor in developing processes that are genuinely useful in ensuring a quality academic experience for students without an unreasonable institutional burden.

As James Wilsdon noted in “The Metric Tide”[2], “The demands of formal evaluation according to broadly standardised criteria are likely to focus the attention system of organisations on satisfying them, and give rise to local lock-in mechanisms. But the extent to which mechanisms like evaluation actually control and steer loosely coupled systems of academic knowledge is still poorly understood.” (p87)

It is therefore essential that both internal and external systems of quality assurance take into account the well-documented negative effects of a metrics-driven compliance-based culture, and it would appear that a meaningful integration of students, academic staff and support staff into the design as well as the delivery of these processes would be an appropriate means to do this.

4. Provide accountability, value for money, and assurance to students, and to employers, government and the public, in the areas that matter to those stakeholders, both in relation to individual providers and across the sector as a whole.

This principle should be balanced very carefully against principle (a), above. Assessment of “value for money”, in particular, should be approached with care and with greater emphasis on longer-term and less direct benefits than are currently fashionable. The risk of short-term accountability limiting the ability of academia to provide genuinely transformational and meaningful interventions in the lives of students and society as a whole is implicit within the current model of institutional funding, and a well-designed system of QA should balance rather than amplify this market pressure.

5. Be transparent and easily understood by students and other stakeholders.

It is difficult to argue against this principle, though simplicity must be balanced with a commitment to both academic and statistical rigour. HEFCE will doubtless remember the issues with over-simplified NSS and KIS data leading to a misleading and confusing information offer to prospective students, as documented in some of the HEDIIP work around classification systems[3] – and should also note the findings of their own 2014 report into the use of information about HE provision by prospective students.[4]

6. Work well for increasingly diverse and different missions, and ensure that providers are not prevented from experimentation and innovation in strategic direction or in approaches to learning and teaching.

It is important here to draw a distinction between experimentation and innovation in learning and teaching practice, which is a central strength of UK HE as evidenced by a substantial body of literature and practice, and experimentation and innovation in institutional business models.

The former should be encouraged and supported, with specific funding offered to individual academics and small teams with the ability to innovate in order to meet existing or emerging learner or societal needs. Funding and opportunity for research into Higher Education pedagogy and policy are severely limited, and in order that experimentation can be based on sound research further investment is needed. Organisations such as the ESRC, SRHE, Higher Education Academy, BERA, SEDA, Jisc and ALT should be supported in addressing this clear need.

The latter should also be encouraged and supported, but the risk to students and the exchequer is far greater here and this should be mitigated and managed carefully. Recent activity in this area has demonstrated risks around the needs of learners being insufficiently met, risks around accountability for public funds, risks around investment being diverted from core business, and risks around reputational damage for the sector as a whole.  In this area experimentation should be evidence-based, and the exposure of learners and the exchequer to the negative consequences of experimentations should be limited.

7. Not repeatedly retest an established provider against the baseline requirements for an acceptable level of provision necessary for entry to the publicly funded higher education system, unless there is evidence that suggests that this is necessary.

Recent research conducted for HEFCE by KPMG concluded that the majority of the “costs” associated with quality assurance in HE come from poorly-designed and burdensome processes at an institutional level, and multiple PSRB engagements. As such, it is difficult to make an argument to limit national engagements as the data and materials will most likely be collected and prepared regardless.

Interim engagement could focus on targeted support to reduce the internal cost of QA activity via expert advice on designing and implementing systems of assurance, and optimising institutional management information systems (MISs). The QAA and Jisc would be best placed to support this – and engagements of this nature would provide much greater savings than simply limiting the number of external inputs into institutional processes.

Of course, QAA support for PSRBs in designing and implementing robust yet light-touch reviews would be a further opportunity for significant savings.

8. Adopt a risk- and evidence-based approach to co-regulation to ensure that regulatory scrutiny focuses on the areas where risk to standards and/or to the academic experience of students or the system is greatest.

Again, it is difficult to argue against this – though a definition of co-regulation (I assume this refers to the totality of sector QA to include national, institutional and subject area specific processes) would be beneficial. Risk monitoring should primarily focus on responsiveness in order to encompass unpredictable need, especially as relates to business model innovation.

9. Ensure that the overall cost and burden of the quality assessment and wider assurance system is proportionate.

This principle should explicitly refer to the overall cost and burden of QA and assurance as a whole, rather than just national processes. The KPMG report was clear that the majority of costs are linked to institutional data collection and PSRB-related activity, and it is here that the attention of HEFCE should be primarily directed.

10. Protect the reputation of the UK higher education system in a global context.

HEFCE and the QAA should continue to work with ENQA, EQAR and INQAAHE, to ensure that the global QA context is paramount in English and UK assurance activity.

11. Intervene early and rapidly but proportionately when things go wrong.

This should continue as is currently the case, with HEFCE (as core and financial regulator), QAA (as academic quality assurance specialists), UCU (as staff advocate) and both OIA and NUS (as student advocates) working together to identify and resolve issues.

13. Work towards creating a consistent approach to quality assessment for all providers of higher education.

Consistency of approach is less important than consistency of academic standards, and as such this principle appears to work in opposition to principle (5). QA approaches at an institutional level should be adaptable to identified needs amongst a diversity of providers and activity.

[1] https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/450090/BIS-15-440-guidance-for-alternative-higher-education-providers.pdf

[2] http://www.hefce.ac.uk/media/HEFCE,2014/Content/Pubs/Independentresearch/2015/The,Metric,Tide/2015_metric_tide.pdf

[3] http://www.hediip.ac.uk/subject_coding/

[4] http://www.hefce.ac.uk/pubs/rereports/Year/2014/infoadvisory/Title,92167,en.html

(if anyone is interested in my responses to the remaining questions, I’d be happy to share. Do leave a comment or send a twitter DM)

5 thoughts on “Principles – if you don’t like ’em, we have others”

Leave a Reply

Your email address will not be published. Required fields are marked *