Whose university, and why? pt1.

Ahead of the Browne Review, and the associated storm of nonsense in the national press, I’ve been getting very interested in the nature of the university, and how this has changed over time. With the fees issue, the influx of private institutions into the UK and cuts to research funding, we are going to be living through some great upheavals. I think it is important to show that the virtual stasis within the system between 1992 and 2004, and from 2004 onwards, has been an anomaly, and that change is embedded into the DNA of the institutions we work for. This is part 1 of  a short series of blog posts: this post focuses on the history, and the next will focus on the present day pressures.

These are my views and not those of my employer, or of projects and programmes I am responsible for. This post is available under a creative commons CC-BY license.
If you asked an average, informed, observer (say an informed and observant Vice Chancellor, for instance) “What is a university” I imagine you’d get something like the following: 

Universities (and colleges) are supported by public funds to do research. They teach students, at undergraduate and post-graduate level, with a combination of state funding and student contributions. They work (at least partially) to meet the needs of local and national employers, and of professional bodies. And they administrate themselves, via academic managers with professional managerial support. (this isn’t a real quote, but it sounds about right)
This has all only really been the case since 1919, with the establishment of two bodies – the Department of Scientific and Industrial Research, which provided state research funding for what we now call STEM subjects, and the University Grants Committee, propping up an ailing higher education infrastructure after the First World War. Keen ironists will be delighted to note that both of these bodies and their underlying state-interventionist principles were established by a Conservative/Liberal coalition government. One Sir William McCormick was the first chair of both the DSIR and the UGC.

Prior to this, university funding by the state was piecemeal and arbitrary, with the primary policy actors being local authorities (in the establishment of Civic universities such as Liverpool, Birmingham and Manchester) and central government in establishing the Willetsian degree-awarding colossus that is the University of London (essentially a self-supporting 1836 fudge by the Privy Council so they didn’t have to grant powers to multiple provincial universities that they didn’t feel would be sustainable). Despite this, institutions continued much as they had in the middle ages, with the idea of the university famously described by the newly-Blesséd John Henry Newman in 1850:

“The general principles of any study you may learn by books at home; but the detail, the colour, the tone, the air, the life which makes it live in us, you must catch all these from those in whom it lives already. You must imitate the student in French or German, who is not content with his grammar, but goes to Paris or Dresden: you must take example from the young artist, who aspires to visit the great Masters in Florence and in Rome. Till we have discovered some intellectual daguerreotype, which takes off the course of thought, and the form, lineaments, and features of truth, as completely and minutely as the optical instrument reproduces the sensible object, we must come to the teachers of wisdom to learn wisdom, we must repair to the fountain, and drink there. Portions of it may go from thence to the ends of the earth by means of books; but the fullness is in one place alone. It is in such assemblages and congregations of intellect that books themselves, the masterpieces of human genius, are written, or at least originated.”

Of course, there was no need for University research funding in those early days. Newman again:

“The nature of the case and the history of philosophy combine to recommend to us this division of intellectual labour between Academies and Universities. To discover and to teach are distinct functions; they are also distinct gifts, and are not commonly found united in the same person. He, too, who spends his day in dispensing his existing knowledge to all comers is unlikely to have either leisure or energy to acquire new.”

Public funding for research (apart from a few special cases where specific non-university research institutes such as the Royal Society and the Royal Observatory were supported by the Crown and commissioned largely private individuals) is largely a 20th century invention – indeed you can pin the date down a rough date shortly after the first world war, and the above mentioned Department of Scientific and Industrial Research. But even here, the Department was more likely to commission and fund independent research bodies such as the National Physical Laboratory and the Building Research Establishment, occasionally bringing in University staff to work with them.

Two notable non-recipients of UGC (and DSIR) cash were the Universities of Oxford and Cambridge, both of whom felt that their autonomy would be compromised by accepting state funding. But even these two, enviously and nervously eyeing the investment in laboratory equipment facilitated by grants to other institutions, petitioned the UGC to support them in 1922. 

UGC grants mainly covered the administrative and structural costs of a University, with teaching supported by learners and their sponsors. The availability of (near) universal public funding for teaching in Higher Education is a post second-world war invention, with a growth in local education authority funding for university fees from the mid ’40s onwards. A national scheme of student grants in the early ’60s after the recommendation of the Anderson Committee and the legislation of the 1962 Education Act built on the narrow availability of private and Board of Education scholarships. The 1962 act enshrined the right of all school leaves to local education maintenance grants in respect of their higher-level studies, with the exception of trainee teachers and mature students, both of which who were supported by the Board of Education. These interventions led to a rapid rise in the number of students who were able to take advantage of university provision.

Only with the passage of the Higher Education Act in 2004 did the onus for the payment of (at least some) of the cost of their university education (in the form of what at the time was called “top-up fees”) return to the learner in question.

But enough of these modern ideas of funding teaching and research! The position of the employer needs has become more prominent since the Dearing report in 1997 but it’s been there since medieval times, with pretty much a 10-20 year cycle of interest through the 20th century. Indeed, giving life to the old Einstein maxim that the definition of madness is continuing to do the same thing and expecting different outcome, successive movements and eventually governments have created new kinds of UK universities, to better meet the needs of employers:
  • The “redbrick” and “civic” universities, largely established by groups of industrialist benefactors, placed particular emphasis on meeting the technological demands of the fast-changing Victorian era.
  • The “Robbins Report”, or “plate-glass”, universities, where  all Colleges of Advanced Technology (originally organised to meet the industrial and commercial needs in a given locality) gained degree awarding powers
  • The “New”, or “post 92” universities, where polytechnics and HE Colleges already embedded in local employment markets gained degree awarding powers.
  • The Open University specifically allowed students to study whilst in full time employment.
  • And those readers sitting in “ancient” universities may want to consider the links between their seat of learning and the Church, the principal employer of university graduates for many centuries.

And as for the academic leadership of Universities, just to give one example the University of Cambridge Congregation appointed “proctors” to deal with the finance, infrastructure and PR activity of the medieval university.

With this in mind, we can surmise that the current state of the university system in the UK is a function of many interventions, by government and employers, over nearly 1000 years. But is what we have ended up with worth defending?

Selected background and further reading:

Anderson, Robert, “The Idea of a University Today“, (History and Policy, March 2010)

A Brief History of the University of Cambridge“, (cam.ac.uk, accessed October 2010)

Dyhouse, Carol, “Going to University: Funding, Costs, Benefits” (History and Policy, August 2007)

Hutchinson, Eric, “The History of the University Grants Committee” (Minerva vol 13 number 4, December 1975)

A history of congregation and convocation“, (ox.ac.uk, accessed October 2010)

Salmon, Mike et al, “Look back at Anglia” (http://www.iankitching.me.uk, accessed October 2010)

Also, the legend that is Joss Winn pointed me to this amazing paper, which covers the changes of the 80s in much more depth.

Finlayson, Gordon, and Hayward, Danny, “Education towards Heteronomy: A Critical Analysis of the Reform of UK Universities since 1978. ” (http://www.jamesgordonfinlayson.net, accessed October 2010)


6 thoughts on “Whose university, and why? pt1.”

  1. I’m not convinced that vocational specialization within HE is actually serving employers well. Are universities really responding to the needs of employers or the desire of individuals to improve their CVs?

  2. @David I think it’s more a shift in perception and purpose rather than an activity. As I’ve said in elsewhere employers generally don’t know what they want, so the fact that we all troop along after what they think they want is both silly and a cornerstone of HE policy.I wonder if there were similar discussions between the Church and the University of Oxford in C14th? 🙂

  3. In my experience employers want independent self-starters who can manage their own time, are excellent team players and will do exactly what they are told to do on demand…

  4. @David I think there is a difference between what line managers, senior managers and HR people want, which is one of the things which gives rise to the beautifully phrased issue you raise.Senior managers want independent self-starters who can manage their own time, to work under their line managers who they would prefer to do exactly as they are told in a more-or-less autonomous way but reporting back with detailed figures which will never be read at least on a monthly schedule.Line managers want excellent team players who will do exactly what they are told to do on demand. In some cases, they like ones who can be assigned tasks and act autonomously within a tight specification without oversight, but those actually appear to be rather rare.HR people want everyone to accept low pay, long hours, be newly qualified with up to date technical skills and at least 5 years commercial experience.I think this multi-faceted beast then tries to influence universities to change curricula to fit their needs. One of the things that they all appear to agree on is that graduates need transferable skills. This is presumably so that the employers don’t need to spend money on re-training the graduates to be able to do the actual jobs the employers think they want to employ them to do. I say ‘think’ because the advertised jobs are seldom related in any strong way to the actual roles people end up in once employed. But anyway, the fascinating thing is that while those transferable skills might be thought to be, oh I don’t know, being an independent learner, what the universities seem to end up teaching as transferable skills are, well, writing CVs. Which may not be quite what the employers had in mind…

Leave a Reply

Your email address will not be published. Required fields are marked *