This – incredibly – is only one of the blog posts I currently have on the boil that touches on the Church of Jesus Christ and the Latter Day Saints, though this one is – I think – merely a coincidence.
Like nearly the entire western world I’ve been thinking about fake news and the negotiation of constructed realities as performed online, and like maybe 40-50 smelly edtech hippies I’ve been wondering how to apply what I learned from #ds106 to this now rather pressing problem.
But then – via Cogdog-style happenstance – and prompted partially by the man-dog himself’s recent and intriguing post on a “Networked Narratives” course he is running with Mia Zamora for no other reason than it needs to be done (I hope to be there) – I got into a bad-old-days-of-blogging nostalgia-fest and in looking up whether anyone had re-invented Google Reader yet. I’m on inoreader at the moment, since you ask.
Whilst meandering, I stumbled across the term “bloggernacle” – which, well, I’d use anything called “bloggernacle” and I think I speak for us all in saying that. Turns out that there is a huge Mormon blogging scene. Open education folks will know that that LDS (Latter Day Saints, which I understand is the more accurate way to describe people of that faith) and OER are intertwined in various wonderful ways, so I was mildy interested to see whether online activities of the two shared a common source. Instead I found a link to something altogether more #ds106-ian.
In mid-2005 several prominent LDS bloggers put together a group blog (called “Banner of Heaven”) based around a bunch of invented characters. The idea was primarily to “to explore the potential of blogging as a story-telling form”, with subsidiary goals of reflecting back what they perceived as the primary concerns of LDS blogging at the time. This link is to what you might call the learning objectives of the exercise – read it. (most of this post comes from a 2010 “behind the music” style retrospective by one of the original authors on By Common Consent. It was only at this point that the original text was made public.)
They came up with six characters:
SeptimusH – a shy inactive former missionary, who all too often ends up dealing with dead cows.
MirandaPJ – a feminist from Lewiston, Idaho, who confiscated her husband’s xbox.
JennMailer – a perky but insecure young woman with very traditional views.
Mari Collier – Miranda’s sister, kind and faithful, but with a troubled past.
Aaron B Cox – representing the more combative end of blogging and the more… unique… expression of scriptural fundamentalism
Greg Fox – a non-church member who loved to hang out with the others, but was often disappointed with what he found.
These were both (semi-) realistic positions current in the LDS online milleau at the time, and astutely drawn comic characters in their own right. I’m sure coming from my position 10 years or more on I’m missing a lot of the subtlety – and would never be able to spot the point where the stereotypes were amped up to the point of lunacy and people began to spot that something wasn’t right.
Yes – the group blog was not explicitly presented as “fake” – people believed that the characters were real, and began both to worry about the situations described, alternately empathizing and judging, and to share their own stories in response.
From an educational point of view, this is great stuff. But for an online community in the first flushes of blog enthusiasm, perhaps not so much. Another LDS blog, Nine Moons, did the inevitable expose and the initial comments, from the hip young things the joke was aimed at, are fairly good natured. But by the time they began the “guess the famous blogger” competition things start turning a little more sour, and those outside of the community began to take a view. (Church and Federal) Legal issues were brought up. Senses of a community were lost. Hands – indeed – were wrung. Pearls were clutched.
But reading the comments to some of these posts, ten years on, is uncomfortable. There is a genuine sense of betrayal. People that were accepted as friends are no longer “real”. Ideas of what constituted a part of the lived experience of peers needed to be rexamined.
It’s not hyperbole to say that this “experiment” had a long and lasting effect on the community that was forming around it and other writers.
Here is the interesting part: no one really remembers much about Banner itself; instead, what everyone recalls is the outrage. Either you remember the deceit, or you remember the pound of flesh publicly exacted from the Bannerites. Few of us recall reading Banner or the ideas laid out by the characters. Bannergate has sucked the work dry.
The power of the reaction, the brutality of the analysis, destroying the meaning of the original text.
All of us have the urge to create a new reality online – either explicitly in coming up with something like the saga of Dr Oblivion, or in the work of Helen Keegan; or implicitly, in presenting a version of ourselves that is just plain nicer/funnier/smarter/more interesting than the decaying sack of bones, flesh and Doobie Brothers lyrics that exists in three-dimensional reality.
And when you do create a reality, do you pull away the mask and risk confusion and alienation. Or run the risk of keeping the story going, ending with some of the horror we have seen emerging from known falsehoods in 2016 (this week’s shitshow: pizzagate).
Pizzagate brought home to me that the fake news scare is just digital storytelling gone awry – and gives me the hope that I know some people that may have a handle on what to do next.
Teaching digital storytelling is probably one of the most important things anyone can be doing right now… and for my part (without a role of my own) I want to volunteer to help out those of you who are taking this to the streets.
The only cure for a blank page is to start typing. The only cure for numb disbelief is to try and get a handle on the system of thinking that is difficult to believe.
We’ve had brexit. now Trump. Next is France. What’s happening? It has to be more than racism/xenophobia. It’s not that simple. right??
So, late 2016, we have been presented with a number of unlikely events, but only three that I would previously have described as almost impossible.
Peter Thiel spoke at the RNC in 2016.
Actual, undisguised, racism is a mainstream global political current in 2016.
The worst thing one can possibly be in 2016 is an “expert”.
So, on November 9th, I started reading around these issues. The common current seems to be something called “Neo-reactionism” (NRx), and the prime reference that I kept seeing is a chap delighting in the nom de plume of Mencius Moldbug who kept a blog (“Unqualified Reservations”) during the latter part of the last decade. If you ask questions around the basis of the alt right generally, that blog is where you tend to get pointed. I’m not going to link to it here – no one needs to read that stuff (for clarity, I read the “gentle introduction” multi-post shitshow, the “open letter”, the “Formalist manifesto” and a couple of other posts.) Here’s a (long) 2013 Scott Walker summary of the “gentle introduction” which may suffice.
Another common reference is to someone I’ve been wanting to write about for ages, a former University of Warwick philosopher named Nick Land. Land’s research interests were around cyberculture (the “Cybernetic Culture Research Unit”). This is a useful (again long) 2009 summary by music journalist Simon Reynolds. Since leaving academia his best-known writing is a collection of blog posts entitled “The Dark Enlightenment” (again I’m not linking to a primary source, but I have read the collection).
(If you just want a general overview of Neoreactionism in 2016, this [by Dylan Matthews at Vox] is readable]
So I started to read this stuff.
And I stopped there. Terrified.
Because the last thing I was expecting was that these guys were me. Us. Tonally, structurally – the same tools and tropes I’d use here to talk about education technology or whatever the hell else are used to talk about this…stuff. Moldbug likes to mix quotations from diverse and “forgotten” primary sources with low pop-culture references, “gen x”-style irony and song lyrics. Nick Land… well, I’ll link to this: “Meltdown“, presented at an academic conference in 1994 (this version mixed with video material by his long-term collaborators 0rphan Drift).
If you go to any random “radical education technology” conference – say, perhaps, #opened16 – none of these tropes would seem out of place. After I’d stopped being shocked, I started wondering why I was shocked. After all, these are people:
that have been in and around the internet and cyberculture since at least the late 90s.
that have hung out and argued in online political discussions similar to the ones I hung out in.
that think and write about technology and how it affects culture, and vice versa.
They are us. They are us.
But it’s OK, right, because I’m now going to tell you all of the monstrous stuff they believe that we obviously don’t. Right?
Not quite.
So, in NRx, the worst things in the world, the causes of all of our problems as a global society, are universities and media organisations. Universities, in particular, have a huge amount of power and influence – which is used to shape the very political direction of civilisation. Which is always to the left. The famous quote is “Cthulu only swims left”. The less-famous idea is that the left (read “progressives”, “the establishment”, the two are used interchangeably within a grand idea of “the Cathedral” (not the one near the bazaar)) deals with enemies to the political right swiftly and without mercy, but with enemies to the left with tolerance and grace.
So it’s fine to be an academic marxist, but if you wanted to be an academic conservative you’d be out on your ear.
[Actually, I should add that one surprising weakness in Moldbug (other than the obvious massive racism and such) is his treatment of Conservatism as a political movement. To him, Conservatism is just the progressive ideas of about 20-50 years ago, an attempt merely to stop the march of progress rather than propose anything new. This utterly ignores the corporatist, or “neoliberal” if you must, trend on the modern right, and makes it easier for him to situate his own critique within a third tradition that includes Mises and the whole Austrian School of Economics – more on that to come, economics fans!]
And that, followers, is the big secret. Because the alt-right are obsessed with crappy Matrix references, they call it the red pill. Proving they at least have the sense to ignore the sequels…
Now – let us pause here. I hang around with a lot of people who seriously dislike the modern university, and have strong feelings about modern mass media. A position that suggests these things are terrible is hardly remarkable. Indeed, I know many who hope to set up their own – alternative – university as a pure source of insight and education for a population that is otherwise reliant on an increasingly corrupt mainstream media to understand the world.
Well, so does Moldbug – he calls his grand ambition the “anti-versity”. He’s not working on it now though – he’s working on a start-up called “Urbit” that aims to allow people to reclaim ownership over their online life via a bespoke cloud hosting service. Funded by Peter Thiel.
You see why I was terrified?
Anyway – his issue with progressivism, and indeed democracy, is that he doesn’t think that it works. To back this up he points to the existence of crime, and of what journalists love to euphemise as “the underclass”. (As usual with Moldbug the weaker points of his argument are backed up either by vague state-of-the-world conjecture or contextless lengthy quotations from dead white European guys.)
So the obvious solution is to restore monarchy in a joint-stock holding national corporation (with the aristocracy as stock holders) . Of course. Reasons for this leap include that Kings are nice, and aristocrats are nice (there’s a might conservative aestheticism under here). Monarchies never suffered from war, strife or insurrection, they provided societal and economic stability, and did I mention that Kings are nice? Any contrary understanding you may have around monarchy is simply the lies you have been fed by “The Cathedral”.
(You remember that South Park episode about Scientology where they had the caption “this is what Scientologists actually believe”?)
That “Cathedral” thing, by the way, is a path in to a religious metaphor about progressivism. It’s a religion! Because people believe stuff, and want other people to believe stuff. (Where as neoreactionism is, I guess, a cult – what with the initiation ceremonies and secret revelation and all that). And the religious underpinning of wider society is tied back to low-church protestantism via the American Civil War, which puts friend Moldbug on the side of the Anglicans and kind of makes me want to hope he reads Richard Hooker’s “Of The Laws Of Ecclesiastical Polity” as a crash course in minutely and at length (Moldbug’s blog posts are longer than mine!) arguing for a clearly ridiculous position.
But back to everything you believe being wrong. Having taken the “red pill”, we next must prove ourselves by denying three sacred Cathedral truths: anthropogenic global warming, fiat currencies/any post-WW2 economics, and human biodiversity.
Moldbug himself undermines the first – it is accepted that (i) more carbon dioxide in the atmosphere makes the earth warmer (since 1896!), (ii) we are putting more carbon dioxide in the atmosphere than ever before, (iii) the Earth is (viewed on both a geological and human scale) warmer that it ever has been. There is discussion to be had on the precise relation between these variables, which is why climate modeling is a thing. His argument with this is the same as that one creepy Uncle you have would make – apparently scientists only get grants if they agree with everything all other scientists think, and only a small cadre of brave, embattled fossil-fuel company owning billionaires can see the (inconvenient) truth. #slowhandclap
Next up – apparently we need to… pauses for dramatic effect… get back on the gold standard. A finite money supply is better than fiat money because stability, and every economic theory other than what is broadly called the “Austrian School” – you know, the “rational actor”, “business cycle” stuff that led to so much stability in the past? – is wrong. I don’t think it’s even worth our while arguing about this, but economics is a broad church (except, counterfactually, in academia – read your Mirowski, who is sage enough to note the central place of neo-classical economics – Austrian School plus one – within what some call the neoliberal economics which… hasn’t been good for us recent). Also – doesn’t this sound like Bitcoin? similar set of roots ties together that whole world of crazy.
Not quite tied in at source, here, but relevant is the “rule of feet” idea that if a serf didn’t like a particular monarchic city he could always stroll off and serf for a different one – thus excusing a corporate disregard for surplus labourers and non-labourers with the lazy assumption that states that did care for temporarily non-working workers would grow faster (and thus provide more shareholder value) than those that don’t.
But “Human Biodiversity” is a scientific-sounding terminology that allows entitled white boys to say racist stuff. Let’s let Frank Zappa explain:
Eventually it was discovered
That God
Did not want us to be
All the same
This was
BAD NEWS
For the Governments of The World
As it seemed contrary
To the doctrine of
Portion Controlled Servings
Mankind must be made more uniformly
If THE FUTURE
Was going to work
Various ways were sought
To bind us all together
But, alas SAMENESS was unenforceable
It was about this time
That someone
Came up with the idea of TOTAL CRIMINALIZATION
Based on the principle that
If we were ALL crooks
We could at last be uniform
To some degree
In the eyes of THE LAW
That’s about the long and the short of it. People are fundamentally different, so we should treat them differently, and not expect equality. Which sounds almost reasonable until you realise it means in this case that “only people like me should have the opportunities that I have”. This is situated firmly on the “nature” side of the nature/culture debate that has being going on in social sciences since Plato, and uses the gloss of genetic analysis to make it not look like the backward leap it is. The (trigger warning:stupid people trying to sound smart whilst being racist) talk page of the wikipedia article on Human Genetic Diversity is an instructive read on the way the argument plays out. There is actually an ongoing academic debate on how meaningful or otherwise the idea of “race” as a classification is in genetic terms (spoiler: maybe a little, but not very much), and for a taste of that some kind soul has curated a great set of links at the article on Lewontin’s Fallacy.
Interestingly, Austrian economics explicitly speaks out against the idea that people’s economic activity is sensibly considered in aggregate based on societal groupings. Which feels rather ultra-modern at the moment, what with our progressive distrust in everything from learning analytics to FiveThirtyEight, but utterly alien to the HBD world. Which can’t happen here…
Oh and – good news – you can justify misogyny via HBD too! and ableism! and homophobia!
You may think at this point that neo-reaction is simply philosophical cover for being a dick to people. I could see how one could argue that. Certainly, believing that “experts” have lied to you about everything: maybe they lied to you about the good points of not being a dick too.
—
The rise of neofascism, the death of the social contract of work, and corporatization of the university are not unrelated and here we are.
This concludes our little whistlestop tour of neoreaction and how it links in a number of places to concerns current in radical education technology. Which has been intermittently entertaining, but still unconnected with our 2016! WTF! starting point. (But keep an eye on Peter Thiel – he’s a big fan)
In the latter parts of his “Gentle Introduction”, Moldbug talks about how to bring about a neoreactionary revolution. These are the three steps to hell:
Passivism
The Anti-Versity, building an alternative power structure
Prepare to accept power when offered
Now I don’t think we take them at face value (that Moldbuggian irony!), but basically this is a riff on the old Freidmanite shuffle – get out of the existing argument, build a new one and then wait for a crisis to come along so you can build an alternative.
The big difference to the “plan” is the absence of an anti-versity. It was a dumb idea anyway, in that universities have little power and less influence, and what would be done if the anti-versity discovered facts that did not accord with the chosen world view? Deliberately biased resources that correct the perceived bias of other resources are seldom pretty. Instead of a university at the head of an anti-cathedral imagine an alternative mainstream media. Where do people get their news these days, and who invests in it?
The passivity here explains the absence of active mainstream right-wing intellectuals (I mean, name one…), and thus an amplification of the perceived liberal effect of the media. For maximum awful we could ensure that newspapers can’t afford to pay journalists properly, and draw on the noble truth that a think piece gets more clicks than actual reportage. This shadow cathedral (death star?) has been sitting quietly off to the right warping public perception and turning politics into an antagonistic us-vs-them affair.
And then hail King Trump?
Is Trump a king? Well he does try to act like one… the royal court, the favoured children, the droit de seigneur, the whole Louis XIV decor… and Moldbug does call for a CEO as king (he suggested Elon Musk).
But on the converse he’s actually not a very good CEO (by any reasonable measure), and he’s a bit – well – common. Aesthetics and decorum are a huge deal for the neo-reactionaries: they want nobles who are truly noble (with elegant, long, royal fingers…)
But he’s a placeholder. Now we’ve normalised the idea of CEO as global leader it’s easier to argue for a better CEO, using the intervening time and Trump’s love of being hated to remove democratic checks and balances as far as possible (maybe by securing power with the unelected administrators before the next guy abolishes them)
This presentation owes an enormous debt to the opportunity I have had to both work and converse with Cameron Neylon of Curtin University. I should clarify that the good bits of the material below should be seen as his influence, the shoddier stuff as my lack of understanding and subtlety. It is presented in a personal capacity.
Substantial pixels have, of late, been devoted to the cultural “demise of the expert” and consequent de-legitimising of academic forms of knowledge. But if we want to know why people don’t believe what we believe, we need to take a long hard look at some of the crazier things we do believe.
Likewise, Open Education, as it matures as an idea has been urged by some voices to consider itself as an academic field. Audrey, in her #etug keynote, suggests that we examine the mechanisms, tropes and homunculi of academic prestige before we take that particular pill.
So I’ve been thinking about one of the new gods of academia – the citation index.
Whether we would have it or no, the purpose of [higher education] is changing. A decade ago the graduate of a college was thought to be fitted with the requisites of a cultural, liberal education, to be ready to begin [their] life work as a good citizen. Everywhere we see the demand for the expert worker, the professional […] who has devoted from two to four additional years to train […] in a special way in a particular field.
PLK and EM Gross of Pomona College writing in Science (October 1927)
I start by saluting PLK and EM Gross for writing a landmark article during a period of major institutional reorganisation. “College Libraries and Chemical Education” represents the birth myth of the science of bibliometrics – but was itself focused on identifying scholarly resources for reuse within undergraduate Chemistry education.
Gross and Gross took the latest volume of the Journal of the American Chemical Society as a starting point (“the most representative of American Chemistry”), and simply tabulated the number of references made therein to works in other journals. The academic journals most frequently cited in this periodical were deemed essential for the college library collection, as they had a demonstrably greater influence on the current state of American Chemistry.
Aside from this contribution to library collection building (or resource discovery, if you’d rather) and the wonderful suggestion that every undergraduate chemistry student should have a working knowledge of German, you would be forgiven for thinking that this paper was naught but a historical curiosity. But it was the first in a series of papers that led, directly or indirectly, to the sorry state of academia today.
Vannever Bush’s 1945 piece in the Atlantic “As We May Think” is often hailed as a founding text of the internet (our opening keynote has written, and spoken, with his usual eloquence about this aspect). Bush – fresh from the interdisciplinary practicalities of administering the Manhattan Project – took as his ostensible subject a similar issue to Gross and Gross: the impossibility of keeping up with the literature.
The difficulty seems to be, not so much that we publish unduly in view of the extent and variety of present day interests, but rather that publication has been extended far beyond our present ability to make real use of the record. The summation of human experience is being expanded at a prodigious rate, and the means we use for threading through the consequent maze to the momentarily important item is the same as was used in the days of square-rigged ships.
In considering the future of threading through the maze of scholarship (or an associative trail if you’d rather), Bush considered examining the links between knowledge (or resources signifying knowledge, I guess) as a means of synthesising and creating a greater understanding – allowing for, simply put, better and more accessible research.
Wholly new forms of encyclopedias will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified. […] The chemist, struggling with the synthesis of an organic compound, has all the chemical literature before him in his laboratory, with trails following the analogies of compounds, and side trails to their physical and chemical behavior.
Bush postulated the development of a “memex” – essentially a large database of literature alongside the links between it, as a means of exploring a corpus. But our third great influence on contemporary academia, Eugene Garfield, was spurred by “As We Think” to consider a much wider variety of uses.
Drawing on Bush’s ideas alongside HG Wells’ “World Brain” – Garfield’s idea was of an “informatorium” to serve “a new Renaissance during which the entire world will be thirsting for knowledge”. (As an aside, these early ideas of the future of scientific literature do not even consider the issue of copyright and ownership over knowledge, seeing it as a public treasury for all to enjoy. What a beautiful world this could be, what a glorious time to be free.)
Garfield developed the Science Citation Index, under the auspices of the Institute for Scientific Information (ISI). As is often the lot of those with a utopian vision, his early work met with almost universal indifference – compare Vannever Bush’s struggles to establish what became the NSF. But his dream – and boundless energy – prevailed, and the work of the ISI as presented in the 60s almost reached the heights of the knowledge discovery solutions postulated by Bush and the Grosses.
Coverage
Although ISI was founded to support resource discovery, it has competition built in to its underlying logic. In the 60s it was fair to suggest that not every paper in every journal can be indexed, and not every journal was. So the index started with the idea of a pre-sift. It is contended that that majority of references (80%) are made to a minority of (20%) journals, and that using methods similar to Gross and Gross one could identify and focus on indexing “the best” journals.
Even today the widely used Science Citation Index covers citations within only 3745 journals, Elsevier’s Scopus covers a wider range of literature, including over 21,000 journals – impressively comprehensive until one considers that there is not any one complete list of currently or formerly published academic journals. Of course, both major indexes focus mainly on English language publications in clearly defined established disciplines from traditional publishers.
This serves, of course to amplify inequalities built into a publishing system where (predominately white male western) reviewers recommend to (predominately white male western) editors that articles written by (predominately white male western) academics are published. It is perhaps not too far a leap to suggest that academics with these characteristics are well cited. Laura Czerniewicz showed some truly humbling visualisations illustrating this in her OR2016 keynote.
It is unsafe to consider indexed papers as representing the sum total of world knowledge, or even as being representative of the human race. Indexers choose (predominately) one form of expression, and – as we shall see – do not take this or other assumptions into account when describing the products of this indexing.
Citation index entries as text – and a series of terrible monetary metaphors
Paul Wouters, in his 1999 thesis (“The Citation Culture“) draws a distinction between the “reference” as what actually happens in scientific writing, and the “citation” – which is what appears in a citation index. Though each citation in the index has a corresponding reference, there are references (for instance to grey literature, data, source texts) that do not have corresponding citation index entries. And though each “reference” will have a unique context in describing a precise and nuanced relationship between two text, a citation index entry – for the purposes of indexing – is simply a citation index entry. At that point, all those words of wisdom sound the same.
The role of the citation might also be compared with that of money, especially if the evaluative use of scientometrics is taken into account . Whenever the value of an article is expressed in its citation frequency, the citation is the unit of a “currency of science”. (p108)
In economic terms citation index entries are fungible – each has equal value within the index, and multiples of can be measured against each other in order to ascribe comparative value. Also, like a fiat currency, there are several “central banks” (ICI, Scopus) which create (and destroy! – not every journal stays on the list forever) citation index entries in response to demand and to policy needs such as the need to control inflation – there is a theoretical infinite number of potential entries, but these are controlled by alterations to the coverage of the index. But there are important ways in which the citation index economy does not function like a fiat currency.
The citation shares still another property with the signs of money and language: it can only function properly in the midst of other citations. Therefore, citations need to be mass-produced. A lone citation does not make sense. It derives its function mainly from its relations to other citations. In other words, it is self-referential. Whether one tries to map science or to evaluate it, one needs large amounts of citation data. (p109)
I’m unfairly picking up a very, very small theme in Wouters’ superb and comprehensive thesis here, but I believe it is an important one.
In Blaise Cronin’s “The Citation Process” (1984), he touches on the practice of science as a mechanism of exchange in glossing work by Merton, Storer and others.
The commodity which scientists traditionally exchange is knowledge or information, and in drawing on the intellectual property of their peers, scientists have to enter the exchange system and ’pay the going rate’, so to speak. The currency, to maintain the economic metaphor, is the ‘coin of recognition’. The exchange on which the social system hinges is
information for recognition. The formal record of these transactions is the scientific establishment’s traditional
ledger, the scholarly journal. The most common form of
‘currency’ is the citation. (p19-20)
Ledgers, with the advent of blockchain and other innovations in “fintech”, have become peculiarly fashionable in recent times. And indeed, it is possible to build a simple metaphorical model of scientific publishing as blockchain with aspects of a Ponzi scheme – with a distributed ledger (multiple journals) added to via proof of work (the writing and publication of a paper) hashed (rendered into academic language) in order that value is both realised and distributed in the form of the ‘coin of recognition’ to earlier contributors, with the promise that similar ‘coin’ will be forthcoming to the most recent participants.
This model almost continues to hold up if the monetary system around journals is taken into account. Doing good research, bluntly, is expensive and is becoming more so. In the majority of cases, one must then pay either to view or contribute to the journal – analogous to the contribution of electrical and processing power to blockchain creation. And – like blockchain – these costs, and the costs of conducting high quality underlying research, concentrate contributions into a smaller and smaller group of centres (universities) capable of getting a paper into a prestige journal as the proof of work becomes more arduous.
Of course, I should add that reputation economies have a literature of their own – though this year Cory Doctorow neatly described how much worse such an economy would be. And there is an appealing parallel to Google PageRank, and to the way that the SEO industry developed around this. Furthermore, Cameron Neylon has been mapping academic publishing to actual economic theory.
The black box(en)
Adding references adds to transparency – it’s why I’m adding all these links so you can see I am not making this all up. But citation indexing actually adds a couple of layers of opacity to our understanding of the scientific process.
The first is perhaps more a matter of obscurity than opacity – how do journals get selected to be on the index? Eugene Garfield would claim that this is the correct use of Journal Impact Factor (the number of index entries received by articles published in that journal during the two preceding years, divided by the total number of articles published in that journal during the two preceding years – which requires, circularly, the citaton index entries to happen in a journal that is already on the index!). But he cites a rather telling alternate explanation:
These days, both Web of Science and Scopus offer detailed criteria, predictably preserving this beautiful circularity alongside sundry cultural and language barriers. Basically, good journals are good because they are like other good journals.
The second issue is simply how a reference becomes a citation index entry.
A reference is a polysemous thing – yes, it reflects a link between one piece of scholarship and “something else”, but it also holds contextual information (where in the paper is it?), intention information (am I being nice or naughty?), normative social information (is it the key reference in the field I am writing?), specific social information (am I citing Martin Weller in the hope he’ll cite me?), transactional information (did Martin Weller cite me so now I have to cite him?)…
All of this meaning is somehow compressed down into a simple, interchangeable (remember fungibility?) unit that can be combined with others – and this collection of units now somehow tells me something about academic quality. Hoeffel (above) attempts to sidestep this oddness by claiming it is all a proxy for what we all already know, but this is a proxy (as pointed out by Wilsdon and others) that has a deadly power in false precision.
So what can be done?
There are projects, such as Open Citations (formerly funded by Jisc) that sought to graph and link (via RDF) citations in Open Access literature. More recently, we have seen experiments in Semantometrics (another Jisc funded project) that seek to recognise the value of context in referencing practice in developing new indicators. CrossRef and open identifiers like ORCID improve data quality and add back some of the contextual information for those who wish to explore it. And there are “improvements” to impact factor style metrics made all the time.
On the other hand, James Wilsdon’s report for HEFCE – “The Metric Tide” – brought home in clear and uncompromising terms the human cost of reliance on oversimplified research metrics. And efforts exist to transfer the existing mess that represents practice around citation metrics to other forms of referend, such as research data.
Open access to research literature is key to all of the efforts to gain control and understanding over citation metrics. By allowing (along with appropriate formatting!) any paper to be indexed, it forces open one of the two black boxes that have held such a sway over the past quarter-century of academic life.
The other, around the creation and destruction of meaning at the reference/index point of flux, requires rather more thought – and it will be difficult, if not impossible, to wean the educational superstructure off scientific-seeming measurements that add credence and objectivity to otherwise arbitrary decisions.
But what hath this to do with tillageopen education?
Quite.
In open education, we stand at the beginning of the discovery and “rating” revolution. Resources are becoming an academic achievement, and our only-too-human urge to see the maximum use made of our work are already leading to conversations about measuring reuse and “amazon style recommendations”.
We’re at the start. Academic citation indexes, and similar systems, are the end point. Proceed with caution.
This is – in the vaugeest sense – a response to Martin Weller’s thoughts on “Open Education and the un-enlightenment” that I’m putting here just to stop people speculating that I’ve been replaced by a robot that only talks about citations and Yacht Rock. In deference to the post-fact world, I am not citing authorities in this post*.
Are we living in a “post-fact” world? And is this a new development, or a more prominent iteration of a previously identified trend?
From Brexit to Corbyn, Trump to Saunders, media commentators have been bashing us over the head with a half-brick labelled “post-fact” or “the new populism” as a way to give us a handle on a world in which the arbiters of information are continually under suspicion.
This itself represents a failure of higher education, as there already exists a theoretical toolkit to deal with this very state of affairs, and it has been widely taught at undergraduate level since at least the late 80s.
Post-modernism is what a group of academics in the latter half of the C20th decided to label our growing cultural distrust of the grand narrative, coupled with the cultural theory informed (and, to a lesser extent post-structuralist) approaches to analysing reality using techniques developed to discuss fiction.
What this has resulted in is a conspiracisation of reality – where explanations from those in positions of power are distrusted, and the tools and tropes of fiction are used to propose plausible alternate possibilities. If you like, it is the scientific method if you removed the basis in scholarly literature and abandoned the concept of falsifiablility.
I’m going to use the idea of the conspiracy theory as a way to understand what is going on, and I am using the term without an implict value judgement. To me, a conspiracy theory:
Is a narrative of resistance – it implicitly distrusts received wisdom from any external source. It puts the case of a group party to hidden truth against a (postulated) far better resourced group that actively hides this truth, on behalf of an unaware mass population.
Is non-falsifiable – counterfactuals are simply distrusted rather than incorporated, significant counterfactuals are seen as a means to “suppress the truth”. Attack or ridicule are seen as an admission of culpability.
Is coherent – it sets up and positions itself with relation to dichotomous relationships, it attempts to explain a wide range of activity in a unified and non-contradictory way. (Conspiracy theories often use tools from fiction, such as plot structures, tropes, motivational fallacies and structural critiques of culture – in order to do this)
Is evangelical – ideas are designed to be spread memetically, codes and systems of reference are used to identify others and cohere a group culture.
Is plastic – the theory narrative will shift to encompass new ideas, or align with other theories. If a component of a theory is thoroughly repudiated, it is routed around.
Open Education itself is a conspiracy, if you like. Evil publishers are profiting from the unequal distribution of information, where they do take steps to address this they are “openwashing”. They do this because they hate learning where a profit is not made. If we attain a critical mass, open education will replace the textbook publishing industry.
If you are sitting there thinking “who, us?”, ask yourself what information would falsify your belief in open education. What information would falsify your other beliefs?
So I’m making a semiotic shift by proposing another way to think about the “post-factual” within work on post-modernity, cultural theory and (in particular) the study of conspiracy theories – and I’m suggesting we also examine our own practice and beliefs with these tools.
In essence, we live in a culture that loves to tell itself stories – comprised of similarly prolix sub-cultures. The sub-culture that constructs the most compelling narrative becomes the dominant culture, and then other sub-cultures attempt to develop narratives in opposition that attempt to displace this.
Stories can be compelling without being true. And the fact that people chose to base their lives around compelling stories is neither unusual nor concerning. As a sub-culture (which I’m going to go ahead and put all of us gathered here in, though not on an exclusive mono-cultural basis* – structuralism! wheeee!) “the elite” privileges certain forms of “truth” within a narrative based on a deliberately developed high standard of proof.
I like high standards of proof, because I like being confident that other members of the “elite” sub-culture (that’s you, dear reader) will validate my contribution to a narrative. This is why it is so horribly hard to write this post without using references or appeals to authority – I have to get over the idea that the esteemed Prof Weller is going to trash my contributions.
Others do not have to, or indeed intend to, appeal to the “elite” sub-culture in constructing or contributing to a narrative. This doesn’t mean that proof or authority isn’t used, just that these may not be in a form that we are used to dealing with or responding to.
If we want to understand why we keep losing arguments (getting to the nub of the matter) we need to get better at understanding how these arguments work and how strategies to win them work. Or we need to come up with another form of argument that works for us better than it does for other people. Or we need to get better at widening our little group to include other people.
___________________________________________
* if you must, Frederick Jameson’s “Post-modernism, or, the cultural logic of late capitalism” is a useful starting point, David Aaranovich’s “Voodoo Histories” is good on the nature of the conspiracy theories, Helene Cixous’ “Le Rire de la Meduse” is an underpinning set of ideas on multiple cultural narratives that more people should read, and any decent UK Cultural Studies anthology would be worth a look for a grounding in the ideas of the field.
** cos I’m an articulate extroverted middle-class able-bodied white heterosexual cis male western European with a university education that participates on a well-paid basis in an information economy – although this makes me FUCKING AWESOME at being a member of the “elite” it does not describe everyone who may subscribe to “elite” values here. Which in itself is a pretty brutal critique of “elite” culture…
(the bulk of this text is from a paper I wrote to support the work of the DCIP. I’m sharing it here in case anyone else finds it useful. All glaring issues and inaccuracies are my fault alone, please leave a note and I will update.)
Briefly, a citation is an in-text link to a reference in a list of references at end of a work. Though there are some systems that focus on citations (Harvard, Vancouver…) or references (ANSI/NISO, ISO/BS) only, within commonly used systems it is more usual to see a coverage of both aspects alongside more general “style guide” material.
Many styles were developed around the requirements of particular publishers or journals, but have since expanded into widely used guidance. Some have been heavily commercialised, others are available to view online for free. There’s an argument to be made about open accessibility to what are, in essence, gateways to academic publishing – but here my focus is on openness in the sense of transparency. How, and why, are changes made to citation/referencing rules?
The bulk of this post is in the form of a list of citation/reference styles, alongside an indication of where they are currently used and the way they are administered. You’ll see (broadly) four categories of style administration:
Developed on behalf of a publisher or professional body by a specifically hired external author/editor.
Developed on behalf of a professional body or publisher(s) by a committee or other individual/group drawn from that body.
Developed by a standards organisation.
Unmaintained/consensus.
In the short term, if you wanted to improve or modify mainstream citation practice you would go via the two major standards organisations. Both – it could be argued – are overdue updates, and the mechanisms by which such an approach could be made are transparent and clearly defined. Both NISO/ANSI and ISO/BS standards are likely to be relied on in the refinement of subject area and, at a secondary level, journal-specific level. This would not be a speedy process, but with concerted lobbying it may be possible to achieve a wide coverage for any changes in around five years.
However, there are two major obstacles to overcome. The first would be the near-impossibility of seeing complete coverage. Whilst the convergence of requirements towards a small set of standards has been an ongoing trend, there are many journals that – for unique reasons of specialism, or through sheer obstinacy – will continue to mandate specific presentational methods. These may include, but are not limited to, modifications of mainstream standards, previous versions of mainstream standards, or entirely distinct and unique methods. Short of contacting each “outlier” journal directly there would be no means of achieving complete coverage.
The second major obstacle concerns the likely development of research metrics over the next ten years. James Wilsdon’s “The Metric Tide” is simply the most prominent example of a trend away from an uncritical acceptance of citation-count based metrics – newer methods of analysis, such as semantometrics, examine contextual information gleaned from the position and sentiment of a citation. Citation (as opposed to reference) practice is primarily based on academic custom – changing ingrained habits could be very difficult indeed, and journals would likely be reluctant to depart existing norms even if the “canonical” documentation of these norms was altered.
Those citation/reference methods in full
International Standards – these primarily deal with references, and may either be used directly by journals or inform the ongoing development of other style guides. As the projects of international standards organisations, these are openly constituted committees which are explicitly open to question and suggestion via well documented routes.
ANSI/NISO Z39.29 (last updated 2005) and covers bibliographic references. This standard underlies other styles and is also used directly by PubMed/Medline. Note that JATS (another NISO standard) supports the XML markup of references in a number of styles. The standard is managed by committee/working group and suggestions are welcome via standard NISO contact details (nisohq@niso.org).
ISO/BS 690:2010 (last updated 2010) is titled “Information and documentation – Guidelines for bibliographic references and citations to information resources”. In the UK it is sometimes cited as, “Harvard British Standard”. The standard is managed by the ISO Identification and Description Committee (ISO/TC 46/SC) which can be contacted easily via the details on that page. ANSI provide secretariat, and confusingly the named secretary (Todd Carpenter) works for NISO.
Citation styles – these could best be described as “conventions” rather than standards, though many have spawned wider style guides. In these latter cases, an invited editor will draw on other style guides in an attempt to be as inclusive as possible without being needlessly complex. The post by RD Harper on the “Chicago” process is instructive here on methods – note that “Chicago” and “Turabian” are aimed, as complete style guides, primarily at students. The “Vancouver” method is another outlier in that it has a close association with the ICMJE, with the maintenance of a committee alongside an invited author and strong links to the NCBI style guide.
Chicago – the Chicago Manual of Style is based on what is widely accepted as the “Chicago” method of citation (primarily author/date within parenthesis, but there is also a footnote variant.) The 16th edition of the manual (published in 2010) is managed by the University of Chicago Press is aimed at general/student use. Russell David Harper was the invited editor, and offers an interesting perspective of the process of developing a style guide.
Turabian – A Manual for Writers of Research Papers, Theses, and Dissertations is a variation on the Chicago style [Author/date, and footnote]. The 8th Edition (published 2013) is also aimed at a general/student audience. It corresponds to the 16th ed of the Chicago Manual of Style, and each edition is managed by an invited editor. Originally developed by Kate Turabian, a former graduate school dissertation secretary at the University of Chicago, more recent editions have been managed by a range of editors and the 8th edition was updated by the “University of Chicago Press editorial staff”.
OxfordThe New Oxford Style Manual (sometimes referred to as the new “Hart’s Rules”) is known for a footnote/endnote citaton style, though the manual also covers references. The current edition is the 3rd, which was published in 2016 and incorporates the 2014 version of “Harts Rules”. Anne Waddingham was editor in chief of the 2014 edition of Hart’s Rules, she is currently a freelance editorial consultant.
Harvard citation is a surname-and-date-in-parenthesis method that refers to a common practice rather than a given publication. There is no central authority and “Harvard” does not constitute a full style guide. There is some doubt as to its origin (though an 1881 paper by Edward Laurens Mark is often given as the source) , but it is not owned, and indeed is explicitly disowned, by Harvard University. This BMJ article offers a partial history of the practice.
Vancouver is closely associated with the work of the ICMJE, with their “Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals” often given as an authority. It is an ordinal number parenthetical citation method. It links to the confusingly-titled NCBI publication Citing Medicine for referencing practice. The recommendations were updated in 2015, and are managed by an ICMJE sub-committee, with the standard ICMJE email address given as a point of contact. “Citing Medicine” complies with NISO Z39.29 and names Karen Patrias as the (invited) lead author of the 2007 second edition.
Legal citation very much exists within a separate world, and two common examples are included here for completeness only. The OSCLA, with an invited author and a clear mechanism for feedback, contrasts with the very commercial and closed auspices of the Bluebook.
Bluebook – “The Bluebook: A uniform system of citation” is the major US legal citation and reference guide, and has been developed on a commercial basis by four major US legal journals. The most recent edition (the 20th) was produced in 2008. Concerns about cost and access have led to the independent, openly accessible, Indigo Book which offers a compatible style guide.
Major subject area guides – the APA and the MLA are two of the most widely used citation and referencing standards, used respectively across the majority of social sciences and arts/humanities subjects. These are style guides in the fullest extent, covering every aspect of academic writing marginalia. The development of style guides is undertaken within the committee or secretariat structure of the society in question.
MLA – The humanities and arts equivalent to APA, the MLA Handbook as of 2016 is in it’s 8th Edition. The publication lead Kathleen Fitzpatrick, is the Associate Executive Director and Director of Scholarly Communication at MLA. She offers an interesting perspective on the aims of the new guide in a recent article.
Other subject area style guides – generally much sparser than the “big two”, these are generally developed as a function of journal publication, and have more in common with guidance for submission. It can be difficult to find details of authorship, though the AMA and ACS (as the two largest in this category) are clearer about processes. At the other end, we are dealing with what are basically journal-specific guidance notes, and at this level it may be easier to encourage journals to adopt a standardised system where possible.
IEEE – The Institute of Electrical and Electronic Engineers’ “Editorial Style Manual“, and parallel “Citation Reference” are widely used in engineering disciplines. These appear to have been last updated in September 2009 by Deborah Graffox, who – at the time – worked for the IEEE press.
APS – the American Physics Society “Online Style Manual” is undated and appears unmaintained, though mentions a previous edition in 2003. It is very difficult to see who was responsible for developing it, I would suspect the APS journals team but the guide is not mentioned in their advice for authors, which instead points to two contradictory journalspecific guides that do not refer to the online style manual.
AMA – The AMA Manual of Style reached a 10th Edition in 2007. AMA is widely used in medical research. It operates an email address for feedback (stylemanual@jamanetwork.org) and is developed by an AMA authorial committee.
There are many other citation/reference style guides (I’ve heard numbers in excess of 2,000 bandied about) at subject, publisher and journal level. But these are the main offenders.
[in which the citation practices of the inventor of the worldwide web and the co-founder of OpenLibHums are compared – and I attempt to interest people in data citation.]
Tim Berners-Lee’s “Information Management: A Proposal” is “gray” literature (an internal memo at CERN, never formally published), and nonetheless one of the most influential and highly cited documents in history. However, in all of the published academic work devoted to this short memo, Sir Tim’s citation style has never received the critique it deserves.
Until now.
In-text citation broadly comes in two-and-a-half flavours – either the entire reference appears within a footnote or endnote (Oxford style), or a parenthetical key using ordinal numbers (Vancouver) or author surname and date plus an identifier if needed (Harvard) style directs the reader to the appropriate part of a list of references at the end of the work .
Sir TBL’s innovation was to provide citations as short text, combining the ease of location of Vancouver with the immediacy of Harvard. Many of his references are to resources where an author name is not present, or where significant additional text is required alongside a (broadly ACM-style) reference.
For instance, [HYP88] refers to a special edition of “Communications of the ACM” entitled “Hypertext on Hypertext” written in Hyperties syntax and sold on floppy disk, containing the full text of eight papers from the July edition of Comm ACM.
(Not quite the first e-Journal, though: that was Richard H. Zander’s Flora Online in 1987, available by subscription on disc, and free via a BBS. This was primarily a directory of source code for academic software in botany.)
But back to TBL’s citation style. If we see a citation as working on two levels, both as an immediate aide memoir to those cognisant of the current state of a given field (a surname and a year being enough to indicate the work in question) and as a way for those less familiar with the field to find an article in question (via looking it up in a list of references) it can be seen as a neat way of achieving this in a field where resources may be well known but citation is not a situational norm.
As is, of course hypertext, combining three levels of information that can indicate a source: the semantic context of the surrounding text, the raw URL viewable via a mouseover, and (hopefully) the full resource – or at least a means of obtaining it – via a link.
Kathleen Fitzpatrick (@Kfitz on twitter) considers the place of the academic citation in a world of hypertext in a recent article for the LA review of books. As the lead editor of the most recent edition of the venerable MLA Handbook, she is well placed to reflect:
All of our current citation formats were invented for a print-based universe, in which each book or article gave the impression of standing alone. Bibliographic notes and markers connect these many individual texts into a broader, ongoing conversation. But now that we live in a world in which no text need be an island, in which scholarly publications are increasingly delivered digitally and so can be literally interconnected via links and embeds, it is reasonable to ask whether citations are still necessary.
Her conclusion draws on the need for a future scholar to understand not just the name and authorship of the work referred to, but also the precise version of said work
[P]ublications and other cultural objects are no longer quite as fixed in format as they were, and their very malleability may heighten the importance for future scholars of knowing precisely which version today’s researcher consulted.
A problem Martin Eve will know well, from his examination of multiple versions of David Mitchell’s Cloud Atlas. It turns out there are two substantially differing versions of the text of the novel, stemming from UK and US translations and differentiable by place of publication and ISBN.
We have unique identifiers for texts in the form of ISBNs but we have become complacent about assuming that all editions are equal on first publication. When we write of ‘Cloud Atlas’, to what are we referring? Is it the textual edition cited in the bibliography? Nominally, yes, but more often the assumption is that we mean this to refer to the ur-structure, the named entity of a text that is ‘the novel’.
(if you are wondering how he dealt with the two versions of the text within the paper -which he references as “Mitchell, D (2008). Cloud Atlas. London: Sceptre” and “Mitchell, D (2008). Cloud Atlas. New York: Amazon Kindle” – he uses ‘P’ for paper and ‘E’ for e-book in discussion, with what amounts to occasional full references in text as citations(!), the former as “(Mitchell, Cloud Atlas [Sceptre, 2008],’P’)” and cunningly never referring to the latter in the text of the paper (but I’d guess “(Mitchell, Cloud Atlas [Amazon Kindle, 2008],’E’)”).
Clear?
So neither links or citation/referencing are ideally adapted to dealing with such issues – although Tim Berners-Lee might have elegantly sidestepped all this by using [CLOUDATLASPRINT08] and [CLOUDATLASEBOOK08] in text 🙂
This is bad news for those submitting to a journal which rigidly applies a citation/reference style (which is most of them). And is not really an issue that EndNote, Zotero or Mendelay (those masters of arcane styles) can help us with either.
The reason I am concerning myself with such arcana is because I’m really boring the citation of research data (of which there may be multiple versions) presents similar problems, except all the time rather than in rare cases like Cloud Atlas. With a paper comparing sensor readings taken milliseconds apart, conventional citation is just not going to cut it – which of course is why we use unique identifiers (though there is an issue in indicating relations between datasets which is where PCDM comes in…). It’s another reason why we cannot simply bring traditional citation processes over to datasets.
[For those jumping up and down and shouting “The HE&R bill!” at me – patience. It’s not law yet, and I’ll get to it at the end.]
So just what is a “higher education institution” in the UK? Section 71 of the 1992 F&HE act currently defines a HEI as one (or more) of these three things:
a designated body, designated by the Secretary of State where a full-time equivalent institutional enrolment number for courses of higher education exceeds 55 per cent. of its total full-time equivalent enrolment number.
Point three gives us the clue that an institution will not be a HEI just because it delivers higher education. Indeed, even if it delivers only HE, it will not become a HEI unless designated as such by the Secretary of State.
At this point, it may also apply for teaching and/or research degree awarding powers, currently via the QAA. And also for university status (currently via the QAA and Privy Council). And also for institutional designation for HEFCE funding and student fee loans. And it may submit an access agreement to OFFA so it can charge fees up to the maximum amount. Or, in any of those cases, it may decide not to. But it needs to be a HEI in order to even think about doing these things (unless it happens to be the Archbishop of Canterbury).
Prior to achieving HEI status, an institution will have one or more designated courses (eligible for student fee loans and, if applicable for the subject of study in question, HEFCE funding). The degree courses it delivers will be validated by a HEI partner, sub-degree courses may instead be validated by Pearson.
If you want to know the status of a particular institution as regards these various attributes, all of this is reflected on the HEFCE register of HE Providers. As long time readers of “Followers…” may recall.
The HE&R Bill takes some steps to clear some of this mess up, notably combining HEI status, HEFCE funding designation, T/RDAP and University title into a single process and makes the access agreement a mandatory component.
It took a while to for this to occur to me as well – but all of these lists, designations and attributes have one thing in common. They refer to the “teaching” end of Higher Education. So what’s the deal for research?
Of course, there is nothing stopping anyone from conducting research into any thing they like in any way they choose. Indeed, this may be paid for by any number of bodies for no other reason than they think that that institution/organisation in question might be quite good at it.
The two jewels in the research funding crown, however, are QR funding and research council funding. How does an institution get access to both edges of “dual support”?
QR funding is the stuff that is linked to the REF – you have to be in the REF and get the required score in order to get the money. It’s up to each institution to decide whether to enter the REF and to work out whether the return via QR plus the reputational benefit is equal to the effort put in.
The entry requirements for the REF itself are surprisingly hard to find. The long and terrifying list of FAQs don’t shed any light, and my best guess from the language in the guidance is that only “Higher Education Institution” status is required, as per the F&HE Act 1992 Section 71 definition at the top of this post (and as far as I am aware, the Archbishop of Canterbury didn’t submit to REF2014). No private providers submitted to REF2014 (not even the University of Buckingham), but I can’t find anything in the documentation to say that they couldn’t.
There is one main anomoly. The Institute of Cancer Research has designated courses at postgraduate level via the University of London but is not a HEI, but it DOES submit to the REF. I’m also unsure of the status of the Institute of Zoology (claims to be HEFCE funded, submitted to the REF, but isn’t on the HEFCE register – I think it hides under UCL somewhere).
And Writtle College (shortly to become Writtle University College – many congratulations!) wouldn’t have had TDAPs (or University College status) at the start of the REF 2014 period. However they were designated as an HEI in 1994 so were eligible to submit to the REF from that point.
So you’d think that HEI status (or being the institute of Cancer Research) may also be your path to research council funding, and you’d be wrong.
There are three groups of institutions eligible for research council funding:
institutions in receipt of grant funding from a UK HE funding council (like HEFCE) can apply to any research council. Though the wording is unclear, this means designation at an institutional level (thus eligibility for HEFCE QR in England)
Long-term established research institutes can apply to any research council. These are places where the research councils have made a long-term investment, and where they are a primary funder.
Independent Research Organisations (IROs) can apply to one or more research councils as agreed in either a “responsive” (whenever they decide to) or “managed” (with permission on each occasion) mode.
To become the latter, there are some fairly detailed eligibility criteria . An IRO needs to be a charity or similar, a legal entity not owned by business or for primarily research purposes by any part of the public sector, and it needs to have the internal capacity to conduct research. Even beyond this, the bar for IRO designation is high and an assessment is made for a five year period (automatically renewed if funding is awarded) by the Grants Governance Committee accross all research councils, even where designation is sought for a single research council.
Though IRO status is a high bar, it is not as high as the designations currently offered by HEFCE – certainly organisations as diverse as the Tate, RAND Europe and the Institute for Fiscal Studies currently hold IRO status but would absolutely not be eligible to become a HEI.
But so what? you may ask.
Remember that the QR funding “Research England” side of HEFCE becomes a part of UKRI (the research councils umbrella body) as a part of the HE&R Bill. Why would one sub-committee of an organisation run a completely different set of eligibility criteria to all of the others. Why would an institution need to have OfS designation to get QR funding, but not to get a research council project? Why would one part of UKRI run a competition only for a subset of the institutions funded by all the others?
The thrust of modern HEI policy has been on simplification of regulatory structures – what would make Research England the exception?
If you are that rare breed, a research manager in an FE college strongly committed to research (that probably delivers some HE on the side), I think you should be having a careful read of that IRO document…
1. [What’s the difference between a University and a University College? Size. A University has to, since 2012, have more than 1,000 FTE students. Before 2005 you needed 4,000FTE and RDAPs, with the latter requirement lost at that point.]↩
Meanwhile, every pathetic racist in the land feels emboldened to make life miserable and uncomfortable for everyone that looks even mildly foreign. Scotland wants out of the UK if it means leaving the EU, as does Gibraltar, as does Northern Ireland.
The wheels are already coming off the Vote Leave promises, there are widespread reports of “buyers remorse” amongst leave voters, and the heroes of the hour (Gove, Johnson Major, Farage, Carswell) are conspicuous by their absence.
And this is T+4 on our “independence day”. In Book of Revelation terms (which is really the only viable comparison we have) we still haven’t got to the end of the letters to the seven churches – and this with the unlikely figure of Nick Clegg as St John of Patmos.
There are so many questions, and so few answers, I feel justified in stepping into the bounds of probabilistic interpretation whilst (hopefully) dropping some constitutional science on those assembled.
What almost certainly won’t happen…
A rerun of the referendum. Yes – there were lies told. But movements legally challenge the referendum result, and/or require a new one (perhaps on modified rules) are doomed to failure. The reason is simple: the referendum wasn’t legally binding. Parliament is sovereign so this and all other UK referendums (with the exception of the 2011 AV referendum which had specific provision in this regard) are advisory only. That said, it is pretty clear advice that would be politically difficult to ignore…
A speedy negotiation and withdrawal. To leave the EU, parliament has to empower the Prime Minister of the day to enact Article 50 of the Lisbon Treaty (The Treaty on the Functioning of the European Union). We know now that we won’t have a prime minister (to all sensible intents and purposes) in post until September at the earliest. And even then, both houses of parliament would have to agree (majority) to grant this power.
As both houses have significant majorities that are opposed to leaving the EU, and as a person who may well become prime minister (I know, I’m sorry…) has already expressed a desire to take things slowly this would be no mean accomplishment. And though the EU may demand speedy resolution, not least to settle the uncertainty that will likely affect economies around the world, they have no power to force Britain to declare article 50.
Those who voted to “take control” may reflect here that this is possibly the last point we would have control over our relationship with Europe, so given our complete lack of preparedness to negotiate time is very much not of the essence.
Anarchy in the streets. Angry citizens on both sides will march, chant and campaign. Millions of people will sign entirely useless petitions. Some idiots will be violent and deeply unpleasant, they will be (rightly) arrested and imprisoned. But as a population looks to the government to protect them from the consequences (devaluation, economic slowdown, job losses…) of the referendum it is unlikely to simultaneously enact full communism/a neo-fascist junta.
What probably won’t happen…
Brexit. Boom. I went there. For the reasons given above, I would be hugely surprised if Britain ever uses article 50. And as there are no other ways to leave the EU, that would very much be that. Practically nobody in government wants to actually leave the EU, even Boris has been clear all along he wants to use a vote to leave as a bargaining tool.
As we progress through a long, painful, summer it is likely that public mood will tend towards “bregret” as the bad news continues. A new PM in September may opt to seal the deal via a snap election won on an explicit anti-brexit stand – possibly with some concessions from the EU to sweeten the deal.
Pariah lawmaking. As the international community attempts to navigate what are already difficulty delicate geo-political and economic waters, the need to punish and isolate the UK will be unlikely to be forefront in their mind. The remorseless logic of capital, and the “function” of the market in idenifying and pricing in risk will make things far worse for us Brits – punative policymaking and sanctions from other nation-states or international groupings are very unlikely to.
However, there is a small chance that if the brexit vote sparks a wave of similar referendums in Europe the EU may wish to make an example of the United Kingdom- though I’d guess it is more likely that we would be already be a cautionary tale thanks to the invisible hand.
What probably will happen…
A UK general election in 2016. Yup – I heard you liked politics so I put some politics on your politics. A general election (already hinted at by Cameron and others) is our grand brexit get-out-of-jail-free card, and an incoming prime minister is likely to want a more recent “remain” mandate to counter the “leave” vote.
A new political landscape. Both major parties are fractured and in disarray – the Tory “leave” right has far more in common with UKIP than with the modernising “remain” wing. It’s been the longest relationship break-up in history but I’d be surprised to see an intact Conservative party as the clamour to remain grows.
Ditto, alas, Labour. Corbyn’s determination to hang on till the bitter end highlights the difference between the labour left and the centrists, who themselves probably have more in common with the Liberal Democrats or – indeed (whisper it) – the Tory centre.
What almost certainly will happen…
More chaos: of the economic, cultural and political varieties. Also sporting. Despite these reading almost like predictions no-one has any idea what will happen or when. It will be an unsettled summer reflected in financial markets and political paralysis.
Scottish independence. Sorry, England, but we’re not going to get away with this again. Even without a brexit, it’s clear to everyone that Scotland and England are on very different trajectories and can no longer trust a Westminster Government to act in their best interests.
Bonus round
Stuff that I am interested in but I have literally no answers to yet includes:
The very dodgy legal ground that depriving individuals of their EU citizen rights would be on. We generally don’t restrict peoples rights (these days, he says, drawing a veil unconvincingly over our nasty imperial past) unless they are criminals.
Effects on European Central Bank investment: despite us not being in the Eurozone the ECB invests heavily in British industry. Will it continue to during this interregnum?
The now legendary British Bill of Rights and the UK leaving the ECHR. I’m assuming Brexit panic will give the government chance to finally abandon this insane idea, but I also have a horrible feeling it may be a sop to ardent “give me my country back” folks in the event of a non-brexit. Please no.
It’s been a quarter of a year since I last turned white pixels black with the aim of getting some sense out of the ongoing EU referendum debate. In the intervening time, both sides of the debate have opted to abandon theatre and scaremongering, simply stating facts soberly, and with the barest minimum of interpretation, in order that the Great British Public can make an informed choice come June 23rd.
As each side describes the other as desperate… desperate… (it’s the insult du jour, folks – the worst thing is being seen as wanting to win so badly you tell your supporters what they want to hear, be it that Grandma needs to be put in a home or that it’s all a massive gubmint conspiracy) a cavalcade of desperate – in many senses – voices are wheeled out for our edification.
There are people in high places in Britain who believe in all seriousness that a word or two from popular crowdpleasers like George Osborne and Michael Gove would affect the thinking of the populace. There are those who consider Kate Hoey and Jeremy Corbyn compelling public speakers.
Any debate that pits Jeremy Clarkson against our own Donald-Trump-with-a-Latin-a-level man of the people Alexander Boris de Pfeffel Johnson should by rights be one to savour, but when something as compellingly awful as Boris’ various attempts at alternate future journalism are met with nary a shrug, we have to consider what else is going on.
The purpose of the Referendum, lest we forget, is to heal a decades-old rift in the Conservative party. Blue blood, blue Audi and blue rinse must come together as one.
And we already know it won’t work. The referendum has already failed before a single vote is cast.
Cameron has been promising a referendum for more than 10 years, both to fend off the forces of senile xenophobia in the form of UKIP and his own fractious backbenches – now it appears he’ll lose either way. Meanwhile the rest of the country looks on in an appalled fascination, contemplating arrant nonsense about either TTIP or mass immigration, depending on personal preference.
If you’d indulge me in a little critical theory, the various competing meta-narratives are only congruent in brief moments of intertextuality – with the sheer immanence of the spectacle itself a reflexive attempt to unify a fractured discourse. Or, as we say on Teesside: I canna be doin’ wi’ this bag o’ doyles, I swear down.
Two tribes
A wonderful survey by YouGov back in March demonstrated that it was possible to stereotype each voting tendency with a high degree of confidence.
If you are over 50, are of social groups C2,D and E, live in the Yorkshire, East Anglia or the West Midlands, have no formal education beyond the age of 16 – you are more likely to vote leave. Whereas if you are under 39, are of social groups A, B and C1, live in London, Scotland or Wales, have a university degree – you are more likely to vote remain.
One can, and nearly everyone does, read their own prejudices into these gaps. There are equally profound party splits, with Conservative and UKIP voters more likely to vote leave, and Green, Liberal Democrat, SNP and Labour supporters likely to vote remain. (Incidentally, men and women are both split around equally on the subject)
But though it is clearly an interesting survey and one worthy of further study, it has exacerbated the already deep split in the electorate. It is not one ideology about the world and Britain’s place in it against another, it has become the old versus the young, the rich versus the poor, the university graduate against the labourer. Dangerous stuff.
How much?
By now, even the most avid news-avoider must be aware that the £350million a week figure being bandied about by leave campaigners is nonsense – it’s the equivalent of saying that a pint costs £20 because that’s the size of the note you gave the barmaid, whilst ignoring the change that you get back and, indeed, the value of the beer alongside the other delights of the pub.
In gross terms we are talking about a membership fee of £250m/week. In net terms our national contribution is £120 million a week, including our rebate, other EU spending in the UK and the amount we count as part of our international aid spending.
Even this £120 million does not include any calculation of the other benefits we get from EU membership, such as an increase in foreign direct investment (the estimation made by the linked paper suggests that this outweighs the net membership costs by a factor of 10!) Or the savings that transnational companies make due to unified trading standards. Or… well, you can take it from here.
Coming over here, running our essential services…
I don’t even want to talk about immigration at this point – it’s depressing.
I could write paragraphs on how much the NHS benefits from migration, how reliant many of our industries are on low-paid migrant labour, how economic migrants work hard doing the jobs we don’t want to or are unable to, from Professor of Theoretical Chemistry to central London barista, whilst paying by far more taxes than they ever receive in social security.
It would’t make the blindest bit of difference.
Neither would a few hundred words on the plight of Syrian refugees, asking little more than the right to live in a house that hasn’t been bombed – leaving the world they had built lives and families in with literally nothing, forced to start afresh, drawing on only the skills (often including fluent English) and determination they themselves offer. As boats began to attempt to land on the east coast, one of our most popular politicians was most scared of human corpses despoiling our beaches.
If you are voting to leave to stem the flow of immigration, you can be assured that it won’t work. Actually – if you are lucky, it won’t work… we need the skills of foreign nationals to manage our country.
—
This is only going to get worse. There will be more Brexit nonsense, and less decent reporting to make your mind up about. Here’s a few of my favourite academic (because most readers of this blog tend to be academics) sources of analysis:
The always excellent LSE blogs have a European Policy portal, which covers more of the soft social science end of the debate. CEPR run VoxEU, which is great for economic analysis. Oxford have an excellent international politics blog. Based at King’s College, the “UK in a Changing EU blog” is a fantastic initiative. “The Conversation” have an EU referendum portal, and with the previous example this is generally an easier read though not always of the same high quality.
In a more general context, a key question that still needs to be addressed is the origin of the [long-range correlations], which are common in a variety of systems. The underlying system must be sufficiently complex, described by a nonlinear differential equation (or many of them), and there must be a proper amount of feedback. However, the origins might largely variate from system to system, and it is difficult to generate universal models that could qualitatively describe, e.g., heart-beat intervals, magnetoconductance oscillations, and drumming intervals in the same footing.
So as the message discipline and the mask of respectability falls away, we might view this as a Democratic opportunity. People who think that words matter– that they should be used responsibly and not to manipulate people through subtle emotional cues embedded in euphemisms and dysphemisms–can celebrate the loss of [Frank] Luntz’s influence.
Frank Luntz, and men like him (in the UK you could look at Philip Gould or Peter Mandelson, there’s also Lynton Crosby, Karl Rove, Jim Messina … ) can be seen as the last of a dying breed of political messaging specialists or “spin doctors”. The great, devastating political campaigns of the 90s and 00s were successful only in their own terms – to the outside observer they led to a parade of “machine politicians” who sought power by surrendering ideals.
Luntz (and the others) worked by means of a focus group. Thousands of hours of recorded conversations gave them an insight into terms and language that “played well”, often the language that later appeared on billboards and in interviews started in the mouth of an ordinary member of the public – the return via the ears and eyes was orchestrated precisely to bring about a “resonance” based on the repetition of language already perceived as “common sense”.
In other words, phraseology such as that used in the image above (“It’s not racist to impose limits on immigration”: in that case explicitly rendered in a “personal” hand – though amended, unofficially in another) reinforces what was identified as an underlying pulse of popular discourse.
Repetition amplifies the sentiment- but it will never feel entirely natural. And the power of repetition relies on exact repetition, requiring a huge amount of message discipline. The later has become a politico-industrial pseudo-science – devoted to the idea of communication without any of the communicative (empathetic) aspects.
So these nuggets of distilled phraseology are seen as a way to make a minimum viable impression on a carefully selected target market. Though the use of “found” phrases (from research or focus groups) is common, these are generally decided on centrally within an organisation before being fed out to often-nonplussed adherents and staff.
Effective? possibly, in the short term. But, as the rise of Trump (and, indeed, Sanders) and the continuing bewildering relevance of Boris Johnson assert, perhaps an idea that suffers from the attentions competing narratives of popular influence.
The theory of message discipline discussed both the quantity and quality of messages – not only must messages be carefully aligned to the language of the target group, but they must be presented uncluttered with other messages. (Lyotard fans should be pricking up their ears round about now). A focus on a few simple messages striates the communicative space, but very broadly and with significant liminal possibilities for a demagogue to exploit. The larger, and “broader” the grouping, the less likely a message discipline approach can capture the full spectrum of opinion and emotion, and the more likely that an off-message individual can find underutilised resonances to exploit.
This years GOP primaries demonstrated not just one (Trump), but a number (Cruz, Carson…) of counter-message candidates who were able to exploit a distrust of such a poorly-expressed and tightly constrained narrative from an “establishment” (itself a loaded, and counter-message term). In Britain, Johnson’s opportunistic and self-centred embrace of Brexit can be seen as a similar attempt to capitalise on years of counternarrative positioning as bumbling, off-message, Boris. Ditto the unexpected and unpredictable rise of Jeremy Corbyn.
Those of you who read the first quote, above, and maybe the underlying paper (and you should!) may wonder where precisely I am going with this. The Trump phenomenon as anti-establishment posture has been done–to–death (alas not literally) all over the popular press. But I daresay none of them have considered fractal patterns within the hi-hat part of a Michael McDonald track in this context.
Jeff Porcaro is a machine. Seriously – it was his brother Steve that suggested the use of samples to power the legendary Linn LM-1 drum machine, and Jeff himself learned to programme one (notably and unmistakably on George Benson’s 1981 “Turn your love around“) . Basically any early 80s LA pop record that has ever made you think “wow… those drums…” – that was Jeff.
If you have any kind of a musical background you may now pick your jaw back up off the floor.
What I want to note here is both the fluid and utterly mesmeric way he can place any technique on any subdivision of the bar, effortlessly, every time – and the way he makes it sound so fluid and natural that you can help but move. Drummers are generally either technical players or groove monsters, Porcaro’s feel defined the early 80s as he managed to be both.
Last year (yes, I just said that so I could say “it’s been a year since they went away, Rasanen et al…“) a team of researchers analysed the timing and volume of that single-handed 16th hi-hat part and deduced that it very clearly wasn’t as exact as it initially sounds. Here’s the numbers…
The A parts (the intro and verse) tend to slow down, the B parts (the chorus) speed up – both very slightly, but still measurably. This is done for musical reasons, to accentuate changes of mood in the song. Despite this you can still see a periodicity in the shorter spikes representing a “pushed” accent on the same 16th note of every two-bar phrase. This is the “long range correlation” which connects the precision of a virtuoso with the undeniable groove of a human being.
Jeff could very easily have programmed the same part in, indeed the George Benson track above uses a broadly similar feel. But if he had, these micro-fluctuations in timing and power would be lost and the track would feel very different.
And no-one told him precisely what to play – he had a feel which he interpreted in his own way to the benefit of the song.
Message discipline could be compared to the hypothetical use of the drum machine, the human effect is lost even though it can be closely simulated by expert programmers. Any movement, organisation or political party that designs in message discipline designs out the fluidity and freedom that allows for a virtuosic interpretation of values and ideals to the detriment of wider goals. You get the precision, but what people really react to is the pocket – not a place where you hold a message but where a message gently holds you.