Baby’s first semantic blog citation metrics #oer16

Note: If you want to see semantometrics done for real to have a read about the Jisc-supported work by Petr Knoth and Drahomira Herrmannova. My approach is loosely inspired by their research – which I’ve been fortunate enough to see presented on a couple of occasions – but in terms of reliability, technical finesse and generally having a clue I’m very much a Robbie Dupree to their Doobie Brothers. I should also be clear that what follows is my own efforts entirely.

In all honesty, and if we look at Vivien Rolfe’s superb systematic reviews of open education, we’d have to conclude that literature in the area leaves a fair bit to be desired. Faced with this I’ve heard it said on a number of occasions that “the good stuff is in the blogs”, and I decided that the time had come to test this.

If open education blogs do have academic merit, I would expect them to be cited in the more traditional literature, both around the subject each covers and further afield. This might seem circular – but as there are clearly gaps in the literature one might reasonably expect blog posts to be filling these.

For the purposes of this experiment (as presented at OER16) I looked at five blogs that I felt were consistently high quality, and that I had seen frequently referenced in conference presentations and similar:

Now the “scholarly graph” (the ways in which publications are interconnected by a network of citations) is notoriously hard to mine, unless you happen to be Thompson Reuters or are in a position to give them money. I am neither, so I needed to use the tools I had available to me, which are necessarily incomplete and variable in coverage.

Google Scholar is far from perfect, but it does let me search for references in a round-about way. What I did was searched for the root domain of the blog as a text string, to generate a corpus of literature that included (most likely) a link or citation to a specific page of the blog. I then went to export details of the question and found that a simple export to .csv is not possible. You can export individual records to a small selection of bibliographic software, but not as a whole.

I eventually found the marvelously named “Harzing’s Publish or Perish” which appears to use some Tony Hirst-esque page-scraping magic to automate what would otherwise be a labourious task. I cleaned out duplicate records and self-blog citation (pages from the blog itself returned, which happened quite a lot for popular individual posts), then rendered to the spreadsheet available here. (I also had a special issue with David Wiley – hey, don’t we all! – as many early papers cited his Open Publication license as as a means of licensing their work. I got round this by searching only for citations to his “blog” directory.)

[I also searched for “edupunk” just out of interest – as this was a 2008 term coined to deal with a range of activities that included blogging instead of formal publication. To be honest, it didn’t tell me very much other than that the majority of publications on edupunk are in Spanish.]

Publish or perish also returns a bunch of those research power statistics that occasionally come up in conversation – and it was tickled to play with the h-index for each corpus of papers I returned. The h- (or Hirsch) index is generally seen as an author-level metric, but it can be calculated to characterise any group of papers. In this instance it gives a reasonable measure of the kind of influence that the papers that cite each blog have.

  • David Wiley – 23
  • George Siemens – 38(! – this is very high for an education related subject)
  • – 29
  • Martin Weller – 20
  • Audrey Watters – 12

All of these are – if you are the kind of person who cares about your h-index – pretty respectable showings. In particular it should be noted that Audrey is a journalist (and a damned fine independent one whom you should support!) who makes no pretense to be writing academic research or comment.

[I should note here that I also looked at commonality between pairs of blogs – are any particular two likely to be cited together. The short answer is yes – people who cite George Siemens’ blog also tend to cite David Wiley’s blog. The data is on the spreadsheet.]

My next step was to find the five most common words used in the titles of the papers citing each blog, and compare these to the top five words used in the blog itself (using the standard list of English stop-words built into voyant tools.) A proper look at this topic might take more words from each source, and employ a weighting based on the ratios between word counts, but at this point it was Sunday evening before the conference.


It’s possible to do a manual sort to look for any interesting patterns – in this case what struck me is how often citing papers talked about technology-related issues or *shudder* MOOCs, whereas the blogs were more likely to consider students or use terms like OER.


I decided to automatically compare titles to the list of common terms from the blog in question using a simple excel formula – I used an average of the number of instances where the title did contain one of the common terms to create an index of semantic prediction – basically a higher number (with 1 being the highest possible) meant that the terms commonly used in the blog were terms likely to be found in titles of papers citing the blog. Here’s how that stacks up:

  • Wiley – 0.70
  • Siemens – 0.77
  • Connectivism – 0.58
  • Weller – 0.58
  • Watters – 0.12

Now Petr Knoth and Drahomira Herrmannova postulate that a greater semantic distance between the citing paper and the cited resource implies a greater contribution to knowledge – works that bridge disparate areas of research could be seen as more valuable as new knowledge has been synthesised via a new connection formed (kinda rhizomatic, wouldn’t you say?)

By this measure Audrey’s work can be said to have the greatest contribution to knowledge, with David and George much more likely to be cited in the field in which they write. Whether this tells us anything meaningful in the grand scheme of things is open to question, not least because of the general shonkiness of my methods – this being just a first look which has much scope for improvement.

It also made me think about Cameron Neylon‘s concerns about our poor understanding of the nature of citation. Citation is one of those things that seems simple at first thought, but has a huge layer of social and cultural practices built on top of it. Unless we have a better understanding of the reasons for each citation (as quote source, acknowledgement, attribution of ideas or methods, cultural norm in a domain of research….) we can’t really assume that all citations carry equal weight – though most serious citation metrics do so. Some of this may be categoriseable using Knoth and Herrmannova’s deep text analysis alongside a carefully designed system of categories and indicators (another reason I am watching that project and other citation experiments with huge interest.)

So – to answer my initial questions:

  • is all the good stuff in the blogs? There is clearly a lot of good stuff in blogs, which is frequently cited by literature that itself is highly cited. I’d love to look at other areas of research with similarly important blogs to compare – any suggestions would be welcome!
  • are blogs cited only within the domains they write? broadly no, though some blogs are more often cited in the domains they most closely identify than others. Though Audrey’s blog citations showed a low level of semantic prediction, this may be because there were lower citations overall, and I would like to refine the metric to account for this (possibly via some form of sampling?)
  • is this interesting enough to look at further? Absolutely!

Here’s the slides from mine and Viv’s presentation at OER16

The Chain

In the 90s, actuaries (and actuarial science) were at a bit of a low ebb as all kinds of public and private institutions had issues with pension liabilities. The actuaries – whose job it was to make detailed predictions of the likely long term demands on these funds and compare these to what the funds actually had in them – had taken a decades long jag of Panglossian assurance that all was for the best in the best possible of all worlds.

One of the odder things they did was to allow fund managers to claim that equities (stocks and shares) were more valuable than any other kind of investment (say government bonds, commodities, derivatives…). This was an accounting convention which allowed “profits” for returns on these equities (dividends or profits from sales, and gains in realisable value during the inflationary years) several years in advance – thus reducing the net liabilities of the pension funds.

By the mid 90s the flaws in this approach were beginning to show – and by the time of the equities crash in the early 00s (that’d be the dot-com bubble bursting) it was clear that something had to be done, and pension funds began de-risking – selling equities (at a loss) and using the proceeds to buy less risky products with smaller, but near-guaranteed returns.

Or so you’d think.

What actually happened in many cases was – having been burned by equities, and egged on by the interests of the defined-benefit pension-holders themselves – pension fund managers sought newer forms of investment that could offer a comparable return but at a lower risk. And a product was there to meet their needs.

David X Li, another actuary, began his financial career investigating ways of quantifying risk. Drawing on the copula function (specifically Gaussian Copulas) he came up with a way of predicting correlation between financial securities. Basically, he could turn the correlation between the likelihood of mortgage holder A defaulting and the likelihood of mortgage holder B defaulting into a single number – simple enough for investment bankers to understand.

Even though Li specifically warned against it, this (comparatively) simple output from a (largely poorly understood) formula became the driving force of the market for collateralised debt obligations – CDOs. (basically a CDO piles a whole bunch of debt into big box, and then slices it into segments rated at different levels of risk. You can buy a low rate of return with little risk of default (the “senior tranche”), or a high rate of return with a much larger rate of default. Li’s formula supported the generation of synthetic CDOs (sometimes CDO-squared), wherein the more-difficult-to-sell lower tranches were put into new boxes and resliced into tranches, thus generating new “senior trances” that could be sold.

And that -it has to be said – ended badly for the pensions of the world.

So where next for your savings and pensions? What about something that combined the the volatility and risk of equity with the anonymity, fungibility and algorithmic obfuscation of the CDO? What if we could combine these features with an entire lack of regulation or state backing, and – just for fun – a huge user base in illegal activity? Ready to fill your boots?

Such a product already exists.

There is significant global debate as to what species of financial instrument a “blockchain-derived currency” really is – a debate that has implications for taxation and licensing by governments.

  • Certainly it is tradeable against other currencies, with a widely fluctuating exchange rate – though it is backed neither by commodities nor a government/central bank.
  • With no extrinsic value, and no value backed by a known trustworthy party, it is clearly not itself a commodity, although it behaves as such.
  • It is issued by what could be considered a decentralised autonomous organisation in return for (computational) work conducted – work that is essential to the continuing viability of the organisation. But unlike equity it offers no voting rights or investor protections.
  • There is even some discussion as to whether or not it is a derivative.

What blockchain-derived currencies do offer is a lower administrative cost for transactions, where these transactions are simple. For on-network transactions there are no direct costs – though obviously the conversion to another currency at either end has a cost, and the “hidden” costs of power, connectivity and processing time are not factored in.

In terms of security, blockchains prioritise anonymity and encryption over direct trust. Indeed, “trust” is something of a dirty word – the industries that blockchain is slated to disrupt are generally those based on the need for trust.

Considering say, a bank: it offers some anonymity (it shouldn’t disclose the contents of my account to anyone without my permission), and some transparency (I am able to read about the strategy of the bank at my leisure, and monitor activity via publication and meeting). But the main reason to use a bank is for one of trust – I can (generally) assume that a bank will not do anything to jeapordise the money I have invested in it, and if this trust is misplaced I can (generally) assume that a central bank or nation state will ensure that my losses are mimimised.

By amping up the anonymity (it is impossible for anyone to know the content of my account, or link it to me) and transparency (all transactions are publicly listed), blockchain technology removes the need for trust – for all I (or indeed anyone) knows, Satoshi Nakamoto could be anyone or anything, and the other blockchain participants could be thieves and scoundrels or possibly even libertarians. The Ayn Rand forum accepts bitcoin payments – *shrug*

Bitcoin – to use the most common example, is a deflationary currency. There will only ever be 21m bitcoins – as transactions become more widespread the intention is to stop rewarding “mining” (the generation of the crypographic hash that is used to publicly record a transaction) directly with coins and instead allow miners to receive transaction fees. This change will happen automatically and was designed in to the bitcoin system

So – like CDOs – a smart algorithm has removed the need for me to trust anyone, and – like the old accounting convention around equities – an implication of guaranteed growth and return is undermined by deflationary reality. All of this is governed by the design of the algorithm – requiring (ironically) that we trust only in the wisdom of an anonymous coder that the various variables and correspondences have been set correctly.

Cue myriad screams of “this sounds AWESOME, how can we apply it to edtech?”

When silicon valley sneezes, edtech catches a cold about three-to-five years later. So if you’ve recovered from the “year of the MOOC” (yep, every year since 2012 it seems) then brace yourself for the first of many years of the blockchain! Hurrah!

Blockchain has provided a handily fashionable hook to attach increasingly moribund ideas like micro-credentialing and badges to. Of course we have to actually trust the issuers of the badges in questions, ensure that badges can be rescinded if this trust breaks down on either side, and despite this somehow have a means to incentive miners to keep growing the chain.

It’s fair to say that the blockchain is as poor a fit for micro-credentials as learning itself is – though this has not stopped various initiatives to combine the two. Despite this, there are genuinely interesting uses of blockchain technology which may have educational resonance: Ethereum is one that I see as being worth keeping an eye on more generally, and Mark Johnson is (as usual) thinking rich and stimulating thoughts.

Audrey Watters is researching this area in more detail, and I would highlight her ongoing struggles as a means of staying abreast of the tide of distributed anonymised ledger based nonsense that this year will bring.



FOTA Brexit nonsense update 1

I’ve a horrible feeling that there will be a few of these over the coming months – primarily because so much nonsense is going to be talked and so few facts will be checked. 

The practice of calling a referendum (or plebiscite) to resolve major political questions via a direct expression of democratic will has a surprisingly brief history within UK government. Only two UK-wide referenda have ever been held: the first was the 1975 vote on membership of the European Community (“Do you think the United Kingdom should stay in the European Community (Common Market)?”), and the second was the 2011 vote on changes to the voting system for general elections (“At present, the UK uses the “first past the post” system to elect MPs to the House of Commons. Should the “alternative vote” system be used instead?”).

A third referendum will be held on 27th June this year, again on European Community membership (“Should the United Kingdom remain a member of the European Union or leave the European Union?”).

So, in UK politics we tend to call nationwide referenda when:

  1. one or more parties are hopelessly split at an existential level.
  2. This issue at stake is so obscure and complex that only a very small number of wonks will understand it.

In the past everyone has taken the opportunity to air whatever prejudices they happen to have before the status quo option prevails and a significant number of UK politicians enter a decades-long sulk. The political mainstream will then take the result as a mandate to carry on doing whatever they were doing anyway, and the whole thing will be a massive waste of time, money and air.

There’s no real reason to suppose anything different will happen this time. But it’s fun to take the opportunity to nail some common myths being bandied about, and hopefully add a small leavening of fact to the huge number of words that will be deployed for very little purpose.

Myth 1 – “Brussels Bans X”

Easiest one first. Starting in the 80s, initially in one Boris Johnson’s columns for the Telegraph, it became fashionable to make up outright lies about “barmy Brussels” and supposed EU “diktats” or “laws” that stopped honest, brave, British folks from doing – well, anything. Because only about 100 people in the UK actually pay any attention to what the EU does on a day-to-day basis, these are repeated and embellished rather than fact-checked. The whole genre exists as an insight primarily into the often vivid imaginations of UK journalists.

The London office of the EU has painstakingly refuted pretty much all of these stories (latterly the preserve of the Express, Mail and Sun) on their superb “Euromyths” blog. Here’s an A-Z list that covers everything from 1992-2015. It’s at once impressive in scope, and depressing as to how much nonsense has been passed off as news and how much the heirs to Boris (and Boris himself) continue to dissemble. You can follow their day to day struggle on the blog.

Myth 2 – “… unelected bureaucrats…”

I mentioned “barmy Brussels” above – to any lover of British tabloid journalese the full phrase is “barmy Brussels bureaucrats”. The idea of the EU as the last repose of the earlier meddling, lazy, civil servant stereotype is difficult to shake off.

Allow me to drop some legislative process on those assembled. Like our own dear Westminster Government the EU has two legislative chambers, a presidential role and a civil service. However, in pretty much every way the EU is more democratic and more accountable than the Westminster equivalent.

  1. The Lower Chamber. In the UK we have the “House of Commons”, full of our representatives (MPs) that we vote for every five years. In the EU we have the “European Parliament” full of our representatives (MEPs) that we vote for every five years. (we do ruin this somewhat by voting for UKIP people who take all the expenses they can but don’t do any actual work). In Westminster there are political parties who generally vote as a block and ensure the government gets their way. In the European Parliament there are loose groupings which change in every parliament, but MEPs generally vote independently.
  2. The Upper Chamber. In the UK we have the “House of Lords”, made up of people who are appointed there primarily by dint of their birth, penchant for arse-licking or job (if they happen to be a Bishop in the Church of England). It’s probably the least democratic legislative body -outside of actual dictatorships – in the entire world. By contrast, the EU has the European Council of Ministers (not the European Council or the Council for Europe), which is made up of the Ministers of State from each member state who have responsibility for the topic in question. (So if the topic was, say, immigration the UK would send Teresa May. Yikes.) This reflects the actual government of each country, voted for by popular vote.
  3. The Presidential Role. In the UK we have the Queen (gawd bless her) as the titular head of state. She basically waves at things, and rubber-stamps laws passed by parliament. Once a year she makes a speech (written by “her” government) that sets out what “her” government will do. The only way you get to be the UK head of state is to be born by the previous one. The EU has the “European Council“, which does have a president (rotating every six months) but is made up of the heads of the constituent state governments – so David Cameron in our case. The European Council acts as the leadership of the EU, you become a member by being prime minister of an EU member country and then become president by waiting your turn.
  4. The Civil Service. Dead easy – we have the Civil Service, the EU has the European Commission. Both help the legislative bodies and leadership in drafting and implementing decisions. The UK Civil service is led by the Head of the Civil Service supported by Permanent Secretaries (24) and employs some 447,000 people. The European Commission is led by a president, supported by a College (28 senior staff) and employs a little over 23,000 people.
  5. Other bits. The UK has a Supreme Court, the EU has a Court of Justice. The UK has an Audit Office, the EU has a Court of Auditors. The UK has a central bank that controls the pound, the EU has a central bank that controls the euro.

So to summarise, the EU has the same legislative and executive structures as the UK, except the EU ones are – in general – more democratic. Indeed, a recent ERS investigation into the state of EU democracy concluded with a set of recommendations… for Westminster!

Myth 3 – TTIP

If you’re on any form of social media, or if you read George Monbiot in the Guardian you’ll have heard something of these trade agreement negotiations. These are some things you think you know about TTIP:

  • It’s secret
  • It’s an EU plot
  • It’s a way to destroy democracy, sell off the NHS, and make kittens look sad.
  • If we leave the EU then TTIP will go away, somehow.

None of these things, in the grand tradition of EU journalism in the UK, is actually true.

Here’s the EU TTIP twitter account – have a read down it. (That’s just one of the many ways they are communicating – here’s the main website, here’s the complete set of negotiating texts, here’s a statement from the chief negotiating officer on the last (12th) round of talks.  There’s even a snapchat channel (!)) So it’s only a secret in the sense that very few people are reading it.

As for being an EU plot, you may have spotted that the EU are negotiating on our behalf with the US. We can see, read, and comment on the EU position – a recent speech by EU Trade Minister Cecilia Malmström suggests that these are taken into account in developing the EU position. For instance, concerns about secret (ISDS) courts are addressed in the EU position – they won’t be secret, they’ll be live streamed and led by an actual real judge. Concerns about public sector bodies like the NHS are written into exemptions that are negotiated for.

The alternate (“brexit”, or “flexit”) position is that something very like TTIP would be negotiated between the UK and the US, and the UK and EU (and the UK and Canada). Rather than the fairly sensible Malmström (a centrist, vaguely New Labour-sh, Swedish politician) these would be led by Sajid Javid – a man who freely admits to the central place Ayn Rand’s “The Fountainhead” has played in his life. For those who want to preserve the NHS as we know it, it would seem that the EU negotiating team would get a far better deal.

There’s so much more to go into, but generally you are more likely – it seems to me – to be misinformed by our press than the EU.

#altc and me

The last working day of January saw the release of the ALTC2016 call for proposals, an eerie advance echo of mid-autumn at the very start of the year. In times past I’ve taken this as a spur to take the various ideas and half-baked inferences that usually turn up on here or on my twitter, and turn them into something I could probably present at a serious conference like ALTC.

In practice this meant I would come back to the idea at the start of the summer and spend those long summers evenings playing with video clips, or data collection, or textual analysis, or reference collecting…

… actually let’s see some uncharacteristic FOTAlove for a moment, I’ve presented each year at ALT since 2010, and every year has been my absolute A game. Five years of FOTA awesomeness:

ALTC2010 – Flat-out nailed the xMOOC movement and attendent “employer needs” hype, more than a year before it happened. No big deal.

ALTC2011 – Future of open education, but with an old west theme, featuring Amber Thomas, Helen Beetham and Dave White in cowboy hats and me as a travelling preacher man? Oh and the first ever ALTC session trailer? Done.

ALTC2012 – Just a meticulously referenced and definitive history of sharing learning resources in the UK, co-authored with Amber Thomas. That’s all.

ALTC2013 – The hype-strewn rise of  the global MOOC movement, told via deep text analysis of the very worst of bandwagon journalism, and with a little tasty soupçon of French critical theory.

ALTC2014 – You know that whole backlash about social media as a teaching tool? Privacy, ownership, twitterstorms, peak social. And memes. Lots of memes.

ALTC2015 – The actual future, the predicted future, why both of them suck. With added economic theory. (incidentally, the NMC Horizon Scan 2016 seems to fit the pattern…)

Now to paraphrase Jim Groom:


Six years of stuff that, looking back, I’m actually pretty damn proud of doing. Blog posts that have had actual serious citations as scholarly work. Multimedia – even live music. And a whole range of critical perspectives, methodological adaptations and detailed analysis. I’m not going to claim any of it changed lives or caused a revolution, but I am going to say that it made the conference, and indeed the edtech field, better and more interesting for some people.

So you’d think that I’d be planning something equally cool for ALTC2016 – but the sad fact of the matter is that I am almost certainly not even going to be attending. Simply put, I don’t work in #EdTech any more, not by choice but by the fact that my employer needs me (and more importantly, pays me) to work in other areas.

Even the most generous of observers would struggle to match what I presented with my actual job in any given year. But actually now working in an entirely non-cognate field means that the very generous leeway that I’ve been offered in the past (I want to single out @sarahjenndavies for her support and tolerance) is just not even going to be arguable this year.

So here’s a few things that I might have submitted this year (some of these may leak out as blog posts during the year if I get time)

  • I’m sure I’d have had more to say on the idea of debt, scholarly debt and the way we count it in academia.
  • Whither education (technology) research? Who is funding, and who is legitimising, people trying to figure out how to actually deliver “excellent teaching” in this data-driven “post-theory” world? (According to one source, Pearson!? – you know I can’t resist that)
  • Connecting those two, whatever the TEF turns out to be, or not to be, would be something I’d love to analyse.
  • Billionaire-driven edtech as a service – or why does a person, in receipt of a good living from ads on social media, feel obliged to try and fix education with money and bad tech? And why does it never actually work.
  • Online learning at a distance – updating Dave White’s 2010 work (that was actually done by Marion Manton and the team at Oxford Continuing Education, mind you!) to encompass the Second Great Online Learning Bubble.
  • The effects of [insert bleak 2016 event here, probably Brexit or President Trump or both] on global higher education, and the affordances or otherwise of technology in addressing these.

I certainly hope to write about all, or most, of these this year. I’m just sad that I’m unlikely to be at ALTC to share and discuss these and many other issues with the wonderful UK edtech community in 2016. Maybe my circumstances will change, but for now I can only encourage all of you to get on and submit something properly awesome to the conference.

Scholarly debt and deficit

“Violence and quantification are intimately linked” – David Graeber

For those of you who have yet to read Graeber’s “Debt: the first 5,000 years“, the central theses are:

  • Debt, seen in the sense of owing something to the general well-being of your community, is far older than both money and barter; and
  • At the various moments during history that this community-minded sharing has been disrupted by attempts to quantify debt, violence and poverty have been close behind.

For those of you who *have* read the book, you may be wondering why I have brought it up again four years after publication. Certainly, I do so with a wariness of the kind of historico-economic “cycles” that would suggest a means of predicting the future.

What Graeber calls the “axial age” period, placed between 800BCE and 600CE involved an explosion of scientific, philosophical and theological innovation – including the founding of all of today’s major world religions – and also saw the invention, primarily as a means of paying soldiers and collecting taxes from expanding empires, of a way of readily quantifying social and personal debt by means of coinage.

Easily portable, and with a known value backed up by military might, coins encouraged trade between soldiers and those whose territory they had occupied. Coins exchanged for food, supplies (including precious metals for the minting of coins) and slaves were the only means a conquered town or country were permitted to pay the required tribute to a distant imperial capital.

Coupled with the birth of mass literacy and the rise of rational calculation (after, for example, Pythagoras), this enabled an affectless data-driven means of managing human interaction – as Graeber describes “impersonal markets, born of war, in which it was possible to treat even neighbors as if they were strangers”.

Here a “stranger” is someone from outside of a given community. Their standing and trustworthiness are unknown – so rather than an inculcation into a communistic tradition of sharing and loyalty, all transactions are based around direct equivalence.

This was a great model for financing an expanding empire, and a strikingly modern economic plan. But when expansion ceased and empire faded, both coins and soldiers disappeared, and taxes could no longer be collected. Though the values of the old coins could be used to make value calculations were required, credit (based on interpersonal interaction and often nominally denominated – if not actually settled -in equivalence to a currency or other reliably valuable commonplace ( food, fancy goods, maidservants) reasserted itself.

And as empires crumbled around the world, civilisation entered what are called medieval times, or the “dark ages”. Except something very interesting happened.

Nalanda – possibly the world’s earliest independent teaching institution focused on higher learning – was founded right at the beginning of this medieval period (in northern India in 427CE), and was followed by the growth of Madrasah in the Muslim world, the rise of Confucian institutes in China and eventually (the west was a little of off the pace here) the Universities of Bologna (1088), Oxford, Salamanca, Paris, Padua and so on.

As I’ve written elsewhere, universities came into being when scholars and teachers banded together to support each other both in the practicalities of administration and in the expansion of knowledge and perspective. Relying largely on unspoken and unexamined ideas about the nature of their academic society (the first codification of this at Oxford, for example, were the Laudian Statutes in 1636) , members shared and supported a common scholarly endeavour. What we can here argue is this is the birth, in the wider sense, of scholarly debt.

This was a parallel to changes in the wider economic life of societies and the re-assertion of the idea of an unquantifiable debt to the home community – manifesting not in a quantifiable culture of repayment but in a general state of openness to share what was needed with other community members.

Surprisingly little has been written from a historical or anthropological perspective about the concept of scholarly debt.  Finnel (2014) and Allen et al (2014) both take an instrumentalist perspective, examining the rise of the practice in order either to re-assert the role of the librarian (yay! librarians!) or as groundwork for quantitative measurement.

We use the idea as shorthand for the whole “standing on the shoulders of giants” idea, but I would argue for a conception that involves standing and membership within this strange medieval idea of “academia”. “Scholarly debt” is the reason we review papers, present at conferences, heckle at conferences(!), contribute to policy debates and do “public intellectual” things and – hell, why not – share stuff with colleagues, collaborators, friends and fellow-travellers outside of an expectation of a quantifiable return on this activity.

Which is a really, really, medieval way of behaving from an economic perspective. We have no way of knowing if the person whose paper we are giving a (hopefully) helpful in-depth peer review to will directly reciprocate – but we do know that someone in the community will, and that if we didn’t participate in peer review our standing in the community would be lowered.

(I’ve sneaked in peer review here – but a few of you will have spotted that it is a bit problematic in this context. The first recognisable academic journals – Philosophical Transactions of the Royal Society and Journal des scavans – came into being in 1665. These replaced an earlier method of review by private correspondence within distributed scholarly societies, which itself was a proxy for discussion and debate between peers. In the modern era, and until very recently, the published journal article has been the de facto standard for scholarly communication. However, in using peer review as an example I am highlighting one of many roles academics take on for the health of the academy or discipline rather than for their own direct benefit.)

For the following millennium this peculiar set of behaviours resisted challenge from a variety of new perspectives – from the disruptive entrepreneurs of the Enlightenment, the bureaucracy and managerialism of the industrial revolution, the white heat of technology and research exploitation, and the relativism of the post-modern turn. Though the global economy entered a period that seemed closer to the Axial age and Graeber’s military-currency-slavery nexus, academia seemed curiously separate – contributing on occasion but not altering practice (if you are thinking of Anathem here, I’m with you).

Historically, the shift between a community-focused reputational credit and a quantified system of debt within any given society has been marked by a great deal of violence. The quantification was imposed by violence (the need to pay the taxes of a conqueror in their own currency), and drew further violence (peasants’ revolts, insurgency, millenarianist religion) in response, especially as newly denominated debt lead to the development of a non-landowning precariat, and debt peons were created.

So that sounds like this evening’s news for society as a whole (if you lump Trump and ISIS in with millenarianism, which is both largely accurate and wonderfully ironic), but what of academia, which has been resisting both modernity and quantified scholarly debt since the renaissance?

Alas, though violence (through quantification of debt) is dealt to academia, it is not resisted in kind. The latest technological push to quantify all of the things (that’d be “big data”, as previous attempts to quantify the unquantifiable failed because the data was just too small, I guess) has resulted simply in the normal response of attempting to gamify whatever system of equivalence is enforced with the violence of finance.

Even this gamification is a capitulation to the quantification of scholarly debt (what is the JIF of the journal you review for, how prestigious is that conference, how widely are you cited and by whom, how many twitter followers do you have…)  simply because what we owe to our institution -as a proxy of the academy as a wider ideal – now has an exchange rate set, and we are only quibbling around the exact equivalences.


So I was going to finish this on an upbeat note – a fight to be fought, a standard to be carried and the permanence of scholarly regard as the last community/reputational credit. Prof Smithcote (from the superb Cow Country) is probably correct that in the future we will all be regionally accredited, both in the short-term by the seemingly unstoppable march of quantification and efficacy in previously unmeasurable areas – and in the long term as we re-establish our alternative during the next great dark age. And if you think that’s a step too far, just look at Trump’s numbers.

SD cards in SPAAACE

So Jisc Futurist Martin Hamilton took to the pages of the venerable Ariadne to call for the inception of a Jisc Space Program Project. As specified, it’s a fairly simple mission.

But the really silly thing here is how (comparatively) easy and cheap part two could be.

Starting with some numbers: the base price for getting a functioning satellite into space is just $8,000. If I sold my car and Brian Lamb sold his car then we’re nearly there. That money gets you a base kit for a TubeSat (a beer-can sided hunk o’metal and PCB that can broadcast a signal back to earth) and puts whatever you do with it into low earth orbit. Thinking bigger, you can have a CubeSat (a 10cm cube) up there for $12,500, including the base kit. Both of these come under the emerging spacefaring category of pico-satellites.

By simply affixing the SD card to the satellite frame of our choice (probably with a tried-and-tested space adhesive like duct tape), we’re already backing up gigabytes of data in space for a laughably small amount of money.

But low earth orbit ain’t enough for Hamilton – nope, to avoid the all-engulfing apocalypse he predicts he wants data on the actual moon. Funnily enough, NASA are already working on a project to send a CubeSat to the moon with a solar sail. A solar sail is just what it sounds like, a shiny sheet that allows the constant energy from the sun to push an object forward in space. It’s slow, but (comparatively) cheap and it will get you there. And all we need to add to our craft is a means of deploying such a sail when low orbit is reached.

Now we’re traveling in space we have to start thinking about steering and trajectories and suchlike. I’d be looking at something like these nano-thrusters (powered by Teflon, no less) to get us pointing in the right direction. And we’d now need some kind of on-board computer to control everything, maybe a smartphone. And thus a couple of solar cell arrays to power the thing.

(a note: the physics and the software development involved here are a huge deal… I’m largely ignoring this to make a point but there would be – even if we could build on well-documented previous projects the cost of this would be enormous)

(of course, Jisc are off the pace here: the University of Surrey already launched a smartphone into space as a functional satellite in 2013.)

So, we’re basically at the moon for a little over £12,000. Now all we need to do is land, or at least crash in a controlled way that protects our precious cargo.

In the bad old days a lunar lander had a whole assembly, with thrusters, just for this. But we don’t have enough room  – so I’d propose that once we got near enough to the moon an inflatable cushion would cover the satellite. The small amount of explosion needed to deploy this could be used to get rid of the solar sail and get us moving towards the surface of the moon at speed. As soon as we hit the rocks it would gently deflate, meaning that we wouldn’t bounce off and get lost in space.

And there Martin’s SD card will lie, undisturbed, for many years – until it is discovered and read by intelligent aliens as a testament to the former existence of a race of beings that didn’t quite make it.

9 things to watch out for in 2015 – redux

Round about this time last year I did “9 things to watch out for in 2015” – a seemingly popular post that dared to predict some things about this year that weren’t entirely obvious. I did this pretty well in 2014, so now’s the time to take a look at how I managed this time around…

… and I would have to say that my performance was credible rather than earth-shattering. On some points I’m spot on – on others, clearly dreaming.

Those who enjoy irony will note that I wrote quite a lot about not-so-good prediction this year, so maybe you could use the points I made against me in the comments to this piece.

1. Education policy in politics

“I’m not going to win any points by predicting a UK general election next year, or an unusual result that is likely to mark a decisive shift away from the two party politics that have dominated the country since the second world war… What I am predicting is HE policy being a point of clear distinction between parties.”

Yup, we had an election and that was a thing that happened. And the collapse of two party politics seems to be Labour tearing itself inside out.

But the parties *did* differentiate themselves on HE, sort of – remember Labour’s £6K fees idea? No, me neither.

But maybe half-a-mark here, for being accidentally right on some points.

2. Academia against the institution

“Expect more strikes, more protests, more hard questions asked of institutions – and hopefully some answers.”

Yup. The ongoing round of industrial action against the OU is telling in the incorporation of informed strategic and financial critique. As yet, few answers, but a feeling of managers being held to account.

3. W(h)ither the MOOC

“I’ll come straight out and predict that at least one major platform will either close entirely or move away from offering free and accessible online courses in 2015.”

And I read this today on Coursera’s blog.  Students paying for assessment from January 2016. That’s pretty much a pivot away from free and accessible online courses. I would never, if I’m frank, have expected that from Coursera (I’d kind of reckoned FutureLearn as the platform to go – but they still seem to be able to draw down money from the OU as required).

4. Teaching quality enhancement metrics

“learning gain will inform the key policy arguments about learning technology in 2015, and that we will see a welcome collaboration between SRHE and ALT in response.”

Oh God, and how. The Teaching Excellence Framework has defined the year in teaching quality enhancement – going from a dumb idea in the Conservative manifesto, to Jo Johnson (who wrote said manifesto) being made HE minister and actually trying to implement it, to George Osborne omnishambles-ly linking it to inflationary fee increases (at a time of deflation, LOL right?).

And then HEFCE (RIP, 4eva in r hearts) still came on like Learning Gain was a thing.

And then the green paper came out (eventually) in Novemeber, and it was as big a dogs breakfast as any of us imagined. Now every other article in the UK HE press is about teaching quality metrics and it remains as stupid and bad an idea as ever.

(ALT and SRHE haven’t stepped up to critique it, though Wonkhe – increasingly the home of practical HE policy research – did a sterling job)

5. Independent researchers

“Like it or not, much of the significant work on education technology and education policy will be done by people in their own time, and with little or no external funding. My prediction here is that independent researchers in a number of non-science fields will begin to organise themselves for mutual support and benefit.”

None less than the mighty James Wilsdon highlighted the need for investment in research into education policy in his seminal “metric tide” report. And although I know anecdotally of unfunded researchers in edtech, education policy and elsewhere grouping together to support each other (via conversations at various conferences), there’s been little visible movement.

It’s worth mentioning the steady and beautiful growth of OpenLibHums here – one of the best HE stories of 2015, and one supporting humanities scholarship in a tangible and sustainable way.

6. Authenticity 

“Many of the most powerful arguments made about the condition of academia in 2015 will be not be framed in financial or statistical language. They will be pure, beautiful and true.”

I had to sit back and think about this a bit, but what were those whole range of post-OpenEd15 blog posts but an authentic response to a movement appearing to focus on financial and statistical language? As well as a wonderful re-awakening of a range of critical perspectives on openness, this was a visible example of the power of authenticity, culminating in the best thing David Wiley has written for a long time.

7. Students as ______ ?

“Sometime in 2015 we will see the development of a proper position that sees the consumer aspect of these interactions as one part of a very complex whole.”

Not really, no. Though the NUS  flirting with a critical position on TEF was a pleasant surprise.

8. Uncapturing the lecture

“I predict a resurgence of the lecture – as outreach, as destination and as the cornerstone of the higher education experience.”

Again, not really. Lectures retained the place that they always have had in HE. I could still see them becoming an “event”, and public lectures still form a more sustainable and valuable form of outreach than MOOCs will ever be, but as a 2015 key trend… no.

9.  Collaborative tools

“So in 2015 the technologies that will impress us most will be collaboration tools of various purposes.”

That’s pretty vague, for which I apologise. But a year that has seen the launch of Mike Caulfield’s federated wordpress and the Thompson Rivers University “SPLOTs” can be said to embody this trend of simple collaboration. And I’m increasingly enamoured of as a means of quickly deploying collaborative tools on demand.

So – I make that six out of nine – 66% accuracy rate – so slightly better than a monkey with a wordpress would have done. Join me in a few days for some predictions about 2016, perhaps FOTA has come of age as means of predicting the future of UK HE policy and edtech, but I can be certain that love has come of age.

Take it away, Kenny Loggins!

Being a Compleat & Poetickal Account Of RECENT EVENTS at the COMMITTEE of BIS

(interpretive note by D. Kernohan (BA Hons English Literature)

This recently discovered MS, by an unknown writer, fits neatly into the established collection of “broadside” poetry – produced at various times throughout history to disseminate courtly news to the general populace. In early modern Britain these would be largely written for oral transmission – the quality of illumination in this MS suggests that this would have been a private copy made after the described events, perhaps by one of those named therein. Though “Morgan John” is mentioned, is clear that this piece cannot be attributed to him.

At the time described (references to “Johnson the Lesser” suggest a date between 2015 and 2016) former acronyms (such as “HE” “OIA” and “HEFCE”) would have been pronounced as single words rather than by individual letter. Those seeking an authentic reading should note that HEFCE was pronounced with a soft “C” (heff-see).


Come Professor! Librarian!, Administrator!

And hear of great valour and graduate data

Charge up thy flagons! Sit close by the fire!

I sing of great courage, and of evil so dire.

When JOHNSON THE LESS held the bright HE Crown

And the doctrine of TEF spread through country and town

A band of great knights, from far and from near

To the COMMITTEE OF BIS did deign to appear

Call’d Assize to skewer this JOHNSON THE LESS

And to his foul schemes he may here confess.

For two hundred days the fair KINGDOM OF HE

To him fain must fall, all upon bended knee

His edicts inscribed on vellum dyed green

Through commons and courts are too often seen


Would dare to engender such dastardly plans.

Brave knights filled the hall as battle commenced

Arméd with no swords, but with God’s Evidence


Proudly at battlefront thence did appear


Knights from the west who were ready to spar

For honour of HE, ‘gainst the forces of BIS

Stood EBDON OF OFFA, who would not be dismissed


JOHNSON, young brother to he who would rule

Had thought the nobles of proud HE to fool,

Disparaging feckless “lamentable” toil

The rites of the Learnéd he thought to despoil

With TEF, a dark magic for black HE days,

A measure of learning in numerous ways,

Ignoring the wisdom of QAA he proceeded

To focus attention where it was not needed

The Knights who said “NSS”, with vigour and pride

Disputed his claims, and called out “you lied!”

Evidence brandished, and numbers unfurl’d

To prove only they comprehended this world

Sarjeant JOHNSON would blacken with hearsay and rumours

And castigate STUDENTS as merely consumers.

But the brave souls of HEFCE and OFFA and OIA

Would not let such scurrilous insults pass by

Long through the morning they battled and tarried

With arguments made and rebuttals parried

Til’ even the Scribe of the Court, MORGAN JOHN,

Cried “enough! Good sirs! this cannot go on!”

The palace of Westminster reverberated

With cries of the lost, and with blood-lust now sated

E’en  LORDS OF UUK in their Tavistock Palace

Did curse the dark TEF with vigour and malice


Noon broke o’er corpse-strewn committee room

Where the forces of BIS did fall to their doom

With arguments divers and figures presented

Twas prov’n that TEF was a Perverse Incentive

And JOHNSON THE LESS, with fire and with flame

Was driv’n from the Palace in anger and shame

He swore his revenge and rethought his defences

MORGAN JOHN filed his story, and claimed his expenses.

“Heart to heart” – meanderings in the future of open scholarship and open education

Two events in a row – #opened15 on open education and #KEevent15 on open research – have conspired within my mind to produce a series of thoughts that have more to do with the organisational dynamics of pressure groups than “open” anything.

As the campaigns for openness in academia reach fifteen years old they have mostly solidified into two positions. On the one hand, we have people who like being in campaigns and are distrustful of what they see as “solutionism” (the idea that there is a technical fix for every problem), and on the other we have people who like fixing things and are less enamoured of the kind of academic navel-gazing that they perceive as being “the wider movement”.

Every solution that is built is a compromise and a first step. Moving a bit along the way towards fixing a big and messy socio-cultural-economic problem is not as rewarding or exciting as it should be. Some worry that these first steps are taken down the wrong path, others worry that the steps are too tentative. And each step has a resonance, it cannot be un-taken, the echoes continue as the unintended consequences mount.

After 15 years our ears are ringing.

Most of us in the OER world have welcomed the blizzard of reflective posts capped recently with an old-fashioned David Wiley showstopper.  The latter could be read as a better kind of project plan complete with outcomes and outputs, a plan presented – with bravery and without guile – for the community to discuss, and one that starts with a goal based in access to education and ends with a re-affirmation of the 5Rs of his famous definition.

The conference (#opened15, in an unseasonably sunny Vancouver), was effectively two co-located conferences – characterised by Martin Weller as “hardcore research” and “philosophy of open” – attended by two groups of people (“colonisers” and [shudder] “edupunks” as named by Rob Farrow). Or was it?

Certainly some of the reflections during and after the conference (Robin Derosa was the key text for me) called for a reassertion of radical pedagogic and theoretical action against the closed horizons of the “big open education” of systemic open textbook adoption.

But really?

All activity needs critique, but this should not be at the expense of the activity. Even back in the Second Great MOOC Boom of 2012 I don’t think anyone was saying “don’t do MOOCs ever”. Not even me. Rather than “this is not what I meant” the cry was “this could be so much more”.

Michael Feldstein’s keynote point about being comfortable about winners who weren’t us was a bit of a wake-up call. BC Campus are pretty great at making open textbooks, but should (say) Pearson ever start making OER texts they would render the BC Campus efforts irrelevant. And as an open advocacy movement we should celebrate that – why should Clint Lalonde have to spend large parts of his life reflowing texts across multiple formats, when Pearson do that every day, all day?

It’s an uncomfortable thought experiment  – even though if like-for-like replacement was no longer a problem we had to solve it would be possible to focus on precisely the pedagogic and remix culture that many yearned for at OpenEd15.

Rolin Moe expressed this conundrum beautifully:

“When we open the escape hatch from the reusability paradox and let the content out into a world unencumbered by copyright, we leave the safety of discussing open as a copyright problem and enter into a larger and more problematic space where open cannot be a use-value product nor a universal value. By opening the escape hatch and leaving the reusability paradox, we make open less absolute than when the hatch was closed.”

At KEevent15 (A ten-year anniversary conference for the Knowledge Exchange, a partnership between five European research infrastructure organisations) Sascha Friesike used the lens of Rogers’ (1962) theories of “diffusion of innovations” to examine the success (or otherwise) of 15+ years of Open Scholarship advocacy.

On the surface, and when compared with open education, one could easily assume that the range of high-quality open access journals, backed by funder mandates that increasingly extend to open research datasets suggests a movement with several significant victories behind it and more in easy grasp. But I heard similar voices lamenting the lack of process, and suggesting some combination of simpler tools and better marketing as the way forward.

As a “provocation”, it is difficult to know how seriously to take the notion of a diffusion deficiency as a full critique of open scholarship. Indeed Rajiv Jhangiani used and critiqued ideas from the same book, expressed via the “pencil metaphor” at OpenEd15. In both instances the metaphors are useful descriptions of the space, the conclusions either facile or unhelpful.

Rogers’ adoption curve makes a number of assumptions that need to be unpicked. Like the Gartner hype cycle it starts from the position of assumed total success, and has an uncritical (solutionist) view of innovation as an unproblematic good. It also focuses on unidirectional diffusion (from the “innovator” to the “end user”, and on a single innovation taken outside of the roaring tsunami of change that is years like 2015.

But fundamentally Rogers (like Von Neumann and Morgernstern before him) assumes a selfish (and thus mathematically predicable)  adopter, who will seek the maximum individual utility in the innovations they choose to adopt. If marketing people chose to believe that of the customers they talk about, I have no complaints – our issues are that we are not marketers and our “product” (if we must) has environmental rather than individual utility.

Looking at the open research area we see a great deal of emphasis on policy – changing the entire environment (forcing individuals to act in ways that they may individually find difficult, at least initially) rather than growing an activist base. Open Education has historically done the opposite (supporting individual innovators in a Von Hippel-ish way) – it is strange, to see the least, to see both fields meet in a joint concern about models of innovation and look to utility marketing (of all things) as a next step.

Especially as the individual academic has far less agency, and less scope to make decisions either rational or irrational, than at any other time in history. This is the greatest concern that is facing academia right now – sure, textbook costs, fee costs and the reproducibility crisis in research are all top four, but in many ways represent the absence of academic freedom to experiment.

Of the three models discussed the environmental model best fits the new fully managerialised education sector, the democratic or lead-user model has the greatest emancipatory potential. Using marketing theory from the middle of the last century as a model seems like an admission of defeat.

After all, that’s how commercial publisher sell “their” content. And how’s that working out for them?

Amy Collier and Jen Ross presented  at #opened15 on the idea of “not-yetness“, an attempt to describe the liminal, (Deleuzean) smooth space that allows for the potential for movement which is not complete. In our audit culture (Dahler-Larson) the very idea of incompleteness sits in opposition to a need for evidence-shaped data (efficacy?) supporting a progression.

The traditional critique of numerate academic research is that the specificity and validity is limited by an inability to show statistical similarity to the wider population. (But somehow when software vendors do it it works). We have moved beyond a distrust of evaluation that is unactionable to an inability to countenance an activity that is inevaluable – as numerous duplicate presentations about no statistical difference between student attainment with OER and with a non-OER text attest (or anything that involves proving with numbers that the people do indeed use repositories of their own volition – seemingly the great silent project of open academic practice).

In open research the outstanding issue is one of genuine reuse – how can we incentivise academics to reuse information and data to build new research? Funding and publication pressures demand, always, new data – and in contemporary precarious academia you do what you have to do to stay in a job.

Conversely, in open education, too much reuse is the problem – how can you *stop* academics reusing published (and expensive) material when cheaper and more adaptable alternatives exist? The pressure within teaching is only that of time – the rise of adjunct culture permits only the barest minimum of preparation and textbooks are one way of ensuring that whichever poorly-paid post-grad teaching a given course can have a running start.

Both of these problems are soluble only with a serious look at the continued attack on academic agency, security and space to experiment. And that’s the open academic movement I want to be a part of.

“Keep the Fire” – notes on my #OpenEd15 presentation

[Slides] [Data] [Song]

Open Education, and indeed Education Technology more generally, exists in a perpetual “now” – or to be more precise, a perpetual near future. No matter how many times we attempt to put it into a narrative, it retains a deliberate ahistoricity that – after a while – begins to jar. Even to say that we have heard all this before has become a cliché, as there always seems to be one of those sessions at every conference.

It’s usually my session.

“Retain” is the first of Wiley’s “5Rs”. To give the complete language:

“The right to make, own and control copies of the content, eg download, duplicate, store and manage.”

In this formulation it implicitly refers to the virality of open content. Open content doesn’t have to exist in just one place, it can exist simultaneously in multiple (accessible and non-accessible) places. The comparison is with content licensed under a closed licence – it can’t be everywhere you want it to be. Even though Apple Music (say) will temporarily store music on your computer, you can never be said to “own” a copy. And most widely used software (such as windows) is only licensed, never owned. Even the software in your car or tractor is only licensed to you – you can never truly be said to own your car.
Ownership implies a relationship between you, an object, and time. Something belongs to you until you decide not to own it. In the perpetual near-future of “edtech”, ownership is a concept that is almost obsolete.
What about a community? Can you “own” membership in a community? I would argue that you could – membership of the community of open educators is ours as long as we choose to claim it. No one – not even Stephen Downes – can refuse to let you be a part of this community.
But is the community active in time?
Maybe I need to unpack this a little. My first OpenEd was here in Vancouver in 2012 (I was meant to go to Barcelona in 2010 but I couldn’t for various personal reasons). But what did I miss by not being there in earlier years? – what did other attendees bring to the conference in 2012 that I did not? How could I even find out what happened at say, OpenEd2010? Or OpenEd2008? or any of the predecessor conferences?
This is important, because a community of practice is a shared history of that practice. When we all complained about Sebastian Thrun “inventing” open education in 2011 – this was an expression of the history of our practice. Some of us were able to talk about, say, David Wiley’s experiences with WikiClasses in 2005, or George Siemens’ experiments in 2008 and say – look, this is the same thing. We’ve done this, we learnt from it, we want to share what we learned.
But – and I mean here no criticism of a decade of hard-working conference organisers – we are actually quite bad at preserving what we have learned and in particular we are bad at capturing what happened at these conferences.
Open Education is a field often criticised – for being without a research base, for being unconcerned with context and pedagogy, for being blind to the problems inherent to the idea of reuse, for being focused on content rather than community. These are major challenges to the integrity and nature of this field: and we have no real way to answer them.
I collected and tabulated 11 years of OpenEd conference activity. Not much, just session titles and presenter names. This took me more than a week, and necessitated me promising not one but two pints of beer to Alan Levine. In some cases abstracts and/or slides were available, in other cases not. I drew primarily on the (amazing) Internet Archive, which captures the old conference sites in various states, allowing for the vagaries of the technology underpinning them. (ColdFusion? seriously…)
I did this to scratch an itch, to see something that no-one else had seen (a similar motivation, I think, drove Adam Croom to serendipitously do a similar job in a slicker way). Others may need to find old abstracts for other reasons. Validation, for instance: to prove that they where at OpenEdxx and they presented on whatever it was. Or research: they’d seen a reference to a great presentation in previous years and wanted to read about it for themselves so they could build more research on top of it.
Some presentations here become papers (or blog posts). Some begin as papers or blog posts. But many more exist as a moment in time, maybe a set of those fashionable slides with big images and not much text. And a presentation that held the room. Maybe every presentation is captured as a youtube video, or an audio file – some years it is, some years it isn’t.

opened15 graph

And as you can see, there are patterns in there. I talked about the peaks in critique and sustainability talk in 2010 (linked to the end of many Hewlett grants at that time), the slow but inexorable growth of “policy” as a theme, and the dearth of interest in “reuse” since ’06. The tags are broad and subjective – in releasing the data I’m hoping people will feel bold in using their own tags to drive their own understanding. (The only unclear tag I use was “update” – just a way of indicating sessions primarily focused on providing an update on an ongoing project. I was heartened and surprised to see more updates this year than ever before – clearly there are more projects running than people may realise)

But the real focus of the session was just to help the community get better at archiving findings and building on them. If open education ever grows beyond simple implementation, this is how it will happen.