Life with geeks

This post represents my personal opinions, and not those of current or former employers, projects, or programmes I am or have been responsible for. This post is available under a CC-BY license.
None less than Dave White (available now for the 2011-12 conference keynote season) started talking about geeks on twitter the other night.

This is what I have learnt about geeks in the last 10 years – I’ve worked with them and hung out with them, and although I feel like I understand them I wouldn’t claim to be one.. On twitter I was in full-scale Adam Curtis mode (or maybe just trying to get a slide all to my self in a Dave White talk) and came up with the following soundbites:

“The geeks are 2%. They’ve always been 2%. They always will be 2%. They’ll always own the cutting edge.”

“Geeks are The Culture. They share everything, they don’t need profit, they trust each other, they have super-advanced tech, they are naive.”

“Geeks have their own currency – reputation. In that respect they’ve a lot in common with what academics used to be.”

So, to unpack that a bit I’m fundamentally seeing geeks as being defined as those who are living now the life we will all be living in 3-5 years time. But they are doing so with a very different set of assumptions, values and interest.

Geeks are not technodeterminists.

It’s a cliche to paint a geek as having an interest in technology – Technology for geeks is like bricks to a builder. It’s a staple. You can do all kinds of cool stuff with it, but in itself it’s barely worth thinking about. Show a geek and a non-geek technodeterminist a new gadget. The technodeterminist gibbers about UI and gigabits and pixels per square inch. The geek asks “what can I do with it?” –  a question that is more concerned with openness and interoperability than specification.

Geeks are interested (almost unhealthily in some cases) in human interactions and ways in which they can be improved and better understood. Most of what is interesting in geek culture is based on their understanding of (or, attempts to better understand) human interaction, and is expressed in the medium of technology. Most geeks do not have a formal background in humanities, so insights are drawn from technical analogies and amplified/reinforced by popular philosophy/literature and *especially* the more interesting class of games.

Amongst themselves, they have perfected interactions to a terrifying level. Respect and reputation are key, but the unlocking capability is the ability to ask intelligent questions. If you can do this – even if you can’t understand the answers – you are accepted into the community. However, a poorly expressed question can often be treated with derision and rudeness.

Geeks design systems of interaction based on mutual respect and trust, precise and concise communication of key ideas, and the assumption that everything will be shared.  When these systems migrate into wiser usage, these underlying assumptions can cause major problems. Facebook, for instance, assumes that you want to share pretty much everything with pretty much everyone – a default that becomes more and more problematic as the service becomes more mainstream.

Commerce, or even profit, is frowned upon. Those who manage to profit whilst maintaining geek credibility are tolerated, those who do not retain standing in the community are reviled. Geeks are more likely to work on something they think is cool (often with superhuman levels of effort and time commitment) than on something that simply pays their wages.

They are using technologies on a daily basis that you will be using, as I say, in 3-5 years time. But by the time you get there they will be gone, to a technology that is more efficient and/or (usually both) more open. Ideas and tools that excite them now are almost certainly not accessible for the rest of us, indeed we’ll have very little chance of understanding them in their current state.  UI comes later, the possibilities and efficiencies are what is initially important.

As I said above, I’m not a geek – just someone who knows some geeks and is dumb enough to think he understands them. I think there are some historical and cultural parallels, as Carl Vincent pointed out:

“[T]hey are equivalent to academics from 300yrs ago and engineers from 150yrs ago.”

but I’ll leave them for others to draw out.

#bebettr , @anna_debenham , skills vs literacies and my shady past.

This post represents my personal opinions only, and does not reflect those of my employer or the programmes and projects I am responsible for. It is made available under a CC-BY license.

Not for the first time, I think @jukesie is on to something. #BeBettr was the bare bones of an event, simply a room to keep the rain off and some interesting people talking, both on the stage and from the floor. No tedious “conference dinner”, no dodgy buffet lunch, no reams of paper, no trailing through the bowels of an expensive venue finding the next break-out room. No wireless to lure you in to email and the backchannels. My suspicion is that we will see a lot more events like this in 2011, and a lot less multi-day quasi-academic conferences for edtech folk. This can only be a good thing.

When I go to a conference I expect to leave with my head spinning, and at least one blog post ‘twixt my teeth. I was especially keen (after very positive reports from #drumbeat) to see Anna Debenham speak and I was not disappointed (here’s her talk on similar themes given to London Web Standards). She’s a young, talented, web professional who feels she has been let down by formal ICT education, and provides example after terrifying example to back up her narrative. Academia and the world of education has simply failed to keep up with the changing face of web technology – teaching children how to lay out a web page using HTML tables, or design a site in PowerPoint helps no one, and although these examples are taken from current GCSE and A-level syllabi it has also been argued (Leslie Jensen-Inman, in A List Apart) that HE is not much better. Anna herself chose not to apply to university based on a poor impression of what courses were focused on, and has instead developed a successful freelance career.

I agree with her diagnosis, but was wary of her solutions – which were that ICT professionals should work to support the development of updated syllabi. I simply don’t think this is sustainable, as web technology and development practice are liable to keep changing at the same dizzying rate as they are currently. Will we still be teaching web design in a world dominated by platform specific applications drawing information down from the cloud, bypassing the web entirely?

My suspicion is that any course that purports to teach you how to do “x” is probably a waste of time if it talks about industry standard practice. It’s not education, it’s training – and thousands of commercial training providers are quite delighted that training becomes out-of-date very quickly as they sip champagne cocktails and cavort in pools of gold coins. 

Education isn’t training. Training shows you how, education supports you in understanding why, and in gaining the general principles and aptitudes that allow you to find out how for yourself in the future. (NB although this sounds pretty convincing, please remember that I am essentially a policy wonk that spent too long listening to experts, so I claim no special insights here).

Back in 2003, I taught Music Technology to a group of Foundation year (level 0) students at a University not unadjacent to Pontypridd. I’d graduated from what became a MusTech degree in 2000, and even three years later most of the practices I had learnt had become laughably outmoded. I had learnt how to programme the venerable Akai S1000 sampler as a cutting edge tool, I was now teaching on a platform where a plug-in was available which allowed you to play the S1000s vintage, grungy, low bandwidth samples.

However the students were expecting to learn the latest industry standard tools, and many of them were much, much better than me in using them. Clearly something had to give. So I didn’t teach any tools at all – I pointed students to newsgroups, websites and online tutorials… I showed them how to identify and use reliable advice. I talked about copyright, music publishing, composition, harmony, basic production principles and critical listening. I even did a seminar called “how to play keyboards – in 40 minutes”. I also (wonderfully) branched out of a critical listening session because of questions asking me to explain what a passing reference to “Marxism” meant – the response prompting one keen student to opine that this revolution of the proletariat stuff sounded brilliant and why hadn’t it happened already? Students used the tools they were comfortable with, supported others in developing their own skills and were supported experimenting with new ideas.

But still, when one student asked me if this course he had signed up for enthusiastically would prepare him for a career as a sound engineer, I had to say no. I explained (truthfully) it would make him a better sound engineer, a better musician, and give him the literacies he needed to work in a fast changing field… but he left the course the next week and became a sound engineer in a club in Cardiff. Fair play to him (and I hope he at least took on board the stuff about ear protection!)

We can’t teach skills in HE. We’re not set up to do it, we’re not very good at doing it and we need to stop selling ourselves as if we are. We shouldn’t run courses claiming to make people professional sound engineers, or professional web designers, or professional anything else. We give people the literacies they need in order to pursue these careers (or any other). The subject matter of a degree, I suspect in more cynical moments,  is largely a way of getting students to concentrate long enough to swallow the literacies pill.

What does it matter if you studied English Literature, Music Technology, Pharmacy, Modern History, Software Engineering or Horticulture? The important things are that you learnt how to find information yourself, make decisions about it’s validity and usefulness, use the information to gain new knowledge, capability and expertise – write and argue persuasively, use sources and quotation to back yourself up, use technology to better express yourself, and had the confidence in your analysis to stand by it in the face of contrasting opinions. These are the graduate skills that most people end up using  – as web designers, sound engineers, sales managers, doctors and business analysts – we need to be a little more open about them at the start of the journey.