Getting out alive

No escape: decal of a struck-out person fleeingOne Friday in May of 2011, I locked up my shared office, went to the pub with some colleagues and students, and said goodbye to my job as a senior lecturer in psychology.

On the following Tuesday (it was a bank holiday weekend) I started a three-month stint as an intern at a then-mid-sized software company. They were pretty clear that there wouldn’t be more work at the end of it; all I had going for me was that they were paying me — a lot less than my academic job paid, but hey, it was money. (Let’s not even start on the ridiculous exploitation of young people by companies looking for free labour, or how unpaid internships exclude those who can’t afford to work for free.)

Anyway, so … lunacy, right?

Maybe. But maybe it saved my life.

I cannot possibly supply a complete list of the things that drove me out of higher education. Some of the factors, in no particular order, were:

* the way the system effectively punished people for caring about ( = preferentially putting time into) teaching, denying them a legitimate career route with equivalent promotion opportunities;

* relatedly, teaching and educational research being seen as second-class citizens to subject-specific academic research. I won lots of praise from colleagues and students for my interrogation of and challenges to typical lecture-theatre methods … and nothing else. Alanis Morissette was right: it really is like 10,000 slideshare views when all you need is a peer-reviewed journal paper.

* seeing colleagues struggle with depression and stress-related illnesses, without support or sympathy from senior management; the relentless, meaningless “do more with less” that rubbed everyone rawer, year on year

* not being allowed to reform teaching on a large scale, because timetables/curriculum/this is how we’ve always done it

* widespread bullying by an incompetent manager, to the absolute indifference of senior management

* prioritisation of commerce over crafting quality learning experiences; massified, McDonaldsized education

* ceaseless adding to the already-overloaded workload of academic staff with no thought for how much work they were already doing (see also: aggressive expansion of higher education; franchised degree programmes)

* resistance — in some cases hostility — to change and growth (personal and institutional)

* too much time, distance and obfuscation between what we did all day and how the organisation as a whole performed

* an itch I had that wasn’t being scratched: creating and building useful, beautiful things (there’s only so far even I can go with lecture slides)

* realising that there were plenty of people out there doing jobs who weren’t exhausted (because they weren’t working 55-hour weeks), weren’t demoralised, weren’t on the edge of a mental health precipice, and who could see, almost every day, a connection between what they had done and how their organisation was moving forward)

* the sad thought that maybe higher education was broken and that, despite having nearly boundless energy to do something about that, I couldn’t fix it on my own, or even alongside people who agreed with me

* weariness at being an ‘expert’ all the time. (Maybe being an expert sounds great to you. Maybe I just have imposter syndrome. But trust me: having undergraduates unquestioningly write down everything you say gets old.)

The drain of good people from higher education has become A Thing. Way back, I wrote about Mark Changizi’s decision to leave, and since then there have been several waves of “screw this, I’m out” from academic refuseniks who just didn’t want a piece of that anymore. I’ve written before about disruptivity and taking risks, and recently I talked about it in person too (slides from the talk here). There were some pretty low moments; I remember sitting in an all-day meeting that was absolutely a waste of everyone’s time, never more so when the person chairing the thing visibly gave up bothering with it, but kept us all there anyway. I remember thinking, “I have to get out of here, but I don’t know how.” At that point, had I needed to reapply for my own job, I might not even have been granted an interview. I was perishing, not publishing. Despite passionate advocacy for teaching quality, and throwing myself into researching better teaching methods, none of it was doing me a blind bit of good.

So I left.

It’s taken me three years to write about this, and even now I’m a little hesitant to talk about it, in case I accidentally explode, covering everyone around me in something unpleasantly bitter and acidic. That sounds pretty overdramatic, but teaching was, as the cliché goes, my vocation. I loved — LOVED— my students. I never understood the detachment and burnout you sometimes see in academia (where, fortunately, the consequences are rather less severe than in disciplines like nursing). Every single student had potential, even the ones who didn’t know why they were there — and you didn’t have to dig very deep to find a human being who was just trying to do well and figure out how they fitted into the world. They were all, individually, amazing people. I still want to write something for and about them. But this is not that post.

In the first year after I left, I fielded several calls and emails from other academics (mostly in the behavioural sciences) who wanted to get out too, but didn’t know how. This post is for you guys, and especially for D, who’s waited a long time for an answer, and probably gave up on me way back: let me help you remember all the things you’re capable of. You might never get all the dents out of your self-esteem after the years you’ve spent in academia, but I might be able to help you with it a little, if you’ll let me.

Let’s dispense with the easy stuff first. You (probably) have a PhD. I’ve alluded before to how people Out There are actually impressed by this. Lord knows nobody in academia gives a rat’s ass, and so neither do you, anymore. But think for a minute what that means. Firstly, you are an expert at something, however much you might not feel like one. This is huge. You have in-depth knowledge of something. Don’t gimme no backtalk about how that’s only useful in academia. That’s just the story you tell yourself because you’re unsure of what to do next, or because academia has left you with Stockholm Syndrome. You know stuff about stuff, and somewhere out there is an organisation with someone in it who wants you to do your thing, for them. As an example, I’ve taken what I knew about cognitive psychology and put it to work in software usability and user experience, and information architecture. I took a decade’s experience of running research with human participants and channeled it into learning how to research the ways that people interact with software. Somewhere out there is a practical application for the academic stuff you know how to do. It might even be a non-profit, if you’re uncomfortable at the thought of doing the corporate dance. (It wasn’t all theory, either — I got to have fun too, storyboarding and scripting educational material like this short animation, which I decided would be more entertaining for everyone if we did it entirely in rhyme.)

A second consequence of having done a PhD is that you are persistent as fuck. This is one of the best-kept secrets of academia: even the basic entry criterion requires immense tenacity. Whether or not you have a doctorate, continuing to put one foot in front of the other for year after year, sometimes in the face of harsh criticism and crushing disappointment, is a measure of what Taekwon-do calls indomitable spirit. Treasure that; it was hard won, and it will get you ’most anywhere. You can absolutely call on it to get out of your academic job if you’re unhappy there.

You (probably) aren’t fazed by addressing groups of people. Don’t get me wrong: for the first few years, the really big lectures (nearly 450 students at Peak Psychology — those were the Cracker years) were pretty scary. Some weeks I was literally only one chapter ahead of the class, because when you first start teaching after your PhD, like the hedgehog, you know one one big thing … but departmental teaching structure requires that, like the fox, you know many things. There’s a fine line between that existence and imposter syndrome. But eventually, you learn to be heard for 50 minutes or 100 minutes at a time. You learn to craft stories and handle questions and manage crowd control in a big room. Of course, it doesn’t stop there if you present your work at conferences, where the questions are a lot harder, sometimes even hostile.

And now think of all the times in so-called corporate jobs when people are called on to present information and take questions. You’ve been doing that for so long, it’s not even a thing anymore. You intuitively know how much information will fill an hour. You can plan and deliver a workshop without drama (unless the workshop is actually about drama). You might not love it, but it’s something you can do. If you are lucky, you will love doing it. Either way, someone out there is willing to pay you to do this.

You are adaptable and flexible in your thinking, because you aren’t afraid of complexity, ambiguity, or new information. The other day, my husband, also an academic by training, said “I’m really glad to have had an education and career that has involved always being on the edge of intellectual comfort. I think [that] makes it much easier to learn new things.” I couldn’t agree more. Academia is all about re-evaluating your position when new shit has come to light. Sure, this makes it harder for other people to get a straight answer, because your response is usually something like “Well … that depends. A, but on the other hand B, and in fact if we consider C …” It might be infuriating for others that you can’t give them something that’s pithily black and white, but living with this relativism represents a daily practice in flexible thinking, and in not reaching conclusions so entrenched that you can’t argue your way back out of them later as more information becomes available. You’ve become the kind of person who, if they really want to know something, reads or asks until they understand it. Maybe you’ll even get massively into the topic as you start to find out more. Do you know how many organisations out there are crying out for people who can do that, who are intellectually self-sufficient?

You can effectively argue a point, in writing and (probably) in person. Being able to read or listen to something someone else has written, and unpick and critique it, is immensely valuable. Our education system doesn’t really foster critical thinking skills as much as it might, but you have had plenty of practice defending your own work against this dispassionate taking-apart. If you’ve done much teaching (or marking of student work), then you’ve also had the experience of trying to explain to non-experts why an argument doesn’t stand up, or is subtly wrong. Out there, in Beyond Academialand, are many, many people, some of them quite senior, who need help with this — often because they want to get it right, but also sometimes because they don’t want to look like idiots. At least some of them are willing to pay you to do it.

You have immense resilience and can work as hard as anyone. It’s still sometimes a shock to me that in my post-academic life, I get to arrive at work around 9 and leave again around 5 or 6. I don’t take my work home with me, beyond idle mulling of the occasional knotty problem. I don’t work weekends. I don’t feel remotely guilty for not working evenings and weekends. Contrast that with the typical lives of academics, who pull long hours and spend so many evenings and weekends writing papers. Some of that is for love, but much of it is practicality — because who can get anything done during the week when there is teaching and admin to be done and people keep knocking on the door? There is no traffic-cop for academic workload; it just keeps piling up. If something else needs to be done, because the university or the subject governing body says so, then somehow it will just have to get done, and that means longer hours — or shoddier work. Like the triangle says, you can have it good and fast, or fast and cheap, or good and cheap, but you can’t have all three. As for resilience, it’s only anecdotal, but I have seen a lot of academics — and teachers generally — put off being ill until a time when it was less inconvenient (the holidays). Don’t tell me from resilience.

I’m not saying there aren’t jobs out there where your employer will cheerfully bleed you dry, or that there aren’t places with long-hours cultures that disadvantage anyone not young, single or rudely healthy. But either you’re used to those long hours anyway, or you’re highly motivated to work somewhere where they’re not expected. Your life can be better than this.

There will be some readjustment; this is unavoidable. The first interview I went to when I was trying to get out of academia was with a medium-sized, well-respected software company in Cambridge. I was fairly pleased with how I’d done until about halfway home, when I started to realise all the rookie errors I’d made. That trickle became a flood, until I thought I would die of embarrassment. It took me months to get over it, but they were right not to hire me, because I didn’t have enough experience: they couldn’t afford to spend time bringing me up to speed while they were trying to ship a product. As I gained more experience in software and digital, I had to learn the hard way to think in hours and days, not in semesters and years. To track time, and report back to people who needed to know what I’d been doing. I had to learn to work in a team again — to have conversations about work, with the people I worked with, every day. To own it when I messed up, instead of hoping nobody would notice (out here, they do notice. And that’s a good thing). To keep regular hours and be in the office every day. This might sound regimented compared to the largely unmonitored life of an academic, but it was surprisingly easy to adapt to. The same self-discipline that got me out of bed at 6am to give a 9am lecture (I had a one-hour commute, which, when it went wrong, went really wrong — and you can’t be late for 200 people) helped me adjust to keeping regular office hours.

Adjustment isn’t big or hairy enough to justify putting off leaving. Admittedly, going in somewhere as an intern was a relatively safe thing for me to do: the company’s expectations were fairly low, so it was an easy bar to clear, and they got someone with way more experience and knowledge than they were paying for. Everybody won.* I’m not saying I didn’t screw up a few times, but every single incident taught me something, and the enforced humility was good for me. And that’s the thing: a radical change of environment was good for me. Saying “I don’t know” and not being an expert — or having anybody treat me like one — was good for me. Those first few months, by the time I got to Friday I was totally exhausted, because I was having to learn so many new things. It was like being back in school again, and I mean that in the best ways. But without the feelings of inadequacy; occasionally I would enjoy surprising people with interesting and relevant things from my background in psychology. And one thing I never expected about the transition was how much more seriously people take you when you’re older, even if you don’t really know anything (as opposed to when you’re 18 or 21 and absolutely everyone except you knows that you know nothing). It’s an unfairness, but given how much our world capitulates to the cult of youth, especially if you work in software, it’s one I can live with. And anyway, 37, my age when I went off to be an intern, is hardly old. Point is: my life is better in almost every way.**

This has now come full-circle: my husband just quit academia. Despite a publication record considerably more handsome than mine ever was, he grew tired of the succession of temporary contracts and empty promises of a permanent, senior position. He doesn’t quite know yet what he’s going to do; I’m finally returning the favour he did me when he agreed to be the breadwinner while I figured out how I was going to make a living. We don’t have kids, and I acknowledge that it would have been a lot harder for either of us to do this alone. But I’m not convinced any of it’s insurmountable.

Look: my old colleagues are tired and bruised after a long battle to see if anyone was going to have redundancy forced on them (end result: no, but four voluntary redundancies. I think the oldest is in their mid-40s. These are not people taking a sweet handout before sloping off into retirement — they are getting the hell outta Dodge). Nobody trusts management anymore, and who can blame them? Academic staff have seen their jobs go from does-anyone-want-to-get-a-coffee to locking-myself-in-the-office-and-taking-my-antidepressants in a span of a few years. This is not what we signed up for.

So go, or at least think seriously about going. Think hard about all those hard-won skills that you take for granted every single day. Skills employers want. Skills that someone somewhere else will pay you to use. I use what I learned in psychology (the theoretical stuff and teaching-related skills) every single day. I still use empirical data to justify decisions — the decisions just have different, more practical, and usually more immediate consequences. And I really like that. I like visible progress. I mean, there is literally a weekly chart of how much closer our team is getting to where we want to be. It’s incredibly motivating.

And if you choose to stay (and I have the utmost respect for those who do), make sure it really is a choice. Don’t tell yourself you’re no good for anything else, because that’s just not true. Call me (my personal inbox is in perpetual meltdown, but I have plenty of time to take calls on my lovely walking commute). Call any of the people who have decided academia’s no longer worth the pain. We think you’re amazing, and we’d be happy to remind you of that anytime.

[Edit: I neglected to thank the many people who helped me make this transition. As well as being eternally grateful to my husband for his love and encouragement, I am forever indebted to the friends who let me stay with them while I was interning (and who refused to take any money from me), and to the many friends and former colleagues who wished me luck and success, and offered connections they thought might be helpful (and who, if they had any doubts about my plans, kept those to themselves). Thank you all :) ]

* except the young person somewhere who might have had the internship instead. I like to think that my time as a lecturer (and for a while as an assistant Taekwon-do instructor), investing time in young people, might begin to make up for some of that, but it still bothers me.

** I miss teaching, I miss my old colleagues, and I miss my students. But I knew I would, and I jumped anyway.

81 Comments

Filed under Uncategorized

How to be disruptive: a retrospective primer, with meerkats.

‘Disruptive’ doesn’t mean what it used to mean. Being disruptive used to mean you’d be in trouble pretty soon: with your teachers, your parents; with other kids’ parents. You know — grown-ups. Back then, being disruptive was seen as bad, and not something that would get you very far in life, beyond maybe the head teacher’s office.

Times change. ‘Disruptive’ has now acquired cachet, to the point where it seems in danger of becoming one of those overused words (see also content, innovation, gameification, strategy, etc.) signifying that the speaker might not actually make things for a living1. But underneath all the buzzwords and hype is a kernel of truth: there’s loads of potential value in disrupting those patterns that keep you, or your organisation, down. Shake things up a little.

(I’m not talking about knock-and-run here, by the way — it’s much more like “hmm, I wonder what happens if I do … this?” It’s actually very science lab.)

Over the last couple of months, while I’ve been literally and metaphorically packing up my office, I’ve been thinking a lot about disruptivity and its role in my recent career. I’m using the word ‘disruptivity’ deliberately here, rather than ‘disruptiveness’ or ‘disruption’, since both of those seem to me to connote someone else having screwed something up in a way that is antisocial and anti-progress. Disruptivity is good disruption: it has agency, and can number among its antonyms complacency, stagnation, and that nice cozy place with the sofas, the ‘comfort zone’.

It’s easy to stay inside your comfort zone if you work in a big organisation: there are established procedures and methods, and a culture of handing these things down to the next person. In a big organisation, you really don’ t have to think too hard if you don’t want to, because nearly any question you care to ask has an answer that begins “well, the last time we had to do that, …” I guess it’s probably not worth getting too pissy with organisations about this, because human beings have behaved this way for tens, maybe hundreds of thousands of years, and on the whole it’s served us pretty well. But the flip side of organisational memory is that procedures and practices have a way of stifling creative thinking by squashing people down into silos. You are a lecturer; you are a psychologist. You will give lectures in a prescribed format, in which you will talk about psychology as defined by the requisite accrediting body. You will go to psychology conferences and conduct psychology research. You are a subject-matter expert.

And yeah, I was pretty compliant to begin with. I mean, I didn’t know anything; who does, after spending most of their life in formal education? But, after several years’ consideration, my response to this way of working is “Yeah … no.”

There’s a technique in the cognitive psychology literature called analogical problem-solving [PDF], in which you take your bleeding-edge science problem and try to reframe it in a more familiar context. Analogical problem-solving allows you to take advantage of all the schemas and chunking you’ve developed by spending time in the familiar domain, thereby freeing up more of your cognitive resources to think about the problem at hand. It strikes me that an important prerequisite for disruptivity is the desire or ability to travel towards unfamiliar domains — exploring foreign spaces and the behaviour of the people who live/work there can actually help you think about your existing problems in new ways.

Here’s the thing: all the really cool people I meet are the ones sticking their heads out of their organisationally-sanctioned silos, and asking “Hey, what are those people over there doing, and might it be of value to us?” — the meerkats of the workplace, if you like. Curiosity is disruptive; it’s pretty hard to remain in your comfort-zone when you’ve wandered out of your area of expertise and into someone else’s. That’s one of the reasons I enjoy doing peer observations with my lecturing colleagues: you see styles of teaching and ways of thinking about classroom interaction that you’d otherwise never be exposed to. Good conferences (you know, the ones where there’s space for conversation, not just showboating) are the same.

Perhaps not surprisingly, it actually seems easier to cross these borders on the Internet than in one’s own workplace, maybe because people who are into online networking have effectively put up a sign on their virtual office door that says “please, bother me — I love it!” Compare that with how it feels trying to set up a meeting about that thing with that guy across campus when you’re both really busy. Asynchronous digital media are intrinsically disruptive, because they put you next to people in countries, cultures and professions that you’d never otherwise know anything about — and those people are often meerkats with disruptive ideas of their own. Though I didn’t know it at the time, signing up to Twitter was one of the smartest things I could have done for my career. In fact, about half of those people currently most influential in my life have come to me through the Internets — and they are, without exception, cross-disciplinary ninjas, people for whom the idea of existing in just one silo is just plain ridiculous.

Like Richard Wiseman’s studies showing that ‘lucky’ people are really just those who notice new possibilities [PDF], a big part of embracing disruptivity comes from being open to the potential in life’s random encounters. Example: several years ago, through an old friend of my husband’s, I met the very lovely Rachel Potts. Her job had nothing to do with my job: she worked as a technical author for a software company, while I was a psychology lecturer. But we kept having the most awesome long conversations about communication. And one eventual consequence of this was that I ended up giving a talk at Technical Communication UK. Until then, I’d barely known that technical communicators even existed, much less that they might be interested in applying cognitive psychology to their own work. But boy, were they. And so,
in making my own journey out into a scary new space, it seems like I disrupted a few other people’s complacency, too. (Of course, you can argue that by attending a professional conference, those people had signalled that they were looking for a bit of disruption in their working lives — but props nevertheless to the TCUK team for expanding their speaker base beyond the traditional edges of technical communication.)

If your social circle isn’t putting you together with people who understand your geek thang, just get out there and talk to people who work in a different area. From my conversations with technical writers (most of which were mediated via Twitter; don’t diss the 140), I learned about software simulation. That struck me as pretty cool, so I learned how to use Adobe Captivate and, with a little help and only minor drama, created some resources to help my students learn how to drive statistics software. Conversations with technical authors also helped steered me towards the field of user experience, which has come to form such a huge part of the way I think about interfaces, learning and cognition that I’m shifting careers to go work in UX. The consequences of disruptivity are sometimes unpredictable, but they may also be transformative.

Maybe you like the idea of disruptivity and the cultural exchange of visiting someone else’s sandbox, but career changes and meeting people all sounds a bit extreme? Well, you don’t even have to introduce yourself: just read the Internets. There are all manner of smart bloggers out there who might not do what you do, but who write about it so clearly that you get it, and you get why it’s relevant to you. If you work with people and/or ideas (and if you’re reading this, I’m pretty sure that you do), I’d particularly recommend Seth Godin and Rands, and also, though he’s perhaps more of an acquired taste, Merlin Mann.

Your search doesn’t even have to be all that targeted: for me, it started almost by accident with Lawrence Lessig’s ‘Free Culture’ talk from OSCON 20022. What possible value could a psychology lecturer find in a long talk about copyright? Okay, how about a perspective-shifting way of delivering lectures? Lessig’s presentation, and this talk by Dick Hardt, provided the change of reference-frame I needed to give my teaching a good kick up the ass. The ensuing domino effect of making those changes led to research, funding, and paid consultancy, plus a couple of international conferences. And it hooked me up with some really interesting and cool people, and they turned me on to a whole bunch of other new stuff to use in the classroom, like Pecha Kucha. Disruption begets disruption, and after that it’s pretty hard to go back into your silo.

No, check that: it’s impossible to go back into your silo. Disruptivity means rejecting the easy life. You will no longer be satisfied with the explanation that “this is how we do things around here”, because you will know that out there, someone else is doing it better, smarter, more efficiently. You will know this because I read a thing, wait, let me email you the link … You won’t win every argument this way, but you will go forth armed with evidence, and your organisation will be a better place for most of your interventions, which of course is what the whole disruptivity thing is all about.

Lastly, if you really want to be disruptive, leave. [If there’s one link from this post that encapsulates disruptivity, it’s that one. Click through and read; it’s only short.] Leaving isn’t an inevitable consequence of embracing disruptivity, but I’d say it’s a likely one. I mean, you can’t spend all that time out of your silo and not wonder about what else might be out there. But consider, too, that your decision to leave also changes things for the organisation you are leaving. It forces your manager to think about whether you need to be replaced. Co-workers who rely on you will have to seek out alternatives; maybe your decision to leave will prompt some of them to become meerkats. Everyone gets a reminder that there is life out there beyond the organisation’s walls, and I consider that to be an inherently good thing.

So, yeah: leaving an organisation can be your last gift of disruptivity. Make it a good one :)

[This post is dedicated to all my awesome colleagues at UCLan who have borne my clumsy attempts at disruptivity with incredible grace and patience. I will miss you more than I can say.]

[Students — you’re getting a post of your own. Watch this space.]

1 I kid, mostly. I mean, I use these words a lot. But I also think that, when the buzzwords start flying, it’s useful to gauge the ratio between talk and eventual action. And there is a lot of talk on the Internets.

2 In fact, if you want to live a more disruptive life, you should probably just hang out with my husband, since he was the one who turned me on to the Lessig talk, and he stumbles upon a lot of interesting and diverse content.

7 Comments

Filed under my stuff

Getting learners to build things out of concrete (examples, that is).

This post is not about assessment.

A few weeks ago, before every other word on the internet became Wikileaks*, there was a lot of buzz about this piece in The Chronicle by someone who writes students’ essays for them, for money.

I’d like to think, gentle reader, that you, sitting alone at your computer with a cookie arrested halfway through its trajectory to your mouth, are reeling at this astonishing news: students buying their way through a degree? Say it ain’t so! But if you’re reading this blog at all, you probably already know about essay mills, so finish your cookie and let’s move on, because essay mills make me sad, and they probably make you sad, too.

Anyway, I got to reading the (numerous) comments on the Chronicle article and ended up at another essay-mill confessional. And this one absolutely stopped me in my tracks:

I doubt many experts spent most of a decade writing between one and five term papers a day on virtually every subject. I know something they don’t know; I know why students don’t understand thesis statements, argumentative writing, or proper citations.

It’s because students have never read term papers.

It’s so obvious, in hindsight: students never see enough essays to be able to abstract the rules of what makes a good one. I mean, think about the essay as a form — often the form — of undergraduate assessment: we’re basically asking students to build a working aircraft without ever having seen one. We give students some classes about Newtonian mechanics, show them a few force diagrams, then say “right, build me something that will fly me to France for lunch.” The students duly experiment, with most making emergency landings in fields, while we rally ’round, saying helpful things like “but you didn’t put the right kind of fuel in” and “why haven’t you finished building this wing?” Typical student responses include “there are different types of fuel?” and “but you didn’t cover that part of the wing in class.”**

No wonder some students find it easier to have a quiet word with their local aircraft retailer. And I’m not saying this to excuse the essay-mill companies, whom I deplore for the simple, selfish reason that they are devaluing university degrees, diminishing my own efforts and those of my students. I mean, I don’t think you will ever convince me that their aww-shucks-we’re-just-providing-exemplars-what-do-you-mean-students-are-handing-this-work-in-as-their-own schtick is anything other than a thin veneer of bullshit designed to stave off the lawyers. But I also think that if hacking the system is as easy as paying a few dollars here and a few dollars there to someone who will effectively learn for you, then, well, maybe the system isn’t very good. Simon Bostock has some nice thoughts here on why this problem won’t go away until universities wise up.

But as I say, this post isn’t really about assessment; it’s about learning. Quite a lot of our knowledge is rules-based, like knowing “I before E except after C”, and “don’t talk to Roger until he’s had his first coffee of the day”; we rely a lot on these rules of thumb to help us make sense of the world. Students’ whole lives are about learning rules: how to write an essay; how to format a references list; how to make sure the electricity in your flat doesn’t get cut off. Very, very broadly (it’s possible that this dirty shorthand explanation is going to upset some people), there are two ways of acquiring these rules: learning the abstract principles, and learning by experiencing concrete examples for oneself.

Guess which category most university education falls into.

None of this really cohered for me until I watched a colleague from a different department teaching a group of new students the Harvard style of academic referencing. While not the most stimulating topic, this is nevertheless pretty relevant, because it underpins much of students’ written work during their degree.

Here’s one way of teaching Harvard referencing:

* surname followed by initials

* year of publication

* Title of article

* Title of journal (italics), its volume (italics), page numbers.

These abstract rules work well as a recipe for writing out your own reference list, but they’re not that great if you’re actually trying to internalise the rules. They’re pre-digested; there’s no work left to do there, so the bits of information slide over us, and each other. There’s no friction. Also, there are a lot of pieces of information there: six(ish) basic components, but many more if you also include the order in which they must be assembled, and details like which bits get italicised and which don’t. That’s probably too many.

Here’s a different way of teaching referencing:

Aardvark, J.R. (1980). Ants, and how to eat them. Journal of Orycteropodidae Studies, 80, 11-17.

Barker, R. (1982). Rum babas, and what to do if you’ve got them. Reading: Goodnight From Him.

Halley, W. (1955) Rock Around The Clock. New York: Decca.

Izzard, E. (1998) Cake or Death? Gateaunomics, 10, 195-196.

Lemur, R.-T. (2010) Strepsirrhinoplasty. Antananarivo: Raft Press.

Leonard, E. (1996). Out of Sight. New York: Harper.

Shorty, G. (in press). Okay, so they got me. Los Angeles: Cadillac.

* What are the rules by which this reference list is organised? Name as many as you can.

Here, to understand the rules, we have to do a little work. But it’s sort of fun; working out the rules is a barely-game. And the thing about abstracting the rules for yourself in this way is that the process is messy, tracking its muddy footprints all over your memory. Which is exactly what you want.

Here’s a half-baked thought: you can’t teach abstract principles nearly as well as people can teach themselves using concrete examples.

Science as a university subject relies on practicals as well as theory, but we still spend a lot of time telling students what the rules are, rather than letting them abstract those rules for themselves. For starters, I think this is a very paternalistic*** way of treating people who are supposed to be adults. But also, I’m pretty sure it constitutes poor practice, since putting in a little mental effort is rewarded in the long-term by better retention and understanding. You thought your teachers were sadists, giving you worked example after worked example? Well, maybe they were — but my point is, they were actually helping you out, too.

But university is not high school. And the thing about being an ‘expert’ (and if you’re lecturing to university students, then you are, by many people’s definition, an expert — even if no-one fully understands what it is exactly that you are an expert in, other than that you “do something with computers”) … the thing about being an expert of any kind is that it’s so, so tempting to provide helpful short-cuts, like well-meaning parents who hand down sensible advice about life to their children. We’ve all been given that advice, and I’m pretty sure that we all learned more profoundly from the consequences of ignoring it than we ever would have if we’d listened in the first place. The trick that education often misses is that abstract rules are easy to ignore until we understand their relevance, by which time we’re usually pretty deep in our own personal concrete example. Or deep in something, anyway.

I recently spent some time with a friend who is trying to learn about organic chemistry but finding some aspects of it hard. I enjoyed chemistry at school (um, 20 years ago), so we sat down together for an hour to try and work through the IUPAC scheme for naming chemicals. Now, you can try to learn all the rules for naming molecules in organic chemistry, but there are lots – they go on for two or three pages of my friend’s textbook. That’s a lot of abstraction, and we know that concreteness helps us learn (stodgy academic explanation; human-readable explanation). So instead, we looked at some specific examples of structural formulae, along with their names, and tried to abstract the rules of naming based on the information we had. And you know what, it worked pretty well. In fact, naming in organic chemistry is basically a language and visualisation problem, not a chemistry one, so I learned the rules quicker than my friend did, because language and visualisation are more my bag than they are his. But I’ve yet to meet someone for whom the exemplar approach flat-out doesn’t work.

Of course, when I talk about abstracting the principles from a set of concrete examples, what I’m really talking about is pattern recognition. Pattern recognition will be one of the essential 21st-century skills. It’s not about finding information anymore — now it’s about finding the right information, rejecting the irrelevant stuff, and knowing how we might go about telling the difference. Hat-tip for that link goes to Lee Skallerup, who suggests

Get students to analyze the writing (and the comments) to see what kinds of patterns emerge, what they can see if they take the time to look.

If we want to prepare students for the 21st-century workplace, we should be teaching pattern recognition, using exemplars and letting students figure out the rules for themselves; those are the skills they are going to need when they go out into the world. It shouldn’t takes much effort to shoehorn this sort of activity into the classroom, or to get students to understand the basic process — we abstract rules from concrete examples all the time (take this discussion of what differentiates men’s and women’s shoulder bags, for instance). As @Dave_Ferguson points out in that post,

The effort to make the tacit knowledge more explicit encourages reflection and revision … Concrete examples help people work their way toward more general principles.

And here, try this on: “assessment should fundamentally be about building learners’ capacity to make informed judgements about their work” (@cathfenn, via @hypergogue). I couldn’t agree more with this: success as a teacher or learning facilitator is watching the learner walk away, not needing you anymore (and ideally, exceeding you). But to be able to get to the point of critiquing their own work, the learner must be able to move from tacit or implicit understanding of the rule to being able to describe it explicitly. And I’m struggling to think of a case in which having concrete examples would not make it easier for learners to explicate the rules in this way.

Once I started thinking about this abstract/concreteness issue, I started seeing it everywhere. For example, on my third-year module, I set students a piece of coursework that is opt-in, and pass/fail: in exchange for demonstrating that they’ve done some of the groundwork, successful students receive some additional information that will help them think about the topic. Quite often, students fail to grasp what is required of them (a failure of pattern recognition that may well originate in my explanation of the task), so I ask them to resubmit. Recently, a student emailed me work that wasn’t up to scratch, so I suggested she try again and resubmit, which she duly did. But when I got her work back a second time, it was still missing the point. So I thought about it for a bit, and then I sent her the additional material anyway. And you know what? I got a very nice, very grateful response, saying that she now realised exactly why her original submissions hadn’t been right. Three simple points of triangulation (two “wrong” answers and one “right” one) constituted enough information to start abstracting some rules.

Really, the more I think about it, the more I think that using concrete examples and letting students abstract the rules for themselves is really just another variation on show, don’t tell. Which is honestly the best advice for learning design — or communication of any kind — that I know. And hey, maybe if we want to assess learning in ways that are less easily hackable, we should engineer a system of assessment that requires students to show us, as well as telling us, what they’ve learned. Let’s have assessments that test (a) implicit knowledge of the rules, (b) explicit knowledge of the rules, (c) awareness of situations in which the rules may not apply, and (d) the learner’s awareness of their own progression in terms of grasping (a) through (c), because there’s nothing like getting students all metacognitively tooled-up.

Okay, so maybe this post was a little bit about assessment.

(PS — If you’d like to read more about abstraction and concreteness in learning design, you may find this short paper interesting.)

* Here I’m inserting a reference to current events, so you know I didn’t just record this sometime last year.

** Just so we’re clear, this is a metaphor; I’m fairly sure this is not how Aeronautical Engineering students are assessed. At least, I hope not.

*** I know it’s not the done thing to laugh at your own jokes, but recently I had occasion to tweet But paternalism is good for you. Here, swallow this. You have to seize these opportunities where you find them.

42 Comments

Filed under Uncategorized

It Gets Better: how vocational educators can stay sane and relevant in H.E.

The persistence of the myth that “those who can’t do, teach” is incredibly damaging. Academics who spend more time and effort on their teaching than on their research are often looked down on — not by their peers (most of whom tolerate, even champion their passion and innovation), but by senior colleagues and managers – the people who make hiring decisions, funding decisions, promotion decisions. If you don’t work in the education faculty, geeking out on learning and teaching is very, very unfashionable, and marks you out as that poor relation of HE, the vocational educator.

My friend S recently completed her PhD, and is looking for work in academia. She’d like a teaching-focused job, because teaching is her passion: she gets it, and is by all accounts a stellar and highly-valued teacher — exactly the kind of person you’d want teaching your kids when they go off to university.

Getting a lecturing job these days is hard. A decade ago, I landed a permanent lectureship in one of the newer universities, straight out of my PhD and with no publication record. Man, those were the days — S is now competing for jobs against people who already have lecturing experience and a string of academic papers to their name. With excellent references, she gets interviews; but while a passion for teaching used to be very persuasive, now it’s all about your publication record — and S doesn’t really have one.

S doesn’t care so much for research (though this may just be a phase; coming out of my PhD, I didn’t either). She cares about teaching — her PhD is essentially educational research. She loves to watch students learn and develop their confidence; she likes refining the teaching process to make things work better next time around. She enjoys having meaningful discussions with people at conferences about how to change education for the better. She doesn’t believe that the impact of educational research should be restricted to people who read academic journals.

Talking with S rang a lot of bells with me.

I went back to my desk and started writing S an email to suggest ways in which her work could have an impact beyond the traditional academic route of publish-publish-publish, but very quickly realised it was turning into another link-heavy infofest that’s really a blog post. So here it is, and maybe it will even be of help to other people, too.

Be yourself. Start by reading this piece by Jo Van Every, in which she talks about how to make being an academic work for you:

“If your idea of a great academic career involves being a fabulous teacher and the pressure to publish seems unreasonable, [then] you should not even apply to Research Intensive institutions even in a bad labour market.”

Jo works in Canada, but the prevailing HE climate and infrastructure there is very like that of the UK. Jo’s job is to help academics make sense of our jobs as they are, and to help us shape what we want them to become. I chat with her sometimes on Twitter, where she’s @jovanevery; she’s a sweetie, and her website is full of great advice that will leave you feeling more in control of your career and less like you’re caught up in a system that doesn’t always speak to your values.

Acknowledge that academia doesn’t encourage sharing or nurture team players. Jo’s post points to a brilliant essay by Lee Skallerup:

“While more and more scholars are using sites like Academia.edu or SlideShare, and even self-publishing, this type of sharing isn’t rewarded when it comes time for decisions on hiring, tenure and promotion. We are taught instead to hoard our research and findings to share with a potentially smaller audience in venues with more ‘prestige’. Why not work to improve Wikipedia in whatever field you specialize in? […] But because the medium is ‘crowdsourced’ instead of peer-reviewed, career-wise, my work there would be meaningless.”

Boy, do I hear that. To paraphrase Alanis Morissette, it’s like 10,000 Slideshare views, when all you need is a peer-reviewed publication. The REF definition of ‘impact’ might never be broad enough to encompass this stuff; meanwhile, a lot of people remain disenfranchised because they’re not producing ‘the right kind of work’.

You don’t have to fight the system, though if you’re feeling punchy, Lee Skallerup’s blog is a great place to start. But there are other ways of getting your work out there.

The absolute best thing you can do is connect with other people who share your passions.

1. Connecting with like minds will remind you that you are not alone — on your worst days it will reassure you, and on your best days it will inspire you.

2. Connecting with others will ensure you valuable exposure outside academia, and this will lead to opportunities you would never have been given in HE.

3. Innovation through conversation. While a lot of science is just about trudging alone through the mud, translational research needs people who are willing to sit at the sometimes-uncomfortable-but-never-dull junction between subject areas, and spot how the patterns in one field relate to another. The border between education and science is one such interface, and it’s a rich seam to be mined.

Below, I’ll try and elaborate on these points, based on my own experiences; if anyone would like to share their experiences of edu-networking in the comments, then that would be lovely.

Academia can be a lonely place, so take charge of your personal learning network. Even when you’re collaborating with colleagues, academia mostly consists of getting your head down and getting on with it, alone. This can be a good thing: geeking out is virtually the raison d’être of academia. But in between those inspirational, accidental conversations over coffee or in the corridor, there are often some pretty long spells in which you don’t really talk to anyone about anything of substance. This is partly about the pressure of time: it’s hard to have “OMG, yes! We could do …” conversations when you have a pile of essays to mark, an inbox the size of Mount Fuji, and are starting to forget what the faux-wood of your desk looks like beneath all that paper. But I suspect it’s also because academia self-selects for people who are actually quite happy to lock themselves away in an office or lab for days at a time.

This is where the personal learning network (or PLN) comes in. If you are the kind of person who feeds on thought-provoking, inspirational discussions about learning, then even if you have the best colleagues in the world (and mine come pretty close), you are still not going to be having these conversations as often as you’d like. Some of the best conversations arise through the friction of differing experience, and a lot of the time, the people you work with share the same experiences, the same knowledge, as you. Also, they might not prioritise a discussion about educational change over getting a paper written, or fighting their inboxes. Luckily, we now have the Internet — and the bits that aren’t full of cat pictures and porn are absolutely packed with people who thrive on conversations about how to make education better. This is your PLN; you just haven’t met them yet.

The great news for you is that Twitter loves anyone who can talk passionately and accessibly about education. Most people think of Twitter as a social networking tool, and certainly if you want to exchange meaningless statements about what you ate for breakfast or the reality TV show du jour, then there’s plenty of that going on. But what’s less well known is that there are also really dynamic professional networks emerging, centred around things like learning, education, and technology. This facet of Twitter is probably invisible when you start using it; if you don’t know anyone else who’s into what you’re into, then it’s hard to find the good stuff. But once you start following people who are talking about, and linking to, the things that interest you, then that’s when it all really takes off. If you’d like to get the general flavour of how Twitter works, watch this 2.5-minute video, made by the very fab Commoncraft.

I’ve been on Twitter for about 18 months, and in that time I’ve met some stellar people, had many great conversations, and been invited to speak at meetings and conferences. Twitter is an always-on edu-conference that I can dip in and out of as my time and my job allow. Note that most of my tweets, and those of the people I follow, actually provide links to content that isn’t limited to 140 characters; things like blog-posts, newspaper articles, and so on. Not that there isn’t value in one-off, pithy comments; but there’s a real culture among educators on Twitter of sharing information, and that’s one of the things that makes it so special. As with most professional networks, you get out what you put in; if you regularly tweet the good stuff, people will start to ‘follow’ you (not as stalkerish as it sounds) — if you tweet it, they will come.

Blogging is another great way of connecting with people. The obvious advantage of blogging, relative to short-form media like Twitter, is that it allows you to rehearse more complex arguments, which other people can then comment on in ways that exceed 140 characters. Blogging is also great for self-reflection; a lot of the time, I don’t fully know what I think about something until I’ve tried writing about it for an hour. Writing about your thoughts and experiences allows you space to be yourself and to tell stories; those moments of insight into what-this-person-is-really-like are the cement that binds your PLN together. In this way, blogging is a great tool for rounding out your online identity, since it’s easy to hitch your Twitter account to your blog presence, such that tweets appear on your blog, and when you blog, it appears in your tweets. People who find one source and like what you’re saying can easily track you to the other place and see what’s going on there.

Your PLN isn’t technology-specific. Obviously, blogging and Twitter aren’t the only ways of connecting with other educators. There are discussion groups on LinkedIn, actual face-to-face (f2f) conferences (and unconferences), Skype, text-chat widgets embedded in people’s blogs, and the good old-fashioned telephone: I’ve used all of these at various points to talk with people about the things I’m passionate about. I’ve also rolled up at conferences already knowing several of the attendees because we’d already chatted online; it’s nice not feeling like a complete stranger.

Some of this does take time; growing a network isn’t something you can do overnight. But it’s hugely valuable. Lee Skallerup again:

Through social media, I have reached a broad audience of academics, teachers, parents, professionals, non-profits and other people who are interested and care about education. I have been invited to contribute blog posts for a number of different sites. My writing has been featured on other sites, UVenus included. Suddenly, not only am I working on a topic I am passionate about, it seems to matter.

Sometimes it’s scary, being asked to contribute to something in your role as an ‘expert’, but that beats feeling irrelevant and disenfranchised every time.

And they will ask you to contribute. A background in learning-related science is perceived as genuinely valuable by learning professionals from other fields. There seems to be increasing emphasis, in the wider education community, on evidence-based practice. Obviously, as a scientist, I think this is brilliant. Oh sure, it sometimes gets misused, like when educators talk the science talk but don’t walk the walk. But there appears to be a real appetite for understanding the mind and brain, and for working out how education can make best use of all this new knowledge: brain-imaging, contemporary behavioural science, and all the rest of it. And they need people like you to act as guides and interpreters; there has probably never been such a good time to get your edu-sci nerd on.

Something I didn’t really expect was that speaking at a conference outside academia is not like attending an academic conference. Perhaps the most obvious difference is that, as a speaker, you probably won’t have to pay the conference registration fee. Depending on the conference and your role in it, the organisers may even cover your travel and accommodation. If you are a non-academic reading this and thinking “so what?”, pause a minute to consider that, for most academics, finding money to cover conference registration, travel costs, and accommodation is non-trivial. Very few departments will fund attendance if you are not presenting work, many departments cannot afford to send you even if you are presenting work, and some academics end up paying their own way, in part or entirely, just so they can go to the conference and gain valuable exposure. Everyone pays the registration fee, which typically runs $200—$700.

Your knowledge and skills have financial value, too. One of the reasons I think vocational educators are treated poorly within higher education is that nobody perceives them as being able to make a financial contribution. You want money and prestige? Bring in a research grant.

Except, it’s not that simple. The expansion of higher education, along with cuts in research and education funding, means that there are ever more people competing for their slice of a rapidly-shrinking pie. And grant applications aren’t just something you can dash off in a morning — depending on what you are proposing and the number of moving parts involved, it can take months.

Which brings us to the slightly euphemistic-sounding Income Generating Activity: increasingly, universities are looking to bring in money in ways other than grant funding and bums on seats. This is where you come in, because your background in education research applies to, well, just about anything. Want to work with teachers to improve classroom education? Develop a course for Lighthouse. Want to partner with industry or professional bodies to improve some facet of their business while also earning your department some much-needed money? Speak to your institution’s Knowledge Transfer people about consultancy work and knowledge transfer partnerships. Educational science has a lot to offer the professional world: public and private sectors can both benefit. (They’ll respect your Ph.D., too.)

Lastly, you are not alone. At the risk of hijacking a good cause, I’ve been blown away by Dan Savage’s recent It Gets Better project, aimed at reassuring LBGT teens that life really does get better, and to hang on in there and not give up hope. The trials of higher education are orders of magnitude less serious, but I kind of feel the same way about those: I want to say to vocational educators in HE everywhere that it really does get better. Reach out; social networking will change your life if you let it.

2 Comments

Filed under Uncategorized

How to become a big e-learning nerd by mistake

Inherit a class. Inherit a class containing a really, really, really dully, repetitive, and entirely necessary component. One that the students must repeat ad nauseam, because rote-learning is the only thing that’s going to make a difference. Anything involving students learning how to drive a piece of software will be perfect.

Teach that class for a decade. Teach it until you can’t stand the rote and repetition any more, and until you find yourself atop a soapbox — metaphorical or actual — proclaiming to anyone who will listen that it is madness to spend valuable face-time with students demonstrating tasks that a poorly-trained monkey could teach.

Serve out another year of repetition, swearing to yourself that This Will Be The Last Time, Damn It. (NB: this works particularly well if the class is one you must teach several times per year, to accommodate the small computer room and the very large number of students.)

Choose a suitable moment during the summer months to crack. Demand — nicely — face-time with the people who run the course*. Explain to them that developments in e-learning are now such that this repetitive task that is driving you crazy and that you cannot stand to teach for one more year without serious risk of going postal can now be experienced by students not for one brief, rushed, 60-minute session, but in their own time and as many times as they like. This will, of course, free up your class times to do something considerably more interactive, like having enough time to answer students’ questions, or discussing ideas as well as specifics.

Watch your colleagues savour the idea. You can tell the moment when they buy into it: it’s about three seconds before they want to know how you are going to achieve this thing.

Gird your loins to achieve this thing. You’re going to need software. And you’re going to need to know how to use it. And then you’re going to need time in which to make things. Software is just about money, and learning how to use it is largely about tenacity. Time, though, you’ll need to create for yourself; nobody is going to pay you to take an extended e-learning sabbatical. And you can forget about manpower: if you want this thing doing, you’re going to have to do it yourself.

Get recommendations. Rope in your network of contacts. Find out from eavesdropping on Twitter that there is something called Adobe Captivate and something else called Camtasia. Pay particular attention when people start enthusing about Camtasia, because in a moment it will turn out that your institution only has a licence for Captivate.

Cultivate your Learning Development people. Our LDU contains some of the nicest and most helpful people you could ever hope to meet. They are super-smart, and they are way nerdy — and nerdy is what you are going to need. In spades. Ideally, your learning development guys will have time to get to know the software a bit, so they can talk you through your early, clumsy steps, and feed you jaffa cakes and cups of tea when you are having a meltdown (thank you, Liz). For full bonus points, they should have managers who are wildly enthusiastic about e-learning and believe that it’s worth the significant investment of time taken to learn a new thing so they can support you effectively while you learn it (thank you, Jim).

I mean, failing that, there are a gazillion text and video tutorials out there, not to mention some enormously helpful people on Twitter. But the learning development guys rule, man; I can’t see person-to-person learning interaction going out of style anytime soon.

Recognise that you are on a learning curve. First of all, it is vital that your software does not always remind you to save individual files before closing the program. It is especially helpful if you can demonstrate this three times inside a week, so that you end up losing the equivalent of about two days’ work: this will provide you with a learning experience that is pretty much optimised.

Swear. Vigorously.

Become a virtuoso of the panic-save, performing Ctrl+S reflexively in your sleep, every three minutes.

Attend a workshop on Adobe Captivate. Devour as many hints and tips as you can, like the fact that it’s possible to record your demo and training simulations simultaneously. Have the blinding realisation that creating good Captivate demonstrations requires exactly the same skills as creating meaningful, transformative PowerPoint animations.

Hate yourself a little bit for thinking PowerPoint when you should have thought slideware.

Embrace everything you know about the psychology of watching things happen on a screen. (Hey, wouldn’t it be cool if there was a word for that? Like televisionomics.) Go to town on the Gestalt laws:

* objects that appear simultaneously will appear to be related
* objects that are the same colour or shape will appear to be related
* objects that are close together will appear to be related
… etc.

Remember to consider the brain’s many idiosyncrasies when processing the flow of visual information:

* inattentional blindness [YouTube] — we may not spot things that are peripheral or that we think are irrelevant;
* change blindness [flash video] – we may not notice changes that occur concurrently with a visual discontinuity such as a slide-change or other interruption;
* gradual change blindness [PDF] slow changes over several seconds may prevent us noticing them altogether. (By the way, the intro to this paper provides a nice overview of the change blindness literature.)

Keep in mind at all times that the brain is really only capable of holding onto about four new things at once.

Wildly, wildly underestimate how much time it will take to create your new project. Plan to create five software demonstrations and five matching software training simulations. Since recording the demonstration takes just a few minutes, annotating it and creating the training simulation will surely only take two or three times as long … right?

Wrong.

Correcting the callouts and highlight boxes and animation timings so they don’t look like they were put together by committee is complicated. Also, writing really clear, unambiguous copy takes time. Start putting in the kind of hours that you can’t blog about for fear of reprimand.

Gameify it. Gameification is a big buzzword at the moment, as people try to budge learning from functional to fun. And god knows you are shovelling some pretty dry material down students’ throats with this software simulation. So lighten the mood a bit: use terminology like “your mission”, and reward correctly-answered questions with slides that say things like “You win a cake!” Include hand-drawn pictures.

Wonder how many students are going to email you to ask when they can collect their cake.

Then start worrying whether students will find the fun and the pictures patronising.

Attempt usability testing. Because of time constraints, this will be pretty crude, basically involving a colleague clicking through your simulation while you lurk behind them making notes about everything that causes them to tut. Helpful feedback will include things like giving people a button to click to move them onto the next screen so they don’t have to wait, and making users aware of the ‘view in fast-forward’ mode.

Recognise that, like any process of product development, this is a cycle. Try to be okay with the fact that each time you look at your work, you will find things wrong with it. Try to remember that it is okay to produce a first draft that is good enough; there will be time for refinements later.

Start fantasising about mass usability testing involving all the students in the class. Wonder about rigging up cameras in the computer labs; get cold feet when you start to consider the privacy, consent, and data protection issues.

Resolve to ask around for help with some kind of smaller-scale usability testing anyway.

Angst about making it accessible as well as usable. Enable the accessibility options. (Hat-tip to Karen Mardahl for making a potentially daunting issue seem straightforward and achievable.) Start wrestling with technical issues like whether uploading the interactive PDF is really okay if students can only access it from on-campus because remote access doesn’t support Adobe Reader 9. Have qualms about flash in general. Briefly consider enabling the iPhone option; get distracted by the question of how many students actually have iPhones.

Breathe out.

Start brewing this blog-post while you brush your teeth before bed, then accidentally stay up until 3am writing about it.

Realise that this project has eaten your life, worn out your body, and driven you into the hairy, caffeinated arms of Diet Coke. Realise, too, that you have acquired a whole new set of skills, and that this opens the door for colleagues who may have been considering similar learning solutions, but who may not have known where to start. Resolve to run a demonstration session soon, and to buy the guys in the LDU some nice cookies.

Then cross your fingers, Ctrl+S again for good measure, and unleash your work on the students.

*Note for North American faculty: it is not unusual in UK universities to teach piecemeal sessions on someone else’s course or module, rather than designing, teaching, and managing a course or module entirely on your own (in fact, many UK faculty will do some of each).

14 Comments

Filed under Uncategorized

How your meetings could be more like classes

Recently, I read a post by Rands about how to run a meeting, and was blown away. Not because of Rands’ excellent writing (though it is; it always is), but because in explaining the attentional dynamics of how to run meetings, he was really explaining how to manage a classroom. I had a bit of a lightbulb moment right there.

I’d never thought about meetings as places that could be like a classroom before, despite the fact that many of the meetings I attend are actually held in classrooms. (Collect one Dunce Point; do not pass GO, do not collect $200.) Oh sure, I understand that you need a facilitator to ensure that everyone who has something to say gets to say it, and that people whose verbosity exceeds their contribution don’t dominate the space. But what Rands is talking about is attention wrangling: making sure everyone stays focused and contributes, and that people go away with their knowledge and understanding improved, and with a clear idea of where they are going next.

This is absolutely what being an educator is all about.

Rands writes:

A referee’s job is to shape the meeting to meet the requirements of the agenda and the expectations of the participants. Style and execution vary wildly from referee to referee, but the defining characteristic is the perceptions of the meeting participants. A good referee is not only making sure the majority of the attendees believe progress is being made, they are aware of who does not believe that progress is being made at any given moment.

… which isn’t really all that far from:

An educator’s job is to shape the class to meet the requirements of the curriculum and the needs of the learners. Style and execution vary wildly from educator to educator, but the defining characteristic is the engagement of the learners. A good educator is not only making sure that the majority of the attendees are learning, they are aware of who is not learning at any given moment.

If you want to take this analogy further, you can think of traditional, top-down, boss-runs-everything meetings as primary education, where the teacher is very much in charge, and hands down information with minimal critique or interrogation from those in attendance. At the other end of the spectrum, adult education at its best is all about facilitating sessions with a light touch, allowing everyone to explore the material for themselves while staying on track. And gosh, I wish I attended more meetings like that. I mean, by the time someone’s old enough to attend a business meeting, they’re old enough to be treated like an adult, right?

Rands’ post made me think about the discussions we are having in higher education as we start questioning the old didactic model and moving towards something more interactive, student-led, and — whisper it — enjoyable. And I started wondering how well those arguments might be applied to the management of meetings in the workplace. Just as it’s a huge waste of resources to have students in class who are not actually learning (or who are doing so in functionally-limited ways), the cumulative workplace productivity that gets pissed away because the bodies in the room aren’t engaged doesn’t bear thinking about.

Disclaimer: I’m not exactly inventing the wheel, here. While I want to believe that many of you work in places where meetings are managed sensibly, I’m assured that there are plenty of workplaces in which meetings are still very much a problem. So if you do work somewhere where meetings are useful, if not actuallt enjoyable, then the rest of this post may not be for you — though I hope you’ll appreciate it as an intellectual exercise, if nothing else.

The person leading the session must add value. Historically, education has involved sitting passively and listening for an hour or two at a time while someone dispenses information, a sort of pre-digital iTunes U on highly degradable reel-to-reel tape. Clearly, in an era where most things worth knowing find their way onto the Internet, and students have to pay to attend university*, such behaviour is nuts: Nevertheless, there remains a population of educators whose idea of teaching is to read aloud from their slides. While I can’t substantiate or quantify this with reference to the literature, I have noticed that when people find out this is something I’m interested in, many of them are quick to tell me about this lecturer they had at university who used to read aloud from … you get the idea. Old-school models of what classes should look like still persist.

Likewise, workplace meetings of the kind where one person talks and everyone else listens are still alive and kicking. Seth Godin argues that disseminating information is a legitimate type of meeting, but I’m less and less sure of this as my time starts feeling increasingly precious. (Though maybe I’m just becoming increasingly precious ;-P). Just as there is a grassroots movement underway to try to rid education of the kind of ‘teaching’ that is really reading aloud, so we should be taking the same approach to eradicate broadcast-style meetings. Surely in both cases it would have been better to send round a document in advance, then take advantage of valuable face-time to have some sort of informed discussion?

Good session management means making sure everyone in the room understands why they are there. Devil’s advocates will by this point be arguing that not everyone reads documents that are sent around. Well, not everyone engages in information-dump meetings either. I mean, you can get me into the room and you can impose a no-laptop rule and whatever other sanctions you choose — but fundamentally, if I can’t see the point, I’m going to go off and be a tourist inside my own head, since that’s where all the really interesting stuff is happening. As educators, when we see this this disengagement happening in the classroom, we try to do something about it by emphasising to those in the room the relevance of what is being discussed. Sadly, I can count on the fingers of one hand the number of facilitators I’ve encountered who have run meetings in this way, ensuring everyone is really engaged and taking the time to draw out the more recalcitrant attendees. And I think that’s kind of a shame.

As group size increases, monitoring and remediating disengagement gets harder. I hypothesise that there’s a direct relationship between a facilitator’s skill and what size group they can wrangle at once without disengagement setting in. I had originally written that larger groups are fine for broadcast-style meetings — but actually, larger groups just encourage anonymity, diffusion of responsibility, and loafing. And anyway, if you you’re going to broadcast, why not circulate a video or document so people can watch or read it at a time that’s convenient for them? It’s worth considering the participant’s experience: small groups increase the potential for better-quality interactions between those present.

To keep people engaged, you have to sustain their attention. My most popular post on this site is When giving presentations, the only rule that matters is the rule of attention, and I’m pretty sure this whole argument applies to meetings too. If you don’t get people’s attention to start with, you won’t even get as far as being able to convince them of the relevance of what you are saying. But once you have their attention, you have to wrangle it, or it will just wander off again; attention is fickle. Moving things along every five, ten, or fifteen minutes will help; the brain is crazy for novelty.

Nevertheless, even an agenda won’t save you if each item on that agenda lasts for half an hour or more; even the most pertinent meetings can lose our attention if they go on too long. Here’s Seth Godin:

Understand that all problems are not the same. So why are your meetings? Does every issue deserve an hour? Why is there a default length?

Excepting the rule of attention, rules are a millstone. I’ve seen people discuss photocopying for half an hour, for no other reason than there was sufficient slack in the meeting schedule. Courtesy for other people’s time goes a long way: while this might be all you have to do today, the other person could be squeezing you in between studying, caring for an elderly relative, and working a part-time job. My nightmare is people who schedule one-to-one meetings lasting an hour or more to ‘chat’ about a single issue, with no plan or structure in mind. I mean, at least in a one-to-one tutorial, the ensuing discomfort could be offset by having some pre-prepared exercises to work through, giving the whole thing a bit of structure. Hey, there’s another tip from education: do the preparatory work — it’s a whole lot less excruciating for everyone concerned.

Rules do pervade education: parcelling up learning into arbitrarily-quantised chunks of 60 or 120 minutes is, objectively, pretty weird, when really what you’d like is to teach X until you are done teaching X, or until the students have run out of attention, then call a recess. But much as I find it hard to justify two-hour lectures, I understand that this rules-based architecture is driven by the practicalities of scheduling lecture theatre allocation across the whole campus, for a population of several thousand students, each of whom is pursuing one of a hundred or so different three-year degree courses. Suddenly, organising a one-hour meeting for seven people across different sections of your company doesn’t seem quite so bad, huh? ;o)

It’s worth distinguishing between ‘rules’ and ‘constraints’. By rules, I mean ‘hand-me-downs’: the things we do because the guy before us, or the guy before him, did them that way, and that we’re too lazy to change. Constraints are quite the opposite: these are deliberately-adopted restrictions designed to keep us on track and force us to be creative. Agendas, when adhered to, are one form of constraint; the curriculum can be another. There’s a whole organisational cult around the daily scrum meeting, which is short and time-limited and forces people to get to the point. I know people who work in teams that run a daily scrum, and from talking to them, it sounds excellent. However, it’s almost certainly less well-suited to academics, since the nature of our work means we’re mostly solitary, even when we are doing collaborative research — leaving aside that many of us don’t observe a standard 9-5, or have predictable hours day to day.

Two thoughts to finish with. First, as the estimable David Farbey pointed out at TCUK10,

“Team working is “I’ll do X, you do Y” — not circulating a document for everyone to read.”

And the second, which just scrolled past on Twitter right now (synchronicity or apophenia? It doesn’t really matter): Meetings aren’t work. They’re what we do as a penance for not rolling along like clockwork..

Postscript: Okay, there’s one other rule I like, too: the rule of two feet, as practiced at unconferences and barcamps. If, despite your best efforts, you’re not learning or contributing, go somewhere else where you can learn or contribute. I understand that this might be contentious (leave class? walk out of a meeting?), but I dare you to tell me that there’s never been a meeting, or a class, where the only thing stopping you from leaving was a vague sense of awkwardness that you ought to be there — and I happen to think it can be done gracefully, without being rude.

* Note for North Americans and others: until recently — the last decade or so — a university education in the UK was effectively free. Yes, really free, as in beer. Summary here; you can trace a lot of the bitterness in UK higher education from the moment that Tony Blair’s Labour government (yes, they’re the ones who’re supposed to be socialists) decided to turn universities into businesses. Important exception: Scotland, because it is awesome and now decides its own education funding policies, still does not charge Scottish students top-up fees. Pro tip for future students: be born in Scotland.

14 Comments

Filed under my stuff, other people's stuff

On success and reward in academia

So it’s been six months since I blogged here, which is frankly atrocious. Having said that, it doesn’t really feel like six months, because everything is whooshing past at such a rate (although interestingly, while we all like to agree that time is speeding up as we get older, the evidence for this is equivocal).

Anyway, time to fill the void. Hi, void. How are you?

VOID: HI, CHRIS. WHERE HAVE YOU BEEN?

I’m coming to that, but I need to tell you some stories on my way there.

VOID: OKAY. I LIKE STORIES.

(Aside: are you following @FEMINISTHULK on Twitter? You should; CAPS LOCK has never looked so attractive.)

This post is coming out of several conversations I’ve had recently about what it’s like to be an academic, and how academics spend their time. I think a lot of this is really not transparent to people who don’t work inside academia; a lot of the time, I don’t think it’s all that obvious to students, either.

First up, here’s Mark Changizi on why he just left academia:

You can’t write a grant proposal whose aim is to make a theoretical breakthrough.

“Dear National Science Foundation: I plan on scrawling hundreds of pages of notes, mostly hitting dead ends, until, in Year 4, I hit pay-dirt.”

Lots of research is by nature theoretical and speculative, the kind of thing you just need to chew on, indefinitely, until you make a breakthrough. But increasingly, funding bodies are turning away from this sort of thing in favour of applied research. Indeed, there’s a massive hoo-hah about HEFCE‘s new Research Excellence Framework (the thing that used to be the Research Assessment Exercise — that is, the attempt to objectively measure how “good” a university department’s research is) and exactly what they mean by ‘impact’.

It’s pretty hard for theoretical research to have impact. (I guess the clue is in the word ‘theoretical’.)

Mark again:

in what feels like no time at all, two decades have flown by, and (if you’re “lucky”) you’re the bread-winning star at your university and research discipline.

But success at that game meant you never had time to do the creative theoretical leaps you had once hoped to do. You were transformed by the contemporary academic system into an able grant-getter, and somewhere along the way lost sight of the more fundamental overthrower-of-dogma and idea-monger identity you once strived for.

Mark’s a theoretician, an absurdly talented one (I can’t even envy him for that, because he’s such a nice guy) — if anyone should be able to thrive within academia, it’s him. But he’s gone, because universities are changing from environments in which academics are free to consider ideas and theories into income-seeking machines.

Wait — you thought universities were about educating people? Well, keep reading, but you might want to be sitting down.

Mark’s experience is different from mine — he’s a theoretician, and I, after many years of not knowing how to describe what I do, have finally started calling myself an applied cognitive psychologist. (My mind is much better at applying theory to existing situations than it is at coming up with entirely new ideas about how the world works.) But what our experiences of academia have in common is that it’s hard to find anyone who will reward us for doing the things we do best, even when those things are ostensibly pillars of academia.

Example? Sure. Here are the things about my job that people senior to me notice whether I am doing:

* Authoring research papers (preferably in high-impact journals)
* Bringing in money through grant funding
* Bringing in money through other means (such as knowledge transfer or consultancy work)
* Attracting negative feedback from students
* Giving a class lower- or higher-than-average marks
* Completing the requisite admin tasks required for my teaching
* Meeting my personal development goals for the year
* Turning up to the relevant staff, admin and committee meetings

Here are some things about my job that nobody senior to me will notice whether I am doing unless something is on fire:

* Teaching well (unless I am namechecked by students right in front of them)
* Reviewing and revising my lecture notes from one year to the next
* Keeping up to date with developments in the theory and practice of teaching and learning
* Being involved in learning and teaching projects at a university-wide level
* Innovating in my teaching (and encouraging or helping others to innovate)

Above all, as I found myself explaining to an incredulous American friend last week, it is pretty much impossible to get promoted on the basis of being a stellar university teacher. I don’t actually think I’m a stellar teacher — but what I’m saying is, there’s no real incentive even to try, because all I’m doing, in striving for teaching excellence, is making work for myself: not only do I have to try to squeeze all this teaching innovation in, I also have to find time to do and write up my ‘real’ research.

So what have I been doing since February? I can’t believe I’m about to type this, but here goes:

leaving the office on time, and going to the gym.

This would be the bit where I proudly announce that I now have a life, right? But actually? I’m exhausted. And it’s not from going to the gym. I’m exhausted because it’s nearly impossible to do my job inside contracted hours if you care about teaching quality. Or if you have many research projects on the go that might one day lead to publications; I have about five of these, and they eat up time and money with no guarantee that the results will ever be publishable, assuming I even have the time and energy to write them up.

teaching vs research time.png

(Disclaimer: the above graph is purely conceptual, being based on no data whatsoever, but I think most academics would recognise it.)

Did you know that academics are estimated to work somewhere in the region of 50 hours a week? Why? Well, as I can now attest from personal experience, it’s the only way they can get anything done.

So where have I been? Mostly, trying not to have a breakdown. Trying to balance having a life with conducting teaching and research to a high standard. Trying to find a balance between using the summer to write up my research findings and taking the vacation time I’m owed (and which I never have time to take during term, because, hello, teaching and admin). Trying to rationalise what I can do, and what I can’t. Practicing saying ‘no’.

It is hard. And the students are back in just over a month and I do not feel rested at all, and I haven’t done half the work I hoped to. And last summer was exactly the same.

So, void, that’s where I’ve been. Interesting times.

It’s not all doom and gloom. I’m learning things about myself, like for instance that I’m a ninja copy-editor — when you give me your poorly-written paper to co-author, I will turn it into something sublime, geeking out for hours while my fourth cup of coffee in a row goes cold. (Now I just need to figure out how to work this way with all my co-authors.) I’ve embarked on a big e-learning project, more about which soon. And I’m slowly getting more clarity about what I want and don’t want in my job. These are all good things.

And the gym? I’ll definitely keep going to the gym. Being fit is great, but more importantly, you should sponsor me to run a half-marathon for charity :)

Thanks for listening; it’s nice to be back.

19 Comments

Filed under my stuff, other people's stuff

Ask, Don’t Tell: the power of questions

Scientists are not trained to ask questions.

No, excuse me — scientists are absolutely trained to ask questions. In the lab. In the lab we are rabid information ferrets, and we will run up every trouserleg that the great wardrobe in the sky sees fit to provide.

Scientists who become lecturers are not trained to ask questions — at least, not questions of the classroom variety (remember, we’re preoccupied with being subject matter experts). We are trained to talk. And talk. Seriously, if you like to pontificate, you could do worse than become a scientist. It’s like our national sport or something.

And so, at the end of class, because we know we’re supposed to ask this, we ask any questions? and nobody says anything — instead, the entire class launches into a frenzied scramble for their bags and coats. Because “any questions?” is about the worst thing you could possibly ask, and my students know it, even if they don’t explicitly realise it.

And yet, in defiance of the mute, are-we-done-yet hordes, a small trickle of students invariably arrives afterwards to ask questions, or to share something interesting and relevant from their lives. And sometimes, it feels like more teaching and more learning happens in those little conversations than in the whole of the lecture preceding them.

I experienced for myself, and am trying hard not to propagate, the cycle of abuse that is didactic, teacher-led education. “Sit down and shut up” is a powerful message to impose on children — and it’s clearly a sticky one, because by the time my students arrive at university, that’s their expectation of what should happen in class1. Ironically, when students don’t want to interact in class, it’s actually even harder not to ask things like “any questions?”, because we do it out of habit, and stressful situations are great for dredging up our most-ingrained routines.

“If you want to improve any element of your life, learn how to ask better questions.” (via Paul at Brain Friendly Trainer).

I’m a huge fan of asking questions: they’re the fast track to learning (a) how interesting the other person is [seriously: people are fascinating] and (b) all the stuff they know that you don’t. And pretty much everyone likes talking about themselves and their thoughts, so asking questions is good social grease, too.

Asking great questions is also a brilliant habit to build in the classroom. It’s a skill I’ve been quite slow to develop, but I’m getting into it. So here are a few ways that I’ve tried to bring more questions into my classes:

I already posted about how I turned a two-hour lecture into a two-hour problem-based learning session. This was great for two reasons: firstly, I asked the students a ton of questions, which normally isn’t something we make much time for in lectures. Second, and even more exciting, was that the students then started asking their own questions. In front of 60 other students. Seriously, if I do nothing else of value this academic year, I’d almost be okay just with that. (Well, not really. But you know.)

I added media clips to my lectures as an excuse to ask concept checking questions. I showed my students Jill Bolte Taylor’s TED talk about the subjective experience of having a stroke. Watch it if you haven’t already — not only does she bring great insights from her knowledge of the brain (she’s a neuroscientist), but she also gives the talk with great humour and humanity. And instead of giving students multiple-choice questions afterwards (Did the stroke attack (A) the left (B) the right or (C) both hemispheres of Jill Bolte Taylor’s brain?), I asked much harder questions: What are the physical and emotional consequences of a left-hemispheric stroke? How do you see objects that appear in your left visual field? Outline the path taken by information through your brain. And so on. (I let them talk through it with a partner first, before I threw it open to the class. Small steps.)

For reasons outlined here, I changed the format of student presentations to Pecha Kucha (which I must write about soon, because it completely deserves its own post). And we went from from “mostly the student talking” to “the student talks for a while and then everyone pitches in with questions and discussion” — which, for the record, is a way better experience. For everyone. (I collected questionnaire data that says so, too.) Nothing makes a class interactive faster than getting students interested enough in the subject to ask each other questions.

I stopped telling and started asking. This wasn’t a class-specific intervention, just something I’ve consciously started trying to do over the last couple of years every time students get stuck: I answer their questions with questions of my own. It seems especially useful when working with very reticent students, but it’s also a handy tool when guiding students who are struggling to express their thoughts on paper: how do you know that? What evidence do you have? Why is that relevant?

What have I learned? Asking questions works. I’ve had really positive feedback from students about these sessions, and I know in my heart that I’m asking better questions and getting students to think more actively about the problems I’m setting. I’ve also learned that if your concentration lapses, even for a moment, it’s really hard not to reflexively ask “any questions?”, so deeply ingrained is the concept. (I guess the only solution to that is more practice.)

Yes, these activities are all things I should have been doing to begin with — but remember that didactic, scientific background, and show me a little mercy; breaking the cycle of abuse can take a while.

And now I want to add a whole session on “asking questions” to the teaching certificate.

.

Edit, one day after posting: One other thing that I learned, just today, is that sometimes it’s okay to ask if there are any [further] questions, if everyone is good and warmed up, and you have time to spare. Because they were, and we did – and students came up with some great questions. Stuff I had no idea about, but about which it was fun to speculate. But I think people really have to be in that headspace and comfortable with the idea of asking questions in class before this will work.

1 Okay, some of students’ reticence in class is also driven by not wanting to look like an idiot in front of their peers, in case the question is “a stupid one” … one day I might turn up at class wearing a t-shirt that says THERE ARE NO STUPID QUESTIONS — ONLY STUPID LECTURERS, but you just know that’s going to backfire in ways that are both immensely embarrassing and completely predictable.

8 Comments

Filed under my stuff

The search for context in education and journalism (wicked problems, Wikipedia, and the rise of the info-ferret)

It’s a January evening, a schoolnight, and I’m sitting on my sofa thinking Stuff it. I’m tired and it’s dark and I worked hard today, damn it. It’s pretty hard, at that moment, to engage with with the things I know are really good for me, like going to the gym, eating right, and engaging with decent journalism that actually says something worthwhile about the state of the world.

Ah, journalism. Why is it so hard to engage with good, wholesome news? You know, instead of the junk-food variety?

Well, for starters, it takes effort; something in short supply when you consider that UK academics apparently rack up an average 55-hour working week. So if I sometimes choose entertainment over learning, maybe it’s because I’ve been thinking really hard for 11 hours already.

Here’s the more interesting question, though: why should it take so much effort to engage with the news? I think the record will show that I did okay in school and that I know a few long words. I can follow an argument; on a good day, I can win one. But watching or reading the news and really, really getting it (not just at the who-shot-whom level, but understanding why), frequently eludes me.

For the longest time, whenever I read the news, I’ve often felt the depressing sensation of lacking the background I need to understand the stories that seem truly important.

I didn’t write that, but I could have. By the time I’d got old enough to be properly interested in the ongoing story that is Northern Ireland, no newspaper was interested in explaining the context to me. I knew it had to do with territory, nationality and religious differences, but who were ‘republicans’? What did they want? The newspapers all assumed that I knew a whole bunch of stuff that actually, I didn’t know. The dictionary was no real help, the Internet was still in short trousers, and Wikipedia didn’t yet exist. (Not that we had the Internet at home. We didn’t even have a computer.) And I was at that delicate age where I didn’t want to look stupid by asking what might have been a dumb question. (Actually, it wasn’t a dumb question at all, but I didn’t know that then.)

We would shy away from stories that seemed to require a years-long familiarity with the news and incline instead toward ephemeral stories that didn’t take much background to understand—crime news, sports updates, celebrity gossip. This approach gave us plenty to talk about with friends, but I sensed it left us deprived of a broader understanding of a range of important issues that affect us without our knowing.

Secret-that’s-not-really-a-secret: the guy who wrote this is a journalist. His name is Matt Newman, and he’s reporting here for Harvard’s Nieman Foundation about how modern journalism bypasses context in favour of immediate, juicy details..

News is complicated. To make sense of complicated things, we need context. And the newspapers aren’t delivering that context; even journalists say so.

In fairness, context is hard to come by when — as with Northern Ireland — your story pretty much defines the phrase wicked problem (see also its big brother, The Middle East). How much information is ‘enough’? How much background would you need to really understand the issues surrounding Obama’s healthcare reforms? Or the debate on university fees?

We need something, and traditional news media aren’t providing it.

But we have Google and Wikipedia, right? So there’s really no excuse for not being able to find out at least something about nearly everything. Apparently, when a big news story breaks, people really do converge on Wikipedia, looking for context; we are a generation empowered, as no generation before us, to find stuff out.

Except.

Except that I still get emails from my students that read What does [word from the course materials] mean? I used to write lots of replies of the biting-my-tongue variety, politely suggesting that the student take advantage of the resources at their disposal1, but eventually I got fed up with this, and wrote an FAQ in which I was somewhat more blunt, though I hope in a kind way.

My favourite was a student who emailed me after a deadline, apologising for the poor quality of the coursework he had submitted, and explaining that he hadn’t known what one of the words in the essay question meant — so he had just tried his best and hoped. This wasn’t a word that was archaic or obscure. This was a word widely employed in psychology and related subjects. It’s not in the paper dictionary on my desk (which, admittedly, is 20 years old), but it’s very, very easy to find and learn about online.

It’s not about having access to the information; all my students have Internet access at least some of the time. Too many (N > 0) of my students are just not in the habit of looking for information when they get stuck, like someone forgot to tell them that the Internet is good for more than just email and Facebook.

But students will surf Wikipedia and YouTube all day long, given half a chance, so what’s that about?

At Playful ’09, Tassos Stevens talked about the power of indeterminacy, and whether, if someone throws a ball, you can look away before you find out if the other guy catches it. Suspense is immensely engaging.

Wikipedia is like this: it’s a barely game, where the idea is to answer as many “Ooh, what does that mean?” questions as possible, using only the links from one article to the next. In suspensefulness terms, Wikipedia is an infinite succession of ball-throws, sort of Hitchcock: The Basketball Years. (Okay, so Tassos was talking about cricket, but my point stands.)

But education obviously doesn’t feel like a barely game, because students don’t behave there like they do when they’re surfing Wikipedia. So I guess we need more suspense. This might mean being less didactic, and asking more questions. Preferably messy ones, with no right answers.

I think that if we really want to turn our students into information ferrets, running up the trouserlegs of their canonical texts to see what goodness might be lurking there in the dark [This metaphor is making me uncomfortable — Ed.] then maybe we, like the news media, need to get better at providing context.

If students email me with simple queries rather than trying to figure things out on their own, maybe it’s because the education system hasn’t been feeding their inner info-ferrets. (Note to schools: teaching kids how to google is a completely different issue from teaching them to google and making it into a habit, and some days, it feels like you only deal in the former.)

We exist, currently, on the cusp: everything’s supposed to be interactive, but not everyone’s got their heads around this yet. (“Wait — you mean we’re supposed to participate? Actively??”) The old-school, didactic models of education and journalism (“sit down, shut up and listen; we know best”) are crumbling. And some of the solutions about how to fix journalism look a lot like the arguments being rehearsed in education about how to make it valuable and keep it relevant: develop rich content that your customers can help build and be part of; accept that you might need a model which permits the existence of premium and budget customers. (This is going to be highly contentious in higher education, and I still don’t know what I think about it. But I don’t think the idea is going away anytime soon.)

I ran one of the many iterations of this post past Simon Bostock and he wrote back: Newspapers have learned the wrong lesson of attentionomics. I think they’ve got it bang-on as far as briefly grabbing our attention goes,2 but I don’t think it’s doing much for our understanding of the news, and some days, I worry that education is headed the same way.

Jason Fry asks, if we were starting today, would we do this? This is a great question for journalism, but it’s also pretty pertinent to education: we still teach students in ways that make only marginal concessions to the Internet’s existence, treating it as little more than a dictionary, encyclopedia, or storage container.

Given that nearly anything can be found with a few keystrokes, if we had to redesign education from scratch, what would it look like?

More like Wikipedia. More ferret-friendly. And maybe upside-down.

.

[Acknowledgements: major kudos to Simon for linking to Ed Yong’s great piece on breaking the inverted pyramid in news reporting, for reading drafts of this post while I was losing my mind, and for the juicy, lowbrow goodness of LMGTFY, below.]

1 I suppose I could slam my students with Let Me Google That For You, but I prefer to save the passive-aggressive stuff for my nearest and dearest.

2 If this post were a headline, it would read STUDENTS TOO LAZY TO GOOGLE. (Admittedly this would be closely followed by SUB-EDITOR TOO DRUNK TO CRAFT ORIGINAL HEADLINE and BLOGGER CHEERFULLY IGNORES CLICHÉ.)

21 Comments

Filed under my stuff, other people's stuff

Why experts are morons: a recipe for academic success

This morning there was quite a bit of tweeting, back and forth, about this article and exactly how stupid it is.

“If our attention span constricts to the point where we can only take information in 140-character sentences, then that doesn’t bode too well for our future,” said Dr. Elias Aboujaoude, director of Stanford University’s Impulse Control Disorders Clinic at Stanford University.

Yup, you read that right. Some guy with a Ph.D. who works at one of the best universities in the world (and who’s sufficiently good at his job that they made him director of a clinic) is talking — to all appearances quite seriously — about the idea that the human attention span might shrink to the length of a tweet.

In other news, if the world were made of custard, global warming might lead to major dessertification, if we could just just bake an apple crumble big enough.

Maybe there’s a good explanation. Maybe Dr Aboujaoude’s remarks were taken out of context by the San Francisco Chronicle. Or maybe they threw him this ad absurdum scenario and he ran with it because he’s a nice guy and thinks that even if reporters pose a dumb question, it would still be rude to call them on it.

Here’s my ill-conceived, half-baked thesis for the day: experts are morons.

Why? Well, we get very excited over stuff we think is new, because we’ve been too busy down in our own little silos. I pissed Harvey off earlier by posting, in good faith, a link to Tyler Cowen’s TED talk about the dangerous appeal of stories.

Kids, don’t even try to sell Harvey old rope. Even if you didn’t know it was old rope. He’ll know.

What I ended up saying to Harvey was essentially Look, there’s a movement afoot to try to to get storytelling back into learning, to replace the content firehosing that passes for big education these days, McDonalds-style — and this talk serves as a useful reminder that stories are invariably a gross oversimplification of the evidence.

What I should have been saying was: Dude, I spent umpteen years becoming a subject matter expert, and at no point did anyone tell me that I needed to apply my critical faculties to delivering the material I researched so painstakingly. I’m new at this; cut me some slack!

(It turns out that Harvey and I were somewhat at cross-purposes; such are the limitations of 140-character ‘discussion’.)

Here’s the thing: academic success favours those who focus their critical faculties on developing their subject area expertise.

Below is a recipe for modest success in academic life and for becoming a legitimate ‘expert’. (Quantities and ingredients may vary according to your needs and experience.)

  • You need to be bright-ish. Not supernova bright, just bright enough. (If you’re too bright in school, you’ll get bored; see next point.)
  • You need to be well-behaved. (If you don’t behave, you’ll be labelled disruptive and that will do exactly what you think it will to your chances of academic success. Yes, even if you are bored because lessons are too easy.)
  • It helps to crave everyone’s approval. (If you don’t care what your teachers or parents think, why would you try hard on subjects that don’t really interest you?)
  • Questioning authority probably isn’t in your nature. (Or if it is, it’s a very specific kind of critical thinking, like “hey, maybe nukes aren’t that great an idea, mmkay?”) This will serve you well later, in your tenured career.
  • You are comfortable letting other people set goals for you (“You think I should go to university? Great!”)
  • You acquire a certain nerd-like pleasure (flow, if you like) from gnawing on very specific questions.
  • Your school years have conditioned you to understand that most people are mean, and best avoided.
  • Metaphorically or actually, you have let a thousand cups of tea go cold while you geek out on your chosen subject.
  • … okay, that much will get you through university and into a postgraduate programme (Masters or Ph.D.) At this point, it will be particular helpful if you can screen out information about the world around you, because this will just distract and confuse you about the relevance of what you are doing. (Having a crisis of meaning is one of the fundamental stages of doing a Ph.D.)

    If you survive this process and get your doctorate, you enter the world of teaching, admin, research, publication, and grant-getting — listed in increasing order of importance to your new employer. Your Ph.D., the entry requirement to academia that you have worked so hard on, also serves as your passport to teaching. Pause a moment to reflect on the weirdness of that statement: subject expertise is used as a measure of how competent you are to communicate that information meaningfully to non-experts.

    (Some universities, mine included, are trying to address this systemic shortcoming by getting new lecturers to do a teaching certificate. This is a lot better than nothing, but it’s also quite possible to do the absolute minimum required to pass, then go on your merry way, unmoved and largely unchanged. At least we do ‘peer observation’, which is a nice way of seeing what other people are up to; it’s hard not to reflect on your own teaching when watching someone else deliver a session.)

    Once you’re on the big shiny merry-go-round of teaching-admin-research-publication-grant-getting, it’s even harder to drag your ass out of the mire of just trying to keep up with your subject area and across the road into the big field of flowers that is good educational practice. And when you do manage to haul yourself over there (at the cost, by the way, of time spent on research/publication/grant application — and no-one is going to reward you for that choice), you get disproportionately excited when people show you some of the shiny things that exist in the world, because you’ve been far, far too busy becoming a subject expert to notice them. This can make educators look like big, dumb puppies — for example when we’re over-keen to co-opt neuroscience.

    The other side-effect of being an ‘expert’ is that if you’re not naturally inclined to cause trouble, question the system, or think critically about more than subject-matter problems (and remember, you have floated to the top of an educational system that rewards exactly those qualities), then sometimes you end up saying really dumb stuff, because you’re too busy thinking “ooh, that would be interesting” — like what if we really could only take in 140 characters’ worth of stuff before our attention drifted — to fully consider the validity of the question.

    None of this is an excuse for living up to the ‘woolly professor’ stereotype, but I hope it helps to explain to people like Harvey why experts sometimes sound like they’re rediscovering — or even reinventing — the wheel. And as for us ‘experts’ (and boy, am I uncomfortable with that label) we need to try harder to think about the practical applications of what we do — and to remember, once in a while, to apply those finely-honed critical thinking skills to something other than our own subject areas. We’re not really morons, but to the casual observer, it’s an easy mistake to make.

    .

    Obligatory afterword: there are a number of stellar educators who really do manage to apply their critical faculties to more than just their own subject area, and who manage to get through university and postgraduate qualifications despite asking really awkward questions and rocking the boat. If they ever isolate a gene for that, we should all go get spliced right away.

    16 Comments

    Filed under my stuff, other people's stuff