A waste of time

pocket watch 

A while back I wrote a blog post about the apparent waste of time involved in things like reading email, loading web pages, etc. At the end of the post I suggested that the simplistic measure of time as money that I was using should be viewed with great suspicion, though it is precisely the kind of measure that we routinely use. This post is mostly about why we should be suspicious.

But first, my basic initial argument, restated and stripped to its bones, is simple. According to the vacation request form that I have to fill in (and, after taking vacation, repeat the process) an Athabasca University working day is 7 hours, or 25,200 seconds, long. There are about 1,200 employees at Athabasca University so, if each employee could save 21 seconds in a day (25,200/1200 = 21), it would be like getting another employee. Equally, every time we do something that loses everyone 21 seconds a day for no good reason, the overall effect is the same as firing someone. I observed then that we have lately adopted a lot of ICT systems that waste a great deal more than that. Since then, things have been getting worse. We are about to move to an Office 365 system, for instance, that I am guessing will cost us the time of at least 5 people, maybe more, compared with our current aged Zimbra suite. It’s not rocket science: a minute of everyone’s day easily accounted for in loading time alone, which I have  checked seems to be roughly 20 seconds longer than the old system, and most people will load it many times a day. At the start, it will take way more than that, what with training, migration, confusion and all and, if my experience of Microsoft’s Exchange system is anything to go by, it is going to carry on sapping minutes out of everyone’s day for the foreseeable future thanks to poor design and buggy implementation. So far, so depressing.

But what have we actually lost?

The simplistic assumption that time is money has a little merit when tasks are routine and mechanical. If you are producing widgets then time spent not producing widgets equates directly to widgets lost, so money is lost for every second spent doing something else. Even that notion is a bit suspect, though, inasmuch as there are normally diminishing returns on working more. Even if a task requires only the slightest hint of skill or judgement, the correlation between time and money is a long, long way from linear. Far more often than not, productivity is lower if you insist on uninterrupted working or longer hours than it would be if you insisted on regular breaks and shorter hours.  At the other end of the spectrum, it is also true, even in the most creative and open occupations, that it is possible to spend so much time doing something else that you never get round to the thing that you claim to be doing, though it is very hard to pin down the actual break-even point. For instance, a poet might spend 23.5 out of every 24 hours not actually writing poetry and that might be absolutely fine. On the other hand, if a professor spends a similar amount of time not marking student work there will probably be words. For most occupations, there’s a happy balance.

But what about those enforced breaks caused by waiting for computers to do something, or playing a mechanical role in a bureaucratic system, or reading an ‘irrelevant’ all-staff email? These are the ones that relate most directly to my original point, and all are quite different cases, so I will take each in turn, as each is illustrative of some of the different ways time and value are strangely connected.

Waiting for the machine

As I wait for machines to do something I have from time to time tried to calculate the time I ‘lose’ to them. As well as time waiting for them to boot up, open a web page, open an application, convert a video or save a document, this includes various kinds of futzing, such as organizing emails or files, backing up a machine, updating the operating system, fixing things that are broken, installing tools, shuffling widgets, plugging and unplugging peripherals, and so on. On average, given that almost my entire working life is mediated through a computer, I reckon that an hour or more of every day is taken up with such things. Some days are better than others, but some are much worse. I sometimes lose whole days to this. Fixing servers can take much more. Because I work in computing and find the mental exercise valuable, futzing is not exactly ‘lost’ time for me, especially as (done well) it can save time later on. Nor, for that matter, is time spent waiting for things to happen. I don’t stop thinking simply because the machine is busy. In fact, it can often have exactly the opposite effect. I actually make very deliberate point of setting aside time to daydream throughout my working day because that’s a crucial part of the creative, analytic and synthetic process. Enforced moments of inactivity thus do a useful job for me, like little inverted alarm clocks reminding me when to dream. Slow machines (up to a point) do not waste time – they simply create time for other actitivies but, as ever, there is a happy balance.

Bacn

pig, showing cuts of meat

Bacn is a bit like spam except that it consists of emails that you have chosen or are obliged to receive. Like spam, though, it is impersonal, often irrelevant, and usually annoying. Those things from mailing lists you sometimes pay attention to, calls for conference papers that might be interesting, notifications from social media systems (like the Landing) that have the odd gem, offers from stores you have shopped at, or messages to all-staff mailing lists that are occasionally very important but that are mostly not –  I get a big lot of bacn. Those ‘irrelevant’ allstaff emails are particularly interesting examples. They are actually very far from irrelevant even though they may have no direct value to the work that I am doing, because they are part of the structure of the organization. They are signals passing around the synapses of the organizational brain that help give its members a sense of belonging to something bigger, even if the particular signals themselves might rarely fire their particular synapses. Every one is an invitation to being a potential contributor to that bigger thing. They are the cloth that is woven of the interactions of an organization that helps to define the boundaries of that organization and reflect back its patterns and values. The same is true of social media notifications: I only glance at the vast majority but, just now and then, I pick up something very useful and, maybe once every day or two, I may contribute to the flow myself. The flow is part of my extended brain, like an extra sense that keeps me informed about the zeitgeist of my communities and social networks and that makes me a part of them. Time spent dealing with such things is time spent situating myself in the sets, networks and groups that I belong to. Organizations, especially those that are largely online, that are seeking to reduce bacn had better beware that they don’t lose all that salty goodness because bacn is a thin web that binds us. Especially in a distributed organization, if you lose bacn, you lose the limbic system of the organization, or even, in some cases, its nervous system. Organizations are not made of processes; they are made of people, and those people have to connect, have to belong. Bacn supports belonging and connection. But, of course, it can go too far. It is always worth remembering that 21 seconds of bacn is another person’s time gone (for a large company, it might be a second or less) and that person might have been doing something really productive with all of that lost time. But to get rid of bacn makes no more sense than to get rid of brain cells because they don’t address your current needs. An organization, not just its members, has to think and feel, and bacn is part of that thinking and feeling. As ever, though, there is a happy balance.

Being a cog

cogs

I’ve saved this one till last because it is not like the others. Being a cog is about the kind of thing that requires individuals to do the work of a machine. For instance, leave-reporting systems that require you to calculate how much leave you have left, how many hours there are in a day, or which days are public holidays (yes, we have one of those). Or systems for reclaiming expenses that require you to know the accounting codes, tax rates, accounting regulations, and approvers for expenses (yes, we have one of those too). Or customer relationship management systems that bombard you with demands that actually have nothing to do with you or that you have already dealt with (yes – we have one of those as well). Or that demand that you record the number of minutes spent using a machine that is perfectly capable of recording those minutes itself (yup). This is real work that demands concentration and attention, but it does nothing to help with thinking or social cohesion and does nothing to help the organization grow or adapt. In fact, precisely the opposite. It is a highly demotivating drain on time and energy that saps the life out of an organization, a minute or two at a time. No one benefits from having to do work that machines can do faster, more accurately and more reliably (we used to have one of those). It is plain common sense that investing in someone who can build build and maintain better cogs is a lot more efficient and effective than trying (and failing) to train everyone to act exactly like a cog. This is one of those tragedies of hierarchically managed systems. Our ICT department has been set the task of saving money and its managers only control their own staff and systems, so the only place they can make ‘savings’ is in getting rid of the support burden of making and managing cogs. I bet that looks great on paper – they can probably claim to have saved hundreds of thousands or even millions of dollars although, actually, they have not only wasted tens of millions of dollars, but they have probably set the organization on a suicide run. But they could as easily have gone the other way and it might have been just as bad. Over-zealous cog-making is harmful, both because ICT departments have a worrisome tendency to over-do it (I cannot have assignments with no marks, for example, if I wish to enter them into our records system, which I have to do because otherwise the cog that pays tutors will not turn) and because systems change, which means many of the cogs inside them have to change too, and it is not just the devil’s work but an accounting nightmare to get them all to change at the right time. Well-designed ICT systems make it easy to take out a cog or some other sub-assembly and replace it, and they use tools that make cog production fast and simple. Poorly designed systems without such flexibility enslave their users, just as much as those that have to submit to cog-retraining are enslaved when their systems change. As ever, there is a happy balance.

Wasting time?

I’m not sure that time is ever lost – it is just spent doing other things. It can certainly be wasted, though, if those other things do not make a positive difference. But it is complicated. Here are just a few of the things I have done today – not a typical day, but few of them are:

  • reading/responding to emails from staff, students and others: roughly 2.5 hours
  • writing a forward for a book: roughly 2 hours
  • writing this post: roughly 1 hour
  • walking: roughly 45 minutes
  • making/consuming food and drink: about 30 minutes
  • reading/ making notes on books and papers: roughly 1 hour
  • replying to interview questions: approximately 45 minutes
  • checking my boat didn’t die in the rainstorm: roughly half an hour
  • cleaning and tidying: maybe half an hour
  • writing a book: about 20-30 minutes
  • replying to student posts: roughly 1 hour
  • marking: roughly 1 hour
  • waiting for computers: perhaps half an hour
  • grooming/washing/etc: maybe half an hour
  • checking/listening to the news and weather: roughly 45 minutes
  • taking an afternoon nap: about half an hour
  • Skyping: roughly 15 minutes
  • Deleting spam from the Elgg community site: about 10 minutes
  • Drying a wet dog: about 5 minutes
  • serious thinking: roughly 12 hours

There are still a couple of hours left of my day before I read a book and eventually go to sleep. Maybe I’ll catch a movie while reading some news after preparing some more food. Maybe I’ll play some guitar or try to get the hang of the sansula one more time. With a bit of luck I might get to chat with my wife (who has been out all day but would normally figure in the list quite a bit). But I hope you get the drift. I don’t think it makes much sense to measure anyone’s life in minutes spent on activities, except for the worst things they do. Time may be worth measuring and accounting for when it is spent doing the things that make us less than human, but it would be better to not do such things in the first place. I have put off responding to the CRM system today and only spent a few minutes checking admin systems in general because, hell, it’s Monday and I have had other things to do. It is all about achieving a happy balance.

What would you miss? Trends in media use in the UK

Really fascinating examination of OFTEL figures on recent changes in use of tools and media in the UK, with some intriguing demographic variations showing enormous differences between young and old, and between richer and poorer (barely discernible gender differences). There are extremely clear trends, though, that cut across demographics. Basically, cellphones/tablets (the two categories are blurring) and TCP/IP-based alternatives to familiar media with analogue antecedents (mainly phone, SMS, TV) are rapidly taking over in almost every segment, especially among the poorer and younger demographics, and the change is occurring incredibly fast. Even native digital technologies like laptops are on the verge of disappearing into a minor niche any moment now. And the title of the article picks out one interesting trend: younger people, in particular, would not miss their TVs much. Most would not even notice they had gone.

Address of the bookmark: http://ben-evans.com/benedictevans/2015/8/10/what-would-you-miss

Best Way to Take Notes In Class Isn't On Your Laptop, (bad) Research Finds

The best way to learn is not to have classes that demand that you take notes to remember their content in the first place. But, putting that very obvious objection to one side for a moment…

This describes one of those awful bits of research that pays no heed to the fact that there are infinitely many ways to take notes and many different purposes behind doing so, nor that there are massive differences between individuals.

If:

  1. your intent is to remember what someone is telling you,
  2. you are determined to keep lots of distracting tools open while you are taking notes,
  3. you are not as good at typing as you are with writing with a pen or pencil 
  4. you have an uncontrollable urge to transcribe rather than reflect when taking notes with a computer,
  5. your tools do not include tablets with rich note-keeping features that you are reasonably proficient with, and
  6. you are a pretty average learner,

then, perhaps, on average (with notable exceptions), you might be better off using a pen or a pencil. Or, at least, you should learn how to use a laptop more effectively.

I’m not suggesting you should always use a laptop. There are plenty of occasions when pens etc are normally more useful (or at least more convenient), for instance if you are the rapporteur for a group or you are sharing a piece of paper, a flip chart or a whiteboard. There are good high-tech solutions for such things but they are expensive, often fragile, and typically come with a learning curve. Such things are not for everyone, at least, not all of the time. But, to suggest that you should not take notes with a laptop is to completely miss the point. It ain’t what you do, it’s the way that you do it. See my previous post on a similarly harmful bit of nonsense for more reasons you might prefer to take notes electronically at least some of the time.

The mediaeval pedagogies are the cause of the problems, not the note taking or use of laptops during lectures. I don’t mind teachers suggesting that it is probably not a good idea to do something that demands effort while doing something else that also demands effort. That’s just common sense advice, like warning people not to text and drive. But, if teachers didn’t force people to learn in a hugely ineffective, coercive, power-crazed fashion in the first place, as though centuries of pedagogical research had never happened, none of this would be a problem at all. Nor would it be a problem if, instead of telling students to give up their really useful tools, teachers went to the trouble of helping them to learn how to use them more effectively. That would be more like teaching. Maybe the teachers would learn something in the process too.

 

Address of the bookmark: http://www.nbcnews.com/feature/freshman-year/best-way-take-notes-class-isnt-your-laptop-research-finds-n416831

What Maslow’s Hierarchy Won’t Tell You About Motivation

A simple, clear description of self-determination theory, with a few examples of how it might be applied. The example given is nothing like as good as the description – it is concerned with making people bend to your will through motivational trickery – but the description of SDT is good, as is the brief, dismissive debunking of Maslow’s hierarchy of needs (an armchair theory with little or no foundation in reality).

Address of the bookmark: https://hbr.org/2014/11/what-maslows-hierarchy-wont-tell-you-about-motivation

Keep laptops out of lecture halls, professor says

Another in a long line of ‘keep digital technologies out of the classroom’ nonsense. Sometimes I despair.

Lecture with skeletonThe sad thing is that this idea (banning the taking of notes in lectures using keyboards) is actually quite valid, in the context of an oppressive, coercive and ineffective pedagogy, given the very limited goals of this kind of transmissive model of learning. If you want your passive students to be able to parrot your wise thoughts back at you, and this is what you value in the assessment, and if you have so little imagination that you can’t figure out a better way to deliver that information than through a lecture, then this is roughly what it takes to make a lecture at least partially work in the manner intended.

To use lectures this way is unbelievably wasteful and stupid. Students will get much more of what you want them to get from just reading a book, or maybe reviewing your own lecture notes or, if you must, watching a recording of your last lecture. Of course, there are normally far better ways to learn than reading or watching, but there is usually a need for simply passing on information in a digestible manner in even the most active approaches to learning.

And yet…

It took me a few years of railing against lectures to realize that lectures are not the problem. I actually don’t mind even the most traditional stand-up-and-preach variety of lectures per se at all. As long as you are not labouring under the illusion that they are at all efficient as a means of helping people to fill their heads with information, and as long as you don’t force people to attend them (including by assessing them on the informational content afterwards), they can play a useful role as catalysts, way-points, and connectors. 

It’s no big deal to give up an hour or so of your time to attend a lecture. You will probably get some inspiration (even if not quite what the lecturer intends), the simple fact that you are devoting time exclusively to it will focus you on the topic of the lecture and give you uninterrupted time to reflect, and it’s a great way to meet people and talk about the topic with them afterwards. As long as you choose whether or not you attend, this can be very motivating. This is even true of rather dull lectures. As long as you don’t set out with the intent of retaining information from them (for which they are very ill suited) they are powerful tools in the pedagogical toolset. 

I do nearly always take notes, typically on a tablet or cellphone, when I attend lectures at conferences etc. A few of those notes may contain reminders about the content, links shared, references, etc: perhaps those might stick better if I made them as hand-written notes (and, sometimes, I’ll scribble them in the margins of the conference program for that reason). But, mostly, my notes contain my reflections and my responses, which are often quite tangential to the intent of the speaker or the content. I might be provoked by something mistaken or dumb, I might pick up a throwaway bit of wording that sparks a divergent train of thought, or I might see connections with something I have been doing, or maybe discover a different way of seeing the same thing or maybe, occasionally, discover something quite new. Handwritten notes are worse for that kind of thing. They’re much more likely to be lost, cannot so easily be re-used, cannot incorporate images of slides or other reminders, cannot contain active hyperlinks and are not so easily indexed.

If you are treating lectures as a source of information then hand-written notes, especially with pictures and visual models of connections, are a good way to make the best of a very bad job. If instead you see lectures as catalysts for thought and creativity, as sparks to light flames, as spaces to reflect, or as conversation starters, then handwritten notes really aren’t that great at all.

 

Address of the bookmark: http://www.ctvnews.ca/lifestyle/keep-laptops-out-of-lecture-halls-professor-says-1.2530738

Personalization in Lumen’s “Next Gen” OER Courseware Pilot

I always enjoy reading posts by David Wiley. This is a good one on the progress of Lumen Learning but the main reason I am bookmarking it is for one of the clearest explanations I have seen of the central problem with far-too-common naive approaches to personalized learning. David uses the example of Google’s seldom-used ‘I’m feeling lucky’ button to explain why having a machine (or, as he puts it ‘a passionless algorithm’) make learning choices for you, even if they are pretty likely to be good ones from a short-sighted objectives-based perspective, is normally a bad idea.

I’d go a bit further. Having a human make those choices for you can be equally bad for learning. While human judgement might lead to better choices than a dispassionate algorithm, the problem in learning is not so much one of making the best choices to reach an objective, but of learning how to make those choices yourself. There is a risk that careless use of analytics by teachers to lead students in a particular direction might simply substitute a human for a machine. Beyond the most trivial of skills (not to trivialize trivial skills) effective teaching – the stuff that persists and transforms – is not about making choices on the behalf of a learner. It is much more about provoking and responding (and a host of other things like caring, nurturing, challenging, soothing, inspiring, etc, none of which can be done well by machines).

Having teachers make choices is not what David is talking about, though. He rightly emphasizes the importance of engaging in ‘good old-fashioned conversations’, which are the very opposite of teacher control, and of simply using models from the machine to help inform those conversations. This is great. The more you know about someone, the richer the conversations can be and, as an expert with a good understanding of the model, a teacher should be able to interpret it wisely – an aid to decision-making, not a decision-maker in itself.

I’m not so sure about feeding the model back to the learner directly though. In all but the most trivial of models there are some big risks of misapprehensions, misdirection, missing parts, and misattributions. Any model is just that – a simplification and abstraction of a much more complex whole.  As long as it is understood that way by the learner then you would think all should be fine, but it is not so simple. For example, I was given one of those dreadful fitness tracker devices that uses just such a simple model. It miscounts steps, fails to understand the concept of cycling, sailing, swimming, playing a guitar or even of a standing desk, but none-the-less continues to present believable-looking statistics about my health to me and even tells me in pure Skinner fashion to get up and jog, without having the slightest idea about the state of my knees or ankles, let alone my distaste for jogging. I completely understand the crude and ugly behaviourist reward/punishment pedagogy it attempts to inflict on me and am fully aware of the fact that it is often hundreds of percent wrong about my activity and I completely get the limitations of the model. But it still draws me in. No matter how much I can intellectually explain that there is nothing inherently meaningful about it counting 500 or 15,000 steps in a day, those reassuring graphs affect me, and not in a good way. Sometimes I have found myself walking places in order to reach the machine’s target when I would otherwise have cycled (a much healthier alternative) and congratulate myself on a nice looking graph when I know that all I have been doing is playing the guitar (which the machine identifies as walking – maybe it’s my foot tapping). It’s a sure sign of extrinsic motivation when, even though I am the only one that knows or cares, I cheat. Being aware of limitations is not enough.

Address of the bookmark: http://opencontent.org/blog/archives/3965

Wiping out species may boost evolution – study — RT News

Brief report on some interesting research that demonstrates that mass extinctions speed up the evolution of those species that are left. What is particularly nice about it is its support for the principle that evolvability is itself selected by evolution. As Kevin Kelly once memorably put it, ‘change changes itself’. In other words, the rules of change are themselves as much subject to evolution as anything else, and this is one of the central ratchets that leads to divergence and complexity. A crucial if fiendishly hard to implement principle for those of us seeking to seed practical self-organizing systems such as, say, the Landing, or who are looking for a better and more resilient way to run courses and universities.

Address of the bookmark: http://www.rt.com/news/312460-extinction-events-boost-evolution/

ToyRep 3D Printer – Costs Under $85 to Build Using Super Cheap 28BYJ-48 Motors

This is interesting – a fully functional 3D printer for (potentially) under $85. Of course, there are caveats. Though the printer itself seems very capable, even compared with those that cost at least ten or fifteen times as much, a fair amount of skill is needed to build it. Also, it does rely on a fair number of 3D printed parts, so you need to have access to a 3D printer to make one. That said, even if you had to rely on a company to produce those 3D parts for you, and even if you invested in a better printing head than the cheap one described here, it would still be possible to build one of these for a very few hundred dollars. This might not be the perfect solution for schools etc, where reliability and safety are paramount, but it looks like a great alternative for hobbyists wanting to explore Santa Claus machines.

Any moment now, 3D printing looks set to hit the mainstream. I’m still not quite sure what such machines can really do, given their current reliance on PLA or ABS filaments, their slow print speeds, and unreliable operation. I have spent a while browsing Thingiverse looking for projects and have been amused by printable guitars and violins (some glueing and extra components required).  I’ve had a few thoughts about designing bits and pieces like cord organizers, replacement parts for broken devices and instruments, home gadgets, etc, but I have yet to come up with any really compelling use cases that are not more trouble, nor significantly cheaper, than simply buying the things ready made. Most of the objects available on Thingiverse look a lot like uses of Sugru – great fun, ingenious, but embarrassingly amateurish, garish and crude.  And 3D printers are not compact things – you need to put them and their raw materials somewhere. For low-utilization scenarios it’s still more sensible, and not much more expensive, to simply send a design to a 3D printing service.

I feel almost certain that there are educational uses for such things. This is most obviously valuable for kids and those in physical design disciplines (architecture, engineering, interior design, sculpture, etc), and I can think of a few ways of using artefacts to help make concepts more concrete in a physical classroom (physical routers, logic gates, etc, for instance), but I have yet to work out a way to incorporate them into the things I teach online, all of which are conceptual and/or virtual.  I’m hoping that, when I get one, the possible will become more adjacent.

Address of the bookmark: http://3dprint.com/89620/toyrep-3d-printer

Grit: A Skeptical Look at the Latest Educational Fad (##) – Alfie Kohn

One of two related articles by the ever-wonderful Alfie Kohn. The other, on ‘growth mindsets’ is at http://www.alfiekohn.org/article/mindset/

Both are cutting attacks on a couple of terrible education fads that appear to be gaining sway with politicians and that are both, as Kohn explains very well, not about supporting creative, interested, engaged learners but instead about ensuring conformity and control. As it often the case in Kohn’s articles, both swing round to Kohn’s central agenda of promoting self-determination theory, both are well informed by substantial research. Kohn is kinder to Carol Dweck’s growth mindset research than to the appallingly unsupported and unsupportable ‘grit’ nonsense promoted by appropriately named Paul Tough, but the results are much the same: the focus on making individuals fit the structure rather than changing the poisonous structure of educational systems themselves.

Read one, read both.

Address of the bookmark: http://www.alfiekohn.org/article/grit/

Teaching with the Internet; or How I Learned to Stop Worrying and Love the Google In My Classroom ~ Stephen's Web

Lecture with skeleton Stephen Downes questions Adeline Koh’s questioning of the lecture form for keynotes. He’s right to question.

In a classroom, the lecture is imposed, regularly scheduled, controlled, and it epitomizes all that is wrong in regulated institutional learning. A classroom lecture is about making people learn what you want to make them learn. At least, that’s the norm. And a lecture is incredibly bad at playing that role – much worse than a book or a decent website. That’s why, for the most part, most good teachers don’t habitually do lectures or, if they do, they keep them very short and situate them in other activities, as Koh suggests they should, and/or use them as ignition points for the real learning that goes on outside the classroom.

A keynote at a conference is not like that at all. With very few exceptions, every attendee makes a deliberate choice to attend and to devote a small chunk of time to being inspired and/or challenged. At least, we hope that’s what will happen. That is at least why we try to get keynotes with interesting things to say. It’s not a means of drumming facts into people. It’s a voluntarily chosen opportunity to see the world a bit differently, not unlike choosing to see a movie that you suspect will affect you. Personally, I do like to provide a bit of variety and audience engagement in my keynotes, especially if I can encourage attendees to engage face-to-face or onine, but that’s really just to keep the interest rolling and to find ways of helping people take ownership of the things that matter to them in whatever it is that I am rabbiting on about. I do so because it’s pretty hard to spend an hour being consistently inspiring and it seems a pity to waste the opportunity to engage with a bunch of smart, interested, like-minded people if they have taken the trouble to attend.

A bad keynote is tedious. I have been bored to sleep by those, who were otherwise some of the greatest thinkers with really interesting things to say, that just stood up and read at me or, worse, read at their notes while barely looking up. Why bother doing that? I’d much rather watch a movie. Even a bad keynote, though, is not entirely a waste of time. The real value of such a thing is not the boringly delivered lecture itself, but that you are sitting there with a load of other bored people with whom you can talk about it afterwards. It’s a shared focal point. This can help spark some interesting conversations, especially if some people managed to overcome their boredom and found inspiration in the words.

If lectures at schools and universities were run like keynotes, with voluntary attendance and carefully chosen inspirational speakers, it might not be a bad thing at all, though the rest of the accreditation framework would have to change too. There were some optional lectures in my first degree but I attended only one in the whole time I was there. I still remember that lecture quite vividly – it did change how I think and it really was inspiring – but there were dozens of others that I missed because they wouldn’t be on the exam (nor was the one I attended – it was just really interesting and someone I respected had suggested I might like it). I attended dozens of such lectures in my second degree because I was a far more mature learner and I was there to learn, not to pass the test: I attended because I was interested, not because I had to do so, and I got a huge amount out of them and the surrounding conversations. This is what we need – people that learn because they want to, not because we tell them they must, and not because we will punish them if they do not. Disaggregation of teaching and assessment is the crucial next step we absolutely have to take if we are to make institutional education as useful as it should, and easily could, be.

Address of the bookmark: http://www.downes.ca/post/64322