Great critique by Tom Worthington of an alleged for-credit MOOC from MIT that was anything but a MOOC. As Tom rightly points out, two instructors, 31 students, and online materials from EdX do not a MOOC make. As he notes, this kind of instructional process has been working pretty well for decades, including at Athabasca, as it happens. What is relatively novel, perhaps, is that the fact that the course itself was supplied at no (extra) cost to the institution. Effectively (though not quite in this case as it was an MIT course in the first place) this was a typical use of an OER course with accreditation and tuition wrapped around it, following a practice that has been common in many places – especially in developing countries – since the earliest MOOCs in 2008. Tom himself has created a great OER course on green computing that we use here at AU, which follows much the same pattern (though we have lightly adapted the Australian course for local use).
Less stress in online learning?
Tom observes that in this intervention, as in his own teaching, students tend to take the online option due to scheduling difficulties, not by preference, but that they are less stressed by the process than their face-to-face counterparts. This makes sense because there’s a lot more teaching presence in a course that is a) designed for online delivery (usually with great care and attention to detail) and b) supported by live teachers. Online learners in this kind of set-up are getting a huge amount of support for their learning, both from course designers/developers and from their own professors. Technically speaking, some of that exuberance of teaching will cancel out due to the inevitable tension between structure and dialogue implied by transactional distance theory, but the opportunities for feedback on coursework, at least, more than compensate for the high transactional distance caused by the industrial teaching approach of a pre-prepared online course. At least, I hope so, because (though mainly with courses we have developed ourselves and only rarely with OERs) this is exactly what Athabasca University has been doing for nearly 50 years, apparently with some success.
More stress in online teaching?
Personally, I have to admit, I normally hate teaching other people’s courses, although it is something I have often done. However well-developed they might be, there are always things I disagree with, factually and pedagogically, and I deeply dislike the strait jackets such structured courses create. This is perhaps a little hypocritical of me because I expect tutors on my courses to do exactly that, and routinely allocate my own faculty to teach courses that others have written, putting them in exactly that position. Whatever. Few seem to suffer my aversion to the same degree and many seem to positively relish it. I guess it makes it easier, with fewer choices to make. To each their own. But even I am very happy to take an existing OER (like Tom’s) and alter it to my own purposes, and am even happier to offer alternative OERs for my students to use within a pedagogical framework I have created. I think this is just common sense, giving both me and my students plenty of freedom to do what suits us best. Either way, re-use of existing well-designed courses is at least as great an idea as it was when Otto Peters came up with his industrial model of distance learning some decades ago.
The reputation of online learning
Tom notes that online and distance education has a bad reputation: to some extent, yes, sure, some people feel that way. Yes, there have been some bad examples of the modality that have resulted in bad press (ahem…Phoenix) and naive folk that have never experienced online learning do tend to believe that there is some magic that happens face to face that cannot be replicated online. They are right, as it happens: some things are difficult or impossible to replicate and it is a kind of magic. But the converse is also true – great things happen online that cannot be replicated face to face, and that’s a kind of magic too. And, just as not everyone gets a great online experience, for many ‘face to face’ learners the experience is uniformly dire, with large impersonal lectures, ill-conceived pedagogies delivered by untrained teachers, and considerably less human interaction than what would typically be found online. On balance, while it is not quite correct to say that there is no significant difference, because there really are some basic differences in the need for self-management and control, there is no significant difference in the outcomes we choose to measure.
But, to return to the point, although some look upon online degrees less kindly, there are many employers who actively prefer those that have learned online because it is strong proof of their self-determination, will-power, and desire to succeed. I can confirm this positive perception: our students at AU are, on average, streets ahead of their traditionally taught counterparts, especially when you consider that a great many do not have the traditional qualifications needed to get into a conventional institution. I am constantly amazed by the skill and perseverance of our amazing students. On my own courses, especially in graduate teaching, I do everything I can to enable them to teach one another, because they tend to come to us with an incredible wealth of knowledge that just needs to be tapped and channelled.
A workable model
Though Tom is a little critical, I see value in what MIT is doing here. For some years now I have been trying to make the case at AU that we should be offering support for, and the means to credential, MOOCs offered elsewhere. This would give freedom to students to pick ways of learning that suit them best, to gain the benefits of diversity, and allow us to provide the kind of tutorial support and accreditation that we are pretty good at, at only a fraction of the (roughly) $100K cost of developing a typical course. It would give us the freedom to extend our offerings quite considerably, and avoid the need to keep developing the same curricula that are found everywhere else, so that we could differentiate ourselves by not just the style of teaching but also the subjects that we offer. This can in principle be done to some extent already through our challenge process (if you can find an equivalent course, take it, then take our challenge paper for a lot less than the price of a full course) and we do have independent study courses at graduate level that can be used much this way, with tutor support. But we could make a lot more of it if we did it just a bit more mindfully.
I suspect everyone on Athabasca University’s staff will be very interested in these posts by Matthew Prineas, who we will welcome on September 5th as our new provost and VPA, that show a great understanding of at least some of the benefits and challenges of distance learning. Amongst other things, he has done some really good work on embedding OERs at UMUC, and has strong credentials (!) in the field of competency based methods of learning and accreditation. These things matter a great deal to our future. It also seems that he has a subtle appreciation of our distributed teaching approach, though I should note that there are more ways to skin this cat than the industrial model – we need to aim for post-industrial, where we achieve economies of scale not (just) by write-once-deliver-many teaching but by leveraging the value of human interaction on a large scale that distributed network technologies enable. It is great, though, that we’re getting a VPA who seems aligned with our mission and who reaches out to the world through social media. See, too, his Twitter posts at https://twitter.com/mprineas?lang=en
Tony rightly points out that our problems are more internal than external, and that the solutions have to come from us, not from outside. To a large extent he hits the nail right on the head when he notes:
“Major changes in course design, educational technology, student support and administration, marketing and PR are urgently needed to bring AU into advanced 21st century practice in online and distance learning. I fear that while there are visionary faculty and staff at AU who understand this, there is still too much resistance from traditionalists and those who see change as undermining academic excellence or threatening their comfort zone.”
It is hard to disagree. But, though there are too many ostriches among our staff and we do have some major cultural impediments to overcome, it is far less people that impede our progress than it is our design itself, and the technologies – especially the management technologies – of which it consists. That must change, as a corequisite to changing the culture that goes along with it. With some very important exceptions (more on that below) our culture is almost entirely mediated through our organizational and digital technologies, most notably in the form of very rigid processes, procedures and rules, but also through our IT. Our IT should, but increasingly does not, embody those processes. The processes still exist, of course – it’s just that people have to perform them instead of machines. Increasingly often, to make matters worse, we shape our processes to our ill-fitting IT rather than vice versa, because the ‘technological debt’ of adapting them to our needs and therefore having to maintain them ourselves is considered too great (a rookie systems error caused by splitting IT into a semi-autonomous unit that has to slash its own costs without considering the far greater price paid by the university at large). Communication, when it occurs, is almost all explicit and instrumental. We do not yet have enough of the tacit flows of knowledge and easy communication that patch over or fix the (almost always far greater) flaws that exist in such processes in traditional bricks and mortar institutions. The continual partial attention and focused channels of communication resulting from working online mean that we struggle with tacit knowledge and the flexibility of embedded dialogue in ways old fashioned universities never have to even think about. One of the big problems with being so process-driven is that, especially in the absence of richer tacit communication, it is really hard to change those processes, especially because they have evolved to be deeply entangled with one another – changing one process almost always means changing many, often in structurally separate parts of the institutional machine, and involves processes of its own that are often entangled with those we set out to change. As a result for much of its operation, our university does what it does despite us, not because of us. Unlike traditional universities, we have nothing else to fall back on when it fails, or when things fall between cracks. And, though we likely have far fewer than most traditional universities, there are still very many cracks to fall through.
This, not coincidentally, is exactly true of our teaching too. We are pretty darn good at doing what we explicitly intend to do: our students achieve learning outcomes very well, according to the measures we use. AU is a machine that teaches, which is fine until we want the machine to do more than what it is built to do or when other, faster, lighter, cheaper machines begin to compete with it. As well as making it really hard to make even small changes to teaching, what gets lost – and what matters about as much as what we intentionally teach – is the stuff we do not intend to teach, the stuff that makes up the bulk of the learning experience in traditional universities, the stuff where students learn to be, not just to do. It’s whole-person learning. In distance and online learning, we tend to just concentrate on parts we can measure and we are seldom even aware of the rest. There is a hard and rigid boundary between the directed, instrumental processes and the soft, invisible patterns of culture and belonging, beyond which we rarely cross. This absence is largely what gives distance learning a bad reputation, though it can be a strength if focused teaching of something well-defined is exactly what is needed, or if students are able to make the bigger connections in other ways (true of many of our successful students), when the control that the teaching method provides is worth all the losses and where a more immersive experience might actually get in the way. But it’s a boundary that alienates a majority of current and prospective students. A large percentage of even those we manage to enrol and keep with us would like to feel more connected, more a part of a community, more engaged, more belonging. A great many more don’t even join us in the first place because of that perceived lack, and a very large number drop out before submitting a single piece of work as a direct result.
This is precisely the boundary that the Landing is intended to be a step towards breaking down.
If we cannot figure out how to recover that tacit dimension, there is little chance that we can figure out how to teach at a distance in a way that differentiates us from the crowd and that draws people to us for the experience, rather than for the qualification. Not quite fair. Some of us will. If you get the right (deeply engaged) tutor, or join the right (social and/or open) course, or join the Landing, or participate in local meet-ups, or join other social media groups, you may get a fair bit of the tacit, serendipitous, incidental learning and knowledge construction that typifies a traditional education. Plenty of students do have wonderful experiences learning with others at AU, be it with their tutors or with other students. We often see those ones at convocation – ones for whom the experience has been deep, meaningful, and connected. But, for many of our students and especially the ones that don’t make it to graduation (or even to the first assignment), the chances of feeling that you belong to something bigger, to learn from others around you, to be part of a richer university experience, are fairly low. Every one of our students needs to be very self-directed, compared with those in traditional institutions – that’s a sina qua non of working online – but too many get insufficient support and too little inspiration from those around them to rise beyond that or to get through the difficult parts. This is not too surprising, given that we cannot do it for ourselves either. When faced with complicated things demanding close engagement, too many of our staff fall back on the comfortable, easy solution of meeting face to face in one of our various centres rather than taking the hard way, and so the system remains broken. This can and will change.
Moving on
I am much heartened by the Coates report which, amongst other things but most prominently and as our central value proposition, puts our leadership in online and distance education at the centre of everything. This is what I have unceasingly believed we should do since the moment I arrived. The call to action of Coates’s report is fundamentally to change our rigid dynamic, to be bold, to innovate without barriers, to evolve, to make use of the astonishingly good resources – primarily our people – to (again) lead the online learning world. As a virtual institution this should be easier than it would be for others but, perversely, it is exactly the opposite. This is for aforesaid reasons, and also because the boundaries of our IT systems create the boundaries of our thinking, and embed processes more deeply and more inflexibly than almost any bricks and mortar establishment could hope to do. We need soft systems, fuzzy systems, adaptable systems, agile systems for our teaching, research, and learning community development, and we need hard systems, automated systems, custom tailored, rock solid systems for our business processes, including the administrational and assessment recording outputs of the teaching process. This is precisely the antithesis of what we have now. As Coates puts it:
“AU should rebrand itself as the leading Canadian centre for online learning and twenty- first century educational technology. AU has a distinct and potentially insurmountable advantage. The university has the education technology professionals needed to provide leadership, the global reputation needed to attract and hold attention, and the faculty and staff ready to experiment with and test new ideas in an area of emerging national priority. There is a critical challenge, however. AU currently lacks the ICT model and facilities to rise to this opportunity.”
We live in our IT…
We have long been challenged with our IT systems, but things were not always so bad. Our ICT model has made a 180 degree turnaround in the past few years in the exact opposite direction to one that will support continuing evolution and innovation, driven by people that know little about our core mission and that have failed to understand what makes us special as a university. The best defence offered for these poor decisions is usually that ‘most other universities are doing it,’ but we are not most other universities. ICTs are not just support tools or performance enhancers for us. We are our IT. It is our one and only face to our students and the world. Without IT, we are literally nothing. We have massively underinvested in developing our IT, and what we have done in recent years has destroyed our lead, our agility, and our morale. Increasingly, we have rented generic, closed, off-the-shelf cloud-based applications that would be pretty awful in a factory, that force us into behaviours that make no sense, that sap our time and will, and that are so deeply inappropriate for our very unique distributed community that they stifle all progress, and cut off almost all avenues of innovation in the one area that we are best placed to innovate and lead. We have automated things that should not be automated and let fall into disrepair the things that actually give us an edge. For instance, we rent an absurdly poor CRM system to manage student interactions, building a call centre for customers when we should be building relationships with students, embedding our least savoury practices of content delivery still further, making tweaks to a method of teaching that should have died when we stopped using the postal service for course packs. Yes, when it works, it incrementally improves a broken system, so it looks OK (not great) on reports, but the system it enhances is still irrevocably broken and, by further tying it into a hard embodiment in an ill-fitting application, the chances of fixing it properly diminish further. And, of course, it doesn’t work, because we have rented an ill-fitting system designed for other things with little or no consideration of whether it meets more than coarse functional needs. This can and must change.
Meanwhile, we have methodically starved the environments that are designed for us and through which we have innovated in the past, and that could allow us to evolve. Astonishingly, we have had no (as in zero) central IT support for research for years now, getting by on a wing and a prayer, grabbing for bits of overtime where we can, or using scarce, poorly integrated departmental resources. Even very well-funded and well-staffed projects are stifled by it because almost all of our learning technology innovations are completely reliant on access, not only to central services (class lists, user logins, LMS integration, etc), but also to the staff that are able to perform integrations, manage servers, install software, configure firewalls, etc, etc. We have had a 95% complete upgrade for the Landing sitting in the wings for nearly 2 years, unable to progress due to lack of central IT personnel to implement it, even though we have sufficient funds to pay for them and then some, and the Landing is actively used by thousands of people. Even our mainstream teaching tools have been woefully underfunded and undermined: we run a version of Moodle that is past even its security update period, for instance, and that creaks along only thanks to a very small but excellent team supporting it. Tools supporting more innovative teaching with more tenuous uptake, such as Mahara and OpenSIM servers, are virtual orphans, riskily trundling along with considerably less support than even the Landing.
This can and will change.
… but we are based in Athabasca
There are other things in Coates’s report that are given a very large emphasis, notably advice to increase our open access, particularly through forming more partnerships with Northern Albertan colleges serving indigenous populations (good – and we will need smarter, more human, more flexible, more inclusive systems for that, too), but mainly a lot of detailed recommendations about staying in Athabasca itself. This latter recommendation seems to have been forced upon Coates, and it comes with many provisos. Coates is very cognizant of the fact that being based in the remote, run-down town of Athabasca is, has been, and will remain a huge and expensive hobble. He mostly skims over sensitive issues like the difficulty of recruiting good people to the town (a major problem that is only slightly offset by the fact that, once we have got them there, they are quite unlikely to leave), but makes it clear that it costs us very dearly in myriad other ways.
“… the university significantly underestimates the total cost of maintaining the Athabasca location. References to the costs of the distributed operation, including commitments in the Town of Athabasca, typically focus on direct transportation and facility costs and do not incorporate staff and faculty time. The university does not have a full accounting of the costs associated with their chosen administrative and structural arrangements.”
His suggestions, though making much of the value of staying in Athabasca and heavily emphasizing the importance of its continuing role in the institution, involve moving a lot of people and infrastructure out of it and doing a lot of stuff through web conferencing. He walks a tricky political tightrope, trying to avoid the hot potato of moving away while suggesting ways that we should leave. He is right on both counts.
Short circuits in our communications infrastructure
Though cost, lack of decent ICT infrastructure, and difficulties recruiting good people are factors in making Athabasca a hobble for us, the biggest problem is, again, structural. Unlike those working online, among those living and working in the town of Athabasca itself, all the traditional knowledge flows occur without impediment, almost always to the detriment of more inclusive ways of online communication. Face to face dialogue inevitably short-circuits online engagement – always has, always will. People in Athabasca, as any humans would and should, tend to talk among themselves, and tend to only communicate with others online, as the rest of us do, in directed, intentional ways. This might not be so bad were it not for the fact that Athabasca is very unrepresentative of the university population as a whole, containing the bulk of our administrators, managers, and technical staff, with less than 10 actual faculty in the region. This is a separate subculture, it is not the university, but it has enormous sway over how we evolve. It is not too surprising that our most critical learning systems account for only about 5% of our IT budget because that side of things is barely heard of among decision-makers and implementors that live there and they only indirectly have to face the consequences of its failings (a matter made much worse by the way we disempower the tutors that have to deal with them most of all, and filter their channels of communication through just a handful of obligated committee members). It is no surprise that channels of communication are weak because those that design and maintain them can easily bypass the problems they cause. In fact, if there were more faculty there, it would be even worse, because then we would never face any of the problems encountered by our students. Further concentrations of staff in Edmonton (where most faculty reside), St Albert (mainly our business faculty) and Calgary do not help one bit, simply building further enclaves, which again lead to short circuits in communication and isolated self-reinforcing clusters that distort our perspectives and reduce online communication. Ideas, innovations, and concerns do not spread because of hierarchies that isolate them, filter them as they move up through the hierarchy, and dissipate them in Athabasca. Such clustering could be a good part of the engine that drives adaptation: natural ecosystems diversify thanks to parcellation. However, that’s not how it works here, thanks to the aforementioned excess in structure and process and the fact that those clusters are far from independently evolving. They are subject to the same rules and the same selection pressures as one another, unable to independently evolve because they are rigidly, structurally, and technologically bound to the centre. This is not evolution – it is barely even design, though every part of it has been designed and top-down structures overlay the whole thing. It’s a side effect of many small decisions that, taken as a whole, result in a very flawed system.
This can and must change.
The town of Athabasca and what it means to us
Though I have made quite a few day trips to Athabasca over the years, I had never stayed overnight until around convocation time this year. Though it was a busy few days so I only had a little chance to explore, I found it to be a fascinating place that parallels AU in many ways. The impression it gives is of a raw, rather broken-down and depressed little frontier town of around 4,000 souls (a village by some reckonings) and almost as many churches. It was once a thriving staging post on the way to the Klondike gold rush, when it was filled with the rollicking clamour of around 20,000 prospectors dreaming of fortunes. Many just passed through, but quite a few stayed, helping to define some of its current character but, when the gold rush died down, there was little left to sustain a population. Much of the town still feels a bit temporary, still a bit of a campground waiting to turn into a real town. Like much of Northern Alberta, its fortunes in more recent years have been significantly bound to the oil business, feeding an industry that has no viable future and the morals of an errant crow, tied to its roller coaster fortunes. There are signs that money has been around, from time to time: a few nice buildings, a bit of landscaping here and there, a memorial podium at Athabasca Landing. But there are bigger signs that it has left.
Today, Athabasca’s bleak main street is filled with condemned buildings, closed businesses, discount stores, and shops with ‘sale’ signs in their windows. There are two somewhat empty town centre pubs, where a karaoke night in one will denude the other of almost all its customers.
There are virtually no transit links to the outside world: one Greyhound bus from Edmonton (2 hours away) comes through it, in the dead of night, and passenger trains stopped running decades ago. The roads leading in and out are dangerous: people die way too often getting there, including one of our most valued colleagues in my own school. It is never too far from being reclaimed by the forces of nature that surround it. Moose, bear, deer, and coyotes wander fairly freely. Minus forty temperatures don’t help, nor does a river that is pushed too hard by meltwaters from the rapidly receding Athabasca Glacier and that is increasingly polluted by the side-effects of oil production.
So far so bleak. But there are some notable upsides too. The town is full of delightfully kind, helpful, down-to-earth people infused with that wonderful Canadian spirit of caring for their neighbours, grittily facing the elements with good cheer, getting up early, eating dinner in the late afternoon, gathering for potlucks in one another’s houses, and organizing community get-togethers. The bulk of housing is well cared-for, set in well-tended gardens, in quiet, neat little streets. I bet most people there know their neighbours and their kids play together. Though tainted by its ties with the oil industry, the town comes across as, fundamentally, a wholesome centre for homesteaders in the region, self-reliant and obstinately surviving against great odds by helping one another and helping themselves. The businesses that thrive are those selling tools, materials, and services to build and maintain your farm and house, along with stores for loading your provisions into your truck to get you through the grim winters. It certainly helps that a large number of residents are employees of the university, providing greater diversity than is typically found in such settlements, but they are frontier folk like the rest. They have to be.
It would be unthinkable to pull the university out at this point – it would utterly destroy an already threatened town and, I think, it would cause great damage to the university. This was clearly at the forefront of Coates’s mind, too. The solution is not to withdraw from this strange place, but to dilute and divert the damage it causes and perhaps, even, to find ways to use its strengths. Greater engagement with Northern communities might be one way to save it – we have some big largely empty buildings up there that will be getting emptier, and that might not be a bad place for some face-to-face branching out, perhaps semi-autonomously, perhaps in partnership with colleges in the region. It also has potential as a place for a research retreat though it is not exactly a Mecca that would draw people to it, especially without transit links to sustain it. A well-designed research centre cost a fortune to build, though, so it would be nice to get some use out of it.
Perhaps more importantly, we should not pull out because Athabasca is a part of the soul of the institution. It is a little fitting that Athabasca University has – not without resistance – had its fortunes tied to this town. Athabasca is kind-of who we are and, to a large extent, defines who we should aspire to be. As an institution we are, right now, a decaying frontier town on the edge of civilization that was once a thriving metropolis, forced to help ourselves and one another battle with the elements, a caring bunch of individuals bound by a common purpose but stuck in a wilderness that cares little for us and whose ties with the outside world are fickle, costly, and tenuous. Athabasca is certainly a hobble but it is our hobble and, if we want to move on, we need to find ways to make the best of it – to find value in it, to move people and things away from it that it impedes the most, at least where we can, but to build upon it as a mythic hub that helps to define our identity, a symbolic centre for our thinking. We can and will help ourselves and one another to make it great again. And we have a big advantage that our home town lacks: a renewable and sustainable resource and product. Very much unlike Athabasca the town, the source of our wealth is entirely in our people, and the means we have for connecting them. We have the people already: we just need to refocus on the connection.
Interesting article on the rights of companies to moderate posts, following the recent Reddit furore that, in microcosm, raises a bunch of questions about the future of the social net itself. The distinction between freedom of speech and the rights of hosts to do whatever they goddam please – legal constraints permitting – is a fair and obvious one to make.
The author’s suggestion is to decentralize social media systems (specifically Twitter and Reddit though, by extension, others are implicated) by providing standards/protocols that could be implemented by multiple platforms, allowing the development of an ecosystem where different sites operate different moderation policies but, from an end-user perspective, being no more difficult to use than email.
The general idea behind this is older than the Internet. Of course, there already exist many systems that post via proprietary APIs to multiple places, from WordPress plugins to Known, not to mention those ubiquitous ‘share’ buttons found everywhere, such as at the bottom of this page. But, more saliently, email (SMTP), Internet Relay Chat (IRC), Jabber (XMPP), Usenet news (NNTP) are prototypical and hugely successful examples of exactly this kind of thing. In fact, NNTP is so close to Reddit’s pattern in form and intent that I don’t see why it could not be re-used, perhaps augmented to allow smarter ratings (not difficult within the existing standard). Famously, Twitter’s choice of character limit is entirely down to fitting a whole Tweet, including metadata, into a single SMS message, so that is already essentially done. However standards are not often in the interests of companies seeking lock-in and a competitive edge. Most notably, though they very much want to encourage posting in as many ways as possible, they very much want control of the viewing environment, as the gradual removal of RSS from prominent commercial sites like Twitter and Facebook shows in spades. I think that’s where a standard like this would run into difficulties getting off the ground. That and Metcalfe’s Law: people go where people go, and network value grows proportionally to the square of the number of users of a system (or far more than that, if Reed’s Law holds). Only a truly distributed system ubiquitously used system could avoid that problem. Such a thing has been suggested for Reddit and may yet arrive.
As long as we are in thrall to a few large centralized commercial companies and their platforms – the Stacks, as Bruce Sterling calls them – it ain’t going to work. Though an incomplete, buggy and over-complex implementation played a role, proprietary interest is essentially what has virtually killed OpenSocial, despite being a brilliant idea that was much along these lines but more open, and despite having virtually every large Internet company on board, bar one. Sadly, that one was the single most avaricious, amoral, parasitic company on the Web. Almost single-handedly, Facebook managed to virtually destroy the best thing that might have happened to the social web, that could have made it a genuine web rather than a bunch of centralized islands. It’s still out there, under the auspices of the W3C, but it doesn’t seem to be showing much sign of growth or deployment.
Facebook has even bigger and worser ambitions. It is now, cynically and under the false pretense of opening access to third world countries, after the Internet itself. I hope the company soon crashes and burns as fast as it rose to prominence – this is theoretically possible, because the same cascades that created it can almost as rapidly destroy it, as the once-huge MySpace and Digg discovered to their cost. Sadly, it is run by very smart people that totally get networks and how to exploit them, and that has no ethical qualms to limit its growth (though it does have some ethical principles about some things, such as open source development – its business model is evil, but not all of its practices). It has so far staunchly resisted attack, notwithstanding its drop in popularity in established markets and a long history of truly stunning breaches of trust.
Do boycott Facebook if you can. If you need a reason, other than that you are contributing to the destruction of the open web by using it, remember that it tracks you hundreds of times in a single browsing session and, flaunting all semblance of ethical behaviour, it attempts to track you even if you opt out from allowing that. You are its product. Sadly, with its acquisition of companies like Instagram and Whatsapp, even if we can kill the primary platform, the infection is deep. But, as Reed’s Law shows, though each new user increases its value, every user that leaves Facebook or even that simply ignores it reduces its value by an identically exponential amount. Your vote counts!
I was writing about openness in education in a chapter I am struggling with today, and had just read Tony Bates’s comments on iQualify, an awful cloud rental service offering a monolithic locked-in throwback that just makes me exclaim, in horror, ‘Oh good grief! Seriously?’ And it got me thinking.
Learning management systems, as implemented in academia, are basically paywalls. You don’t get in unless you pay your fees. So why not pick up on what publishers infamously already do and allow people to pay per use? In a self-paced model like that used at Athabasca it makes perfect sense and most of the infrastructure – role-based time-based access etc – and of course the content already exists. Not every student needs 6 months of access or the trimmings of a whole course but, especially for those taking a challenge route (just the assessment), it would often be useful to have access to a course for a little while in order to get a sense of what the expectations might be, the scope of the content, and the norms and standards employed. On occasion, it might even be a good idea to interact with others. Perhaps we could sell daily, weekly or monthly passes. Or we could maybe do it at a finer level of granularity too/instead: a different pass for different topics, or different components like forums, quizzes or assignment marking. Together, following from the publishers’ lead, such passes might cost 10 or 20 times the total cost of simply subscribing to a whole course if every option were purchased, but students could strategically pick the parts they actually need, so reducing their own overall costs.
This idea is, of course, stupid. This is not because it doesn’t make economic and practical sense: it totally does, notwithstanding the management, technical and administrative complexity it entails. It is stupid because it flips education on its head. It makes chunks of learning into profit centres rather than the stuff of life. It makes education into a product rather than celebrating its role as an agent of personal and societal growth. It reduces the rich, intricately interwoven fabric of the educational experience to a set of instrumentally-driven isolated events and activities. It draws attention to accreditation as the be-all and end-all of the process. It is aggressively antisocial, purpose-built to reduce the chances of forming a vibrant learning community. This is beginning to sound eerily familiar. Is that not exactly what, in too high a percentage of our courses, we are doing already?
If we and other universities are to survive and thrive, the solution is not to treat courses and accreditation as products or services. The ongoing value of a university is to catalyze the production and preservation of knowledge: that is what we are here for, that is what makes us worthwhile having. Courses are just tools that support that process, though they are far from the only ones, while accreditation is not even that: it’s just a byproduct, effluent from the educational process that happens to have some practical societal value (albeit at enormous cost to learning). In physical universities there are vast numbers of alternatives that support the richer purpose of creating and sustaining knowledge: cafes, quads, hallways, common rooms, societies, clubs, open lectures, libraries, smoking areas, student accommodation, sports centres, theatres, workshops, studios, research labs and so on. Everywhere you go you are confronted with learning opportunities and people to learn with and from, and the taught courses are just part of the mix, often only a small part. At least, that is true in a slightly idealized world – sadly, the vast majority of physical universities are as stupidly focused on the tools as we are, so those benefits are an afterthought rather than the main thing to celebrate, and are often the first things to suffer when cuts come along. Online, such beyond-the-course opportunities are few and far between: the Landing is (of course) built with exactly that concern in mind, but there’s precious little sign of it anywhere else at AU, one of the most advanced online universities in the world. The nearest thing most students get to it is the odd Facebook group or Twitter interaction, which seems an awful waste to me, though a fascinating phenomenon that blurs the lines between the institution and the broader community.
It is already possible to take a high quality course for free in almost any subject that interests you and, more damagingly, any time now there will soon be sources of accreditation that are as prestigious as those awarded by universities but orders of magnitude cheaper, not to mention compellingly cut-price options from universities that can leverage their size and economies of scale (and, perhaps, cheap labour) to out-price the rest of us. Competing on these grounds makes no sense for a publicly funded institution the role of which is not to be an accreditation mill but to preserve, critique, observe, transform and support society as a whole. We need to celebrate and cultivate the iceberg, not just its visible tip. Our true value is not in our courses but in our people (staff and students) and the learning community that they create.
Interesting and thoughtful argument from Savage Minds mainly comparing the access models of two well-known anthropology journals, one of which has gone open and seems to be doing fine, the other of which is in dire straits and that almost certainly needs to open up, but for which it may be too late. I like two quotes in particular. The first is from the American Anthropologist’s editorial, explaining the difficulties they are in:
“If you think that making money by giving away content is a bad idea, you should see what happens when the AAA tries to make money selling it. To put it kindly, our reader-pays model has never worked very well. Getting over our misconceptions about open access requires getting over misconceptions of the success of our existing publishing program. The choice we are facing is not that of an unworkable ideal versus a working system. It is the choice between a future system which may work and an existing system which we know does not.”
I like that notion of blurring and believe that this is definitely the way to go. We are greatly in need of new models for the sharing, review, and discussion of academic works because the old ones make no sense any more. They are expensive, untimely, exclusionary and altogether over-populous. There have been many attempts to build dedicated platforms for that kind of thing over they years (one of my favourites being the early open peer-reviewing tools of JIME in the late 1990s, now a much more conventional journal, to its loss). But perhaps one of the most intriguing approaches of all comes not from academic presses but from the world of student newspapers. This article reports on a student newspaper shifting entirely into the (commercial but free) social media of Medium and Twitter, getting rid of the notion of a published newspaper altogether but still retaining some kind of coherent identity. I don’t love the notion of using these proprietary platforms one bit, though it makes a lot of sense for cash-strapped journalists trying to reach and interact with a broad readership, especially of students. Even so, there might be more manageable and more open, persistent ways (eg. syndicating from a platform like WordPress or Known). But I do like the purity of this approach and the general idea is liberating.
It might be too radical an idea for academia to embrace at the moment but I see no reason at all that a reliable curatorial team, with some of the benefits of editorial control, posting exclusively to social media, might not entirely replace the formal journal, for both process and product. It already happens to an extent, including through blogs (I have cited many), though it would still be a brave academic that chose to cite only from social media sources, at least for most papers and research reports. But what if those sources had the credibility of a journal editorial team behind them and were recognized in similar ways, with the added benefit of the innate peer review social media enables? We could go further than that and use a web of trust to assert validity and authority of posts – again, that already occurs to some extent and there are venerable protocols and standards that could be re-used or further developed for that, from open badges to PGP, from trackbacks to WebMention. We are reaching the point where subtle distinctions between social media posts are fully realizable – they are not all one uniform stream of equally reliable content – where identity can be fairly reliably asserted, and where such an ‘unjournal’ could be entirely distributed, much like a Connectivist MOOC. Maybe more so: there is no reason there should even be a ‘base’ site to aggregate it all, as long as trust and identity were well established. It might even be unnecessary to have a name, though a hashtag would probably be worth using.
I wonder what the APA format for such a thing might be?
A thought-provoking article on a man who gate-crashed some top tier US university courses as well as here in Canada. This was both politically and personally motivated. The man not only attended lectures and seminars but also got access to networks, direct interaction with professors, and received many of the benefits of a traditional university education bar the accreditation and support.
Attendance at lectures is rarely monitored and it has long been touted as one of the best free entertainment options available, if you choose your lectures wisely. I think I first recall reading about it in Playpower, a book on alternative culture from the early 70s. As the article suggests, it is generally pretty well accepted in many institutions and, as long as it doesn’t harm paying students (in this case I’d guess it was actually beneficial), it is hard to see why anyone would object. It might be a little trickier to get away with attending small tutorials or workshops involving restricted resources (e.g needing logins to university computers), and it would not always be easy to get the accompanying documentation and schedules, but this seems eminently do-able for many courses. I don’t think it scales well – most universities would probably start to institute controls if a large number of non-paying students started to join in, especially if they used up rival resources like handouts or materials, or hog tutor time in ways that harmed others. In some subjects it would not work at all. But it is quite an interesting perspective on openness. Face-to-face can be open too – in fact, it might be the default.
Makes me wonder about closed online courses and open MOOCs. Are we really as open as we claim? I think not so much.
There are lots of caveats and some rather serious and mostly pointless restrictions (publishers have not figured out how to let go yet) but, on balance, this is pretty darn good news.
This is a very interesting, if (I will argue) flawed, paper by Margaryan, Bianco and Littlejohn using a Course Scan instrument to examine the instructional design qualities of 76 randomly selected MOOCs (26 cMOOCs and 50 xMOOCs – the imbalance was caused by difficulties finding suitable cMOOCs). The conclusions drawn are that very few MOOCs, if any, show much evidence of sound instructional design strategies. In fact they are, according to the authors, almost all an instructional designer’s worst nightmare, on at least some dimensions.
I like this paper but I have some fairly serious concerns with the way this study was conducted, which means a very large pinch of salt is needed when considering its conclusions. The central problem lies in the use of prescriptive criteria to identify ‘good’ instructional design practice, and then using them as quantitative measures of things that are deemed essential to any completed course design.
Doubtful criteria
It starts reasonably well. Margaryan et al use David Merrill’s well-accepted abstracted principles for instructional design to identify kinds of activities that should be there in any course and that, being somewhat derived from a variety of models and theories, are pretty reasonable: problem centricity, activation of prior learning, expert demonstration, application and integration. However, the chinks begin to show even here, as it is not always essential that all of these are explicitly contained within a course itself, even though consideration of them may be needed in the design process – for example, in an apprenticeship model, integration might be a natural part of learners’ lives, while in an open ‘by negotiated outcome’ course (e.g. a typical European PhD) the problems may be inherent in the context. But, as a fair approximation of what activities should be in most conventional taught courses, it’s not bad at all, even though it might show some courses as ‘bad’ when they are in fact ‘good’.
The authors also add five more criteria abstracted from literature relating rather loosely to ‘resources’, including: expert feedback; differentiation (i.e. personalization); collaboration; authentic resources; and use of collective knowledge (i.e. cooperative sharing). These are far more contentious, with the exception of feedback, which almost all would agree should be considered in some form in any learning design (and which is a process thing anyway, not a resource issue). However, even this does not always need to be the expert feedback that the authors demand: automated feedback (which is, to be fair, a kind of ossified expert feedback, at least when done right), peer feedback or, best of all, intrinsic feedback can often be at least as good in most learning contexts. Intrinsic feedback (e.g. when learning to ride a bike, falling off it or succeeding to stay upright) is almost always better than any expert feedback, albeit that it can be enhanced by expert advice. None of the rest of these ‘resources’ criteria are essential to an effective learning design. They can be very useful, for sure, although it depends a great deal on context and how it is done, and there are often many other things that may matter as much or more in a design, like including support for reflection, for example, or scope for caring or passion to be displayed, or design to ensure personal relevance. It is worth noting that Merrill observes that, beyond the areas of broad agreement (which I reckon are somewhat shoehorned to fit), there is much more in other instructional design models that demands further research and that may be equally if not more important than those identified as common.
It ain’t what you do…
Like all things in education, it ain’t what you do but how you do it that makes all the difference, and it is all massively dependent on subject, context, learners and many other things. Prescriptive measures of instructional design quality like these make no sense when applied post-hoc because they ignore all this. They are very reasonable starting frameworks for a designer that encourage focus on things that matter and can make a big difference in the design process, but real life learning designs have to take the entire context into account and can (and often should) be done differently. Learning design (I shudder at the word ‘instructional’ because it implies so many unhealthy assumptions and attitudes) is a creative and situated activity. It makes no more sense to prescribe what kinds of activities and resources should be in a course than it does to prescribe how paintings should be composed. Yes, a few basics like golden ratios, rules of thirds, colour theory, etc can help the novice painter produce something acceptable, but the fact that a painting disobeys these ‘rules’ does not make it a bad painting: sometimes, quite the opposite. Some of the finest teaching I have ever seen or partaken of has used the most appalling instructional design techniques, by any theoretical measure.
Over-rigid assumptions and requirements
One of the biggest troubles with such general-purpose abstractions is that they make some very strong prior assumptions about what a course is going to be like and the context of delivery. Thanks to their closer resemblance to traditional courses (from which it should be clearly noted that the design criteria are derived) this is, to an extent, fair-ish for xMOOCs. But, even in the case of xMOOCs, the demand that collaboration, say, must occur is a step too far: as decades of distance learning research has shown (and Athabasca University proved for decades), great learning can happen without it and, while cooperative sharing is pragmatic and cost-effective, it is not essential in every course. Yes, these things are often a very good idea. No, they are not essential. Terry Anderson’s well-verified (and possibly self-confirming, though none the worse for it) theorem of interaction equivalency makes this pretty clear.
cMOOCs are not xMOOCs
Prescriptive criteria as a tool for evaluation make no sense whatsoever in a cMOOC context. This is made worse because the traditional model is carried to extremes in this paper, to the extent that the authors bemoan the lack of clear learning outcomes. This doesn’t naturally fall out from the design principles at all, so I don’t understand why they are even mentioned, and it seems an abitrary criterion that has no validity or justification beyond the fact that they are typically used in university teaching. As teacher-prescribed learning outcomes are anathema to Connectivism it is very surprising indeed that the cMOOCs actually scored higher than the xMOOCs on this metric, which makes me wonder whether the means of differentiation were sufficiently rigorous. A MOOC that genuinely followed Connectivist principles would not provide learning outcomes at all: foci and themes, for sure, but not ‘at the end of this course you will be able to x’. And, anyway, as a lot of research and debate has shown, learning outcomes are of far greater value to teachers and instructional designers than they are to learners, for whom they may, if not handled with great care, actually get in the way of effective learning. It’s a process thing – helpful for creating courses, almost useless for taking them. The same problem occurs in the use of course organization in the criteria – cMOOC content is organized bottom-up by learners, so it is not very surprising that they lack careful top-down planning, and that is part of the point.
Apparently, some cMOOCs are not cMOOCs either
As well as concerns about the means of differentiating courses and the metrics used, I am also concerned with how they were applied. It is surprising that there was even a single cMOOC that didn’t incorporate use of ‘collective knowledge’ (the authors’ term for cooperative sharing and knowledge construction) because, without that, it simply isn’t a cMOOC: it’s there in the definition of Connectivism. As for differentiation, part of the point of cMOOCs is that learning happens through the network which, by definition, means people are getting different options or paths, and choosing those that suit their needs. The big point in both cases is that the teacher-designed course does not contain the content in a cMOOC: beyond the process support needed to build and sustain a network, any content that may be provided by the facilitators of such a course is just a catalyst for network formation and a centre around which activity flows and learner-generated content and activity is created. With that in mind it is worth pointing out that problem-centricity in learning design is an expression of teacher control which, again, is anathema to how cMOOCs work. Assuming that a cMOOC succeeds in connecting and mobilizing a network, it is all but certain that a great deal of problem-based and inquiry-based learning will be going on as people post, others respond, and issues become problematized. Moreover, the problems and issues will be relevant and meaningful to learners in ways that no pre-designed course can ever be. The content of a cMOOC is largely learner-generated so of course a problem focus is often simply not there in static materials supplied by people running it. cMOOCs do not tell learners what to do or how to do it, beyond very broad process support which is needed to help those networks to accrete. It would therefore be more than a little weird if they adhered to instructional design principles derived from teacher-led face-to-face courses in their designed content because, if they did, they would not be cMOOCs. Of course, it is perfectly reasonable to criticize cMOOCs as a matter of principle on these grounds: given that (depending on the network) few will know much about learning and how to support it, one of the big problems with connectivist methods is that of getting lost in social space, with insufficient structure or guidance to suit all learning needs, insufficient feedback, inefficient paths and so on. I’d have some sympathy with such an argument, but it is not fair to judge cMOOCs on criteria that their instigators would reject in the first place and that they are actively avoiding. It’s like criticizing cheese for not being chalky enough.
It’s still a good paper though
For all that I find the conclusions of this paper very arguable and the methods highly criticizable, it does provide an interesting portrait of MOOCs using an unconventional lens. We need more research along these lines because, though the conclusions are mostly arguable, what is revealed in the process is a much richer picture of the kinds of things that are and are not happening in MOOCs. These are fine researchers who have told an old story in a new way, and this is enlightening stuff that is worth reading.
As an aside, we also need better editors and reviewers for papers like this: little tell-tales like the fact that ‘cMOOC’ gets to be defined as ‘constructivist MOOC’ at one point (I’m sure it’s just a slip of the keyboard as the authors are well aware of what they are writing about) and more typos than you might expect in a published paper suggest that not quite enough effort went into quality control at the editorial end. I note too that this is a closed journal: you’d think that they might offer better value for the money that they cream off for their services.
This is a sign of what appear to be some remarkable seismic shifts at Microsoft. To be fair, Microsoft has long been a contributor to open source initiatives but .NET was, until fairly recently, seen as one of the crown jewels only slightly less significant than Windows and Office, which makes me and the writer of this article wonder whether they might be heading towards open sourcing these at some point (Windows mobile version is already free, albeit with many provisos, terms and conditions, but that’s just common sense otherwise no one would use the substandard pile of pants at all).
Note that they are apparently only open-sourcing the core of .NET, which is not that wonderful without all the accompanying framework and goodies. The open source Mono project has provided this functionality for many years thanks to Microsoft’s wisely open approach to treating it and C# as a specification rather than a completely closed technology in the first place but, and it’s a big but, there are few Windows .NET apps that can run on Mono under Unix without some significant tweaking or acceptance of limitations and bugs, because so much relies on the premium libraries, controls and other proprietary closed tools that only paying Windows users can take advantage of. It’s much better than it used to be, but Mono is still a shim rather than a solution. I’m guessing there are few that would use it in preference to, say, Java unless their primary target were Windows machines or they were inveterate C# or VB fans.
This is probably not a sign of deeper openness, however. Microsoft, like most others in the industry, clearly see the future is in net-delivered cloud-based subscription services. Azure, Office365, Skype, Exchange Online etc etc are likely to be where most of the money comes from in the years ahead. .NET is nothing like as effective at locking people in than providing a service that handles all the data, communication and business processes of an individual or organization. Moreover, if more .NET developers can be sucked in to developing for other platforms, that means more that can be pulled in to Microsoft’s cloud systems though, to be fair, it does mean Microsoft has to actually compete on even ground to win, rather than solely relying on market dominance. But it does have a lot of cash to outspend many of its rivals, and raw computer power together with the money to support it plays a large role in achieving success in this area.
The cloud is a new (well, extremely old but now accepted and dominant) form of closed system in which the development technology really shouldn’t matter much any more. I worry a great deal about this though. In the past we were just locked in by data formats, closed licences and closed software (perniciously driven by upgrade cycles that rendered what we had purchased obsolete and unsupported), but at least the data were under our control. Now they are not. I know of no cloud-based services that have not at some point changed terms and conditions, often for the worse, few that I would trust with my data any further than I could throw them, and none at all that are impervious to bankrupcy, take-overs and mergers. When this happened in the past we always had a little lead time to look for an alternative solution and our systems kept running. Nowadays, a business can be destroyed in the seconds it takes to shut down or alter a system in the cloud.