I hate change, especially when it is inflicted upon me

For at least the past 5 or 6 years I have been hosting the websites I care most about, including this one, with a good-value, mostly reliable provider (OVH) that has servers in Canada. I don’t dislike the company and I’m still paying them, though the value isn’t feeling so great right now, because they are soon to retire their old VPS solution on which my sites are hosted, forcing me to either leave them or ‘upgrade’ to one of their new plans. Of course, the cheapest plan that can fit what I already have is more expensive than the old one. If I had the time, I might look for an alternative, but Canada is not well served by companies that provide cheap, reliable virtual private servers. There’s no way I’m moving my sites to US hosting (guys, stop letting rich corporations decide your laws for you, or at least elect someone to your presidency who’s not a dead ringer for the antichrist). I do have servers elsewhere but I live here, and I like Canada more than any other country.

My new hosting plan might be a bit better than the old one in some ways but worse in others. I am now paying $15/month instead of $10 for something I didn’t need to be improved, and that is mostly not much better than it was. I have lost a day or two of my own time to migration already (with just one site mostly migrated), and expect to lose more as I migrate more sites, not to mention significant downtime when I (inevitably) mess things up, especially because, of course, I am ‘fixing’ a few things in the process. In fairness, OVH have given me 6 months of ‘free’ hosting by way of compensation but, given the amount of work I need to put into it and the increased cost over the long term, it’s not a good deal for me.

I do understand why things must change. You cannot run the same old servers forever because things decay, and complexity (in management, especially) inevitably increases. This is true of all technologies, from languages to bureaucracies, from vehicles to software. But this seems like a sneaky way to impose a price hike, rather than an inevitable need. More to the point, if I need to change the technologies my sites run on, l want to be the one that makes those choices, and I want to choose exactly when I make them. That’s precisely why I put up with the pain and hassle of managing my ‘own’ servers. Well, that and the fact that I figure a computing professor ought to have a rough idea about real world computing, and having my own server does mean I can help out friends and family from time to time.

Way back in time I used to run servers for a living so, though the pace of change (in me and technologies I use) makes it more difficult to keep up than it used to be, I am not too scared about doing the hard stuff. I really like the control that managing a whole server gives me over everything. If it breaks, it’s my fault, but it’s also my responsibility when it works. I’ve always told myself that, worst case, all I need to do is to zip up the sites and move them lock stock and barrel somewhere else, so I am not beholden to proprietary tools and APIs, nor do I have much complexity to worry about when things need to change. I’ve also always known that this belief is over-simplistic and overly optimistic, but I’ve tried to brush that under the carpet because it’s only a problem when it becomes a problem. Now it’s a problem.

On the bright side, I have steadfastly avoided cloud alternatives because they lock you in, in countless ways, eventually making you nothing but a reluctant cash cow for the cloud providers. This would have been many times worse if I had picked a cloud solution. I have one small server to worry about rather than dozens of proprietary services, and everything on it is open and standardized. But path dependencies can lock you in too. Though I rarely make substantial changes – that way madness lies – I’ve made quite a surprising number of small decisions about the system over the past few years that, on the whole, I have mostly documented but that, en masse, are more than a slight pain to deal with. This site was down for hours today, for instance, while I struggled to figure out why it had suddenly decided that it knew nothing about SSL any more, which it turned out was due to a change in the Let’s Encrypt certificates (that had to be regenerated for the new site) and some messiness with permissions that didn’t quite work the same way on the new servers (my bad for choosing this time to upgrade the operating system, but it was a job that needed doing), combined with some automation that wanted to change server configuration files that I expected to configure myself. This kind of process can reveal digital decay that you might not have noticed happening, too. Right now, for example, there appear to be about 50 empty files sitting in my media folder for reasons that I am unsure of, that were almost certainly there on the old server. I think they may be harmless, but I am bothered that there might be something that is not working that I have migrated over, that might cause more problems in future. More hours of tedious effort ahead.

The main thing that all this highlights to me, though, is something I too often try to ignore: that I do not own what I think I own any more. This is my site, but someone else has determined that it should change. All technologies tend towards entropy, be they poems, words, pyramids, or bicycles. They persist only through an active infusion of energy. I suppose I should therefore feel no worse about this than when a drain gets blocked or a lock needs replacing, but I do feel upset, because this is something I was paying someone else to deal with, and because there is absolutely nothing I could have done (or at least nothing that would not have been much more hassle) to prevent it. I have many similar ‘lifetime’ services that are equally tenuous, ‘lifetime’ referring only to the precarious lifespan of the company in its current state, before it chooses to change its policies or gets acquired by someone else, or simply goes out of business. A few of the main things I have learned through having too many such things are:

  • to keep it simple: small, easily replaceable services trump big, highly functional systems every single time.
  • to always maintain alternatives. Even if OVH had gone belly-up, I still have mirrors on lesser sites that would keep me going in a worst case scenario, though it would have been harder work and less efficient to have gone down that path.
  • don’t trust any company, ever. They are not people so, even if they are lovely now, there is no guarantee that they will be next year, or tomorrow. And their purpose is to survive, and probably to make money, not to please you. You can trust people, but you cannot trust machines.
  • this is even true of the companies you work for. Much as I love my university, its needs and purposes only partially coincide with mine. The days of the Landing, for instance, a system into which I have poured much energy for well over 10 years, are very likely numbered, though I have no idea whether that means it has months or years left to live. Not my call, and not the call of any one individual (though someone will eventually sign its death warrant). With luck and concerted effort, it will evolve into something more wonderful but that’s not the point. Companies are not human, and they don’t think like humans.
  • if possible, stick with whatever defaults the software comes with or, at least, make sure that all changes are made in as few places as possible. It’s an awful pain to have to find the tweaks you made when you move it to a new system unless they are all in one easy-to-find place.
  • open standards are critical. There’s no point in getting great functionality if it relies on the goodwill of a company to maintain it, except where the value is unequivocally transient. I don’t much mind a trustworthy agent handling my spam filtering or web conferencing, for instance, though I’d not trust one to handle my instant messaging or site hosting, unless they are using standards that others are using. Open source solutions do die, and do lose support, but they are always there when you need them, and it is always possible to migrate, even if the costs may be high.

This site is now running on the new system, with a slightly different operating system and a few upgrades here and there. It might even be a little faster than the last version, eventually. I (as it turns out) wisely chose Linux and open source web software, so it continues to work, more or less as it did before, notwithstanding the odd major problem. If this had been a Windows or even a Mac site, though, it would have been dead long ago.

I have a bit of work to do on the styling here and there – I’m not sure quite what became of the main menu and (for aforementioned reasons) am reluctant to mess around with the CSS. If you happen to know me, or even if you don’t but can figure out how to deal with the anti-spam stuff in the comments section of this page, do tell me if you spot anything unusual.

Finally, if I’ve screwed up the syndication then you will probably not be reading this anyway. I’ve already had to kill the (weak) Facebook integration in order to make it work at all, though that’s a good riddance and I’m happy to see it go. Twitter might be another matter, though. Another set of proprietary APIs and, potentially, another fun problem to deal with tomorrow.

Addendum: so it turns out that I cannot save anything I write here. Darn. I thought it might be a simple problem with rewrite rules but that’s not it. When you read this, I will have found a solution (and it will probably be obvious, in retrospect) but it is making me tear my hair out right now.

Addendum to addendum: so I did screw up the syndication, and it was a simple problem with rewrite rules. After installing the good old fashioned WordPress editor everything seemed fine, but I soon discovered that the permalinks were failing too, so (though it successfully auto-posted to Twitter) links I had shared to this post were failing. All the signs pointed to a problem with Apache redirection, but all my settings were quadruple-checked correct. After a couple of hours of fruitless hacking,  I realized that the settings were quadruple-checked correct for the wrong domain name (jondron.org, which actually redirects here to jondron.ca, but that is still running on the old site so not working properly yet). Doh. I had even documented this, but failed to pay attention to my own notes. It’s a classic technology path-dependency leading to increased complexity of exactly the kind that I refer to in my post. The history of it is that I used to use jondron.org as my home page, and that’s how the site was originally set up, but I chose to switch to jondron.ca a few years ago because it seemed more appropriate and, rather than move the site itself to a new directory, I just changed everything in its database to use the jondron.ca domain name instead. Because I had shared the old site with many people, I set up a simple HTTP redirect from jondron.org to point to this one, and had retained the virtual host on the server for this purpose. All perfectly logical, and just a small choice along the way, but with repercussions that have just taken up a lot of my time. I hope that I have remembered to reset everything after all the hacks I tried, but I probably haven’t. This is how digital decay sets in.

Development work

I don’t think of myself as a web developer nor as a server administrator (though I used to do rather a lot of that in a former career) but I do dabble a little. I ran my first web server in early 1993 and have been doing so ever since.  Among the courses I teach are web server management and web programming. I also occasionally develop web server software, most recently using the very wonderful Elgg social framework. My PhD revolved around the creation of a series of web applications, CoFIND (Collaborative Filter in N Dimensions), originally built using ASP (1997-2002) and, with funding received after the PhD was over,  later re-imagined in PHP (2002-2005). This adaptive social bookmarking system was one of the earliest (if not the earliest) uses of tag clouds in an educational application (though such things did not even have a name back then), one of the earliest  (if not the earliest) examples of open-corpus adaptive hypermedia, and employed social recommendation for learning in what I still think are pretty neat ways, albeit that the interface made it almost entirely unusable outside an experimental context. In the late 1990s I helped build a learning management system, combining home-brewed PHP, ASP and Lotus Notes, at the University of Brighton, which included CoFIND as one of its components. In the early 2000s I built a Plone-based community networking site. Throughout the early to mid 2000s I built a series of experimental web applications designed to allow learners to help one another to learn, using self-organizing principles drawn from natural evolution, the dynamics of cities, and marketplaces. Since the mid to late 2000s I have mostly focused on building tools that, though less novel in design, are usable enough to enter a production environment.

These are a few currently running servers that I am fully or partially responsible for building and maintaining:

  • Athabasca Landing
    I am the architect and technical lead of the Landing,  a social learning commons used in Athabasca University to provide a soft, controllable, social space designed to fill the gaps left between the hard, function-driven systems of the university and support the spread of knowledge within and beyond the university. The site supports over 7000 users and over 400 groups, and features in several papers and our book, Teaching Crowds. It is build using Elgg. I have specified and managed the development  of  over 50 plugins for the site, around 10 of which I have written myself, most of which are available as open source to the Elgg community. Nearly all the posts on this site were originally posted on the Landing, imported to this site via RSS. The Landing has been the basis of a lot of my research over the past five years, as well as a platform for my teaching, and it takes up a fair bit of my time.
  • Teaching Crowds
    A WordPress site for the book of the same name by me and Terry Anderson
  • Curtin Commons
    An Elgg-based site that I built to support a MOOC run by Curtin University in 2014 (due to run again later this year). It uses a superset of the tools and plugins of the Landing, made to support a very different kind of community.
  • Ahead Academy (under development)
    An Elgg-based site to support Australian school children in beyond-the-school activities
  • Oasis for learning
    An Elgg-based site to support graduate students and their teachers. Not much developed, it has been ticking over for a few years now.
  • Sugar Cube
    A WordPress site for the Bowen Island cabin and gallery/event space run by my wife. She does the hard work – I just handle some of the tech side.

See also a couple of my old servers built for research many years ago now that are still running. Sadly, the more radical versions of CoFIND are no longer running (they used an old, now insecure and unsupportable version of ASP) but there is a later and less exciting version of it there. There’s also a version of Dwellings, a system designed to replicate the dynamics of cities, based on the ideas of Jane Jacobs. At some point I hope to add in a couple of other old systems.

I run an assortment of other small sites for friends and relatives.

From Representation to Emergence: Complexity's challenge to the epistemology of schooling – Osberg – 2008 – Educational Philosophy and Theory – Wiley Online Library

This is my second post for today on the subject of boundaries and complex systems (yes, I am writing a paper!), this time pointing to a paper by Osberg, Biesta and Cilliers from 2008 that applies the concepts to knowledge and education. It’s a fascinating paper, drawing a theory of knowledge out of complex systems that the authors rather deftly fit with Dewey’s transactional realism and (far less compellingly) a bit of deconstructionism.

I think this sits very firmly within the connectivist family of theories (Stephen Downes may disagree!) albeit from a slightly different perspective. The context is the realm of complex (mostly complex adaptive) systems but the notion of knowledge as an emergent and shifting phenomenon born of engagement – a process, not a product – and the significance of the connected whole in both enabling and embodying it all is firmly in the connectivist tradition. It’s a slightly different perspective but one that is well-grounded in theory and comes to quite a similar conclusion, aptly put:

education (becoming educated) is no longer about understanding a finished  universe, or even about participating in a finished and stable universe. It is the result, rather, of participating in the creation of an unfinished universe.

The authors begin by defining what they describe as a ‘representational’ or ‘spatial’ epistemology that underpins most education. This is not quite as simplistic as it sounds – they include models and theories in this, at least. Their point is that education takes people out of ‘real life’ and therefore must rely on a means to represent ‘real life’ to do its job properly. I think this is pushing it a bit: yes, that is true of a fair amount of intentional teaching but there is a lot that goes on in education systems that is unintentional, or emerges as a by-product of interaction, or that happens in playgrounds, cafes, or common rooms, that is very different and is not just an incidental to the process but quite critical to it. To pretend that educational systems are nothing but the explicit things we intentionally do to people is, I think deliberately, creating a bit of a straw man. They make much the same point: I guess it is done to distinguish this from their solution, which is an ’emergentist’ epistemology.

The really interesting stuff for me comes from Cillier’s contribution (I’m guessing) on boundaries, which makes the simple and obvious point that complex systems (as opposed to complicated ones) are inherently incompressible, so any model we make of them is inaccurate: in leaving out the tiniest thing we make it impossible to make deterministic predictions, save in that we can create boundaries to focus on particular aspects we might care about and come up with probabalistic inferences (e.g. predicting the weather). Those boundaries are thus, of necessity, created (or, more accurately, negotiated), not discovered. They are value-laden. Thus:

“…models and theories that reduce the world to a system of rules or laws cannot be understood as pure representations of a universe that exists independently, but should rather be understood as valuable but provisional and temporary tools by means of which we constantly re-negotiate our understanding of and being in the world

They go on…

We need boundaries around our regularities before we can model or theorise them, before we can find their rules of operation, because rules make sense only in terms of boundaries. The point is that the setting of the boundary creates the condition of possibility for a rule or a law to exist. When a boundary is not naturally given, as is the case with natural complex systems, the rules that we ‘discover’ also cannot be understood as naturally given. Rules and ‘laws’ are not ‘real’ features of the systems we theorise about. Theories that attempt to reduce complexity to a system of rules or laws, like our models which do precisely this, therefore cannot be understood as pictures of reality.

So, the rules that we find are pragmatic ones – they are tools, rather than pictures of reality, that help us to renegotiate our world and the meaning we make in and of it:

From this perspective, knowledge is not about ‘the world’ as such, it is not about truth; rather, it is about what we can do in the world, how we can change it.One could say ‘acquiring’ knowledge does not ‘solve’ problems for us: it creates problems for us to solve.”

At this point they come round to Dewey, whose transactional model is not about finding out about the world but leads to a constantly emerging and ever renegotiated state of being.

“…in acting, we create knowledge, and in creating knowledge, we learn to act in different ways and in acting in different ways we bring about new knowledge which changes our world, which causes us to act differently, and so on, unendingly. There is no final truth of the matter, only increasingly diverse ways of interacting in a world that is becoming increasingly complex.

One of the more significant aspects of this, that is not dwelt on anything like enough in this paper but that forms a consistent subtext, is that this is a fundamentally social pursuit. This is a complex system not just of individuals negotiating an active relationship with the world, but of people doing it together, as part of a complex system that drives its own adaptation, at every scale and within every (overlapping, interpenetrating) boundary.

They continue with an, I think, unsuccessful attempt to align this perspective with postmodernist/poststructuralist/deconstructionist theory, claiming that Dillon’s differentiation between the radical relationality of complexity and poststructuralist theorists is illusory, because a complex system is always in a state of becoming without being, so it is much the same kind of thing. Whether or not this is true, I don’t think it adds anything significant to the arguments.

The paper rushes to a rather unsatisfactory conclusion – at last hitting the promised topic of the title – about the role of this emergentist epistemology in schooling:

Acquisition is no longer the name of the game …. This means questions about what to present in the curriculum and whether these things should be directly presented or should be represented (such that children may acquire knowledge of these things most efficiently or effectively) are no longer relevant as curricular questions. While content is important, the curriculum is less concerned with what content is presented and how, and more with the idea that content is engaged with and responded to …. Here the content that is engaged is not pre-given, but emerges from the educative situation itself. With this conception of knowledge and the world, the curriculum becomes a tool for the emergence of new worlds rather than a tool for stabilisation and replication

This follows quite naturally and makes sense, but it diminishes the significance of a pretty obvious elephant in the room, which is that the educational institution itself is one of those boundaried systems that plays a huge role in and of itself, not to mention with other boundaried systems, regardless of the processes enacted within its boundaries. I think this is symptomatic of a big gap that the paper very much implies but barely attempts to address, which is that all of these complex systems involved processes, structures, rules, tools, objects, content (whatever that is!), media, and a host of other things are part of those complex systems. Knowledge is indeed a dynamic process, a state of becoming or of being, but it incorporates really a lot of things, only a limited number of which are in the minds of individuals. It’s not about people learning – it’s about that whole, massive, complex adaptive system itself.

Address of the bookmark: http://onlinelibrary.wiley.com/doi/10.1111/j.1469-5812.2007.00407.x/abstract;jsessionid=901674561113DC6F72BDE8756B165030.f04t03?systemMessage=Wiley+Online+Library+will+be+disrupted+on+11th+July+2015+at+10%3A00-16%3A00+BST+%2F+05%3A00-11%3A00+EDT+%2F+17%3A00-23%3A00++SGT++for+essential+maintenance.++Apologies+for+the+inconvenience&userIsAuthenticated=false&deniedAccessCustomisedMessage=