I hate change, especially when it is inflicted upon me

For at least the past 5 or 6 years I have been hosting the websites I care most about, including this one, with a good-value, mostly reliable provider (OVH) that has servers in Canada. I don’t dislike the company and I’m still paying them, though the value isn’t feeling so great right now, because they are soon to retire their old VPS solution on which my sites are hosted, forcing me to either leave them or ‘upgrade’ to one of their new plans. Of course, the cheapest plan that can fit what I already have is more expensive than the old one. If I had the time, I might look for an alternative, but Canada is not well served by companies that provide cheap, reliable virtual private servers. There’s no way I’m moving my sites to US hosting (guys, stop letting rich corporations decide your laws for you, or at least elect someone to your presidency who’s not a dead ringer for the antichrist). I do have servers elsewhere but I live here, and I like Canada more than any other country.

My new hosting plan might be a bit better than the old one in some ways but worse in others. I am now paying $15/month instead of $10 for something I didn’t need to be improved, and that is mostly not much better than it was. I have lost a day or two of my own time to migration already (with just one site mostly migrated), and expect to lose more as I migrate more sites, not to mention significant downtime when I (inevitably) mess things up, especially because, of course, I am ‘fixing’ a few things in the process. In fairness, OVH have given me 6 months of ‘free’ hosting by way of compensation but, given the amount of work I need to put into it and the increased cost over the long term, it’s not a good deal for me.

I do understand why things must change. You cannot run the same old servers forever because things decay, and complexity (in management, especially) inevitably increases. This is true of all technologies, from languages to bureaucracies, from vehicles to software. But this seems like a sneaky way to impose a price hike, rather than an inevitable need. More to the point, if I need to change the technologies my sites run on, l want to be the one that makes those choices, and I want to choose exactly when I make them. That’s precisely why I put up with the pain and hassle of managing my ‘own’ servers. Well, that and the fact that I figure a computing professor ought to have a rough idea about real world computing, and having my own server does mean I can help out friends and family from time to time.

Way back in time I used to run servers for a living so, though the pace of change (in me and technologies I use) makes it more difficult to keep up than it used to be, I am not too scared about doing the hard stuff. I really like the control that managing a whole server gives me over everything. If it breaks, it’s my fault, but it’s also my responsibility when it works. I’ve always told myself that, worst case, all I need to do is to zip up the sites and move them lock stock and barrel somewhere else, so I am not beholden to proprietary tools and APIs, nor do I have much complexity to worry about when things need to change. I’ve also always known that this belief is over-simplistic and overly optimistic, but I’ve tried to brush that under the carpet because it’s only a problem when it becomes a problem. Now it’s a problem.

On the bright side, I have steadfastly avoided cloud alternatives because they lock you in, in countless ways, eventually making you nothing but a reluctant cash cow for the cloud providers. This would have been many times worse if I had picked a cloud solution. I have one small server to worry about rather than dozens of proprietary services, and everything on it is open and standardized. But path dependencies can lock you in too. Though I rarely make substantial changes – that way madness lies – I’ve made quite a surprising number of small decisions about the system over the past few years that, on the whole, I have mostly documented but that, en masse, are more than a slight pain to deal with. This site was down for hours today, for instance, while I struggled to figure out why it had suddenly decided that it knew nothing about SSL any more, which it turned out was due to a change in the Let’s Encrypt certificates (that had to be regenerated for the new site) and some messiness with permissions that didn’t quite work the same way on the new servers (my bad for choosing this time to upgrade the operating system, but it was a job that needed doing), combined with some automation that wanted to change server configuration files that I expected to configure myself. This kind of process can reveal digital decay that you might not have noticed happening, too. Right now, for example, there appear to be about 50 empty files sitting in my media folder for reasons that I am unsure of, that were almost certainly there on the old server. I think they may be harmless, but I am bothered that there might be something that is not working that I have migrated over, that might cause more problems in future. More hours of tedious effort ahead.

The main thing that all this highlights to me, though, is something I too often try to ignore: that I do not own what I think I own any more. This is my site, but someone else has determined that it should change. All technologies tend towards entropy, be they poems, words, pyramids, or bicycles. They persist only through an active infusion of energy. I suppose I should therefore feel no worse about this than when a drain gets blocked or a lock needs replacing, but I do feel upset, because this is something I was paying someone else to deal with, and because there is absolutely nothing I could have done (or at least nothing that would not have been much more hassle) to prevent it. I have many similar ‘lifetime’ services that are equally tenuous, ‘lifetime’ referring only to the precarious lifespan of the company in its current state, before it chooses to change its policies or gets acquired by someone else, or simply goes out of business. A few of the main things I have learned through having too many such things are:

  • to keep it simple: small, easily replaceable services trump big, highly functional systems every single time.
  • to always maintain alternatives. Even if OVH had gone belly-up, I still have mirrors on lesser sites that would keep me going in a worst case scenario, though it would have been harder work and less efficient to have gone down that path.
  • don’t trust any company, ever. They are not people so, even if they are lovely now, there is no guarantee that they will be next year, or tomorrow. And their purpose is to survive, and probably to make money, not to please you. You can trust people, but you cannot trust machines.
  • this is even true of the companies you work for. Much as I love my university, its needs and purposes only partially coincide with mine. The days of the Landing, for instance, a system into which I have poured much energy for well over 10 years, are very likely numbered, though I have no idea whether that means it has months or years left to live. Not my call, and not the call of any one individual (though someone will eventually sign its death warrant). With luck and concerted effort, it will evolve into something more wonderful but that’s not the point. Companies are not human, and they don’t think like humans.
  • if possible, stick with whatever defaults the software comes with or, at least, make sure that all changes are made in as few places as possible. It’s an awful pain to have to find the tweaks you made when you move it to a new system unless they are all in one easy-to-find place.
  • open standards are critical. There’s no point in getting great functionality if it relies on the goodwill of a company to maintain it, except where the value is unequivocally transient. I don’t much mind a trustworthy agent handling my spam filtering or web conferencing, for instance, though I’d not trust one to handle my instant messaging or site hosting, unless they are using standards that others are using. Open source solutions do die, and do lose support, but they are always there when you need them, and it is always possible to migrate, even if the costs may be high.

This site is now running on the new system, with a slightly different operating system and a few upgrades here and there. It might even be a little faster than the last version, eventually. I (as it turns out) wisely chose Linux and open source web software, so it continues to work, more or less as it did before, notwithstanding the odd major problem. If this had been a Windows or even a Mac site, though, it would have been dead long ago.

I have a bit of work to do on the styling here and there – I’m not sure quite what became of the main menu and (for aforementioned reasons) am reluctant to mess around with the CSS. If you happen to know me, or even if you don’t but can figure out how to deal with the anti-spam stuff in the comments section of this page, do tell me if you spot anything unusual.

Finally, if I’ve screwed up the syndication then you will probably not be reading this anyway. I’ve already had to kill the (weak) Facebook integration in order to make it work at all, though that’s a good riddance and I’m happy to see it go. Twitter might be another matter, though. Another set of proprietary APIs and, potentially, another fun problem to deal with tomorrow.

Addendum: so it turns out that I cannot save anything I write here. Darn. I thought it might be a simple problem with rewrite rules but that’s not it. When you read this, I will have found a solution (and it will probably be obvious, in retrospect) but it is making me tear my hair out right now.

Addendum to addendum: so I did screw up the syndication, and it was a simple problem with rewrite rules. After installing the good old fashioned WordPress editor everything seemed fine, but I soon discovered that the permalinks were failing too, so (though it successfully auto-posted to Twitter) links I had shared to this post were failing. All the signs pointed to a problem with Apache redirection, but all my settings were quadruple-checked correct. After a couple of hours of fruitless hacking,  I realized that the settings were quadruple-checked correct for the wrong domain name (jondron.org, which actually redirects here to jondron.ca, but that is still running on the old site so not working properly yet). Doh. I had even documented this, but failed to pay attention to my own notes. It’s a classic technology path-dependency leading to increased complexity of exactly the kind that I refer to in my post. The history of it is that I used to use jondron.org as my home page, and that’s how the site was originally set up, but I chose to switch to jondron.ca a few years ago because it seemed more appropriate and, rather than move the site itself to a new directory, I just changed everything in its database to use the jondron.ca domain name instead. Because I had shared the old site with many people, I set up a simple HTTP redirect from jondron.org to point to this one, and had retained the virtual host on the server for this purpose. All perfectly logical, and just a small choice along the way, but with repercussions that have just taken up a lot of my time. I hope that I have remembered to reset everything after all the hacks I tried, but I probably haven’t. This is how digital decay sets in.

Turns out the STEM ‘gender gap’ isn’t a gap at all

Grace Hopper and Univac, image from en.wikipedia.org/wiki/Grace_HopperAt least in Ontario, it seems that there are about as many women as men taking STEM programs at undergraduate level. This represents a smaller percentage of women taking STEM subjects overall because there are way more women entering university in the first place. A more interesting reading of this, therefore, is not that we have a problem attracting women to science, technology, engineering, and mathematics, but that we have a problem attracting men to the humanities, social sciences, and the liberal arts. As the article puts it:

“it’s not that women aren’t interested in STEM; it’s that men aren’t interested in poetry—or languages or philosophy or art or all the other non-STEM subjects.”

That’s a serious problem.

As someone with qualifications in both (incredibly broad) areas, and interests in many sub-areas of each,  I find the arbitrary separation between them to be ludicrous, leading to no end of idiocy at both extremes, and little opportunity for cross-fertilization in the middle. It bothers me greatly that technology subjects like computing or architecture should be bundled with sciences like biology or physics, but not with social sciences or arts, which are way more relevant and appropriate to the activities of most computer professionals. In fact, it bothers me that we feel the need to separate out large fields like this at all. Everyone plays lip service to cross-disciplinary work but, when we try to take that seriously and cross the big boundaries, there is so much polarization between the science and arts communities that they usually don’t even understand one another, let alone work in harmony. We don’t just need more men in the liberal arts – we need more scientists, engineers, and technologists to cross those boundaries, whatever their gender. And, vice versa, we need more liberal artists (that sounds odd, but I have no better term) and social scientists in the sciences and, especially, in technology.

But it’s also a problem of category errors in the other direction. This clumping together of the whole of STEM conceals the fact that in some subjects – computing, say – there actually is a massive gender imbalance (including in Ontario), no matter how you mess with the statistics. This is what happens when you try to use averages to talk about specifics: it conceals far more than it reveals.

I wish I knew how to change that imbalance in my own designated field of computing, an area that I deliberately chose precisely because it cuts across almost every other field and did not limit me to doing one kind of thing. I do arts, science, social science, humanities, and more, thanks to working with machines that cross virtually every boundary.

I suspect that fixing the problem has little to do with marketing our programs better, nor with any such surface efforts that focus on the symptoms rather than the cause. A better solution is to accept and to celebrate the fact that the field of computing is much broader and vastly more interesting than the tiny subset of it that can be described as computer science, and to build up from there. It’s especially annoying that the problem exists at Athabasca where a wise decision was made long ago not to offer a computer science program. We have computing and information systems programs, but not any programs in computer science. Unfortunately, thanks to a combination of lazy media and computing profs (suffering from science envy) that promulgate the nonsense, even good friends of mine that should know better sometimes describe me as a computer scientist (I am emphatically not), and even some of our own staff think of what we do as computer science. To change that perception means not just a change in nomenclature, but a change in how and what we, at least in Athabasca, teach. For example, we might mindfully adopt an approach that contextualizes computing around projects and applications, rather than its theory and mechanics. We might design a program that doesn’t just lump together a bunch of disconnected courses and call it a minor but that, in each course (if courses are even needed), actively crosses boundaries – to see how code relates to poetry, how art can inform and be informed by software, how understanding how people behave can be used in designing better systems, how learning is changed by the tools we create, and so on.

We don’t need disciplines any more, especially not in a technology field. We need connections. We don’t need to change our image. We need to change our reality. I’m finding that to be quite a difficult challenge right now.

 

Address of the bookmark: http://windsorstar.com/opinion/william-watson-turns-out-the-stem-gender-gap-isnt-a-gap-at-all/wcm/ee4217ec-be76-4b72-b056-38a7981348f2

Originally posted at: https://landing.athabascau.ca/bookmarks/view/2929581/turns-out-the-stem-%E2%80%98gender-gap%E2%80%99-isn%E2%80%99t-a-gap-at-all

Evidence mounts that laptops are terrible for students at lectures. So what?

The Verge reports on a variety of studies that show taking notes with laptops during lectures results in decreased learning when compared with notes taken using pen and paper. This tells me three things, none of which is what the article is aiming to tell me:

  1. That the institutions are teaching very badly. Countless decades of far better evidence than that provided in these studies shows that giving lectures with the intent of imparting information like this is close to being the worst way to teach. Don’t blame the students for poor note taking, blame the institutions for poor teaching. Students should not be put in such an awful situation (nor should teachers, for that matter). If students have to take notes in your lectures then you are doing it wrong.
  2. That the students are not skillful laptop notetakers. These studies do not imply that laptops are bad for notetaking, any more than giving students violins that they cannot play implies that violins are bad for making music. It ain’t what you do, it’s the way that you do it. If their classes depend on effective notetaking then teachers should be teaching students how to do it. But, of course, most of them probably never learned to do it well themselves (at least using laptops). It becomes a vicious circle.
  3. That laptop and, especially, software designers have a long way to go before their machines disappear into the background like a pencil and paper. This may be inherent in the medium, inasmuch as a) they are vastly more complex toolsets with much more to learn about, and b) interfaces and apps constantly evolve so, as soon as people have figured out one of them, everything changes under their feet. It becomes a vicious cycle.

The extra cognitive load involved in manipulating a laptop app (and stopping the distractions that manufacturers seem intent on providing even if you have the self-discipline to avoid proactively seeking them yourself) can be a hindrance unless you are proficient to the point that it becomes an unconscious behaviour. Few of us are. Tablets are a better bet, for now, though they too are becoming overburdened with unsought complexity and unwanted distractions. I have for a couple of years now been taking most of my notes at conferences etc with an Apple Pencil and an iPad Pro, because I like the notetaking flexibility, the simplicity, the lack of distraction (albeit that I have to actively manage that), and the tactile sensation of drawing and doodling. All of that likely contributes to making it easier to remember stuff that I want to remember. The main downside is that, though I still gain laptop-like benefits of everything being in one place, of digital permanence, and of it being distributed to all my devices, I have, in the process, lost a bit in terms of searchability and reusability. I may regret it in future, too, because graphic formats tend to be less persistent over decades than text. On the bright side, using a tablet, I am not stuck in one app. If I want to remember a paper or URL (which is most of what I normally want to remember other than my own ideas and connections that are sparked by the speaker) I tend to look it up immediately and save it to Pocket so that I can return to it later, and I do still make use of a simple notepad for things I know I will need later. Horses for courses, and you get a lot more of both with a tablet than you do with a pencil and paper. And, of course, I can still use pen and paper if I want a throwaway single-use record – conference programs can be useful for that.

 

 

 

 

Address of the bookmark: https://www.theverge.com/2017/11/27/16703904/laptop-learning-lecture

Originally posted at: https://landing.athabascau.ca/bookmarks/view/2871283/evidence-mounts-that-laptops-are-terrible-for-students-at-lectures-so-what

Teens unlikely to be harmed by moderate digital screen use

The results of quite a large study (120,000 participants) appear to show that ‘digital’ screen time, on average, correlates with increased well-being in teenagers up to a certain point, after which the correlation is, on average, mildly negative (but not remotely as bad as, say, skipping breakfast). There is a mostly implicit assumption, or at least speculation, that the effects are in some way caused by use of digital screens, though I don’t see strong signs of any significant attempts to show that in this study.

While this accords with common sense – if not with the beliefs of a surprising number of otherwise quite smart people – I am always highly sceptical of studies that average out behaviour, especially for something as remarkably vague as engaging with technologies that are related only insofar as they involve a screen. This is especially the case given that screens themselves are incredibly diverse – there’s a world of difference between the screens of an e-ink e-reader, a laptop, and a plasma TV, for instance, quite apart from the infinite range of possible different ways of using them, devices to which they can be attached, and activities that they can support. It’s a bit like doing a study to identify whether wheels or transistors affect well-being. It ain’t what you do, it’s the way that you do it. The researchers seem aware of this. As they rightly say:

“In future work, researchers should look more closely at how specific affordances intrinsic to digital technologies relate to benefits at various levels of engagement, while systematically analyzing what is being displaced or amplified,” Przybylski and Weinstein conclude. 

Note, though, the implied belief that there are effects to analyze. This remains to be shown. 

Address of the bookmark: https://www.eurekalert.org/pub_releases/2017-01/afps-tut011217.php

Moral panic: Japanese girls risk fingerprint theft by making peace-signs in photographs / Boing Boing

As Cory Doctorow notes, why this headline should single out Japanese girls as being particularly at risk – and that this is the appeal of it – is much more disturbing than the fact that someone figured out how to lift fingerprints that can be used to access biometric authentication systems from photos taken using an ‘ordinary camera’ at a considerable distance (3 metres). He explains the popularity of the news story thus:

I give credit to the news-hook: this is being reported as a risk that young women put themselves to when they flash the peace sign in photos. Everything young women do — taking selfies, uptalking, vocal fry, using social media — even reading novels! — is presented as a) unique to young women (even when there’s plenty of evidence that the trait or activity is spread among people of all genders and ages) and b) an existential risk to the human species (as in, “Why do these stupid girls insist upon showing the whole world their naked fingertips? Slatterns!”)

The technical feat intrigued me, so I found a few high-res scans of pictures of Churchill making the V sign, taken on very good medium or large format film cameras (from that era, 5″x4″ press cameras were most common, though some might have been taken on smaller formats and/or cropped) with excellent lenses, by professional photographers, under various lighting conditions, from roughly that distance. While, on the very best, with cross-lighting, a few finger wrinkles and creases were partly visible, there was no sign of a single whorl, and nothing like enough detail for even a very smart algorithm to figure out the rest. So, with a tiny fraction of the resolution, I don’t think you could just lift an image from the web, a phone, or even from a good compact camera to steal someone’s fingerprints unless the range were much closer and you were incredibly lucky with the lighting conditions and focus. That said, a close-up selfie using an iPhone 7+, with focus on the fingers, might well work, especially if you used burst mode to get slightly different images (I’m guessing you could mess with bas relief effects to bring out the details). You could also do it if you set out to do it. With something like a good 400mm-equivalent lens,  in bright light, with low ISO, cross-lit, large sensor camera (APS-C or higher), high resolution, good focus and small aperture, there would probably be enough detail. 

Address of the bookmark: https://boingboing.net/2017/01/12/moral-panic-japanese-girls-ri.html

Setapp – Netflix-style rental model for apps for Mac

Interesting. For $10USD/month, you get unlimited access to the latest versions of what is promised to be around 300 commercial Mac apps. Looking at the selection so far (about 50 apps), these appear to be of the sort that usually appear in popular app bundles (e.g. StackSocial etc), in which you can buy apps outright for a tiny fraction of the list price (quite often at a 99% reduction). I have a few of these already, for which I paid an average of 1 or 2 dollars apiece, albeit that they came with a bunch of useless junk that I did not need or already owned, so perhaps it’s more realistic to say they average more like $10 apiece. Either way, they can already be purchased for very little money, if you have the patience to wait for the right bundle to arrive. So why bother with this?

The main advantage of SetApp’s model is that, unlike those in bundles, which often nag you to upgrade to the next version at a far higher price than you paid almost as soon as you get them, you always get the latest version. It is also nice to have on-demand access to a whole library at any time: if you can wait for a few months they will probably turn up in a cheap pay-what-you-want app bundle anyway, but they are only rarely available when you actually need them.  I guess there is a small advantage in the curation service, but there are plenty of much better and less inherently biased ways to discover tools that are worth having. 

The very notable disadvantage is that you never actually own the apps – once you stop subscribing or the company changes conditions/goes bust, you lose access to them. For ephemerally useful things like disk utilities, conversion tools, etc this is no great hassle but, for things that save files in proprietary formats or supply a cloud service (many of them) this would be a massive pain. As there is (presumably) some mechanism for updating and checking licences, this might also be an even more massive pain if you happen to be on a plane or out of network range when either the app checks in or the licence is renewed. I don’t know which method SetApp uses to ensure that you have a subscription but, one way or another, lack of network access at some point in the proceedings could really screw things up. When (with high probability) SetApp goes bust, you will be left high and dry. Also, I’m guessing that it is unlikely that I would want more than a dozen or thereabouts of these in any given year, so each would cost me about $10 every year at the best of times. Though that might be acceptable for a major bit of software on which one’s livelihood depends, for the kind of software that is currently on show, that’s quite a lot of money, notwithstanding the convenience of being able to pick up a specialist tool when you need it at no extra cost. 

This is a fairly extreme assault on software ownership but closed-source software of all varieties suffers from the same basic problem: you don’t own the software that you buy.  Unlike use-once objects like movies or books, software tends to be of continuing value. The obvious solution is to avoid closed-source altogether and go for open source right the way down the stack: that’s always my preference. Unfortunately, there are still commercial apps that I find useful enough to pay for and, unfortunately, software decays. Even if you buy something outright that does the job perfectly, at some point the surrounding ecosystems (the operating system, network, net services, etc) will most likely render it useless or positively dangerous at some point. There are also some doubly annoying cases where companies stop supporting versions, lose databases, or get taken over by other companies, so software that you once owned and paid for is suddenly no longer yours (Cyberduck, I’m looking at you). Worst of all are those that depend on a cloud service over which you have no control at all and that will almost definitely go bust, or get taken over, or be subject to cyberattack, or government privacy breaches, or be unavailable when you need it, or that will change terms and conditions at some point to your extreme disadantage. Though there may be a small niche for such things and the immediate costs are often low enough to be tempting, as a mainstream approach to software provision, it is totally unsustainable.

 

Address of the bookmark: https://setapp.com/

Pebble dashed

Hell.

Pebble made my favourite smart watches. They were somewhat open, and the company understood the nature of the technology better than any of the mainstream alternatives. Well, at least they used to get it, until they started moving towards turning them into glorified fitness trackers, which is probably why the company is now being purchased by Fitbit.

So, no more Pebble and, worse, no more support for those that own (or, technically, paid for the right to use) a Pebble. If it were an old fashioned watch I’d grumble a bit about reneging on warranties but it would not prevent me from being able to use it. Thanks to the cloud service model, the watch will eventually stop working at all:

Active Pebble watches will work normally for now. Functionality or service quality may be reduced down the road. We don’t expect to release regular software updates or new Pebble features. “

Great. The most expensive watch I have ever owned has a shelf life of months, after which it will likely not even tell the time any more (this has already occurred on several occasions when it has crashed while I have not been on a viable network). On the bright side (though note the lack of promises):

We’re also working to reduce Pebble’s reliance on cloud services, letting all Pebble models stay active long into the future.”

Given that nearly all the core Pebble software is already open source, I hope that this means they will open source the whole thing. This could make it better than it has ever been. Interesting – the value of the watch would be far greater without the cloud service on which it currently relies. 

 

Address of the bookmark: https://www.kickstarter.com/projects/597507018/pebble-2-time-2-and-core-an-entirely-new-3g-ultra/posts/1752929

Open Whisper Systems

The Signal protocol is designed for secure, private, encrypted messaging and real-time calling. The protocol, designed by Open Whisper Systems, is used in an increasingly large range of tools (including by Facebook and Google), but their own app is the most interesting application of it. 

The (open, GPL) Signal app is a secure, private messaging and voice chat app for iOS and Android, offering guaranteed and strong end-to-end encryption without having to sign up for a service with dubious privacy standards or further agendas (e.g. Facebook, Apple, Google, Whatsapp, Viber etc). No ads, no account details kept by the company, no means for them (or anyone) to store or intercept messages or calls, the organization is funded by donations and grants. The app uses your phonebook to discover other contacts using Signal – I don’t have many yet, but hopefully a few of my contacts will see this and install it. Call quality seems excellent – as good as Skype used to be before Microsoft maimed it – though I haven’t used it enough yet to assess its reliability. One disadvantage is that, if you have more than one phone and phone number, there seems to be no obvious way to link them together. That’s a particular nuisance on a dual-SIM phone.

It needs a real, verified phone number to get started but, once you have done that, you can link it to other devices too, including PCs (via Chrome or a Chrome-based browser like the excellent Vivaldi), using a simple QR code (no accounts!) so this is a potentially great replacement for things like Whatsapp, Skype, Allo, Viber, etc. No video calling yet, though you can send video messages (and most other things).

 

Address of the bookmark: https://whispersystems.org/#page-top

Get that “new Mac” smell all the time with a $24 scented candle

Some time ago, while comparing the virtues of paper and electronic books, I predicted that the current generation would one day wax lyrical about the smell of a new iPhone much as those from my generation get gooey over the scent of old books.

That day has arrived.

Address of the bookmark: http://www.alphr.com/apple/1004449/get-that-new-mac-smell-all-the-time-with-a-24-scented-candle

Sole and Despotic Dominion

Cory Doctorow is on excellent form discussing the evils of DRM and the meaning of ownership. The title is lifted from William Blackstone, referring to what it means to own something –  “that sole and despotic dominion which one man claims and exercises over the external things of the world, in total exclusion of the right of any other individual in the universe.” Doctorow’s central argument here is that, at least in the US (where DMCA 1201 denies people the right to break DRM locks), the presence of copyrighted DRM’d code in almost every object manufactured, from books to rectal thermometers, means that they cannot ever be owned by anyone other than their manufacturer, protected by law and unaccountable to anyone. 

“DMCA 1201 gave publishers and movie studios and game companies the power to make up their own private laws and outsource their enforcement to the public courts and police.”

Among the results of this are that security researchers cannot reveal flaws that may be dangerous or even deadly (think cars, insulin pumps, etc, not to mention the Internet of Hackable Things) while criminals can exploit them freely. It means that companies like Volkswagen can conceal cheating on emissions tests, that makers of thermostats can prevent you from controlling heat in your own home, that books you bought can be taken away from you on a whim or an error, that printer manufacturers can introduce code to break your printer if you don’t use their cartridges the way they want you to use them, that security agencies can demand that manufacturers let them use your webcam to spy on you, that abandoned games on a long extinct platform cannot be ported to modern hardware, that your watch will stop working if its manufacturer goes bust, and so on. It means that, mostly without our consent or knowledge, we no longer own what we own. As Doctorow puts it:

“There’s a word for this: feudalism. In feudalism, property is the exclusive realm of a privileged few, and the rest of us are tenants on that property. In the 21st century, DMCA-enabled version of feudalism, the gentry aren’t hereditary toffs, they’re transhuman, immortal artificial life-forms that use humans as their gut-flora: limited liability corporations.”

Address of the bookmark: http://www.locusmag.com/Perspectives/2016/11/cory-doctorow-sole-and-despotic-dominion/