The ultimate insomnia cure: new GDPR legislation soothingly read by Peter Jefferson

The BBC’s Shipping Forecast is one of the great binding traditions of British culture that has been many a Brit’s lullaby since time immemorial (ie. long before I was born). Though I never once paid attention to its content in all the decades I heard it, eleven years after leaving the country I could still probably recite the majority of the 31 sea areas surrounding the British Isles from memory. 

For as long as I can recall, the gently soothing voice of the Shipping Forecast was Peter Jefferson (apparently he retired after 40 years in 2009) who, in this magnificently somnolent rendering, immortalizes exerpts from the General Data Protection Regulation that has recently come into force in the EU. My eyelids start drooping about 30 seconds in.

 

Address of the bookmark: https://blog.calm.com/relax/once-upon-a-gdpr

Originally posted at: https://landing.athabascau.ca/bookmarks/view/3327075/the-ultimate-insomnia-cure-new-gdpr-legislation-soothingly-read-by-peter-jefferson

Tim Berners-Lee: we must regulate tech firms to prevent ‘weaponised’ web

TBL is rightfully indignant and concerned about the fact that “what was once a rich selection of blogs and websites has been compressed under the powerful weight of a few dominant platforms.” The Web, according to Berners-Lee, is at great risk of degenerating into a few big versions of Compuserve or AOL sucking up most of the bandwidth of the Internet, and most of the attention of its inhabitants. In an open letter, he outlines the dangers of putting so much power into hands that either see it as a burden, or who actively exploit it for evil.

I really really hate Facebook more than most, because it aggressively seeks to destroy all that is good about the Web, and it is ruthlessly efficient at doing so, regardless of the human costs. Yes, let’s kill that in any way that we can, because it is actually and actively evil, and shows no sign of getting any nicer. I am somewhat less concerned that Google gets 87% of all online searches (notwithstanding the very real dangers of a single set of algorithms shaping what we find), because most of Google’s goals are well aligned with those of the Web. The more openly people share and link, the better it gets, and the more money Google makes. It is very much in Google’s interest to support an open, highly distributed, highly connected Web, and the company is as keen as everyone else to avoid the dangers of falsehoods, bias, and the spread of hatred (which are among the very things that Facebook feeds upon), and, thanks to its strong market position and careful hiring practices, it is more capable of doing so than pretty much anyone else. Google rightly hates Facebook (and others of its ilk) not just because it is a competitor, but because it removes things from the open Web, probably spreads lies more easily than truths, and so reduces Google’s value.

I am somewhat bothered that the top 100 sites (according to WIkipedia, based on Alexa and SimilarWeb results) probably get far more traffic than the next few thousand put together, and that the long tail pretty much flattens to approximately zero after that. However, that’s an inevitable consequence of the design of the Web (it’s a scale-free network subject to power laws), and ‘approximately zero’ may actually translate to hundreds of thousands or even millions of people, so it’s not quite the skewed mess that it seems. It is, as TBL observes, very disturbing that big companies with big pockets purchase potential competitors and stifle innovation, and I agree that (like all monopolies) they should be regulated, but there’s no way they are ever going to get everything or everyone, at least without the help of politicians and evil legislation, because it’s a really long tail.

It is also very interesting that even the top 10 – according to just about all the systems that measure such things – includes the unequivocally admirable and open Wikipedia itself, and also Reddit which, though now straying from its fully open model, remains excellently social and open. In different ways, both give more than they take.

It is also worth noting that there are many different ways to calculate rank. Moz.com (based on the Mozscape web index of 31 Billion domains and 165 Billion pages) has a very different view of things, for instance, in which Facebook doesn’t even make it to the domains listing, and is way below WordPress and several others in the popular pages list, which is a direct result of it being a closed and greedy system. Quantcast’s perspective is somewhat different again, albeit only focused on US sites which are a small but significant portion of the whole.

Most significantly, and to reiterate the point because it is worth making, the long tail is very long indeed. Regardless of the dangers of a handful of gigantic platforms casting their ugly shadows over the landscape, I am extremely heartened by the fact that, now, over 30% of all websites run on WordPress, which is both open source and very close to the distributed ideal that TBL espouses, allowing individuals and small communities to stake their claims, make a space, and link (profusely) with one another, without lock-in, central control, or inhibition of any kind. That 30% puts any one of the big monoliths, including Facebook, very far into the shade. And, though WordPress’s nearest competitor (Joomla, also open source) accounts for a ‘mere’ 3% of all websites, there are hundreds if not thousands of similar systems, not to mention a huge number of pages (50% of the total, according to W3Techs) that people still roll for themselves.

Yes, the greedy monoliths are extremely dangerous and should, where possible, be avoided, and it is certainly worth looking into ways of regulating their activities, nationally and internationally, as many governments are already doing and should continue to do so. We must ever be vigilant. But the Web continues to grow, and to diversify regardless of their pernicious influence because it is far bigger than all of them put together.

Address of the bookmark: https://www.theguardian.com/technology/2018/mar/11/tim-berners-lee-tech-companies-regulations

Originally posted at: https://landing.athabascau.ca/bookmarks/view/3105535/tim-berners-lee-we-must-regulate-tech-firms-to-prevent-weaponised-web

Facebook has a Big Tobacco Problem

A perceptive article listing some of Facebook’s evils and suggesting an analogy between the tactics used by Big Tobacco and those used by the company. I think there are a few significant differences. Big Tobacco is not one company bent on profit no matter what the cost. Big tobacco largely stopped claiming it was doing good quite a long time ago. And Big Tobacco only kills and maims people’s bodies. Facebook is aiming for the soul. The rest is just collateral damage.

Address of the bookmark: https://mondaynote.com/facebook-has-a-big-tobacco-problem-f801085109a

Originally posted at: https://landing.athabascau.ca/bookmarks/view/3046034/facebook-has-a-big-tobacco-problem

Facebook’s days may be numbered as UK youth abandon the platform

The end of Facebook couldn’t come soon enough, but we’ve been reading headlines not unlike this for around a decade, yet still its malignant tumour in the lungs of the Web grows, sucking the air out of all good things.

Despite losses in the youth market (not only in the UK), as the article notes, Facebook has deep pockets and is metastasizing at a frightening rate. Instagram and WhatsApp are only the most prominent recent growths, and no doubt far from the last. Also, the main tumour itself is still evolving, backed by development funding that staggers belief. It would take a lot to cure us of this awful thing. On the optimistic side, however, Metcalfe’s Law works just as well in reverse as going forward. Networks can grow exponentially, but they can shrink just as fast. Perhaps these small losses will be the start of a cascade. Let’s hope so.

 

 

Address of the bookmark: http://www.alphr.com/facebook/1008480/facebook-youth-numbers-drop-over-55-rise

Originally posted at: https://landing.athabascau.ca/bookmarks/view/3037494/facebook%E2%80%99s-days-may-be-numbered-as-uk-youth-abandon-the-platform

Turns out the STEM ‘gender gap’ isn’t a gap at all

Grace Hopper and Univac, image from en.wikipedia.org/wiki/Grace_HopperAt least in Ontario, it seems that there are about as many women as men taking STEM programs at undergraduate level. This represents a smaller percentage of women taking STEM subjects overall because there are way more women entering university in the first place. A more interesting reading of this, therefore, is not that we have a problem attracting women to science, technology, engineering, and mathematics, but that we have a problem attracting men to the humanities, social sciences, and the liberal arts. As the article puts it:

“it’s not that women aren’t interested in STEM; it’s that men aren’t interested in poetry—or languages or philosophy or art or all the other non-STEM subjects.”

That’s a serious problem.

As someone with qualifications in both (incredibly broad) areas, and interests in many sub-areas of each,  I find the arbitrary separation between them to be ludicrous, leading to no end of idiocy at both extremes, and little opportunity for cross-fertilization in the middle. It bothers me greatly that technology subjects like computing or architecture should be bundled with sciences like biology or physics, but not with social sciences or arts, which are way more relevant and appropriate to the activities of most computer professionals. In fact, it bothers me that we feel the need to separate out large fields like this at all. Everyone plays lip service to cross-disciplinary work but, when we try to take that seriously and cross the big boundaries, there is so much polarization between the science and arts communities that they usually don’t even understand one another, let alone work in harmony. We don’t just need more men in the liberal arts – we need more scientists, engineers, and technologists to cross those boundaries, whatever their gender. And, vice versa, we need more liberal artists (that sounds odd, but I have no better term) and social scientists in the sciences and, especially, in technology.

But it’s also a problem of category errors in the other direction. This clumping together of the whole of STEM conceals the fact that in some subjects – computing, say – there actually is a massive gender imbalance (including in Ontario), no matter how you mess with the statistics. This is what happens when you try to use averages to talk about specifics: it conceals far more than it reveals.

I wish I knew how to change that imbalance in my own designated field of computing, an area that I deliberately chose precisely because it cuts across almost every other field and did not limit me to doing one kind of thing. I do arts, science, social science, humanities, and more, thanks to working with machines that cross virtually every boundary.

I suspect that fixing the problem has little to do with marketing our programs better, nor with any such surface efforts that focus on the symptoms rather than the cause. A better solution is to accept and to celebrate the fact that the field of computing is much broader and vastly more interesting than the tiny subset of it that can be described as computer science, and to build up from there. It’s especially annoying that the problem exists at Athabasca where a wise decision was made long ago not to offer a computer science program. We have computing and information systems programs, but not any programs in computer science. Unfortunately, thanks to a combination of lazy media and computing profs (suffering from science envy) that promulgate the nonsense, even good friends of mine that should know better sometimes describe me as a computer scientist (I am emphatically not), and even some of our own staff think of what we do as computer science. To change that perception means not just a change in nomenclature, but a change in how and what we, at least in Athabasca, teach. For example, we might mindfully adopt an approach that contextualizes computing around projects and applications, rather than its theory and mechanics. We might design a program that doesn’t just lump together a bunch of disconnected courses and call it a minor but that, in each course (if courses are even needed), actively crosses boundaries – to see how code relates to poetry, how art can inform and be informed by software, how understanding how people behave can be used in designing better systems, how learning is changed by the tools we create, and so on.

We don’t need disciplines any more, especially not in a technology field. We need connections. We don’t need to change our image. We need to change our reality. I’m finding that to be quite a difficult challenge right now.

 

Address of the bookmark: http://windsorstar.com/opinion/william-watson-turns-out-the-stem-gender-gap-isnt-a-gap-at-all/wcm/ee4217ec-be76-4b72-b056-38a7981348f2

Originally posted at: https://landing.athabascau.ca/bookmarks/view/2929581/turns-out-the-stem-%E2%80%98gender-gap%E2%80%99-isn%E2%80%99t-a-gap-at-all

terra0 – a forest that will one day buy itself

I love this art project – a forest that owns itself and that makes money on its own behalf, eventually with no human control or ownership. From the blurb…

“The Project emerged from research in the fields of crypto governance, smart contracts, economics and questions regarding representations of natural systems in the techno-sphere. It creates a framework whereby a forest is able to sell licences to log its own trees through automated processes, smart contracts and blockchain technology. “

But it gets better…

“The terra0 project creates a scenario whereby the forest, augmented through automated processes, utilitizes itself and thereby accumulates capital. A shift from valorisation through third parties to a self-utilization makes it possible for the forest to procure its real counter-value and eventually buy itself. The augmented forest is not only owner of itself, but is thus in the position to buy more ground and therefore to expand.”

Wonderful, immensely thought-provoking, deeply subversive.

Address of the bookmark: http://paulkolling.de/projects/terra0

Originally posted at: https://landing.athabascau.ca/bookmarks/view/2906218/terra0-a-forest-that-will-one-day-buy-itself

Black holes are simpler than forests and science has its limits

Mandelbrot set (Wikipedia, https://en.wikipedia.org/wiki/Mandelbrot_set)Martin Rees (UK Astronomer Royal) takes on complexity and emergence. This is essentially a primer on why complex systems – as he says, accounting for 99% of what’s interesting about the world – are not susceptible to reductionist science despite being, at some level, reducible to physics. As he rightly puts it, “reductionism is true in a sense. But it’s seldom true in a useful sense.” Rees’s explanations are a bit clumsy in places – for instance, he confuses ‘complicated’ with ‘complex’ once or twice, which is a rooky mistake, and his example of the Mandelbrot Set as ‘incomprehensible’ is not convincing and rather misses the point about why emergent systems cannot be usefully explained by reductionism (it’s about different kinds of causality, not about complicated patterns) – but he generally provides a good introduction to the issues.

These are well-trodden themes that most complexity theorists have addressed in far more depth and detail, and that usually appear in the first chapter of any introductory book in the field, but it is good to see someone who, from his job title, might seem to be an archetypal reductive scientist (he’s an astrophysicist) challenging some of the basic tenets of his discipline.

Perhaps my favourite works on the subject are John Holland’s Signals and Boundaries, which is a brilliant, if incomplete, attempt to develop a rigorous theory to explain and describe complex adaptive systems, and Stuart Kauffman’s flawed but stunning Reinventing the Sacred, which (with very patchy success) attempts to bridge science and religious belief but that, in the process, brilliantly and repeatedly proves, from many different angles, the impossibility of reductive science explaining or predicting more than an infinitesimal fraction of what actually matters in the universe. Both books are very heavy reading, but very rewarding.

Address of the bookmark: https://aeon.co/ideas/black-holes-are-simpler-than-forests-and-science-has-its-limits

Originally posted at: https://landing.athabascau.ca/bookmarks/view/2874665/black-holes-are-simpler-than-forests-and-science-has-its-limits

A Universe Explodes: A Blockchain Book, from Editions At Play

A Universe Explodes A really nice project from the Editions at Play team at Google, in which blockchain is used both to limit supply to a digital book (only 100 copies made) and, as the book is passed on, to make it ‘age,’ in the sense that each reader must remove two words from each page and add one of their own before passing it on (that they are obliged to do). Eventually, it decays to the point of being useless, though I think the transitional phases might be very interesting in their own right.

I was thinking something very vaguely along these lines would be an interesting idea and had started making notes about how it would work, but it seemed so blindingly obvious that somebody must have already done it. Blockchain technologies for publishing are certainly being considered by many people, and some are being implemented.   The Alliance of Independent Authors seems to have the most practical plans for using Blockchain for that purpose. Another similar idea comes with the means to partially compensate publishers for such things (as though they needed even more undeserved profits). Another interesting idea is to use Blockchain Counterparty tokens to replace ISBN numbers. However, A Universe Explodes is the only example I have so far found of building in intentional decay. It’s one of a range of wonderfully inventive and inspiring books that could only possibly exist in digital media at the brilliant Editions at Play site.

Though use of Blockchain for publishing is a no-brainer, it’s the decay part that I like most, and that I was thinking about before finding this. Removing and adding words is not an accurate representation of the typical decay of a physical book, and it is not super-practical at a large scale, delightful though it is. My first thoughts were, in a pedestrian way, to build in a more authentic kind of decay. It might, for instance, be possible to simply overlay a few more pixels with each reading, or to incrementally grey-out or otherwise visually degrade the text (which might have some cognitive benefits too, as it happens). That relies, however, on a closed application system, or a representation that would be a bit inflexible (e.g. a vector format like SVG to represent the text, or even a bitmap) otherwise it would be too easy to remove such additions simply by using a different application. And, of course, it would be bad for people with a range of disabilities, although I guess you could perform similar mutilations of other representations of the text just as easily. That said, it could be made to work. There’s no way it is even close to being as good as making something free of DRM, of course, but it’s a refinement that might be acceptable to greedy publishers that would at least allow us to lend, give, or sell books that we have purchased to others.

My next thought was that you could, perhaps more easily and certainly more interestingly, make marginalia (graphics and text) a permanent feature of the text once ownership was transferred, which would be both annoying and enlightening, as it is in physical books. One advantage would be that it reifies the concept of ownership – the intentional marks made on the book are a truer indication of the chain of owners than anything more abstract or computer-generated. It could also be a really interesting and useful way to tread a slightly more open path than most ugly DRM implementations, inasmuch as it could allow the creation of deliberately annotated editions (with practical or artistic intent) without the need for publisher permission. That would be good for textbooks, and might open up big untapped markets: for instance, I’d quite often rather buy an ebook annotated by one of my favourite authors or artists than the original, even if it cost more. It could be interestingly subversive, too. I might even purchase one of Trump’s books if it were annotated (and re-sold) by journalists from the Washington Post or Michael Moore, for example. And it could make a nice gift to someone to provide a personally embellished version of a text. Combined with the more prosaic visual decay approach, this could become a conversation between annotators and, eventually, become a digital palimpsest in which the original text all but disappears under generations of annotation. I expect someone has already thought of that but, if not, maybe this post can be used to stop someone profiting from it with a patent claim.

In passing, while searching, I also came across http://www.eruditiondigital.co.uk/what-we-do/custos-for-ebooks.php which is both cunning and evil: it lets publishers embed Bitcoin bounties in ebooks that ‘pirates’ can claim and, in the process, alert the publisher to the identity of the person responsible. Ugly, but very ingenious. As the creators claim, it turns pirates on other pirates by offering incentives, yet keeping the whole process completely anonymous. Eeugh.

Address of the bookmark: https://medium.com/@teau/a-universe-explodes-a-blockchain-book-ab75be83f28

Originally posted at: https://landing.athabascau.ca/bookmarks/view/2874113/a-universe-explodes-a-blockchain-book-from-editions-at-play

Evidence mounts that laptops are terrible for students at lectures. So what?

The Verge reports on a variety of studies that show taking notes with laptops during lectures results in decreased learning when compared with notes taken using pen and paper. This tells me three things, none of which is what the article is aiming to tell me:

  1. That the institutions are teaching very badly. Countless decades of far better evidence than that provided in these studies shows that giving lectures with the intent of imparting information like this is close to being the worst way to teach. Don’t blame the students for poor note taking, blame the institutions for poor teaching. Students should not be put in such an awful situation (nor should teachers, for that matter). If students have to take notes in your lectures then you are doing it wrong.
  2. That the students are not skillful laptop notetakers. These studies do not imply that laptops are bad for notetaking, any more than giving students violins that they cannot play implies that violins are bad for making music. It ain’t what you do, it’s the way that you do it. If their classes depend on effective notetaking then teachers should be teaching students how to do it. But, of course, most of them probably never learned to do it well themselves (at least using laptops). It becomes a vicious circle.
  3. That laptop and, especially, software designers have a long way to go before their machines disappear into the background like a pencil and paper. This may be inherent in the medium, inasmuch as a) they are vastly more complex toolsets with much more to learn about, and b) interfaces and apps constantly evolve so, as soon as people have figured out one of them, everything changes under their feet. It becomes a vicious cycle.

The extra cognitive load involved in manipulating a laptop app (and stopping the distractions that manufacturers seem intent on providing even if you have the self-discipline to avoid proactively seeking them yourself) can be a hindrance unless you are proficient to the point that it becomes an unconscious behaviour. Few of us are. Tablets are a better bet, for now, though they too are becoming overburdened with unsought complexity and unwanted distractions. I have for a couple of years now been taking most of my notes at conferences etc with an Apple Pencil and an iPad Pro, because I like the notetaking flexibility, the simplicity, the lack of distraction (albeit that I have to actively manage that), and the tactile sensation of drawing and doodling. All of that likely contributes to making it easier to remember stuff that I want to remember. The main downside is that, though I still gain laptop-like benefits of everything being in one place, of digital permanence, and of it being distributed to all my devices, I have, in the process, lost a bit in terms of searchability and reusability. I may regret it in future, too, because graphic formats tend to be less persistent over decades than text. On the bright side, using a tablet, I am not stuck in one app. If I want to remember a paper or URL (which is most of what I normally want to remember other than my own ideas and connections that are sparked by the speaker) I tend to look it up immediately and save it to Pocket so that I can return to it later, and I do still make use of a simple notepad for things I know I will need later. Horses for courses, and you get a lot more of both with a tablet than you do with a pencil and paper. And, of course, I can still use pen and paper if I want a throwaway single-use record – conference programs can be useful for that.

 

 

 

 

Address of the bookmark: https://www.theverge.com/2017/11/27/16703904/laptop-learning-lecture

Originally posted at: https://landing.athabascau.ca/bookmarks/view/2871283/evidence-mounts-that-laptops-are-terrible-for-students-at-lectures-so-what