Democratech: reflections on the human nature of blockchain

mediaeval blockchain votingAt short notice I was invited to be guest of honour and keynote at Bennett University’s International Conference on Blockchain for Inclusive and Representative Democracy  yesterday. I was not able to attend the entire conference – my opening keynote was at 9:30pm here in Vancouver and I eventually needed to sleep – but I made it for a few hours. I was impressed with the diversity and breadth of the work going on, mainly in India, and the passionate, smart people in attendance. It was a particular pleasure to hear from Ramesh Sharma, who I have known for many years in an online learning context, here speaking of very different things, and I really loved the ceremonial lighting of the lantern – the sharing of the light – with which the conference began. It is a powerful and connecting metaphor.

Like most geeks I do have the occasional thought about blockchain and democracy but I can’t describe myself as an expert or even an enthusiastic amateur in either field. So, rather than speaking about things the delegates knew far more about than I, and given the compressed time-frame for preparing the keynote, I chose to ground the talk in familiar territory, taking a broad-brush view of how to think of the technological ecosystem into which the technologies must fit. It led to some new thoughts here and there: in particular, I rather like the idea of technologies in general acting as a kind of distributed ledger of human cognition. The result was these slides – Democratech: reflections on the human nature of blockchain.

In rough note form (not a polished academic work and not particularly coherent!), the text below is approximately what I spoke about for each of the slides:

1 In this talk I will be using ideas from my most recent book: here it is. You can download it for free or buy it in paper or electronic form if you wish. See http://teachingcrowds.ca. It is at least as  much about the nature of technology as it is about the nature of education, and that’s what I want to talk about today: what kind of a technology is blockchain, and why does it matter?

2 “Technology” is a fuzzy term that can mean many things to different people. I spend a whole chapter in the book exploring many definitions of what “technology” means. To, save time, I am going to use what I conclude to be the best definition, from Brian Arthur, “orchestrating phenomena to our use”.

3 I prefer to think of this as “organizing stuff to do stuff”, because it makes it clearer that the stuff that it organizes nearly always includes stuff already organized to do stuff: as Arthur observes, almost all if not all technologies are assemblies of other technologies, at least when they are put to use.

Technologies are made of technologies, at every scale, and they are parts of webs of technologies that stretch far into time and space.  Kevin Kelly calls this massively interconnected network the technium. And, as he puts it, technology can be thought of as both a thing and a verb or, as Ursula Franklin puts it fish and water – a slippery thing to pin down. It is something we do and something we have done. In fact it is typically both.

4 By this definition, democracies are technologies too – in fact, hugely complex assemblies of technologies. They orchestrate phenomena using systems, physical objects, and assemblies of them, to approximate a fair voice for all in the governance of where we dwell. So are words, and language, and, as Franklin notes, there are technologies of prayer.

5 If you take nothing else from this speech, take this: only the whole assembly matters. The parts are very important to the designer and make a big difference to how a technology works and is experienced, but it is how the parts are assembled and act together that makes the technology as it is experienced, as it is instantiated. That includes what we do with them – more on that in a moment.

If you are not convinced, think about some of the parts of the computer you are looking at now: some are sharp, some contain harmful chemicals, and there’s a good chance that there is a deadly amount of  electricity flowing through them, and yet we gain benefit from them, not loss of life, because we assemble them in ways that (at least normally) eliminate the harm by adding technologies to prevent it: counter technologies. Often, a large part of what we recognize as a technology is in fact a counter technology to other parts of it – think of cars, for example, where many of the components are simply there to stop other components blowing up, seizing, or killing people.

6 Technologies create what Stuart Kauffman calls “adjacent possibles” – empty niches that further technologies can fill, individually or in conjunction with others, including others that already exist. Every new technology makes further technologies possible, adding new parts to new assemblies. This accounts for the exponential growth in technologies over the past 10000 years or so: technologies evolve from and with other technologies, almost never out of nothing.

Those adjacent possible empty niches are fundamentally unprestatable, as Kauffman puts it: no one can imagine all the possible assemblies into which we might put something as simple as a screwdriver. A stirrer of paint, a back scratcher, a scribe, a pointer, a stabbing weapon, a weight, a missile, a crow bar… And this is true of every technology. All can be assembled differently, in indefinitely many assemblies, to make indefinitely many wholes. This is true at the finest of scales. Though there may be some very close resemblances between instances, you have never written your own signature, nor washed your clothes, nor eaten your food the same way twice. Only machines can do that, but they are part of our technologies as much as we are part of them: the machine may behave consistently but the technology through which we use it – the instantiation in which we participate – most likely does not.

Technologies also come with path dependencies that can harden and distort assemblies, because the soft must shape itself around the hard. What exists shapes what can exist.

7, 8 When instantiated, we are participants in, not just users of, the technology. Using a technology is also a technology: whether organizing it or being part of the organization

9 , 10 We are coparticipants in a largely self-organizing web of technology that is part organic, part process, part physical object, part conceptual, part structural. Technologies democratize cognition though they also embed and harden values of the powerful, and the uses to which they are put are too often to subdue, constrain, or abuse our fellow humans. It is always important to remember that the technology that matters is seldom its most obvious components: it is the assembly they are in. As they are used, they are different technologies to everyone that uses them, because they are parts of different assemblies: the production line is a very different technology for its boss, its workers, its shareholders, the consumers of what it produces, orchestrating different phenomena to different users. This means that technologies – as instantiated – are never neutral. They have histories, contexts, and propensities.

11 And our input matters: it is not just the method but the way things are done that matters. Every assembly can be a creative assembly, and it is possible to do it well or badly. And so we all create new adjacent possibles for one another.  Through technologies we participate in the collective cognition of the human race: in effect, technologies form the distributed ledger of our shared cognition. But all of us assemble and interpret in the ways we use technology, whether we form part of it (hard technique) or are the organizers (soft technique).

12 Blockchain is a technology capable of achieving great good: potentially accountable but equally interesting in ways it can support anonymity, free from central control but also interesting in the context of an existing system of trust, good for both privacy and transparency, etc. It has indefinitely many adjacent possibles, from the exchange of property to the assertion of identity, from enabling reliable voting to making supply chains accountable.

13 But all technologies are what Neil Postman called Faustian bargains. When you invent the ship you invent the shipwreck as Paul Virilio put it. The story of the Monkeys Paw, by W.W. Jacobs is a tale of horror in which a monkey’s paw grants three wishes to a modest couple, who ask only to pay off their mortgage with their first wish. Moments later, they learn their son has died in a horrible accident at a factory in which he works and the company will pay compensation: the exact cost of the outstanding mortgage. And so the story goes on. Technologies are like that.

Blockchain can be subverted by organized crowds (botnets and human), malware, cracking, etc, and quantum computing means all bets are off about reliability an security. It is possible to lose votes as easily as it is to lose millions in bitcoin. Blockchain can conceal criminal activity, and, conversely, enable a level of surveillance never seen before. Remember, this is all about the assembly, and blockchain is a very versatile component. It’s a super-soft technology that connects many others. Blockchain makes new forms of democracy possible, but it also enables new forms of tyranny.

To understand blockchain we must understand the technologies of which it forms only part of the assembly. Never forget that it is only ever the assembly that matters, not the parts. This is and has always been true of all the technologies of democracy. Paper voting, say, in its raw form is incredibly and fundamentally unreliable, prone to loss, error, abuse, corruption, coercion, loss of privacy, etc and it is terribly, terribly inefficient and insecure. However, we throw in a lot of counter technologies – systems to assure reliability, safes, multiple counts, policing procedures, surveillance, electronic counts, , observers, etc – and so the process is now so well evolved that it often enough works. Paper is not the technology of interest: it is the whole system that surrounds it. Same for blockchain.

14 Understanding technologies mean we we must know the adjacent possibles but, remember, we we can only ever see the most brightly lit of these from where we currently stand. The creative potential, for both good and evil, is barely visible at all. Someone, somehow, somewhere, will find new assemblies that achieve their ends, whether it benefits all of us or not. Sadly, those most able are typically those least trustworthy thanks to the fundamental inequalities of our societies that reward greed and that give most to those who already have most. Anything is weaponizable, including democracy, as (here in Canada) our neighbours south of the border are discovering to their cost. And it means understand what happens at scale: the environmental impacts and counter technologies to that: but, as Reneé Dubos put it, fixing problems with counter technologies is a philosophy of despair, because every counter technology we create is another Faustian bargain that creates new problems to solve, and new adjacent possibles we never foresaw.

15 We must understand where blockchain fits in the massive web of the collective technium – the Ricardian contracts, the oracles, the legal frameworks that surround them, the ZKP techniques, the privacy laws, the voting practices, the laws of ownership, and so on. It is unwise to simply drop it in as a replacement for what we already do because it will harden what should not be hardened – when we automate we tend to simplify – and create new relationships that may be incompatible or positively dangerous to existing technologies of democracy. But, as we reinvent it, we must always remember the unprestatable adjacent possibles we create, the things we reinforce, the things we lose. And we must remember that someone, somewhere is seeing adjacent possibles we did not imagine, assemblies we have yet to conceive, and they may not be friendly to democratic ideals.

16 To understand this means we must look far beyond the bits and bytes and flashing lights; we must make empathetic leaps into the hearts and minds of our coparticipants in the technium. We are technologies, as much a part of blockchain as it is part of the broader web of the technium.

What kind of technologies do we want to be?

We are (in part) our tools and they are (in part) us

anthropomorphized hammer using a person as a toolHere’s a characteristically well-expressed and succinct summary of the complex nature of technologies, our relationships with them, and what that means for education by the ever-wonderful Tim Fawns. I like it a lot, and it expresses much what I have tried to express about the nature and value of technologies, far better than I could do it and in far fewer words. Some of it, though, feels like it wants to be unpacked a little further, especially the notions that there are no tools, that tools are passive, and that tools are technologies. None of what follows contradicts or negates Tim’s points, but I think it helps to reveal some of the complexities.

There are tools

Tim starts provocatively with the claim that:

There are no tools. Tools are passive, neutral. They can be picked up and put down, used to achieve human goals without changing the user (the user might change, but the change is not attributed to the tool).

I get the point about the connection between tools and technology (in fact it is very similar to one I make in the “Not just tools” section of Chapter 3 of How Education Works) and I understand where Tim is going with it (which is almost immediately to consciously sort-of contradict himself), but I think it is a bit misleading to claim there are no tools, even in the deliberately partial and over-literal sense that Tim uses the term. This is because to call something a tool is to describe a latent or actual relationship between it and an agent (be it a person, a crow, or a generative AI), not just to describe the object itself. At the point at which that relationship is instantiated it very much changes the agent: at the very least, they now have a capability that they did not have before, assuming the tool works and is used for a purpose. Figuring out how to use the tool is not just a change to the agent but a change to what the agent may become that expands the adjacent possible. And, of course, many tools are intracranial so, by definition, having them and using them changes the user. This is particularly obvious when the tool in question is a word, a concept, a model, or a theory, but it is just as true of a hammer, a whiteboard, an iPhone, or a stick picked up from the ground with some purpose in mind, because of the roles we play in them.

Tools are not (exactly) technologies

Tim goes on to claim:

Tools are really technologies. Each technology creates new possibilities for acting, seeing and organising the world.

Again, he is sort-of right and, again, not quite, because “tool” is (as he says) a relational term. When it is used a tool is always part of a technology because the technique needed to use it is a technology that is part of the assembly, and the assembly is the technology that matters. However, the thing that is used – the tool itself – is not necessarily a technology in its own right. A stick on the ground that might be picked up to hit something, point to something, or scratch something is simply a stick.

Tools are not neutral

Tim says:

So a hammer is not just sitting there waiting to be picked up, it is actively involved in possibility-shaping, which subtly and unsubtly entangles itself with social, cognitive, material and digital activity. A hammer brings possibilities of building and destroying, threatening and protecting, and so forth, but as part of a wider, complex activity.

I like this: by this point, Tim is telling us that there are tools and that they are not neutral, in an allusion to Culkin’s/McLuhan’s dictum that we shape our tools and thereafter our tools shape us.  Every new tool changes us, for sure, and it is an active participant in cognition, not a non-existent neutral object. But our enactment of the technology in which the tool participates is what defines it as a tool, so we don’t so much shape it as we are part of the shape of it, and it is that participation that changes us. We are our tools, and our tools are us.

There is interpretive flexibility in this – a natural result of the adjacent possibles that all technologies enable – which means that any technology can be combined with others to create a new technology. An iPhone, say, can be used by anyone, including monkeys, to crack open nuts (I wonder whether that is covered by AppleCare?), but this does not make the iPhone neutral to someone who is enmeshed in the web of technologies of which the iPhone is designed to be a part. As the kind of tool (actually many tools) it is designed to be, it plays quite an active role in the orchestration: as a thing, it is not just used but using. The greater the pre-orchestration of any tool, the more its designers are co-participants in the assembled technology, and it can often be a dominant role that is anything but neutral.

Most things that we call tools (Tim uses the hammer as an example) are also technologies in their own right, regardless of their tooliness: they are phenomena orchestrated with a purpose, stuff that is organized to do stuff and, though softer tools like hammers have a great many adjacent possibles that provide almost infinite interpretive flexibility, they also – as Tim suggests – have propensities that invite very particular kinds of use. A good hardware store sells at least a dozen different kinds of hammer with slightly different propensities, labelled for different uses. All demand a fair amount of skill to use them as intended. Such stores also sell nail guns, though, that reduce the amount of skill needed by automating elements of the process. While they do open up many further adjacent possibles (with chainsaws, making them mainstays of a certain kind of horror movie), and they demand their own sets of skills to use them safely, the pre-orchestration in nail guns greatly reduces many of the adjacent possibles of a manual hammer: they aren’t much good for, say, prying things open, or using as a makeshift anchor for a kayak, or propping up the lid of a tin of paint. Interestingly, nor are they much use for quite a wide range of nail hammering tasks where delicacy or precision are needed. All of this is true because, as a nail driver, there is a smaller gap between intention and execution that needs to be filled than for even the most specialized manual hammer, due to the creators of the nail gun having already filled a lot of it, thus taking quite a few choices away from the tool user. This is the essence of my distinction between hard and soft technologies, and it is exactly the point of making a device of this nature. By filling gaps, the hardness simplifies many of the complexities and makes for greater speed and consistency which in turn makes more things possible (because we no longer have to spend so much time being part of a hammer) but, in the process, it eliminates other adjacent possibles. The gaps can be filled further. The person using such a machine to, say, nail together boxes on a production line is not so much a tool user as a part of someone else’s tool. Their agency is so much reduced that they are just a component, albeit a relatively unreliable component.

Being tools

In an educational context, a great deal of hardening is commonplace, which simplifies the teaching process and allows things to be done at scale. This in turn allows us to do something approximating reductive science, which gives us the comforting feeling that there is some objective value in how we teach. We can, for example, look at the effects of changes to pre-specified lesson plans on SAT results, if both lesson plans and SATs are very rigid, and infer moderately consistent relationships between the two, and so we can improve the process and measure our success quite objectively. The big problem here, though, is what we do not (and cannot) examine by such approaches, such as the many other things that are learned as a result of being treated as cogs in a mechanical system, the value of learning vs the value of grades, or our places in social hierarchies in which we are forced to comply with a very particular kind of authority. SATs change us, in many less than savoury ways. SATs also fail to capture more than a miniscule fraction of the potentially useful learning that also (hopefully) occurred. As tools for sorting learners by levels of competence, SATs are as far from neutral as you can get, and as situated as they could possibly be. As tools for learning or for evaluating learning they are, to say the least, problematic, at least in part because they make the learner a part of the tool rather than a user of it. Either way, you cannot separate them from their context because, if you did, it would be a different technology. If I chose to take a SAT for fun (and I do like puzzles and quizzes, so this is not improbable) it would be a completely different technology than for a student, or a teacher, or an administrator in an educational system. They are all, in very different ways, parts of the tool that is in part made of SATs. I would be a user of it.

All of this reinforces Tim’s main and extremely sound points, that we are embroiled in deeply intertwingled relationships with all of our technologies, and that they cannot be de-situated. I prefer the term “intertwingled” to the term “entangled” that Tim uses because, to me, “entangled” implies chaos and randomness but, though there may (formally) be chaos involved, in the sense of sensitivity to initial conditions and emergence, this is anything but random. It is an extremely complex system but it is highly self-organizing, filled with metastabilities and pockets of order, each of which acts as a further entity in the complex system from which it emerges.

It is incredibly difficult to write about the complex wholes of technological systems of this nature. I think the hardest problem of all is the massive amount of recursion it entails. We are in the realms of what Kauffman calls Kantian Wholes, in which the whole exists for and by means of the parts, and the parts exist for and by means of the whole, but we are talking about many wholes that are parts of or that depend on many other wholes and their parts that are wholes, and so on ad infinitum, often crossing and weaving back and forth so that we sometimes wind up with weird situations in which it seems that a whole is part of another whole that is also part of the whole that is a part of it, thanks to the fact that this is a dynamic system, filled with emergence and in a constant state of becoming. Systems don’t stay still: their narratives are cyclic, recursive, and only rarely linear. Natural language cannot easily do this justice, so it is not surprising that, in his post, Tim is essentially telling us both that tools are neutral and that they are not, that tools exist and that they do not, and that tools are technologies and they are not. I think that I just did pretty much the same thing.

Source: There are no tools – Timbocopia

My keynote slides for Confluence 2023 – Heads in the clouds: being human in the age of cloud computing

 heads in cloudsThese are the slides from my keynote today (or, in my land, yesterday) at Confluence 2023, hosted by Amity University in India. It was a cloud computing conference, so quite a way outside my area of greatest expertise, but it gave me a chance to apply the theory of technology developed in my forthcoming book  to a different context. The illustrations for the slides are the result of a conversation between me and MidJourney (more of an argument that MidJourney tended to win) which is quite a nice illustration of the interplay of hard and soft technologies, the adjacent possible, soft technique, and so on.

Unsurprisingly, because education is a fundamentally technological phenomenon, much the same principles that apply to education also apply to cloud computing, such as: build from small, hard pieces; valorize openness, diversity and connection; seek the adjacent possible; the whole assembly is the only thing that matters and so the central principle that how you do it matters far more than what you do.

Slides from my Confluence 2023 keynote