Paper: Cognitive Santa Claus Machines and the Tacit Curriculum

This is my contribution to the inaugural issue of AACE’s new journal of AI-Enhanced Learning, Cognitive Santa Claus Machines and the Tacit Curriculum. If the title sounds vaguely familiar, it might be because you might have seen my post offering some further thoughts on cognitive Santa Claus machines written not long after I had submitted this paper.

The paper itself delves a bit into the theory and dynamics of genAI, cognition, and education.  It draws heavily from how the theory in my last book, has evolved, adding a few of its own refinements here and there, most notably in its distinction of use-as-purpose vs use-as-process. Because genAIs are not tools but cognitive Santa Claus machines, this helps to explain how the use of genAI can simultaneously enhance and diminish learning, both individually and collectively, to varying degrees that range from cognitive apocalypse to cognitive nirvana, depending on what we define learning to be, whose learning we care about, and what kind of learning gets enhanced or diminished. A fair portion of the paper is taken up with explaining why, in a traditional credentials-driven, fixed-outcomes-focused institutional context, generative AI will usually fail to enhance learning and, in many typical learning and institutional designs, may even diminish our individual (and ultimately collective) capacity to do so. As always, it is only the whole assembly that matters, especially the larger structural elements, and genAI can easily short-circuit a few of those, making the whole seem more effective (courses seem to work better, students seem to display better evidence of success) but the things that actually matter get left out of the circuit.

The conclusion describes the broad characteristics of educational paths that will tend to lead towards learning enhancement by, first of all, focusing our energies on education’s social role in building and sharing tacit knowledge, then on ways of using genAI to do more that we could do alone, and, underpinning this, on expanding our definitions of what “learning” means beyond the narrow confines of “individuals meeting measurable learning outcomes”. The devil is in the detail and there are certainly other ways to get there than by the broad paths I recommend but I think that, if we start with the assumption that our students are neither products nor consumers nor vessels for learning outcomes, but co-participants in our richly complex, ever evolving, technologically intertwingled learning communities, we probably won’t go too far wrong.

Abstract:

Every technology we create, from this sentence to the Internet, changes us but, through generative AI (genAI), we can now access a kind of cognitive Santa Claus machine that invents other technologies, so the rate of change is exponentially rising. Educators struggle to maintain a balance between sustaining pre-genAI values and skills, and using the new possibilities genAIs offer. This paper provides a conceptual lens for understanding and responding to this tension. It argues that, on the one hand, educators must acknowledge and embrace the changes genAI brings to our extended cognition while, on the other, that we must valorize and double-down on the tacit curriculum, through which we learn ways of being human in the world.