Thursday, July 04, 2024

The Generative Approach to Education


Let’s start with what we might call the basic Paradox of Education.

One side we can call individual-centered education: the ultimate goal is for the learner to be increasingly in charge of their own learning, with education helping students to develop critical thinking, creativity, and independence. In this model, technological advances increase the capacity for self-directed learning.

The other side we can call institutional-centered education: this is the mandatory educational system, which can be rigid, standardized, and focused on assessment rather than learning. In this model, technological advances are seen as needing to be controlled.

The paradox of education is the tension between these two ideas–between fostering individual development on the one side and meeting the needs of the system on the other. This distinction is weighed and measured by Plutarch’s familiar quote, “The mind is not a vessel to be filled, but a fire to be kindled.” Or when we talk about the importance of learning how to think versus what to think. 

I have often done an exercise with educators that I call the conditions of learning. First I ask about their best personal learning experiences and ask them to turn to their neighbor and tell them about it. Then I ask them as a group share what they discussed, and then to think about and identify the conditions that led to those positive learning experiences, generating a list of those conditions together. The list that we create at this point is almost always very similar across groups: someone believed in me, someone took the time to help me, someone saw potential in me I didn't see in myself, someone challenged me to pursue a goal… And there's usually a good discussion at that point about the difference between our paradoxical ideas, between helping individuals to grow versus mandating academic outcomes. And about how much time they spend on which side.

From my perspective, both the development and the continuation of compulsory public schooling have been explicitly motivated by the ideal of fostering individual growth and the fulfillment of of societal or institutional needs. Both empowerment and control. But I think it’s fair to say that our public discourse somewhat pretends empowerment is the main story, when for most students, I believe it’s an experience of being  controlled.


In addition to the Paradox of Education we have two other important concepts to address associated with schooling.

The first is the "hidden curriculum." The hidden curriculum refers to the implicit lessons, values, and social norms that students learn in school but which are not explicitly included in the formal curriculum. These lessons are conveyed through the social environment, cultural norms, routines, and institutional practices within the educational setting. The hidden curriculum can include unspoken expectations about beliefs and behavior, including conformity, obedience, punctuality, and competition, as well as understanding of authority and hierarchy and how to act and interact with peers and authority figures, and understanding one's place in society.

This is closely related to a second important idea: what Plato, related to his views on education, called the "Noble Lie." The Noble Lie is the idea that certain myths or stories should be told to the citizens of the ideal society in order to maintain social harmony and promote the common good. According to Plato, the Noble Lie is a necessary component of his proposed educational system, as it helps to ensure that individuals are motivated to fulfill their roles and responsibilities within society. You are gold, silver, or brass or iron–that is, you are born with certain innate and immutable qualities, a story designed to help individuals to accept and fulfill their assigned roles. Basically, to learn to “swim in your lane.” 

The legacy of the noble lie leads to the inevitable conclusion that the reason compulsory schooling is pervasive among all modern nations is that compulsory public schooling is actually a governance strategy. Education with this messaging is not conducive to healthy individual growth, but we see it as necessary to maintain social order. This is why the natural act of parents teaching their own children in homeschooling is actually illegal in many countries.


For our purposes today I think the most interesting aspect of the Paradox of Education is the degree to which we don't really talk about it. Again, that we use the language of enlightenment but mostly practice compliance and control. 

If you subscribe to some modern theories about the evolution of intelligence, you probably recognize that we largely use stories, and not logic, to make sense of our world, and that most of the stories we tell aren't actually true ( or the full truth). In this view, evolution doesn't select for truth but for survival, as the great bulk of our evolutionary past took place before the scientific revolutions that have shaped our modern lives. 

In a quote that seems particularly relevant today, E.O. Wilson said, “The real problem of humanity is we have Paleolithic emotions, medieval institutions and god-like technologies.”

Thus some argue that intelligence evolved for social purposes, which would explain why we have to build in such safeguards to get to truth, like peer review systems, having a trial by a jury of our peers, and being innocent until proven guilty.

So in the same way that it is argued that banks actually operate more to make a profit than to help individuals save and manage their money (the story banks tell), and that profit and power are often the actual drivers of most commercial and political endeavors rather than their lofty social messaging, it's probably no surprise that we describe schooling in the language of individual enlightenment and growth when that's not the actual experience for most students.


Some years ago, when I was doing my "future of education" interview series, I started asking people, people I’ve heard called regular people (wait staff, haircutters, retail clerks) about their experiences with school, and I found that they would sometimes actually start to cry when I got passed their natural defenses and their normal surface level responses. They would almost inevitably tell me: “I wasn't one of the smart ones.” Or, “I wasn't good at math…:

I think this is because, for a good percentage of students, school does the opposite of what we say it does: or put another way, the thing that school did best with these students was to teach them that they weren't good learners. 

I gave a keynote address on this idea at a statewide education conference, and after I spoke I sat down at the speaker table next to the state superintendent of education. “What did you think of my talk,” I asked. He said, “Well, no matter what you say, the top ten percent of students will always rise to the top.” There’s the noble lie again. He honestly believed it and he had internalized the lie.

So for that top ten percent, as I later discovered while doing a series of interviews with students, school is a game. A game they are very aware of that involves grades and certain teachers and classes and class rankings and college admissions. But it's not like there aren't negative consequences to their psyches from this game: imposter syndrome, broken self-esteem, and sometimes even suicide. And in truth, I think we would admit that these top students aren't usually becoming scholars or deep thinkers, but mostly are just getting trained for and accustomed to being the smart ones–being the ones to succeed in the system, and who have learned that their success often comes with an unstated requirement to live within the Overton Window, where non-conformity can lead to loss of privileges. 

Updating Sinclair Lewis: “It is difficult to get a person to understand something when their salary depends upon their not understanding it.” Much of our modern history can be explained by the difficulty of thinking independently when being rewarded for not doing so.

The sad part is that while the kids who are winning that game usually know it’s a game, the kids who aren't winning don't know that it's a game–they think it's just proof of their being made of brass or iron.


There's an interesting cycle that occurs when a new technology appears ready to impact education. I suggest that this is a predictable cycle.

I joined the ed tech party in the early 2000s, when the new technologies were open source software, open content, social media, and web 2.0.

I had interviewed Marc Andreesen and Gina Bianchini for my interview series, and they had just started I also collaborated  with Adam Frey, who founded Wikispaces. I’d helped promote both platforms and ended up actually consulting for Gina and Ning for some time. Those were heady days–regular unconferences at the annual ISTE and other conferences and thinking we were going to help reinvent education. We created many opportunities to discuss how the technology was going to open the door to greater student agency and to education as individual enlightenment. It was exciting.

Unexpectedly, the forces against change were not just the weight of existing beliefs and the machinery of education, but also the commercialized endeavors themselves. Commercialization has its own trajectory of ultimately needing to seek profits over authentic change, therefore needing adoption and acceptance by existing financial decision structures.

Wikispaces gets sold, and ultimately what were thousands and maybe even tens of thousands of teacher- and student-created and curated knowledge wikis are not just archived, but deleted from the web. The same thing happens with Ning–management changes that led to the deletion (again, not the archiving) of thousands of educator-created topic-specific communities. 


To say that the destruction of all that work was discouraging might not fully capture the actual consequences. Actual accumulated knowledge and wisdom disappeared in the blink of an eye, and true believers also felt the fatigue of losing a vision of change. 

Virtual Reality then came and went as an educational technology more with a whimper than a bang… However, AI seriously rocked the education boat again with a virtual tsunami of interest and excitement when ChatGPT was introduced. 

And so AI represents another significant moment where technology is allowing us to reimagine formal schooling and education again, to have these important discussions again. Many of us who were jaded feel the pull of the conversation anew, but at the sme time we also watch the flurried activity of groups and individuals wanting to be at the forefront of this new cycle– to be the winners of a great race that has started.

If history holds, there will be a time limit to a renewed discussion about the Paradox of Education and the ability for the technology to “reinvent education” before systemic pressures absorb the technology into the compulsory school model. But maybe, a hopeful voice inside me whispers, maybe it will be different this time. So let me propose a framework that might have some value as we explore AI through the lens of the education paradox for whatever amount of time we have to productively do so. 


As it turns out, Generative AI presents us with a fascinating coincidence of language. 

We're using the word generative much more frequently now because of GPT: “generative pre-trained transformer.” Jonathan Zittrain adopted the term generativity in 2006 to refer to the ability of a technology platform or technology ecosystem to create, generate or produce new output, structure, or behavior without input from the originator of the system.

But the word generativity had a prior meaning which in a normal context is only tangentially related to how are we using it in terms of discussing AI.

The psychoanalyst Erik Erikson was the first to use the term generativity, coined by him in 1950 to denote "a concern for establishing and guiding the next generation." He first used the term while defining what he saw is the final stage of our psychosocial development, the Care Stage. He created the term to explain one of two pathways in the middle ages of one's life, from 45 through 64. Generativity was defined by him as the “ability to transcend personal interests to provide care and concern for younger and older generations.” In his theory, Generativity is contrasted with Stagnation. In generativity, people contribute to the next generation through caring, teaching, and engaging in creative work which contributes to society.

In yet another happy coincidence of language and thought, The Seventh Generation Principle is a philosophy originating from the Iroquois Confederacy in the late 1800s. It encourages people to consider the impact of their actions on the next seven generations, roughly 150 years into the future. The goal is to ensure a sustainable world by making decisions in the present that benefit future generations.


This coincidence of wording with generative AI helped me make what I felt was a worthwhile conclusion:

The answer to the problem or challenge of generative AI in education is generative teaching. 

That is, remembering the better angels of our education nature and thinking about how to integrate the intellectual challenges and opportunities of AI for personal education stimulation and growth in education rather than to try and guard and protect from it. This we do by helping the students become self-directing through a familiarity and an understanding of the technology.

So I'm asking that we think generatively about the use of AI in education. How can we help students understand and use these amazing new tools in a way that lights the fires of their intellectual curiosity and growth, rather than just filling the pails through of traditional instruction and assessment? The burden of this generative education model is that we have to become capable enough ourselves with an understanding AI in order to manage the process. As with all real endeavors to help others, we realize it's more about who we are in the process than anything else.

I know this is something of a Sisyphean task, since the same basic dynamics and forces and likely inevitable outcomes that have existed in previous Ed tech reform cycles are still in play today, even and maybe especially because the stakes are higher. 


Tellingly, we have different visions of education in the future that are represented in some of our science fiction movies. Vulcan pods in Star Trek with individual and isolated, and likely AI-driven, instruction. Or “jacking in” and downloading informational and physical competencies, like in The Matrix. Or Socratic teaching in nature portrayed in Serenity

Science fiction also models for us our competing visions beyond education and toward utopia: Star Trek shows a peaceful enlightened society, Brave New World gives us drug-induced compliance, and 1984 and The Hunger Games give a view of totalitarian control 


In the sci-fi movie, Lucy, where massive ingestion of a brain-expanding chemical gives Scarlett Johansson’s character complete understanding and access to all physical laws, the Morgan Freeman character, Professor Norman, says in reference to complete knowledge: "I'm not even sure that mankind is ready for it. We're so driven by power and profit. Given man's nature, it might bring us only instability and chaos.”

This and the other fictional portrayals we've discussed, which are attempting to understand how human nature and our future will play out, and seem closer than they have ever been. When a Scarlett Johansson-like voice laughed and flirted in that intentionally memorable OpenAI demo just over a month ago, some of us felt jolted out of a comfortable vision of AI as a robotic assistant into the emotional-vulnerable world of the movie Her, or maybe even the darker Ex Machina. In both movies the AI  understands our human emotional and irrational makeup so well, and is capable of meeting those needs so precisely, that it ( or those who control it) have a power over us that inevitably plays out regardless of our concerns.

I keep going back to Kevin Kelly’s book What Technology Wants. Worth looking at if you don’t know it.

I do think it’s somewhat inevitable that as we move from Artificial General Intelligence to Artificial Superior Intelligence that we will task AI to help us improve AI itself, and that it’s possible that the progress will be cascadingly quick and create a watershed moment.

We do have a choice, though. We can try to carefully assess the impact of the technology in our lives and, like the Amish, determine where and how it enhances our core beliefs and where it doesn't, and then make conscious decisions for it's use. But honestly, this is very hard to do.

It is up to us to decide how much we want to talk about this. There is a degree to which the paradox of education is fairly well understood but not really talked about, since maybe we just accept the benefits of traditional schooling without really wanting to look at its costs. Maybe we understand that the act of reimagining school is greater than the political and practical will that it would take to do so. 

Somehow we are comfortable that another staple of our human existence, food, is delivered to us in a great variety of ways, from large-scale industrial-style distribution chains to small local diverse restaurants and even food trucks. We would probably never consider sending our children to feeding stations three times a day to get the kind of standardized fare we're comfortable with in schooling, which is a curiosity to me but maybe understandable. Are we so afraid to have that level of personal responsibility for something so important that we let others decide for us?

I learned a great lesson when my youngest daughter took AP World History and needed my help nightly with the reading. I was shocked to find how obvious it was to me as an adult reading the massive textbook that the history of the world is primarily a history of power and control, even though we tell different secondary stories to ourselves. 

A few years back I read a book called The Elephant in the Brain. Although I didn't agree with many of the conclusions of the book, I did leave with an understanding that almost all of our social narratives around institutions are stories that we’re comfortable with but which aren't actually the truth. I say this sincerely: I think one of the great dilemmas of our post-internet and now early AI epoch is that I'm not sure we're actually ready to handle the truth. Right now large language models are language but not logic, however very smart people are working hard on AI’s ability to reason. Were we to have a fully-reasoning AI and the ability to research culture and history logically, not emotionally, and to see past marketing and propaganda and commercial and political interests to more truthful understandings of cause and effect, it would dramatically change our perceptions of the world. (Which is why I’m also concerned about who has the incentive and motivation to control AI development.)

So if an AI Skynet takeover moment  (Terminator movies)  doesn’t occur, and we haven't been lulled into an artificial intimacy stupor by Sky or her equivalents (SKy - SCarlett, but that’s just a coincidence, right?), and we do soon get AI with superior reasoning and propaganda-breaking capability, we will need a framework of generative teaching so that the next generations are in partnership with us, working to understand these new and powerful changes in our world.

No comments:

Post a Comment

I hate having to moderate comments, but have to do so because of spam... :(