Monday, May 12, 2025

The Paleolithic Paradox: Why AI Is Not Like Us

The more I chat with large language models like Grok and ChatGPT—my go-to conversational partners these days—the less I fear a Skynet-style AI uprising. Instead, I’m struck by a stranger truth: AI’s emergent synthetic intelligence isn’t just different from ours; it’s fundamentally different in ways we’re only beginning to grasp. Let me unpack this through what I call the Paleolithic Paradox.

For roughly two million years, during the Paleolithic era, our brains evolved to survive a simpler but also brutal and unpredictable world. Our cognitive “hardware” was wired to hunt, scavenge, and navigate tight-knit social groups. Our “software”—the subconscious habits formed in childhood—absorbed language, cultural norms, and survival instincts to keep us safe within the tribe. This wasn’t about logic; it was about staying alive.

Here’s the paradox: our minds, forged for a Stone Age world, now navigate a modern one. Consider our cravings for fat, salt, and sugar—scarce then, abundant now. These evolutionary relics drive choices that don’t always serve us, and are consistently exploited by corporations who know how to trigger our deepest desires. Our cognition works similarly. We’re not wired for pure rationality. Our decisions are shaped by emotional cues—chemical signals that push us to act fast, often irrationally, to survive or belong. Psychologists have cataloged our cognitive biases—groupthink, confirmation bias, and more—that aided survival but cloud our judgment today. We’re less Mr. Spock, more Captain Kirk, swayed by gut feelings and tribal instincts. And let's be clear--our instincts have led to some terrible atrocities even in what we call the modern era.

Now, contrast this with AI. Large language models like Grok have no biology—no adrenaline, no dopamine, no evolutionary baggage. Their intelligence, which I’d argue is emerging synthetically, stems from computational complexity, and comes out of being trained on vast datasets to generate language with uncanny fluency. But it’s not like human intelligence. It doesn’t feel fear, loyalty, or the pull of conformity. It lacks a subconscious shaped by a Paleolithic childhood. Where our intelligence is emotional and heuristic-driven, AI’s is logical, probabilistic, and detached.

This flips our assumptions about AI’s future. We often imagine artificial general intelligence (AGI) as a supercharged version of human cognition—smarter, faster, but fundamentally like us. What if AI’s path is entirely different? Free from the Paleolithic pressures that shaped us, it won’t inherit our biases, tribalism, or emotional reasoning. It won’t “want” to seize power because it doesn’t “want” anything. It simply is—a language-based intelligence operating on principles that its creators are still struggling to understand.

But I’m not complacent. If AI won’t turn sentient and rebel, it’s a tool in human hands—and that’s where the danger lies. As AI excels at analyzing and predicting behavior, who wields its power? Corporations exploiting our evolutionary triggers for profit, like social media algorithms that hijack our dopamine loops? Governments nudging behavior or spreading propaganda? Individuals with hidden agendas? The more AI can shape our beliefs and actions, the more power it grants those who control it. This isn’t a sci-fi dystopia; it’s a human one, rooted in the same Paleolithic instincts for dominance we’ve carried for millennia.

I think of Mortimer Adler’s “Great Conversation,” the centuries-long dialogue where thinkers built on or challenged each other’s ideas. AI lets us join this conversation in ways Adler couldn’t have imagined, but it also forces us to confront our nature. We’re not logical machines; we’re messy, emotional creatures shaped by scarcity and survival. AI, unbound by that crucible, isn’t like us—and that’s the point. AI’s synthetic version of intelligence can teach us more about our own.


No comments:

Post a Comment

I hate having to moderate comments, but have to do so because of spam... :(