Monday, March 23, 2026

New Webinar: "Human-Centered AI Use in a Machine-Centered World"

31104632666?profile=RESIZE_710x

Human-Centered AI Use in a Machine-Centered World
A Library 2.0 / Learning Revolution Workshop with Reed Hepler

OVERVIEW

In an era where artificial intelligence increasingly shapes how we think, work, and relate to one another, this 90-minute workshop asks a fundamental question: How do we remain fully human while engaging with increasingly powerful machines? Drawing on the prophetic insights of Neil Postman's Technopoly and Joseph Weizenbaum's Computer Power and Human Reason, this workshop challenges the prevailing narrative that AI adoption is inevitable, neutral, and inherently progressive. Instead, we explore how to use AI tools deliberately, ethically, and in service of human flourishing—not merely technological efficiency.

This workshop is designed for educators, professionals, and thoughtful technology users who sense that something essential is at risk in our rush toward automation. We examine how machine-centered thinking—where speed, scale, and optimization dominate—threatens to eclipse human-centered values like contemplation, nuance, privacy, and authentic relationships. Participants will develop frameworks for critical resistance: not rejecting AI wholesale, but using it selectively and intentionally while safeguarding the irreducibly human elements of knowledge work, creativity, and ethical judgment.

The session synthesizes insights from multiple domains: the philosophical critique of technological determinism, practical frameworks for evaluating AI-generated content, strategies for deliberately safeguarding privacy in AI-pervaded environments, and ethical principles for navigating the tension between efficiency and integrity. Through discussion and collaborative application, participants will move from abstract concern to concrete practice—developing personal and institutional approaches that center human agency, dignity, and wisdom.

By engaging with Postman's warning that we risk becoming "a culture without a moral foundation" and Weizenbaum's insistence that "there are certain tasks which computers ought not be made to do," participants will develop a philosophical foundation for their AI practices. This foundation supports practical skills: evaluating AI content for evidence of human reasoning, implementing privacy-protective workflows, and creating ethical guidelines that prioritize human values over technological capabilities. The result is a coherent approach to AI that neither demonizes the technology nor surrenders to its logic—but instead places it firmly in service of human purposes, under human control, and subject to human judgment.

LEARNING OBJECTIVES

By the end of this intensive, participants will be able to:

  • Analyze how machine-centered thinking shapes institutional and personal AI adoption, and identify alternatives grounded in human-centered values
  • Evaluate AI-generated content not merely for accuracy but for evidence of human reasoning, ethical consideration, and authentic intellectual engagement
  • Apply Postman's and Weizenbaum's critiques of technological determinism to contemporary AI challenges in education, work, and civic life
  • Implement privacy-protective practices when using AI tools, understanding both technical vulnerabilities and philosophical implications of data exposure
  • Articulate ethical frameworks for deciding when AI use serves human flourishing and when it undermines essential human capacities

The recording and presentation slides will be available to all who register. 

DATE: Tuesday, March 31st, 2026, 2:00 - 3:30 pm US - Eastern Time

COST:

  • $129/person - includes live attendance and any-time access to the recording and the presentation slides and receiving a participation certificate. To arrange group discounts (see below), to submit a purchase order, or for any registration difficulties or questions, email admin@library20.com.

TO REGISTER: 

Click HERE to register and pay. You can pay by credit card. You will receive an email within a day with information on how to attend the webinar live and how you can access the permanent webinar recording. If you are paying for someone else to attend, you'll be prompted to send an email to admin@library20.com with the name and email address of the actual attendee.

If you need to be invoiced or pay by check, if you have any trouble registering for a webinar, or if you have any questions, please email admin@library20.com.

NOTE: Please check your spam folder if you don't receive your confirmation email within a day.

SPECIAL GROUP RATES (email admin@library20.com to arrange):

  • Multiple individual log-ins and access from the same organization paid together: $99 each for 3+ registrations, $75 each for 5+ registrations. Unlimited and non-expiring access for those log-ins.
  • The ability to show the webinar (live or recorded) to a group located in the same physical location or in the same virtual meeting from one log-in: $399.
  • Large-scale institutional access for viewing with individual login capability: $599 (hosted either at Learning Revolution or in Niche Academy). Unlimited and non-expiring access for those log-ins.

12420251095?profile=RESIZE_180x180REED C. HEPLER

Reed Hepler is a digital initiatives librarian, instructional designer, copyright agent, artificial intelligence practitioner and consultant, and PhD student at Idaho State University. He earned a Master's Degree in Instructional Design and Educational Technology from Idaho State University in 2025. In 2022, he obtained a Master’s Degree in Library and Information Science, with emphases in Archives Management and Digital Curation from Indiana University. He has worked at nonprofits, corporations, and educational institutions encouraging information literacy and effective education. Combining all of these degrees and experiences, Reed strives to promote ethical librarianship and educational initiatives.

Currently, Reed works as a Digital Initiatives Librarian at a college in Idaho and also has his own consulting firm, heplerconsulting.com. His views and projects can be seen on his LinkedIn page or his blog, CollaborAItion, on Substack. Contact him at reed.hepler@gmail.com for more information.
 
OTHER UPCOMING EVENTS:

 March 24, 2026

 March 26, 2026

31095253079?profile=RESIZE_710x

 April 3, 2026

31101295096?profile=RESIZE_710x

 April 7, 2026

 April 9, 2026

31093880457

 April 10, 2026

 April 15, 2026

31093502700?profile=RESIZE_710x

 April 30, 2026

31101317694?profile=RESIZE_710x

 April 24, 2026

 April 28, 2026

31105086668?profile=RESIZE_710x

 May 1, 2026

31101306885?profile=RESIZE_710x

 May 8, 2026

31105084900?profile=RESIZE_710x

 May 22, 2026

31101313053?profile=RESIZE_710x

Sunday, March 22, 2026

Undervaluing Librarians

I've been thinking about why libraries, and especially school libraries, declined at the exact moment information became the defining challenge of our time. I don't have a tidy answer. But I want to try out a reading of the situation that I think holds some explanatory power, and that might tell us something uncomfortable about what's coming next.

The surface-level story is simple enough. The internet made information abundant, which made libraries seem redundant. Budgets tightened. Positions were cut. School librarians were hit hardest, sometimes the first professionals eliminated when districts needed savings. That's the version most of us know.

But sit with the irony for a moment. The explosion of freely available information, much of it unreliable, much of it deliberately misleading, should have been the librarian's greatest moment. Here was a world suddenly drowning in information and desperate for the skills librarian as information specialist had spent decades developing: how to evaluate sources, how to distinguish credible from questionable, how to navigate complex information systems with a critical eye. The need didn't diminish. It intensified. But the profession shrank, both in status and membership..

I think the explanation lies in a gap between two stories that we have been telling simultaneously for a long time, and the fact that almost nobody noticed they were different stories.

Two Stories

The story I’ve heard librarians tell, especially school librarians, went something like this: we help people become independent learners. We give students access to information outside the mandated curriculum. We create space for curiosity and self-directed inquiry. In a building organized around compliance and standardized outcomes, the library was the one room where a student could, at least in theory, follow a question wherever it led.

That story was and is true. The good school librarians (who are left) genuinely have been the one adult in the building whose job description was compatible with curiosity.

The story the school has told is different. It has gone like this: we have books.

That's it. And current library controversies are about which books they do or don’t have. The institutional justification for the library, I think it’s fair to say, has never been the intellectual function the librarian performed. It was the physical resource the library contained. The school basically saw inventory. A countable collection, a physical space, a line item that could be measured and, when necessary, cut.

I’m guessing that the librarians believed (or wanted to believe) that the institution shared their story. They thought when they said "we teach information literacy and support independent learning," the people making budget decisions heard the same thing. I don’t think they did. They heard "we house books." So the moment the books became unnecessary, the institutional justification evaporated. I haven’t been a principal, or a school board member, or even a librarian, but I think it’s fair to say that the librarian's actual value, helping a person navigate information independently and critically, had no line item for most schools. It was never what the school was actually purchasing.

The Pivot That Didn’t Land

This explains something that always puzzled me about the library profession's response to the internet. From what I saw, librarians tried to pivot. They genuinely did. They talked about information literacy, digital citizenship, and media literacy. They made the capability argument with real passion and real expertise.

But they were making a capability argument to an institution that could only understand resource arguments. You should have been able to defend a budget line with "I teach students to think critically about what they read." But I think that didn’t work in an era of increasingly mandated curricula. Instead it got defended with "we have 14,000 volumes and a computer lab." When the volumes became irrelevant and the computer lab moved into every student's pocket, the argument collapsed, not because the capability wasn't needed, but because the institution was never organized around it.

And then came the makerspaces.

I want to be careful here because I know many librarians who built wonderful makerspaces and did genuinely creative work with them. But I have always thought that the makerspace movement in libraries, when you looked at it honestly, was a survival strategy dressed up as innovation. 3D printers, laser cutters, robotics kits, these are wonderful things. They are not information science, they are more aligned with vocational arts (which were also disappearing). The presence of makerspaces in a library seemed like an unconscious confession: we can no longer justify this space with our actual expertise, so we are filling it with something the institution will fund.

In this interpretation, it was, painfully, a return to the original institutional logic. We have stuff. Just different stuff. The librarian stopped arguing "you need what I know" and started arguing "you need this room and this stuff." Which worked, in some cases, for a while. But it also completed the abandonment of the very claim that made the profession distinctive.

The Information Ecosystem Turns Adversarial

Here is where I think the librarian's story stops being a professional tragedy and starts being a civilizational warning. I know, I’ve switched to a pretty big canvas.

The internet didn't just make information abundant. It made information commercial. Google's original mission was to organize the world's information. That might be a librarian's mission statement, almost word for word. But something happens to idealistic missions when they become embedded in business models, and it happens reliably enough that I've come to think of it as a kind of law: any system that can be exploited for profit eventually will be, and the exploitation will be proportional to the system's size, scope, and reach.

Search results became ad delivery mechanisms. Ranking algorithms optimized for engagement, not accuracy. The information environment didn't just grow larger; it arguably grew adversarial. The system was no longer trying to help you find what you needed. It was trying to keep you in the ecosystem. That was round one.

AI is round two (or twenty, depending on how you want to count all the technology in between), and it's worse. Large language models aren't just delivering information shaped by advertising incentives. They're generating information shaped by whatever the model's ecosystem rewards. Right now, the AI companies are in their idealistic phase. They talk about helpfulness, truthfulness, and making knowledge accessible to everyone. The mission statements read like library charters.

And here is the parallel that keeps me up at night: those idealistic stories are true. Just as the librarian's story was true. The best people at AI companies surely believe in expanding access to knowledge, just as the best librarians genuinely believed in fostering independent inquiry. The truth of the story is not the problem. The problem is that truth is not what decisions get made on.

The business model will assert itself. The pressure to keep users engaged, to serve partner interests, to optimize for retention and revenue over accuracy and independence, all of that is coming. It isn't cynicism to say so. It's pattern recognition. It's watching what happened to search, to social media, to every information system that started with an idealistic mission and ended up governed by the logic of its business model. The idealistic narrative will survive as long as it's useful for growth. The moment it conflicts with profitability, it will be rewritten.

What the Librarian's Story Tells Us

So here is what I think the decline of the librarian, and the real decline in library use and relevance, actually reveal, if we're willing to look at them clearly.

We had a profession whose members felt a duty to the information consumer. Not to a publisher, not to an advertiser, not to a shareholder. The librarian's institutional obligation was to the person asking the question. That kind of alignment is vanishingly rare in the information ecosystem now. The people building AI products have obligations to investors and growth metrics. The people consuming AI outputs largely have no trained intermediary helping them understand what they're actually receiving.

And many librarians, especially school librarians, lost twice. Many have lost their institutional home because the institution only ever valued the container, not the function. And then they lost their story. The language of intellectual empowerment, of democratized access, of helping people find and evaluate information, that language now belongs to companies governed by dynamics that will, over time, subordinate everything to the demands of the business model.

I want to be clear that I'm not offering this as a definitive history. I'm offering it as a lens, a way of making sense of something that has always struck me as deeply strange, that we dismantled the one profession structurally aligned with needs of the information patron, right before the information ecosystem became structurally aligned against them.

If that reading holds any truth, then the librarian's story isn't just an institutional casualty. It's a preview. And the question it leaves us with is the one that matters: if the profession built around serving the information needs of individuals couldn't survive the institution it was embedded in, what makes us think the idealistic promises of AI companies will survive theirs?