Monday, June 30, 2025

New Webinar - "Patron Behavior Management: Six Key Decision Points"

Patron Behavior Management:
Six Key Decision Points

Part of the Library 2.0 Service, Safety, and Security Series with Dr. Steve Albrecht

OVERVIEW

When it comes to the wide range of patron behavioral issues, it’s not always easy to know what to do. There is no perfect solution for every situation. It can help to apply some structure to how staff and the PICs decide how to handle issues that can range from pretty easy to highly vexing.

There are Six Decision Points that every library employee, at every level, can use quickly, intuitively, and in a timely manner to keep the peace. They are: 

  • What does my Intuition tell me to do?
  • What does our current Code of Conduct tell me to do?
  • Which of our service, safety, or security Policies apply in this situation?
  • What does the Law require I do? Are there local/municipal, state, or federal laws being broken by the patron?
  • What is our Usual Approach to this situation, as in what does the culture we have created for our library suggest I do?
  • Is my response going to be Reasonable?

LEARNING AGENDA:

  • How to apply these six choices, for almost every patron behavioral concern.
  • How to conduct the necessary research as to what policies, laws, and your Code of Conduct suggest we do to create a safe and peaceful library for all users.
  • How to train all staff to use these six reminders as their “think fast, on their feet” guidelines.
  • The importance of identifying the elements of the “library culture” for each branch in your system.

DATE: Thursday, July 10th, 2025, 2:00 - 3:00 pm US - Eastern Time

COST:

  • $99/person - includes live attendance and any-time access to the recording and the presentation slides and receiving a participation certificate.
  • To arrange group discounts (see below), to submit a purchase order, or for any registration difficulties or questions, email admin@library20.com.

TO REGISTER: 

Click HERE to register and pay. You can pay by credit card. You will receive an email within a day with information on how to attend the webinar live and how you can access the permanent webinar recording. If you are paying for someone else to attend, you'll be prompted to send an email to admin@library20.com with the name and email address of the actual attendee.

If you need to be invoiced or pay by check, if you have any trouble registering for a webinar, or if you have any questions, please email admin@library20.com.

NOTE: Please check your spam folder if you don't receive your confirmation email within a day.

SPECIAL GROUP RATES (email admin@library20.com to arrange):

  • Multiple individual log-ins and access from the same organization paid together: $75 each for 3+ registrations, $65 each for 5+ registrations. Unlimited and non-expiring access for those log-ins.
  • The ability to show the webinar (live or recorded) to a group located in the same physical location or in the same virtual meeting from one log-in: $299.
  • Large-scale institutional access for viewing with individual login capability: $499 (hosted either at Library 2.0 or in Niche Academy). Unlimited and non-expiring access for those log-ins.
DR. STEVE ALBRECHT

Since 2000, Dr. Steve Albrecht has trained thousands of library employees in 28+ states, live and online, in service, safety, and security. His programs are fast, entertaining, and provide tools that can be put to use immediately in the library workspace with all types of patrons.

He has written 27 books, including: Library Security: Better Communication, Safer Facilities (ALA, 2015); The Safe Library: Keeping Users, Staff, and Collections Secure (Rowman & Littlefield, 2023); The Library Leader’s Guide to Human Resources: Keeping it Real, Legal, and Ethical (Rowman & Littlefield, May 2025); and The Library Leader's Guide to Employee Coaching: Building a Performance Culture One Meeting at a Time (Rowman & Littlefield, June 2026).

Steve holds a doctoral degree in Business Administration (D.B.A.), an M.A. in Security Management, a B.A. in English, and a B.S. in Psychology. He is board-certified in HR, security management, employee coaching, and threat assessment.
He lives in Springfield, Missouri, with seven dogs and two cats.

More on The Safe Library at thesafelibrary.com. Follow on X (Twitter) at @thesafelibrary and on YouTube @thesafelibrary. Dr. Albrecht's professional website is drstevealbrecht.com.

 

OTHER UPCOMING EVENTS:

July 8, 2025

 July 11, 2025

 July 18, 2025

August 2025

THE CONFERENCE IS BEING POSTPONED UNTIL AUGUST.
MORE INFORMATION WILL BE POSTED HERE WHEN THE DATE IS SOLIDIFIED.

Friday, June 27, 2025

Students and AI Webinar Report

RECORDING:

PRESENTATION FILE:

STUDENTS AND AI.pdf

ADDITIONAL LINKS:

CHAT LOG:

https://drive.google.com/file/d/1s5paMte4iUD3_QEzWCGv3sN2pvdBTDO8/view?usp=sharing

PRE-WEBINAR SURVEY RESULTS:13557303296?profile=RESIZE_710x

IN-WEBINAR SURVEY RESULTS:

13557304458?profile=RESIZE_710x

13557304091?profile=RESIZE_710x

Grok-Produced Summary from Responses - Framework for Good Practices for AI with Students

This framework synthesizes webinar responses on creating conditions for effective and responsible AI use in education. It is structured around three pillars: promoting positive outcomes, preventing negative outcomes, and fostering responsible AI use.

1. Promoting Positive Outcomes

Leverage AI to enhance engagement, personalize learning, and prepare students for future skills.

  • Increase Engagement:
    • Use AI to create interactive quizzes, games, or creative projects (e.g., presentations, videos, songs) that align with students’ interests.
    • Encourage students to compare AI outputs from different platforms (e.g., ChatGPT vs. Gemini) to foster critical analysis and discussion.
    • Allow students to co-create AI-powered activities or share how they use AI tools, promoting ownership and excitement.
    • Match AI tasks to real-world applications, like composing resumes or pursuing interest-driven research.
  • Enable Personalized Learning:
    • Use AI to tailor content to students’ reading levels, languages, or learning needs (e.g., generating summaries, practice quizzes, or explanations).
    • Encourage students to refine AI prompts to customize outputs, fostering prompt engineering skills.
    • Provide AI as a personal tutor for iterative feedback on writing, math, or other subjects, allowing students to revise and learn at their own pace.
    • Offer flexible activity options and accommodations to support diverse learners.
  • Support Skill Preparation:
    • Teach students to use AI for step-by-step processes (e.g., math problem-solving, research strategies) to build foundational skills.
    • Design tasks that develop communication, collaboration, creativity, and digital literacy through AI use.
    • Use AI to prepare students for tests (e.g., SAT) or create enrichment activities to address skill gaps.
    • Guide students in crafting and evaluating prompts to enhance questioning and problem-solving skills.
  • Provide 24/7 Learning Support:
    • Integrate AI chatbots or tools into learning management systems for instant access to explanations or feedback.
    • Encourage students to use AI to clarify concepts, summarize notes, or identify knowledge gaps at any time.
    • Provide guides for responsible AI use across devices, ensuring accessibility at home and school.
  • Foster Agentic Learning:
    • Involve students in setting learning goals, co-developing success criteria, and choosing how to demonstrate understanding.
    • Encourage independent problem-solving by having students design prompts, propose projects, or map out learning goals.
    • Create a classroom culture that embraces risk-taking, feedback, and reflection on AI use.
  • Enhance Generative Teaching:
    • Use AI to generate lesson ideas, differentiated activities, or formative data analysis to meet diverse student needs.
    • Create tiered tasks or language supports based on students’ levels or accommodations (e.g., IEPs/504 plans).
    • Model lifelong learning by reflecting on AI’s role in teaching and experimenting with prompt formulation.

2. Preventing Negative Outcomes

Address concerns about cheating, loss of critical thinking, information literacy, and authentic learning.

  • Mitigate Cheating and Uphold Academic Integrity:
    • Define clear guidelines for when and how AI can be used in assignments, emphasizing transparency (e.g., declare AI use, submit prompts).
    • Focus assessments on the learning process (e.g., reflections, drafts, oral defenses) rather than just the final product.
    • Redesign assignments to prioritize open-ended discussions, in-class writing, or tasks AI cannot easily complete.
    • Teach the value of learning and curiosity, reducing the incentive to cheat by fostering engagement and growth over perfection.
    • Use oral exams, presentations, or process-focused tasks (e.g., annotated bibliographies, scaffolding) to verify student understanding.
  • Preserve Critical Thinking, Reasoning, and Writing Skills:
    • Embed critical thinking across the curriculum by having students analyze, critique, or edit AI-generated outputs for errors or biases.
    • Require students to explain their reasoning, processes, or prompt strategies in writing, orally, or through projects.
    • Incorporate tactile, in-class, or non-AI activities (e.g., handwritten essays, group discussions) to reinforce foundational skills.
    • Teach metacognition and AI literacy, helping students reflect on how AI influences their thinking and how to use it as a tool, not a replacement.
    • Design open-ended assignments that encourage reflection, process writing, or application of learning to new contexts.
  • Strengthen Information Literacy:
    • Teach students to verify AI outputs by cross-checking with reputable sources (e.g., academic journals, library databases).
    • Provide structured lessons on evaluating credibility, bias, and authorship of AI-generated content, including discussions on hallucinations and algorithmic bias.
    • Involve librarians to integrate information literacy across disciplines, teaching students to prompt AI for sources and analyze their validity.
    • Use activities like comparing AI responses to traditional research or tracking AI citations to build critical evaluation skills.
    • Embed AI literacy as part of digital literacy, addressing how AI is trained and its limitations.
  • Ensure Authentic Learning:
    • Design assignments that require personal opinions, creativity, or real-world application (e.g., experiential projects, hands-on tasks).
    • Test assignments to ensure AI cannot easily complete them, focusing on processes like planning, drafts, or reflections.
    • Encourage collaborative learning, group discussions, or peer reviews to emphasize human interaction and idea-sharing.
    • Build time for students to reflect on what they learned, how AI influenced their understanding, and what they still wonder.
    • Align AI use with learning goals, ensuring technology supports, rather than replaces, authentic engagement with content.

3. Fostering Responsible AI Use

Create a culture of transparency, reflection, and ethical AI integration.

  • Promote AI Literacy and Transparency:
    • Educate students on how AI works, its strengths, weaknesses, and potential biases, using real-time demonstrations (e.g., analyzing AI prompts in class).
    • Model responsible AI use by openly acknowledging when and how educators use AI (e.g., drafting syllabi, generating ideas).
    • Encourage students to document their AI use (e.g., submit prompts, compare AI outputs to their work) to build accountability.
  • Encourage Reflective and Ethical Practices:
    • Integrate reflection activities where students assess how AI helped or hindered their learning and how they could improve their prompts.
    • Discuss ethical issues, academic integrity policies, and the societal implications of AI use (e.g., dependency, privacy).
    • Foster a growth mindset by normalizing mistakes and emphasizing learning through struggle, not just AI-generated answers.
  • Balance AI and Human Interaction:
    • Limit over-reliance on AI by incorporating in-class, collaborative, or tactile activities that prioritize human connection.
    • Use AI as a thinking partner or mentor model, encouraging students to paraphrase, critique, or build on AI outputs rather than copying them.
    • Maintain personal engagement with students through discussions, feedback, and conversations to understand their challenges and progress.
  • Support Faculty and Institutional Collaboration:
    • Provide professional development on AI literacy, prompt engineering, and integrating AI into teaching workflows.
    • Collaborate with librarians to embed information literacy and AI literacy into the curriculum.
    • Develop clear institutional policies on AI use, including acceptable behaviors, consequences, and guidelines for assignments.

Implementation Considerations

  • Start Small: Begin with low-stakes AI tasks (e.g., brainstorming, summarizing) to build student and teacher confidence.
  • Iterate and Reflect: Regularly assess how AI impacts learning outcomes and adjust strategies based on student feedback and performance.
  • Ensure Equity: Provide universal access to AI tools and training to avoid disparities in technology access or skills.
  • Model Lifelong Learning: Teachers should experiment with AI, share their learning process, and demonstrate adaptability to new tools.

This framework balances excitement about AI’s potential with proactive strategies to address risks, ensuring students use AI responsibly while developing essential skills for the future.

https://www.library20.com/ai-recordings/ students-and-ai Tags What's This? Published in AI Recordings (move) Delete this Discussion Who can comment on this post? Everyone Just My Friends Just Me Nobody Visible to: Everyone (Public) Only network members Only selected members Make this content paid Cancel Help | Terms of Service© 2025 Library 2.0 Powered by Website builder | Create website | Ning.com