Our April roundup focused on where the lines are drawn, and May is showing us exactly how those boundaries look in practice. Educators are actively defining acceptable use policies, and students are stepping up to build their own ethical frameworks. Major technology providers are adding specific features to their platforms to address concrete student needs, moving the conversation past theoretical debate and straight into action. Read on for the details!
A recent commentary in EdSource highlights the risks of rushing generative artificial intelligence into classrooms without proper guardrails, and USC professor Morgan Polikoff notes that 20% of students’ in-school AI interactions involve troubling behaviors like cheating or bullying. He argues that shortcutting the struggle to master new skills will have negative long-term consequences for individuals and society. To counter this trend, Polikoff suggests that state policies should focus on shifting more assessments into the classroom, and teachers should get rid of busywork assignments that are easily thwarted by algorithms.
Reporting from Education Week showcases a unique approach to technology adoption at Percy Julian Middle School in Illinois, where a group of eighth graders spent a year testing Google’s Gemini and presented their findings directly to their teachers. These students recognized the software’s limits, and they emphasized that AI cannot replace the human connection of a supportive educator who makes them feel important and loved. By involving students in the decision-making process, the school fostered a nuanced discussion about ethics, and students gained a deeper understanding of acceptable use.
A feature in NEA Today explores how future teachers are approaching these new tools as they prepare to enter the workforce. College students studying education see the utility of the software for brainstorming lesson plans, and they also appreciate its ability to translate materials for multilingual learners. At the same time, they maintain strong reservations about the reliability of these programs. Future teachers warn about models generating fabricated information, and they raise valid privacy concerns regarding the use of AI machinery to write individualized education programs.
El Paso Matters details how local school districts are actively drafting official guidelines for AI usage in their classrooms. The Socorro Independent School District adopted a policy requiring students to get teacher permission before using these tools, and the district is currently finalizing a comprehensive handbook. Meanwhile, the Ysleta Independent School District requires students to take digital citizenship courses that cover basic literacy and responsible use. These structured approaches ensure that students remain the primary thinkers while engaging with new software, and they keep the focus on learning rather than finding shortcuts.
Google announced several updates to its education tools in a recent Keyword blog post, detailing expanded access for users. The company expanded NotebookLM by doubling the limits for sources and notebooks for users with specific education add-ons. Gemini is now an official AI provider for the Moodle Learning Management System, and it enables features like text summarization directly within the platform. Google is also rolling out a feature that allows graduating students to migrate their school-issued Google Photos to a personal account, ensuring their personal history stays with them after they leave an organization.
A separate post from Google’s Keyword blog highlights the widespread adoption of the Google AI for Education Accelerator. In less than a year, over 400 higher education institutions across all 50 states have joined the program to help students and faculty build job-ready skills. The Texas A&M University System hosted an AI Learnathon to train staff, while students at the University of Virginia are applying their new skills to help local small businesses. These practical applications demonstrate how universities are equipping students for the workforce, and they show a clear commitment to bringing these tools into everyday academic life.
–
Taken together, these stories point to a grounded phase of AI in the classroom. The conversation is moving away from basic exploration and toward concrete decision-making. When should students use these tools? How do we build practical guidelines to keep them safe? Those questions are starting to shape both classroom practice and administrative agendas. Looking to explore your own AI-enabled learning experiences that hold up in real classrooms? Let’s talk.