< Back to Blog

the latest findings in AI and learning – february 2026

Policy debates are heating up as the AI race enters a new phase in which the top spot once again seems to be up for grabs. The popularity of Claude Code has forced some pivots in focus from OpenAI, and Gemini 3 (which has lately upended the hierarchy of AI providers) is building features that are specifically designed to help you migrate from other services to theirs. All of this competitive activity continues to emerge in fits and starts as schools and workplaces grapple with the ethical, logistical, and financial ramifications of adopting AI. With that context firmly in mind, here are the latest updates on how these changes in AI are impacting education:

Rapid Adoption and Professional Evolution

The integration of AI in classrooms is accelerating at a remarkable pace, with the percentage of teachers utilizing AI-driven tools nearly doubling between 2023 and 2025. While only 34% of teachers reported using AI in 2023, half of all teachers had participated in at least one professional development session on the technology by 2025. This surge is largely driven by emerging AI featuresets in ubiquitous educational platforms like Canva, Google, and Khan Academy, as well as its high value for labor-intensive tasks such as lesson planning, differentiation, and providing feedback. Educators are increasingly viewing AI as a personal assistant that can save nearly six hours of work per week – equivalent to about six weeks over a full school year.

Balancing Innovation with Cognitive and Emotional Risks

Despite these efficiencies, a 2026 report from Brookings warns that the risks of generative AI currently overshadow its benefits. A primary concern is cognitive off-loading, a phenomenon where students rely so heavily on AI that it may lead to declines in critical thinking, creativity, and content knowledge. Emotional development is also under scrutiny; nearly 1 in 5 high schoolers report having a romantic relationship with an AI or knowing someone who has, while 42% have used it for companionship. The report also warns that the sycophantic nature of AI could stunt the development of empathy and resilience.

Policy and Equity in the Classroom

The legislative landscape remains fractured. While there is bipartisan consensus on the risks regarding student data security and overreliance, federal regulation is a point of contention. Recent executive orders have complicated state-level oversight, raising concerns that the absence of clear national standards could exacerbate existing divides. Financial barriers further threaten equity, as underfunded districts may be forced to rely on free, less accurate AI models while wealthier schools afford advanced, more reliable versions. In response, educational organizations are calling for educator-led advisory committees to oversee AI procurement and ensure the human connection remains central to the learning experience.


As we look toward the future of learning, the prevailing sentiment is one of careful introduction. While AI provides undeniable superpowers for administrative efficiency and personalized learning, it is not a replacement for human thought or the emotional intelligence fostered in a physical classroom. The axe we will continue to grind at Filament is that the path forward requires a focus on holistic AI literacy for both students and staff, ensuring they can navigate the technology’s biases while maintaining their own authorship and critical perspective. Ultimately, the success of AI in education will depend not on the tools themselves, but on the human oversight that guides their ethical and pedagogical application. Looking to implement AI in your own portfolio? Let’s talk.

© 2026 Filament games. All rights reserved.