Today : Oct 09, 2025
Education
08 October 2025

Rider University And Kentucky Schools Launch AI Workshops

Universities and public schools are rolling out new AI initiatives to boost student learning and mental health, while teaching critical skills and ethical awareness.

As artificial intelligence (AI) continues to reshape the educational landscape, schools and universities across the United States are experimenting with new ways to help students harness its power—while also grappling with its limitations and ethical challenges. This October, two very different institutions, Rider University in New Jersey and Bullitt County Public Schools in Kentucky, are at the forefront of this trend, each rolling out initiatives aimed at empowering students to use AI as a tool for learning, collaboration, and even emotional support.

At Rider University, the Moore Library is opening its doors to a series of hands-on workshops designed to teach students from all majors how to "leverage" artificial intelligence in their academic and professional lives. According to a university-wide email cited by The Rider News, the workshops will be held in-person over four sessions: two on October 8th, one on October 9th, and another on October 21st, 2025. The sessions cover a range of topics, including using Google Notebook LM for studying, employing AI models like Google Gemini and Napkin to enhance group projects, and exploring career development tools such as Google Career Dreamer. Importantly, all these Google workspace AI features are available to Rider students at no cost.

Dean of the Library Sharon Whitfield, a self-proclaimed avid AI user, is leading the charge. She sees AI not as a replacement for human creativity, but as an assistant—"kind of looking at AI more as what it’s supposed to be, as an assistant that works with you." Whitfield uses AI for tasks that don’t require much creativity, such as proofreading emails, but she’s quick to point out the technology’s current limitations. "AI writes at a seventh-grade level," she cautions, warning that students who rely on AI to fully generate their work risk producing "banal" and unoriginal content. "Programs will not produce writing that is 'creative' or capable of being a 'great argument.'"

Whitfield’s approach is rooted in what she calls a "centaur-based" relationship—a partnership where human input and critical thinking are essential to coax the best results from AI. She encourages students to experiment with prompts, tailoring their questions to suit the intended audience. "I will kind of take my prompt and say from a certain lens, what do you think about this?" she explains, emphasizing the importance of thoughtful user input.

But the workshops aren’t just about technical skills. Whitfield and her team are also keen to address the ethical dimensions of AI. "There’s a bias to AI, and that typically happens with all technologies. So we want them to also think about that," she notes. Another concern is intellectual property: each time a student submits original work to an AI program, that content becomes part of the vast datasets used to train and improve the software, potentially eroding the creator’s ownership. Whitfield hopes the workshops will equip students to "take control over the shaping and use of AI," urging them to "continue to prompt ... to ask more questions to use this tool more effectively."

For students like Trevor Janusas, a junior communications major at Rider, the workshops are a welcome initiative. "There’s so much rampant use of AI within every part of people’s school lives … Why not learn how to make ourselves better from it instead of just completely cheating?" Janusas uses generative AI systems like ChatGPT to create study guides and assist with assignments, but he admits the convenience comes at a cost. "It might be taking me back as a learner because I’m not doing as much of the thinking as I used to do in high school." The library’s goal, according to Whitfield, is to help students reach the "next level"—using AI to enhance notes, develop new perspectives, and stay critical of the technology’s outputs.

Meanwhile, in Bullitt County, Kentucky, another AI-driven educational experiment is unfolding, but with a different focus: student well-being. According to reporting from the Courier Journal, Bullitt County Public Schools have begun implementing an AI chatbot app, co-designed by high school students and software developers, to support peers facing mental health and academic challenges. On a recent September morning, students peppered a developer with questions about the app during a school presentation, underscoring their engagement with the project.

The chatbot initiative is part of a broader effort to make mental health resources more accessible, especially for students who might be hesitant to seek help in person. The app is designed to offer guidance, answer questions, and direct users to appropriate support services—including the 24/7 U.S. National Suicide Prevention Lifeline (dial 988) and online chat resources at suicidepreventionlifeline.org/chat. The hope is that AI technology can provide a discreet, always-available first line of support for students in need, bridging gaps in traditional counseling services.

While the Bullitt County program is still in its early stages, it reflects a growing recognition of AI’s potential to address not only academic but also emotional and psychological needs. The initiative is careful to supplement, rather than replace, human intervention—acknowledging that technology can’t fully replicate the empathy and nuance of a trained counselor or teacher, but can serve as a valuable tool in the broader support network.

Both the Rider University and Bullitt County initiatives highlight the importance of teaching students not just how to use AI, but how to use it wisely. Whether it’s learning to craft effective prompts, understanding the risks of over-reliance, or recognizing the ethical and privacy implications of sharing information with AI systems, educators are increasingly focused on fostering digital literacy alongside technical proficiency.

As AI becomes more deeply embedded in everyday life, these programs offer a glimpse into the future of education—one where technology is neither villain nor panacea, but a partner in the ongoing quest for knowledge, creativity, and well-being. The challenge, as both Whitfield and the Bullitt County educators acknowledge, is to strike the right balance: leveraging AI’s strengths while remaining vigilant about its shortcomings and potential pitfalls.

For now, the message to students is clear: AI is here to stay, but how it shapes learning and support will depend on the questions we ask, the boundaries we set, and the values we prioritize. As these pioneering programs show, the next chapter in education will be written not by machines alone, but by the humans who guide them.