A voice came over the speakers in English teacher Dan Eigenberg’s class last year: “Hey everyone, welcome back. Excited to jump into another deep dive with you.” His students listened as an AI-generated podcast discussed their current text.
Eigenberg said he uses AI as a planning partner for podcast creation and lesson development.
“I’ll kind of talk to ChatGPT, type in and have it help me maybe smooth out some of the rough edges,” Eigenberg said. “It always feels like a conversation.”
Eigenberg said he uses the AI-generated podcast through NotebookLM as a resource for his students.
“I can’t get to every student to give them one-on-one attention, so it works as a really good, almost, like a one-on-one tutor,” Eigenberg said.
According to NASA’s website, artificial intelligence refers to computer systems capable of performing tasks that typically require human reasoning, planning or learning.
Math teacher Kate Gabel, who graduated from the University of Nebraska-Lincoln in May, said she first used AI as a learning tool during college. She said professors showed increased interest in AI during her junior year, when they had students download Colleague AI to assist them in a lesson-planning assignment.
“A lot of teachers actually encouraged it (AI) to help students study,” Gabel said.
That experience of getting to practice using AI carried into her teaching. Recently, she said she asked ChatGPT to generate 10 absolute value equations arranged in a printable PDF format for station-based activities. The next day, students moved between stations solving each equation.
Junior Gracie Streets said she first noticed teachers using AI this year and wants clear expectations for both students and teachers.

“It is hypocritical if students have restrictions, and the adults that we are supposed to be able to trust do not have restrictions,” Streets said. “[They should] lead by example, because if a lot of students find out teachers are using AI to teach, a lot of students are going to think that it’s OK to use AI.”
Assistant Principal of Curriculum and Instruction Jeff Aman said educators have received district guidance on AI use. While the document remains internal, Aman said educators may use AI for workplace efficiency or instructional purposes but are prohibited from entering personally identifiable information. He said the BVNW blended learning team meets once a week to discuss technology integration.
Aman said technology in general offers new ways to provide feedback and review material.
“It allows us to have multimedia presentations or multimodal ways of learning information,” Aman said. “When I was still a teacher, practice quizzes online with immediate feedback for students was an incredibly powerful tool.”
Beyond quizzes, English teacher Valerie Golden said she used AI to generate an image of a character her AP Literature class was reading about in order to showcase AI being used as a tool.
“My purpose in doing that was to show one of the creative ways that AI can be used to generate ideas and images where it’s not stealing somebody else’s thoughts or opinions,” Golden said.
She said it worked well with the story they read in class.
“Because the character was so clearly described, I could get AI to make a really clear picture,” Golden said.
Senior Chase Allen, a student in Golden’s class, said he appreciates the effort but has concerns with AI use in education.
Allen stated he opposes using AI for creative work, arguing it takes opportunities away from artists.
“Every time I see AI, I cringe due to it not only putting artists out of their careers, but it lacks personality, creativity and depth,” Allen stated.
He said he would have preferred a student-created illustration or a class activity where students drew the character based on their analysis.
Like Allen, some teachers also emphasize the importance of students doing the work themselves. When Eigenberg began teaching at BVNW in 2014, he said students often turned to CliffNotes or SparkNotes, but now, with the rise of AI, shortcuts look different.
“I’ve really seen more willingness to cheat—to have AI do the work for you, rather than taking the time to put in the work,” Eigenberg said.
Both Golden and Eigenberg said detecting AI misuse has become part of the job. He said he uses multiple AI-detection tools when suspicious of student work and follows up with conversations. Golden said she uses a similar process.
“I think that English teachers are pretty good detectors of it (AI) on their own,” Golden said. “I’ve told my students: if I have a hunch, and multiple detectors agree with me, then it becomes a conversation.”
Beyond BVNW, CAPS Technology Solutions instructor Jill Riffer said blocking access to AI across the district could create inequities.
“I worry that when we block access to AI, it becomes an issue,” Riffer said.
Over the summer, Riffer said she took numerous classes on generative AI, including coding-specific courses. She said teachers should be trained to help students use AI appropriately.
“If we teach proper use, we can help students use AI ethically, recognizing the biases [and] the errors in the AI,” Riffer said.
This year, she said she added an AI course in CodeHS, a programming platform used in Blue Valley. She also said she revised her course syllabus to include AI guidelines requiring students to cite all AI tools used. Students must document their prompts, the programs used, AI responses and identify which parts of their work were AI-assisted.
Riffer said detecting AI misuse became easier when she noticed patterns last year. She said students would submit coding assignments beyond their skill level or using programming concepts not yet covered in class.
“Students who are novice coders that magically can code beyond their capabilities and submit that code as their own, or are using concepts that haven’t been introduced yet were obviously using a tool,” Riffer said.
She also spotted complex solutions where basic coding problems were solved with 20 lines of code instead of the expected three to four lines.
To address this issue, Riffer said she incorporated “unplugged activities” this year. These exercises are completed without computers or AI to ensure students understand coding fundamentals before using technological assistance.
“If you only know how to put a prompt into a tool and can never reason why the code works, you’re not going to be able to be successful in this new tech workforce,” Riffer said.
Similarly, Eigenberg said schools should teach students how to use AI responsibly rather than prohibit it.
“We owe it to students to help teach them how to effectively use this technology,” Eigenberg said. “I think we need to figure out with the district and with schools how to safely unlock AI so students can access it as a tool and as a resource.”
He also said AI should support learning rather than replace it.
“I think adults and kids use [AI] instead of thinking and instead of doing the hard work to learn, to grow, to be better,” Eigenberg said. “ChatGPT is helping me do things quicker, not doing things for me.”
Ultimately, he said schools must prepare students to work alongside AI.
“At the end of the day, AI is not going away, and societally speaking, we need to figure out what that looks like and how to raise the next generations to be able to outthink AI, or AI will replace us,” Eigenberg said.
