Skip to main content

AI is headed back to school

child looking at code

Artificial intelligence is everywhere, from Google search results to social media—and now it’s showing up in classrooms. How is AI changing the way students learn in 2025, and what does it mean for teaching? CU Boulder Today spoke with Jeremiah Contreras, associate teaching professor of accounting at the Leeds School of Business, about how AI is being integrated into courses, how students can use it responsibly and what skills will matter most in an AI-driven world.

Jeremiah Contreras

Jeremiah Contreras

AI was the story of 2023 and 2024. How is it showing up in classrooms as students head back to school in 2025?

In 2025, AI is no longer a novelty—it’s becoming a core part of the educational experience. While students have been using it on their own at increasing rates, at Leeds, every student in our Business Core courses now engage with AI in some form, whether it’s using generative AI to analyze a case study, brainstorming ideas through dialogue with AI, or learning about a topic with a custom learning agent created by faculty. The goal in the classroom is to model effective uses of AI for learning, as opposed to simply having students get answers through these chatbots.

You’ve helped lead a campuswide effort to integrate AI across CU Boulder’s business curriculum. What lessons from higher ed should K–12 educators consider as they face pressure to adopt AI tools?

The biggest lesson is: Start with purpose, not tools. Don’t just add AI because it’s trendy. Decide what problem it’s solving or what skill it’s building. Provide teacher training before expecting classroom adoption, and integrate AI into existing learning goals rather than treating it as an add-on. At the same time, create a supportive space for experimentation: Pilot in a few classrooms, learn from the process, and scale up intentionally.

What are the biggest misconceptions parents, teachers or even students have about using AI in the classroom?

One misconception is that teaching students how to use AI to get answers is the most important aspect. The reality is that AI inherently reduces effort, but learning requires struggle. Students who use it well often work harder, because they’re iterating, fact-checking and refining their thinking. Finally, some believe AI will “replace” teaching, when the most effective uses can actually deepen teacher-student interaction.

How do you teach students not just to use AI, but to use it ethically and critically?

There is a huge difference between learning to use AI and using AI to learn. We not only need to teach students how to use AI (known as AI literacy), but when we embed AI into a course, we should be using the power of AI to ask questions of the student as opposed to simply providing an answer.

This is the magic behind custom learning agents, which are just AI that have been given a task. We give AI the job of walking a student through a learning process. Instead of providing answers to students, it helps them think about a topic and guides them through the learning process. With more complexity, these agents can create entire simulations where students can engage with different characters throughout an assignment. 

We should also explicitly address ethics in our AI-related activities, rather than isolating it to one lecture. Students should learn to ask: Where did this data come from? Who benefits? Who might be harmed?

Some educators fear AI will discourage original thinking or be used for shortcuts. What have you seen in practice? Are students getting lazier … or sharper?

When AI is used appropriately, students can actually learn more deeply. We’ve seen them explore ideas in more depth, test different approaches to a problem, and take risks when learning. There is far less fear talking with an AI to explore a topic than talking with a teacher or even a teaching assistant. The problem comes when assignments are purely product-focused, such as grading a paper that could be written with AI in an unchecked environment. When we take the time to assess the process a student goes through and the thinking behind the work, AI becomes a tool for deeper engagement rather than simply a shortcut to an answer.

Colorado recently passed one of the nation’s first comprehensive AI laws. How should schools—K-12 and universities—start preparing for a regulated AI environment?

Schools need to get comfortable with documenting and explaining how AI is used. That means knowing which tools are being deployed, where the data goes, and how decisions are made. This shouldn’t just be about compliance, but rather about building a culture of responsible AI use that can stand up to public scrutiny. If AI is used to assist in any type of assessment, that process must be disclosed and have a way for students to question the process. The Colorado law covers any AI system that affects the “well-being or opportunities” of individuals, which includes grades.

You’ve emphasized that AI should be a helper, not a crutch. What are some concrete ways students can use AI to boost and not bypass their learning?

Students should learn to use AI for brainstorming, cleaning up their outlines, or helping review their writing. It should not be used as a replacement for learning to do those things. There are also very powerful new learning tools in many of the AI products, such as ChatGPT’s “Study and Learn” feature or Gemini’s “Guided Learning” tool. This can help students use AI as a coach for practice problems or to simulate a debate partner that challenges your arguments. And most importantly, use it to get feedback on drafts or ideas, then decide which feedback to keep. AI should expand our thinking, not replace it.

What advice do you have for school administrators or faculty who feel overwhelmed by the pace of AI change and don’t know where to start?

Start now. You can start small or take some big steps forward, but ensure you have an approach that is adding to effective uses of AI in education, not just throwing AI into the mix in an uncoordinated way. Pick one or two use cases that solve real pain points: maybe grading rubrics or re-writing lesson plans. Build a peer learning group so educators can share wins and challenges. The perfect AI plan doesn’t exist; momentum comes from experimenting, reflecting and iterating. It takes having people who are willing to play with the tools and begin experimenting.

Looking ahead, what skills will define the most successful students in this AI-driven world and how can schools help nurture them right now?

The most successful students will be those who can ask better questions, evaluate AI outputs critically and adapt quickly to new tools. Most importantly, they will need strong human skills around collaboration, communication, creativity, ethical reasoning and learning to trust their judgment. Schools can nurture this by making AI a regular part of projects, emphasizing reflection and encouraging students to work on problems with no single right answer.