AI in the Classroom: Policies and Assessment
Moving Beyond “Yes” or “No” to GenAI
When ChatGPT was first introduced in 2022, instructors often took one of two approaches to GenAI in their classes: yes or no. While bans on GenAI may be appropriate in certain courses, such bans are difficult for instructors to enforce. As GenAI tools have become more advanced and widespread, our approach to teaching with GenAI should expand too.
The binary choice between allowing or banning AI in the classroom no longer serves students or educators effectively. Today’s reality requires nuanced policies that prepare students for a world where AI collaboration is the norm, not the exception.
Crafting Nuanced AI Policies
Class policies for GenAI should establish clear boundaries while acknowledging that students will encounter GenAI in their careers. Policies might vary by assignment type or learning goal. Students may be allowed to use GenAI to develop and refine an outline for a project or troubleshoot broken code, while being restricted from using it for final submissions or individual assessments.
Effective AI policies recognize that different assignments serve different pedagogical purposes. A policy that works for a collaborative project may be entirely inappropriate for an individual skill assessment. This granular approach helps students understand not just what they can and cannot do, but why these distinctions matter.
Defining Use Cases and Boundaries
Class policies should also distinguish between use cases of GenAI. GenAI can be used to debug code, generate test cases, explain concepts, or write entire functions – what’s allowed and what isn’t? Class policies should be adapted for different use cases and explain which uses of GenAI are acceptable and when.
Consider creating a spectrum of AI use rather than binary permissions. For example, using AI to understand error messages might be encouraged, using it to optimize existing code might be permitted with attribution, while having AI write complete solutions might be prohibited. This approach mirrors real-world professional environments where AI assistance varies by context and responsibility.
The Importance of Educational Rationale
Students should also understand why certain uses of GenAI are limited or prohibited in their courses. Instructors may choose to restrict GenAI use to ensure students develop essential skills, practice critical thinking, or avoid encountering hallucinations. Helping students understand when GenAI use is appropriate versus inappropriate in their courses will equip them to use GenAI responsibly after the semester ends.
When students understand the pedagogical reasoning behind AI restrictions, they’re more likely to engage authentically with learning objectives. Rather than viewing policies as arbitrary rules, they begin to see them as scaffolding for their professional development.
The Problems with AI Detection Tools
GenAI detection tools sound like an attractive solution for enforcing class policies, but detection tools have significant problems. Researchers at Stanford University found that GenAI detection tools are biased against non-native English writers (https://hai.stanford.edu/news/ai-detectors-biased-against-non-native-english-writers). AI detectors may lead to false accusations and negatively impact student motivation and trust (https://teach.its.uiowa.edu/news/2024/09/case-against-ai-detectors). Over-reliance on AI detection tools can shift an instructor’s focus from learning to AI policing.
The fundamental issue with detection-based approaches is that they create an adversarial relationship between instructors and students. Instead of fostering learning and growth, these tools can damage the trust that’s essential for effective education. False positives can devastate student confidence and motivation, while the mere presence of detection tools can create anxiety that interferes with learning.
Assessment Design for the AI Era
Instead of focusing on AI detection and limitations, consider updating assessments to discourage inappropriate AI use. Assessments that require students to show their work, explain their reasoning, and demonstrate understanding through multiple modalities or checkpoints make it harder for students to overly rely on AI.
Modern assessment strategies should make AI collaboration visible and educational rather than hidden and problematic. When students must explain their thought processes, defend their choices, and demonstrate understanding in real-time, AI becomes a tool that enhances rather than replaces learning.

Iterative and Multi-Modal Assessment Strategies
Iterative assignments where students respond to follow-up questions or make revisions are a good opportunity for students to start with a GenAI tool and use their knowledge to improve the output or identify errors. Using in-class assessment-like quizzes, exams, or presentations alongside out-of-class assessment prevents students from becoming too dependent on GenAI.
This approach recognizes that AI can be a valuable starting point while ensuring that students develop the critical thinking skills necessary to evaluate, improve, and apply AI-generated content. Students learn to be critical consumers and collaborators with AI rather than passive recipients of its output.
Building AI Literacy Through Policy
Effective AI policies do more than regulate behavior – they build AI literacy. Students who understand when and how to use AI appropriately in academic settings are better prepared to make these judgments in their professional careers. They develop metacognitive skills about their own learning process and the role AI can play in enhancing rather than replacing their capabilities.
The goal isn’t to eliminate AI from education but to integrate it thoughtfully in ways that support learning objectives and prepare students for their future careers. By moving beyond binary policies toward nuanced, educational approaches, we can help students develop the skills they need to thrive in an AI-enhanced world.
Ready to implement thoughtful AI policies in your computer science courses? Explore how zyBooks’ interactive learning platform can support assessment strategies that encourage authentic learning while embracing the possibilities of AI collaboration.