Our students need knowledge, skills and experiences to help them thrive in a world powered by generative AI
Generative AI is at our fingertips and rapidly maturing. Keeping pace with the technical landscape, emerging capabilities, and new integrations will be a challenge but is essential, especially at a Research 1 university.
To participate in a complex and dynamic society that will increasingly depend upon AI, UW–Madison students will need skills such as prompt engineering, problem-solving, bias detection and intellectual curiosity. At the same time, they will need emotional intelligence, flexibility, and the ability to collaborate with humans and machines.
The principles and resources shared here are intended to help instructors address generative AI within the courses they teach. Please contact us if you have questions, comments or additions!
Receive updates on generative AI in teaching and learning
On this page: UW–Madison guiding principles | Events | AI syllabus statements | Guides to AI in teaching | Customized support
UW–Madison guiding principles
This guidance is based on the work of cross-disciplinary teams at UW–Madison including faculty, instructional academic staff, academic technologists and student affairs specialists. Campus guidance and resources will continue to evolve to reflect developments in this rapidly changing space.
This is an accordion element with a series of buttons that open and close related content panels.
Advance accessibility and equity
Generative AI offers both benefits and risks related to accessibility and equity. Generative AI tools may present opportunities for students who are multilingual, who struggle with writer’s block or writing anxiety, or who are entering a new discipline. At the same time, AI can generate racist, sexist, and other kinds of biased responses and potentially promote such biases. It may also present accessibility barriers for students with certain disabilities. Financial equity is another concern – some students will be able to afford to buy premium subscriptions while others will not. UW–Madison has adopted Microsoft Copilot as the campus equitable access tool for question/answer generative AI.
Protect data and intellectual property
Instructors and administrators are responsible for protecting student privacy and intellectual work, as well as securing FERPA-protected data. They must also follow university policies that apply to the use of generative AI .
With this in mind:
- Students should not be required to submit drafts of assignments they create to an AI tool unless doing so is an intentional part of the assignment.
- Students should not be asked to submit personal information to AI tools.
- Instructors should not submit student work into AI tools for the purposes of automating feedback and comments.
- Instructors are encouraged to communicate to students that assignment prompts and other instructor-created materials are protected intellectual property and submitting that to an AI tool without permission may be a violation of intellectual property rights. (For more on communicating with students, see AI Statements for Course Syllabi.)
Consider educational uses
Generative AI can support student learning and open new opportunities for teaching. Students can dialogue with AI chatbots in ways that generate new knowledge and insights and that mimic conversations with peers or even instructors.
Potential uses for students
- At various stages of the learning process – to brainstorm ideas, to summarize or distill complex thinking, to learn specific genre conventions, to check grammar, to test out ideas or formulas.
- To clarify complex readings, issues or points of confusion.
- To support student research – finding, generating and analyzing data quickly; encouraging multimodal approaches to communication (e.g., generating websites or posters).
Potential uses for instructors
- Help with designing course materials, including syllabi, questions, quizzes and prompts for learning activities.
- Explaining complex concepts in multiple ways for diverse learners.
- Generating announcements or newsletters for large classes or cohorts.
- While instructor feedback on an entire student work should not be automated, AI can assist with generating ideas for praise, considering additional approaches, and offering stylistic or grammatical suggestions to a specific paragraph.
Learn more in the guide Exploring AI in Teaching
Consider adapting learning experiences and assessments
When considering how to adapt learning experiences and assessment, the following strategies may have the added advantage of helping to foster a sense of belonging for students, which promotes student success.
- Promoting a growth mindset with learning tasks and assessments that are well scaffolded so students feel capable of success without relying exclusively on AI.
- Developing authentic assessments and activities that can be completed without AI, such as in-class discussion; personal narratives; assignments focused on local and/or very recent events; applications of learning to real-world scenarios.
- Considering metacognitive learning strategies, such as writers’ memos, reflections on learning and sharing thinking processes.
- Building in collaborative learning, such as peer review and scaffolded group projects.
- Considering flipped classroom opportunities using real-time educational tools such as Top Hat, time in class to brainstorm or work on assignments, and active learning strategies.
- Using process-over-product approaches, such as sharing Google drafting histories, scaffolded assignments and assessments, check-ins on progress, and conferences with students to gauge learning. The Writing Across the Curriculum Sourcebook includes suggestions for scaffolding writing assignments.
Learn more in the guide Planning AI Use in Your Course
Discuss course expectations and academic integrity with students
The proliferation of generative AI tools necessitates intentional and thoughtful discussions about AI, broadly (what it is, can do, etc.) and, more specifically, how it relates to student learning and academic integrity.
Instructors are encouraged to communicate with students about generative AI (including how it relates to academic integrity), both at the start of a course (in course syllabi, in Canvas and in conversation) and throughout the semester.
Here are some things to consider when thinking through communications with students:
- Clearly communicate expectations regarding the use of generative AI tools early and often. Some instructors might have a policy for the entire course, or have specific instructions for individual assignments and assessments. Recognize that students may receive different guidelines from other instructors. View sample syllabus statements.
- Provide clear instructions, examples and explanations to help students understand the expectations. Explain when generative AI can be used (e.g., initial queries, topic development, help structuring a written assignment or project) and when AI might hinder their learning or development of essential skills and knowledge (e.g., relying on AI to complete a math assignment that builds skills necessary for more advanced work).
- If instructors plan to discourage or prohibit the use of AI, it is particularly important that they explain why to students in the context of learning objectives for the course.
- If allowed, consider when/how/in what way AI-generated content should be cited and whether to require students to turn in, or at least retain, chat transcripts.
- Establish a clear and transparent dialogue with students early (and often) to help avoid an instructor-student dynamic built on mistrust. Using a “misconduct” lens can create a climate of policing/suspicion.
- Incorporate the topic of generative AI into a broader discussion on academic integrity and professional ethics within an instructor’s discipline.
- Participate in conversations about academic integrity and generative AI with colleagues in your department. Departments are being encouraged to facilitate conversations within their specific discipline to clarify shared expectations and identify strategies for promoting professional ethics within the discipline. Academic associate deans, CTLM and Writing Across the Curriculum are available to support these conversations.
Address potential misconduct using established policies and procedures
Instructors are encouraged to take a proactive approach to prevent misconduct, using the teaching and communication strategies outlined above.
Avoid the detection arms race – generative AI detection tools are imperfect at best, carry the risk of false positives, have been shown to be biased against non-native English speakers, and will not prevent students from using these tools.
If an instructor suspects a student has not followed their established course guidelines on generative AI, they should address it as they would any case of suspected academic misconduct, beginning by meeting with the student to discuss concerns.
Events
AI Learning Labs
Got questions about AI? Want some hands-on help with activities and assignments? These informal sessions offer personalized support.
Navigating Author Responsibility and Copyright
On Oct. 22, hear from University Libraries how you can help students develop ethical practices when using generative AI.
Navigating Ethics and Privacy in the Age of AI
On Nov. 18, hear a panel discussion exploring the critical ethical and privacy considerations related to generative artificial intelligence in education.
AI syllabus statements
It's important to share your expectations for AI use with your students. Your course syllabus is a great place to start.
Read moreGuides
Intro to AI in Teaching
- What is generative AI?
- How might it be useful in teaching and what concerns should be considered?
- What are UW–Madison’s AI tools and policies?
Exploring AI in Teaching
- How might AI enhance students’ learning?
- How can I promote academic integrity?
- My course involves writing – what can I do?
Planning AI Use in Your Course
Thinking about giving AI a try?
Consult this step-by-step approach. You can also download it as a handout.
Customized support
CTLM provides generative AI support that is tailored to the questions, interests, and needs of academic departments as well as individual instructors. If you’re curious about AI but unsure whether or how to get started, please contact us! We offer one-on-one consultations, customized deparmental workshops, and more.