AI in Teaching & Learning
The approaches and resources below have been assembled to help faculty reimagine or reaffirm aspects of our teaching that are impacted by recent advances in generative artificial intelligence. The practical orientation of this page is complemented by the broader principles articulated in the accompanying “Framing Thoughts” page.
Teaching with Generative AI at UNI
Approaches
- Communicate with students about AI and learning in the context of your class.
Faculty inhabit a wide range of disciplinary and pedagogical traditions and values, so our students will benefit from ongoing communication and transparency about our expectations. Brené Brown reminds us that “Clear is kind,” and this is especially true when our classes draw upon novel approaches and tools, when our student body is increasingly diverse, and when those students are navigating the distinct choices individual faculty make for their classes.
Clarity around AI is enhanced when we offer concise syllabus statements, craft step-by-step assignment descriptions, repeatedly model the practices we wish to encourage, and initiate and sustain conversations about these choices. You might opt to arrive at some course policy statements collaboratively as group agreements, established with student input at the outset of a course or for specific assignments. Consider incorporating a class session, module, or unit to address specific information literacy contexts that will inform student choices and responsibilities in their coursework with you.
- AI does not think, but it can be used to support and develop critical thinking skills.
AI tools can jumpstart problem solving and knowledge work, allowing learners to move more quickly to the essentially human work of higher order thinking such as:
- looking carefully into “plausible” AI products to appraise their accuracy and truth,
- engaging in thoughtful skepticism and dialectics,
- enhancing one’s own information literacy and awareness of intrinsic biases,
- addressing the needs of specific people and places.
As with any drafting process, reflection and revision are acts that keep humans in the loop and help learners realize the benefits of iterative work. Not only does learning depend upon these processes of trial and error, of rethinking and rewriting, but critical thinking skills are developed in conversation with and on behalf of other people. Above all we should aspire to building a class community that values, develops, and thrives on human interactions.
- Rethink assignments to promote learning processes.
Many traditional and innovative assignments will need to acknowledge the ready availability of generative AI tools. A range of adaptations is possible - from restrictive to permissive - depending on a variety of practical, personal, and content-related considerations:
- Insulation strategies - whether the intent is to prohibit, detect, elude, or discourage use of AI
- Limitation strategies - when AI use is for defined purposes, with a need to properly circumscribe and document that use
- Amplification strategies - when AI tools are embedded, leveraged, or used creatively as part of the learning process
If one learning goal is to build facility with an AI application, students will need clear parameters and opportunities to practice their skills. Along these lines, one of the most powerful tactics is to help students develop strategies and skills with respect to writing and scaffolding AI prompts.
- Generative AI requires us to revisit some academic ethics and related policies.
Recent amendments to the UNI Student Academic Ethics Policy (3.01) are intended to clarify, for students and faculty, how to frame and respond to recent and continuing developments in generative AI. These updates (not yet published to the policies webpage) will allow faculty latitude in defining what constitutes “authorized use” of AI tools in the coursework for their classes. With that latitude comes a responsibility to communicate clearly with students as described above.
The use of AI detectors to identify instances of plagiarism or cheating involves us in an ongoing technical and commercial “arms race” and raises ethical issues of its own that will present challenges for the foreseeable future. For example, AI detection is fundamentally different from conventional plagiarism detection - which highlights what is purportedly plagiarized and then links to the source content and allows a faculty member to consider whether something is plagiarized or just, say, poor citation. By contrast, AI detectors can measure the likelihood that a passage has been assembled by AI, but at present this capability is undercut in practice by false positives, by variability among different detector platforms, and by a proliferation of strategies (and AI tools!) to disguise the source of AI-generated text.
Whenever the use of generative AI is allowed or encouraged in a given class or assignment it is important to establish and teach an up-to-date and professionally accepted protocol for AI citations. Here are current recommendations from MLA and APA.
- AI brings along some ethical baggage of its own.
- Bias: The products of generative AI tools can be expected to reflect the human biases of their colossal datasets, as well as the perspectives of their programmers.
- Accuracy: AI algorithms aim for plausibility and are in service to the prompter’s guidance; truth and accuracy are not assured.
- Privacy: Commercial interests - including data-mining, content ownership, and subscription revenues - accompany these tools, and their use requires granting access to one’s personal data and creations.
- Access: These products are available to those who can afford and access them, which presents some impediments to equity. On the other hand, AI affords assistive capacities that potentially facilitate the participation of some otherwise excluded groups of learners.
- Efficiency: The energy and resource demands of the servers that support AI data storage and training are not sustainable, and they are especially inefficient in comparison with the human brain.
- Capitalize on AI to reinforce your pedagogy.
Course interactions are opportunities to:
- model effective AI prompting techniques,
- use AI to power real-time demonstrations and simulations,
- leverage the potential for AI to support differentiated instruction for a variety of needs that emerge among your students,
- establish the relevance of course content by using AI to quickly generate examples in response to student interests/questions/ideas,
- reinforce the importance of grounding inquiry in information literacy.
- Consider using AI to make your own job easier.
AI tools can facilitate faculty efforts by:
- contributing to the work of course/lesson planning and organization,
- creating and augmenting practice opportunities for students,
- quickly drafting study or discussion questions,
- generating immediate feedback on some student submissions,
- supplementing or creating alternatives to existing materials, assignments, assessments.
Resources
Artificial Intelligence in Education: Sources to Stimulate Engagement & Discussion at UNI
AI Tools (ChatGPT) FAQ - user friendly site from Carnegie Mellon University
Advancing meaningful learning in the age of AI - from Oregon State University