Contributors:Megan McFarland

In this curated teaching guide, you will find a variety of sources to support your exploration of generative AI tools like ChatGPT, build your understanding of their capabilities, and help with strategizing their integration into your courses. It’s crucial to recognize that generative AI tools are in a constant state of evolution, and this resource will be regularly updated to reflect that.

What is Generative AI?

Generative AI is a branch of artificial intelligence that involves machines generating content, including text, images, and more, based on patterns and data via user-entered prompts, such as questions or requests. In this way, generative AI is similar to a search engine but with the additional ability to synthesize multiple sources of information.

Generative AI works by analyzing vast datasets and identifying patterns to generate contextually relevant content. For example, ChatGPT uses a language model trained on a diverse range of internet text to generate written responses to user prompts.

Generative AI tools are versatile and can be used to:

  • Answer prompts and questions.
  • Summarize information.
  • Refine and revise content.
  • Foster creativity.
  • Assist with coding and debugging.
  • Manipulate datasets.
  • Facilitate interactive gaming experiences.

Generative AI tools also come with limitations, especially given the emergent nature of this technology. Some limitations include:

  • Inconsistencies in integrating genuine research into generated text or generating responses that are erroneous, oversimplified, unsophisticated, or biased when posed with questions or prompts. While many AI tools, such as Scite, can produce content with reference lists, these references may not always align with the generated text and may even be “hallucinated,” or imaginary. More recently developed tools like GPT-4 exhibit more sophisticated research integration capabilities.
  • Challenges in responding to prompts about current events. Generative AI tools are only as strong as their training data, and it takes time to integrate new information. For example, ChatGPT’s training data currently only extends up to 2021, but efforts are underway to update its knowledge base.

Generative AI and Academic Integrity

The remarkable capabilities and widespread accessibility of generative AI tools have sparked both excitement and fear within higher education, albeit not always in equal measure.

Promoting authentic learning and discouraging cheating, or “non-learning,” are two common goals for educators working with generative AI. Authentic learning, on one hand, involves immersive experiences that closely resemble real-world scenarios, which foster critical thinking, problem-solving, and practical skills. It encourages students to apply their knowledge in meaningful contexts, enhancing engagement and retention. In contrast, non-learning often involves rote memorization, surface-level comprehension, and minimal connection to real-life applications. It can inadvertently promote cheating and academic dishonesty due to its focus on repeating information rather than on understanding and application. With this in mind, it is clear that the conditions that either support or discourage cheating or “non-learning” have and will continue to exist regardless of generative AI.

While exploring the applications of generative AI to enhance teaching quality, it is also vital to remain focused on upholding principles of academic integrity and ethical conduct. Each instructor’s approach to generative AI in the classroom will vary according to your knowledge, skillset, and familiarity with this emerging technology, as well as the appropriate applications within your discipline. One way to define your approach for both yourself and your students is through an AI syllabus statement. In our Syllabus Template, you will find several suggested approaches and sample syllabus language, which can be adopted or adapted to align with your specific context.

Please note that the provided language is merely a suggestion. We encourage faculty to consult with their respective departments or schools to determine if there are any required AI syllabus statements or specific guidelines applicable to their discipline. Any and all generative AI approaches should be aligned with PSU’s Academic Misconduct Policy.

In an industry response to concerns around academic integrity and generative AI use, a multitude of AI detection tools are now readily available. These tools claim to be able to detect AI writing versus student-generated writing, although their accuracy varies considerably. While many tools claim high accuracy rates in identifying AI-generated content, it is not uncommon for third-party evaluations to reveal a significant rate of false positives. As such, even detectors with strong records in identifying AI-generated content may mislabel human-authored text as AI-generated. False positives carry the risk of significantly eroding student trust and motivation. Perhaps most alarming, early research and anecdotal evidence indicates that false positives are more likely to occur among students who are English Language Learners or students with cognitive, developmental, or psychiatric disabilities.

To incorporate generative AI effectively while fostering authentic learning and discouraging cheating, consider the following general strategies:

  • Engage Students in Ethical AI Discussions: Begin by discussing the ethical use of AI, including its benefits and potential pitfalls, with students. Encourage students to reflect on AI’s role in education and in your discipline.
  • Collaborate with Students: Involve students in defining ethical AI use within your course. This collaborative approach empowers students to take responsibility for maintaining academic integrity.
  • Transparently Share AI-Generated Content: When using generative AI tools like ChatGPT, share the initial AI-generated responses with students before assignments. Encourage them to assess, evaluate, and improve these responses to promote higher-order thinking.

Designing authentic learning assessments with students’ lived experiences in mind can be an excellent way to provide guardrails around unethical AI use, while also offering clearer insight into what your students really know. Here are some ideas on how to get started:

  • Design Higher-Order Thinking Assessments: Create tests and assignments that require critical thinking, analysis, synthesis, and creativity. These tasks are less susceptible to AI-driven cheating, as they demand students’ unique perspectives and insights.
  • Incorporate Multimedia Elements: In your assessment directions, encourage students to incorporate multimedia components into their work, such as videos, presentations, or infographics, which are challenging for AI to generate comprehensively.
  • Connect to Real-World Contexts: Make it challenging for AI to generate relevant responses without students’ authentic input by designing projects that relate to current events, specific class discussions, local issues, or students’ personal experiences.
  • Chunk Assignments and Emphasize Revision: Divide high-stakes, long-term assessments such as projects into smaller tasks with opportunities for planning, revision, and peer collaboration. This approach discourages last-minute AI-generated submissions.

By following these guidelines, faculty can harness the potential of generative AI to enhance learning while maintaining the integrity of their educational environments.

For more information on this subject, check out Encouraging Academic Integrity Through Course Design at OAI+.

Enhancing Teaching and Learning with Generative AI

Like any piece of technology, generative AI is just one of many tools you may choose from when designing your course. Some of the many potential instructional applications are:

    • Facilitate responses to frequently asked student questions or emails.
    • Generate exam questions and multiple-choice options.
    • Draft lesson plans and assignment guidelines.
    • Create reusable feedback comments for assignments.
    • Develop examples for students to evaluate and compare against their own work.
    • Demonstrate how generative AI can be a strong tutoring resource for reviewing complex concepts.
    • Produce real-time feedback on writing, particularly in language learning courses.
    • Condense qualitative student feedback from course evaluations

In addition to being a powerful tool for faculty, generative AI can make thinking and learning accessible to a wider range of students, including those with disabilities. Some ways you and your students might consider using generative AI are:

    • Using tools such as ChatGPT to create models or exemplars of assignments. Students may use these models to frame their own work or practice evaluating AI-generated work.
    • Using planning AI, such as Goblin Tools, to break down a complex assignment into manageable chunks.
    • Draft writing from an outline, or vice versa, to support task initiation.
    • Treat generative AI as a “second brain,” and ask it for help getting started on hard or daunting tasks.
    • Collaborate on a research strategy.
    • Guide students to use generative AI as an advanced proofreading and editing tool similar to Grammarly.