Strategies for Teaching with AI to Promote Learning

This resource on teaching with AI was created by Dr. Mandy Olejnik at Miami University, with input from professional development offerings she’s led with Rena Perez at the Howe Center for Writing Excellence.

Introduction to Teaching with AI

In late 2022, OpenAI launched ChatGPT 3.5, which is an artificial intelligence (AI) chatbot that uses natural language processing to create humanlike conversational dialogue. AI tools themselves are not new to society or higher education, as AI capabilities are built into virtual assistants like Alexa and Siri, speech-to-text features on smartphones, fraud detection systems in banking, and more.  Indeed, new tools and technologies have mediated student learning throughout all of human history—think, for example, of the invention of clay tablets in 3200 BCE, or the invention of the pencil with an eraser in the late 1800s. 

One way writing scholars and experts have approached the rise of AI technologies in educational settings is to treat it like any new and impactful technology that comes with nuance, literacy, and a learning curve, for both students and faculty. There are also important ethical considerations of using and working with these tools that all users should keep in mind. Below are some ways that AI tools like ChatGPT and Google Bard can be used to aid student learning in the classroom, especially in terms of student writing and writing development, as well as some limitations and ethical considerations to be aware of.

Some Cautions around AI Usage

First, note that there are a number of data and privacy issues around AI. When users input information into these tools, they cannot know how their input will be used. Scholars or creative writers inserting their own in-progress work, for example, may in fact be helping “train” AI on what academic or creative writing features and patterns look like. It is possible that a user’s  writing will  be replicated and reconfigured for someone else writing about a similar topic without any attribution (and there are a number of lawsuits about this). 

Although ChatGPT currently has a setting under “data control” where users can opt out of having their information saved and used to train ChatGPT’s models (see figure 1 below), users realistically have no way of knowing how or when their data is used, and should note that the platform also cautions users not to share personal or sensitive information.

Figure 1. ChatGPT setting to opt in or out of saving chat history and training AI models.
Figure 1. ChatGPT setting to opt in or out of saving chat history and training AI models.

Second, AI is also not immune to human bias (for more information, read “Assessing Bias in Large Language Models” from Miami University). AI tools are being trained by the input people provide, which is created by actual humans with inherent biases, even as there are efforts to cross-check for bias (work also undertaken also by humans with their own biases).

Third, there are numerous  ethical concerns regarding the actual production and sustainment of AI tools. AI programs like ChatGPT are trained by people working in what is referred to as “digital sweatshops” set up across the Global South where people coordinate the “crowdwork” necessary to train AI (see this article from the Washington Post for more). Additionally, there serious  environmental impacts of sustaining AI platforms, such as a plant in Iowa needing a steady supply of water to cool the plants on hot days (see this article from AP for more). 

Finally, it’s important to consider equity and access. ChatGPT only offers its most up-to-date version (GPT-4) for “plus” subscribers, which at the time of this writing costs $20/month. GPT-3.5 is free, but students who can afford to pay for the “premium” version have broader access to resources, including the use of the most up-to-date DALL-E image generator. Not everyone can have this same service and access free of charge.

To sum up, there are numerous cautions instructors should note when considering AI use in the classroom. Given the above, instructors should have conversations with students about how these tools operate and are trained, and how data is (or could) be used. These are important nuances to keep in mind, especially if instructors require usage of AI tools for course assignments.

Policies for AI Usage in the Classroom

Regardless of how instructors may feel about AI tools, these tools are quickly becoming ubiquitous, both in and out of academic settings. Thus, both students and instructors benefit from a clear syllabus statement  about AI usage clarifying their values and acceptable classroom use. 

There is a growing, crowd-sourced list of syllabi policy for AI generative tools available online. Policies range from outright bans on AI usage to allowable but specified uses. instructors choose based on the specific context of their courses. 

As instructors consider their own policies, they should ask questions such as:

  • What can AI tools do, and what can’t they do? How do AI capabilities relate to the types of course assignments, and what aspects AI tools alone can or cannot fully produce? (if there are components AI tools can actually produce, might instructors want to reconfigure what students are being asked to do?)
  • How can instructors ask students to bring valuable rhetorical, moral, and creative perspectives that AI tools cannot? What is the “human component” needed for specific courses and topics?
  • How might students want to use AI tools in assignments: to write the final product itself? To assist with drafting and coming up with ideas? To edit or provide editing feedback? (these usages  have some important differences, and might impact usage policies)

Ways AI Tools Can Be Used for Generative Learning

Below are some suggestions and strategies for ways that AI can be used to support student learning, and not replace learning. AI tools can be used to…

1. Provide practice and support for writing. With their various features, AI tools can provide a host of different types of writing support for students—and for writers overall. When prompted, ChatGPT itself states the following ways it can support people learning to write:

Figure 2. ChatGPT sharing the ways it can help the prompter learn how to write.
Figure 2. ChatGPT sharing the ways it can help the prompter learn how to write.

ChatGPT can support students in various stages of the writing process, from the beginning stages with writing prompts to the middle stages with research assistance to the ending stages with editing and proofreading. While learning to write cannot be supported only by these tools, such learning can be supported with AI as one tool in a wider network of writing support, which also includes teachers in the classroom, peers in the classroom, and outside peers such as writing consultants at a writing center. Students may be able to use ChatGPT to get started, or for some advice after peer feedback, and so on. Most important: ChatGPT should only be used as a  part of a larger set of supports.

2. Offer Writing Feedback. AI tools can offer writing support and feedback to students in mere seconds, if they feel comfortable sharing their writing given the various privacy concerns. When prompted, tools like ChatGPT can offer feedback following specific criteria. For example, take a look at the following:

Figure 3. ChatGPT providing feedback on a paragraph with specific parameters.
Figure 3. ChatGPT providing feedback on a paragraph with specific parameters.

When the writer prompts ChatGPT to summarize the main point of their written paragraph, ChatGPT obliges, offering the writer a sense of how well they communicated their intended message. Writers can ask for different kinds of feedback.  Most important: users need to  know what and how to prompt these tools, which is its own kind of rhetorical literacy.

3. Combat Writer’s Block. When faced with the ever-daunting blank page, writers can also chat with AI tools about their ideas, and be prompted with helpful questions. For example, when inputting “I’m trying to write a research journal article about AI and the writing across the curriculum movement, and I’m having a hard time getting started,” ChatGPT responded:

Figure 4. ChatGPT offering encouragement for getting started with writing.
Figure 4. ChatGPT offering encouragement for getting started with writing.

The AI tool (in this example, ChatGPT) offers encouragement and acknowledgement of the difficult task, and provides some helpful questions to the user (all while relating back to the user originally said). In this way, the writer is provided with an action plan to move forward—perhaps at a time where they couldn’t reach out to a peer or faculty member for assistance. Writing instructors themselves would likely offer similar suggestions and instruction, which speaks to the utility of students having a resource like this that they can access.

4. Assist with brainstorming. While students should absolutely exercise their own creative muscles when writing and thinking about ideas and topics, AI tools can still help them solidify and narrow down their thoughts, as well as bring new ideas and perspectives to students’ attention.

5. Problem-solve coding. For even more mathematical-based learning contexts, AI tools can help learners problem-solve while coding, working through specifics. Take the following example:

Figure 5. ChatGPT answering a specific question about coding.
Figure 5. ChatGPT answering a specific question about coding.

The user prompted ChatGPT with a specific question around the shade of certain lines, and ChatGPT was able to provide the answer with an updated code written in a clear way that helped the learner conduct that task. 

Additionally, learners can prompt tools to ask for more directive responses, like the following:

Figure 6. ChatGPT breaking down a coding process.
Figure 6. ChatGPT breaking down a coding process.

Here, too, ChatGPT provided a breakdown of the process with an example that the learner could follow and apply to their own learning context. Students can use this as an aid in their learning, and in seeing examples and learning how it applies to other examples. 

6. Compare AI-generated writing to human writing. While many instructors would agree that students using and turning in AI-generated text in its entirety is wrong and not conducive to learning, working from and adjusting AI-generated text might still be helpful.

Take the following scenario as an example, where a user instructs ChatGPT to “write me a 2,000 word research paper on the history of composition studies in the United States, starting with the Harvard English A course”:

Figure 7. ChatGPT writing a paper in response to a general writing prompt.
Figure 7. ChatGPT writing a paper in response to a general writing prompt.

A careful reader situated in the discipline of composition studies (the discipline represented here) would note that ChatGPT’s response gets a few things wrong. If a student were to turn in this essay as-is, it would be factually incorrect, as well as unethical. What if a student prompted ChatGPT in this way and then annotated its response in the following way:

Figure 8. An annotated version of writing completed by ChatGPT.
Figure 8. An annotated version of writing completed by ChatGPT.

In this example, the writer is directly commenting on and challenging ChatGPT’s produced product, as well as cross-referencing class resources. This is a way to utilize AI that might be generative—and that shows the important of an actual human who has all of the context.

Overall, these are some general strategies and ways that AI tools might be useful in student learning. Below are some sample activities and assignments that might be effective to try in college courses. 

Example Classroom Activities Using AI for Learning

The following are some more specific activities around AI that instructors can incorporate into the classroom. 

Resources for Larger-Scale Assignments and Activities:

Ideas for Smaller-Scale Assignments and Activities: 

Students can use AI tools in-class to…

1. Talk with AI tools about their understanding of a topic. As a LLM with almost instant access to lots of information, AI tools can be a great resource for students getting lots of information about a topic quickly. Although ChatGPT, as an example, is trained only up until September 2021, it can still provide some information from its database that can get students started and that they can cross-reference with other sources of information from class and from the internet. 

2. Annotate AI-produced text. Another helpful activity might be to ask ChatGPT or Google Bard to “write an essay” and then have students comment on the text actually produced, like the previous example in strategy 6. The student can comment on what ChatGPT produces, correcting areas where it isn’t entirely accurate and recognizing where the LMM might misunderstand what the prompt actually askes. Annotation exercises like this can help students critically and carefully examine text, and apply the knowledge they’ve gained elsewhere in the course or their degrees.

3. Examine AI-produced examples vs. human-produced and discuss effectiveness. Providing example texts is an effective pedagogical strategy, and one way AI might help is to generate examples of text. Instructors might prompt the LMM to write or produce the kind of text that it wants, and then mix that in with other student-generated (or actual professionally-generated) examples. Students can examine the set and determine what they like about each, perhaps learning that the AI-generated examples have some distinct differences to the human-generated ones (and why that matters).

4. Use AI as a place to start research. One of the strengths of AI generative tools is their ability to compile a lot of information within seconds. Students might prompt tools like ChatGPT or Google Bard to get started with their research. Since these tools scrape information that was uploaded to them from the internet, the information it provides may or may not be entirely accurate. It’s therefore essential to cross-check any information received from such platforms, but it could serve as an interesting place to see some initial patterns and ideas.

5. Ask AI to help reverse outline. Sometimes students struggle with writing too much or not fully understanding their points after writing their way through a paper. A common pedagogical approach is creating a reverse outline where students take what they’ve actually written and condense it down to an outline, and either compare that outline to the one they originally wrote or otherwise see if it constructs the narrative they aimed to articulate. Students could create their own reverse outline and then have an AI tool produce one as well, prompting them to compare the two outlines and really reflect on their writing and what their words are actually saying.

Overall, there are several ways that AI tools can be used in the classroom, depending on course context and instructor comfort. The above-linked resources offer a wealth of information and ideas for doing so. 

Scroll to Top