Discussion boards, a text-based communication format dating back to the 1970s, is a popular technique to integrate into the classroom. However, anyone who has tried to use these boards in their teaching knows how stilted the conversations become when dealing with unenthusiastic students who are uninterested in impromptu discussions. As an instructor, you’re constantly trying to create prompts that will force a simulacrum of interaction so you can essentially grade bean-counter posts for participation points.
Well, this semester, ChatGPT finally killed off my efforts to use these discussion boards in a traditional attempt to simulate the real world. In our classes, students took our prompts, which were intentionally simple in order to encourage one to two paragraph interpersonal exchanges, and dropped them into ChatGPT. They then copied and pasted the results into the discussion boards. Often, I suspect, without even reading them. Several of the posts included admissions that the student was an AI.
- Resource collection: AI transformers like ChatGPT are here, so what next?
- A simple hack to ChatGPT-proof assignments using Google Drive
- ChatGPT and the rise of AI writers: how should higher education respond?
To combat this problem, we modified some of our prompts this summer to try to prevent students from using AI to avoid learning. I’m sharing some of our strategies in the hope that they help you out as you adapt your course to a world of generative AI.
1. Use prompts that force a personal opinion. This suggestion for modifying assignments is common, and it is one of the ways we identified students who were copying and pasting from ChatGPT. The AI tends to avoid using first person. In addition, it will often decline to give a personal opinion and instead list some possibilities of opinions. As instructors, we identified these posts by their vagueness and list format.
2. Have students include their source(s) as an attachment. The more insidious ChatGPT posts falsify information and make up citations. This made our simple, formative assessments, designed for an instructor to give quick feedback and be easy to assess, an absolute time vortex as we attempted to determine whether or not the cited sources were valid. Moving forward, we are requiring students to attach copies of their source and indicate where the information is located within that source. This way, we do not have to search for the (imaginary AI) source ourselves and can quickly check the veracity of the information. In addition, for some of our students, this will help them understand how to use evidence in a written assignment.
3. Use current or local events. This option does not work with all AIs, but some of them have information cut-off dates. Remember that AIs lie with confidence, so you will still need to double-check the veracity of the citations.
4. Have them take and caption a photo. ChatGPT-4 can describe photos, but if you have the student take a photo with themselves in the frame, it can help to verify that they are performing the activity, since it still takes some skill to convincingly photoshop yourself into an image. You can then have the students explain what is happening in a photograph or format it with a professional caption. As a biologist, this is a strategy I already use in the laboratory for the results sections of their lab reports, but it can be expanded to other types of assignment.
5. Draw a diagram or chart. This works particularly well for assessing student integration of multiple concepts across modules or chapters. Tables, Venn diagrams, flowcharts, concept webs, interconnected food webs, molecular interaction networks: all of these things can be drawn out by hand and photographed or created using computer programs, and posted in lieu of text-based descriptions.
6. Build and explain a 3D model. This is another technique I have found effective. Using playdough or other items around the house, students can build models of core concepts and explain these three-dimensional representations in videos. This takes the student away from text-based answers and, if they programme a robot using AI to do this for them, they should probably drop your class and join a tech start-up. As a twist on this idea, stop-motion is another fun activity, though it does take a little bit longer for the students to complete than a video. Using stop-motion, students can plan and photograph a storyboard of moving elements. For instance, in biology, this forces them to think more critically about molecular movement inside the cell. Students can then caption their stop-motion videos, much like they would with a photograph or graph.
7. Include timestamps from lecture videos. A simple technique I have used with success is requiring a timestamp related to discussion board questions or comments. With a timestamp, you can quickly reference your own video to ensure the student has not just pulled (potentially false) information from the AI ether.
8. Scrap the discussion boards. While still in use, discussion board technologies are a bit dated and only work in spirit if you have an entire class of internally motivated students. You can use other technologies, such as collaborative documents and video response platforms if your institution subscribes to them. Video responses in particular have yet to be emulated by AI. You might want to start thinking ahead, though, because deep-faked videos imply that this eventuality is on the horizon.
I already have a plan for when this happens: retire and live the rest of my days off the grid with a pile of books and some goats. If you have any other suggestions or techniques, we would love to hear them. Good luck.
Sara Cline is a professor of biology at Athens State University.
If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.
comment