We need to address the generative AI literacy gap in higher education

By Laura.Duckett, 18 March, 2024
View
Holding open the question of generative AI’s role in higher education presents an opportunity for us to model our access values to our students, colleagues and the wider public, writes Kyle Jensen
Article type
Article
Main text

Developing generative AI literacy can be a challenge because humans have difficulty conceptualising large numbers. Such numbers expose the limits of our knowledge and our ability to solve problems at scale becomes strained. When I teach students and colleagues about generative AI, they are genuinely shocked to learn that the applications are built on hundreds of billions of texts. Understanding the scale of generative AI operations is important, I explain, because the algorithms that run them need large quantities of text to create reliable output.

Rather than delve into the mechanics of how large language models process billions of texts, students and colleagues will often shift the conversation to the ethical problems associated with sourcing and processing those texts. Avoiding a confrontation with the limits of one’s knowledge is reasonable, of course. But this move can be dangerous because it assumes that generative AI policies can be developed without a working knowledge of how such applications function. This argumentative tendency results in policies that do not adapt well to evolving learning environments, which, in turn, create structurally reinforced literacy gaps that will only grow with time.

Another hurdle is that humans often rely on stories to make sense of things that confuse us. Many times, stories will increase our knowledge base and help us solve problems in thoughtful ways. But stories can also make our problems worse by misdirecting our attention and spurring unnecessary conflict. Unfortunately, the majority of the stories about generative AI fall into the latter category. If generative AI moves forward at its current rate, the prevailing story goes, we will witness the erosion of democracy, the obsolescence of human intelligence and the extinction of humanity. Think Hal from 2001: A Space Odyssey or Skynet from the Terminator series. Because these stories trade in hyperbole, most people (including faculty, staff and students) struggle to develop a measured perspective on generative AI. They become preoccupied with AI detection and sound alarms over the regulation of AI research. Once this type of framework is established, principled implementation gets bogged down and literacy gaps increase.

How to address the generative AI literacy gap

Arizona State University, where I am on faculty and direct the university’s writing programmes, is addressing this literacy gap by establishing an enterprise partnership with OpenAI. This partnership has been described as an extension of ASU’s well-earned title as a leading innovator in higher education. And that’s true. But what sometimes gets lost in the shuffle is the role that ASU’s access mission plays in these types of educational initiatives.

ASU’s charter supports students who are often excluded by university communities and helps local communities define, address and adapt to exigent social issues that affect how they live in the 21st century. This mission galvanises our faculty, staff and students because it tells a story about higher education that is by definition inclusive.

When it comes to generative AI, ASU activates its access mission by focusing on improving generative AI literacy in faculty, staff and students. We define generative AI literacy as a dynamic commitment to learning how generative AI works so that we can contribute to the technical evolutions, policy debates and ethical problems that it raises. Focusing on literacy is key because it requires us to learn about generative AI technologies with greater imagination and scope.

Universities that are trying to narrow the generative AI literacy gap might begin their work by developing a story about generative AI that is tied to their missions and identities. How these stories get told will vary depending on the institution. Regardless of the details, having a university-driven story in place will focus attention on a local problem-solving process and thus facilitate more measured approaches to generative AI applications.

How is ASU facilitating this process? Beginning in spring of 2023, faculty across the university worked alongside ASU’s EdPlus to develop professional development videos focused on generative AI literacy. This training course features faculty and staff members who explain how generative AI works in an accessible manner. The videos address ethical issues associated with generative AI, of course. But the bulk of them explain the basics of how generative AI applications function and describe how such functions might extend ASU’s educational mission. To date, more than 1,500 colleagues across the university have taken this course.

Institutions looking for a similar path might get started by identifying faculty with expertise either in generative AI or theories of language. In brief meeting sessions, you might ask them to explain the basics of generative AI in a manner that a non-specialist can understand. Once those explanations are in place, have them meet with other faculty who specialise in literacy development, storytelling, curriculum design and media production. This type of professional learning community will be especially well positioned to imagine a campus-wide initiative that scales generative AI literacy to faculty, staff, students and the community.

In addition to the structured professional development led by EdPlus, ASU has held a number of formal and informal in-person events across the campus. For example, ASU’s Learning Engineering Institute currently hosts weekly meetings with generative AI experts to help faculty stay up to date on key issues associated with the technology. Faculty across the university also meet regularly to develop instructional methods for incorporating generative AI in university classes. So far, conversations have focused on responsible usage policies, research-based initiatives on how students interact with generative AI technologies, instructional strategies that increase technological literacy, the implications of generative AI on labour equity and academic integrity. These collaborations have produced research studies that measure the effectiveness of instructional methods that consciously incorporate generative AI technologies. The result of these studies will then be distributed via the channels described above and to individual units that want guidance on how to bolster generative AI literacy in faculty, staff and students.

For universities interested in adopting a similar approach, the key is to bring together a core group of faculty and staff who have a baseline knowledge in generative AI literacy, expertise in faculty development and experience with research development or event coordination. Ask this group to create outcomes that align both with the university’s mission and the story you are telling about generative AI literacy. For example, one of our guiding outcomes is to understand generative AI research via its key terms, technical nuances, historical problem-solving processes and ethical challenges. Meetings should focus on these outcomes and the effectiveness of proposed research initiatives should be measured by whether such outcomes were met. Again, the advantage of this approach is that it increases generative AI literacy by focusing the faculty’s, staff’s and students’ attention on local problem-solving processes.

There is more to say, of course. But the bottom line is that ASU’s focus on increasing generative AI literacy helps us not only to approach new enterprise partnerships in a principled manner, but also to measure its success by the access mission that guides our collective work. Holding open the question of generative AI’s role in higher education presents an opportunity for us to model our access values to our students, colleagues and the wider public. We will continue to do so in the classes we teach, the public and professional outreach opportunities we provide and the research we conduct. If generative AI is poised to transform how we communicate and work on a broad scale, we cannot afford to do anything less.

Kyle Jensen is the director of writing programmes and a professor of English at Arizona State University.

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

Standfirst
Holding open the question of generative AI’s role in higher education presents an opportunity for us to model our access values to our students, colleagues and the wider public, writes Kyle Jensen

comment