Universities must think smarter when devising edtech strategies for the future

By dene.mullen, 16 March, 2023
View
The ideal vision is one where AI and faculty work together to deliver the best outcomes, rather than a two-tier system where the less privileged are left with a low-cost, automated education
Article type
Article
Main text

At any given time, there will be a range of new technologies whose proponents claim them to be transformational. In the early 2000s this included open educational resources, learning environments and virtual worlds such as Second Life. During the pandemic, the focus was mainly on those technologies that allowed us to deliver education at a distance, such as virtual and hyflex classrooms. We are now witnessing the emergence of new trends, including virtual, augmented and mixed reality, blockchain, NFTs, AI, datafication and analytics.

Not all these emerging technologies will be transformative, and many will only impact certain niches of the curriculum, if at all. But one thing is clear, simply embracing new edtech trends – such as investing in VR headsets – cannot be a strategy in and of itself, and so a critical task for decision-makers within higher education is to identify which technologies have the potential to make a significant and long-lasting impact. And one set of emerging technologies with strong potential for disruptive change is data analytics and AI.

As the student experience is increasingly digitised, we have access to new information about those who choose to study with us in the form of a data footprint. Here at Imperial College Business School we have hired a dedicated team of data specialists to collect and unify such data, from application through to becoming an alumnus. Once built, we can apply sophisticated mathematical modelling to reveal previously unseen data patterns, providing a unique insight into the aspects of the student experience we care about most – performance, satisfaction and graduate outcomes.

The goal of using this technology is to facilitate the development of precision education, allowing us to make better-informed, evidence-based decisions regarding school strategy, recruitment, resourcing and educational design. In turn, this means we can provide students with a more precise and tailored learning journey based on their individual skill sets and personal objectives, as well as develop sophisticated impact measurement tools. This is a significant investment but, in time, it will have the power to transform us into a data-driven organisation.

Looking to the future, such datafication, machine learning and precision education may be the pathway to the development of an educational AI, capable of complementing or even replacing humans in performing certain tasks.

Of course, using AI in higher education isn’t a new idea – it’s already being implemented across a range of functions, from automating admin and general management tasks to enhancing learning, particularly in the context of marking and assessment. But recent technological progress is pushing the boundaries of what tech can do. For example, ChatGPT has already taken the education sector by storm, threatening to disrupt traditional learning practices by changing the way we teach and learn.

Thus, it doesn’t seem too far a stretch to imagine an educational future in which AI tutors are developed to guide students through their personalised learning journey, learn about their individual abilities, identify gaps in their learning and provide tailored feedback and support.

The emergence of AI tutors would represent a transformative shift in the higher education landscape. It would allow institutions to deliver high-quality programmes at scale and at a lower cost, making learning more widely accessible and affordable for students. It would also allow faculty the time and freedom to build relationships with their students, providing them with a depth of understanding and adaptability – all uniquely human capabilities.

This is a highly attractive prospect, but there are challenges. For example, despite its advancements, AI is not yet sophisticated enough to perform tasks that go beyond providing programmed responses to predetermined inputs. Such widespread adoption would require the development of a general AI capable of performing at a more complex and intellectual level.

Of even greater concern are important ethical considerations when it comes to protecting students and how their data is used and shared. Additionally, despite AI promising better-quality education, we shouldn’t underestimate the challenge of changing people’s mindsets about what high-quality education looks like. We have already faced this battle in convincing traditional educationalists of the merits of online education – and it took a pandemic to settle the argument.

We also need to be careful to consider the potential for these new technologies to be adopted inappropriately within the sector. According to David Lefevre, professor of practice at Imperial College Business School: “There is a fear among people in roles like mine that technology will create a two-tier educational system in which there exists a low-cost, largely automated education system for some, and a high-touch, more human experience for those who can afford it.”

According to David, this moves our discussion into the realm of politics. Is it fair that some students have access to better-resourced education than others? If not, what’s the solution? Trying to level the playing field by restricting access to elite institutions? Spending more on less-equipped institutions to ensure they have a comparable level of resource?

I share David’s concerns, but this scenario is not inevitable. As key stakeholders within the sector, it is within our power to engage intelligently with these new technologies, grasp the wider issues and intervene. Decisions about technology adoption and application should be pedagogically driven, and we ought to be philosophical in our decision-making about what adds value and what does not.

For example, using ChatGPT to create high-level lesson plans can be valuable. It can help faculty save time and streamline processes. But the technology doesn’t understand what it’s writing, so it can be factually incorrect and include potentially problematic biases. So, as with other AI technologies, we still need to review and correct AI-generated text and this editing requires subject matter expertise. Ultimately, the ideal vision for AI in higher education is one where AI and faculty work together, capturing the best of human and digital prowess for the best student outcomes.

As a final comment, it is worth noting that we have faced a similar scenario in the past when the media predicted Moocs would replace traditional learning. One of the fears was that we would end up with a relatively small number of elite universities delivering high-touch, high-quality education to those who could afford it and low-cost, low-support Mooc-style courses delivered to the masses. This didn’t come to fruition. Today, Moocs are more widely recognised for offering lifelong learning rather than degrees, and student engagement levels and course completion rates are abysmal. We can draw the same parallel with low-cost automated support here.

Sarah Grant is director of digital education and leads Imperial College Business School’s award-winning Edtech Lab. Her responsibilities also include piloting and implementing new educational technologies and seeking out opportunities to enhance the student experience and disrupt the educational landscape.

This article has been written with input from David Lefevre, professor of practice in management and entrepreneurship at Imperial College Business School. David’s research focuses on the applications of analytics and AI to education.

If you found this interesting and want advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the THE Campus newsletter.

Standfirst
The ideal vision is one where AI and faculty work together to deliver the best outcomes, rather than a two-tier system where the less privileged are left with a low-cost, automated education

comment