Blog posts

Designing courses in an AI era: what we can learn from our students

“I’m studying [insert topic here]. Motivate me.”

The above isn’t a message to a tutor (thankfully), but a prompt from OpenAI’s Top 20 Chats for Finals. The list, released earlier this year, reveals the creative ways students are using generative AI in their learning. It’s striking that the most popular examples don’t involve using AI to cut corners, but to make learning more effective. We’re seeing similar patterns among our own students. In a recent survey of 1,556 participants on the Introduction to Generative AI course, the most common use of AI was to understand concepts, with 1,087 students (~70%) saying they had used AI as a personal tutor.

As a learning designer, I find this data both fascinating and thought-provoking. It challenges us to think about how learning design might evolve when students are already using generative AI to adapt their learning. Give learners an e-module today and a significant number will personalise that content in some way. They might ask a chatbot to explain concepts in simpler language or create quizzes and new opportunities to apply knowledge. They may even adapt materials to fit their current mood. One prompt from OpenAI’s Top 20 Chats really caught my attention:

“I’m not feeling it today. Help me understand this lecture knowing that’s how I feel.”

This serves as a powerful reminder that students don’t arrive at every topic full of motivation and focus (as much as it may pain us to admit it!). However, through AI they’re finding ways to adapt resources to stay engaged. It’s a sophisticated use, and one that highlights a key strength of generative AI in education – its ability to adapt and personalise interactions to meet learners’ needs on different levels. This is something that has always been difficult to achieve with one-size-fits-all digital resources and it’s exactly the kind of opportunity that is worth exploring in AI-augmented learning design.

Adapting to a changing landscape

This individual example reflects a wider trend in how students are approaching their studies, with generative AI usage on the rise. The Higher Education Policy Institute (HEPI) 2025 survey revealed that 92% of respondents reported using AI tools in their learning this year, up from 66% last year. Usage is also strong among Imperial students. In our Introduction to Generative AI questionnaire, just under one third reported using AI daily (32%), and a further 45% said they use it at least 1–2 times per week.

What does this all mean for the way we design courses? Should we simply accept that students are using AI to make adaptations and leave them to it? My answer is a strong no! Understanding how students are using AI should motivate us to update our practices. As pedagogical experts, we have a responsibility to explore opportunities to integrate AI in ways that support learners and improve the effectiveness of our digital resources.

There are several pedagogical and practical reasons why our involvement is essential.

1. Generative AI is not automatically beneficial for learning

Not every student’s use of AI supports effective learning. Sometimes students just want to get the job done with as little effort as possible, and even the most well-intentioned learners may be using AI in ways that aren’t helpful. Commercial models like ChatGPT are general-purpose chatbots – they’re not designed specifically for education, and some of their built-in behaviours would make a teacher’s toes curl. They make things up, exhibit bias, talk too much and over-edit. This is not ideal when learners are trying to grasp new concepts or develop key skills.

2. Carefully designed AI interactions can improve learning

If students are already turning to AI to personalise course content, our role is to help them do it well. Through intentional design we can offer learners AI support that is more accurate, inclusive and pedagogically sound than what they may find elsewhere. Well-designed AI interactions will make our digital resources more responsive to learners’ needs and help to keep students within Imperial’s learning environment, where we can maintain quality and protect our data.

3. Digital equality demands our attention

There’s also a strong digital-equality incentive to act. We can no longer assume that learners experience our resources in the same way across cohorts. By incorporating AI into our course design, we can help ensure that all students benefit from responsive, personalised content, not just those with advanced AI knowledge or the means to pay for premium tools.

4. AI literacy as an essential skill set

Including more AI interactions in our resources also requires us to think carefully about students’ AI literacy and to ensure these skills are embedded within the curriculum. Through our design choices, we can model appropriate uses of AI and support students in developing the skills to critically evaluate AI output.

5. Practical challenges

AI interactions work best when they are thoughtfully embedded within learning sequences and informed by course content. This is where limitations in our current provision become clear. At present, Imperial’s approved AI tools – Copilot and dAIsy – offer secure access to advanced GPT models, which is a valuable starting point, but they are both stand-alone platforms. A more integrated future, where AI is embedded within Imperial’s systems would enable richer and more seamless learning experiences. Our move to Canvas may open new opportunities in this direction, and it will be important to explore them carefully.

Closing thoughts: An opportunity to build better digital resources

The message is clear – students are using generative AI in new and creative ways to support their learning. The question, therefore, isn’t whether this change is happening, it’s whether we are designing with it or despite it. If we choose the former, we have an opportunity to improve our digital resources and build more inclusive, accurate and pedagogically effective AI interactions for our learners. If we choose the latter, students will continue to rely on external tools that may not be appropriate for education. Generative AI can support learning, but only if we take an active role in shaping how it appears within our courses.

If our students can ask, “Motivate me,” perhaps our response should be, “We’re working on it.”

References

OpenAI (2024) Top 20 Chats for Finals. Available at: https://edunewsletter.openai.com/p/top-20-chats-for-finals (Accessed: 11 December 2025).

Higher Education Policy Institute (HEPI) (2025) Student Generative AI Survey 2025. Available at: https://www.hepi.ac.uk/reports/student-generative-ai-survey-2025/ (Accessed: 11 December 2025).

 

 

Thoughts as we begin our second year

Students and staff walk in the sunshine at South Kensington campus.
Students and staff in the spring sunshine at South Kensington campus, Dangoor Plaza.

 

We’re about to embark on a second year of the AI Futurists initiative.

Since June 2024, the five embedded AI Futurists have been devising ways to further Imperial’s AI capabilities in the education space. This work began with uncovering activities related to GenAI in Education within the AI Futurists’ local faculties and departments, exploring opportunities to connect people to share practice and join up. Each of the AI Futurists have responsibilities and interests within their ‘day jobs’ related to genAI. These range from identifying training gaps and designing learning to fill these to collaboratively developing and deploying AI tools to assist with learning and teaching across taught courses.

In my role as Lead AI Futurist, I am tasked with facilitating the work of the group, prioritising and developing ideas for streams of work. I horizon scan to develop our awareness of HE sector initiatives related to genAI broadly. This work aligns well with my role as Senior Teaching Fellow for Library Services, where I work with library teams to deploy learning across taught courses at Imperial using our bespoke Information and Digital Literacy framework, which we collaboratively amended in 2024 to address genAI across areas of competencies for students.

Our approach to being AI Futurists has been, first and foremost, rooted in curiosity. We have taken the stance of asking questions first, rather than being quick to provide answers. In the past year we have learned as we’ve gone along, taking care to harness our enthusiasm into planning. Addressing the fervour around generative AI has at times felt like a Sisyphean task, with new tools and use cases and emerging weekly to sift through, along with a scary raft of ethical challenges.

Among other things, in the past year we’ve developed a Special Interest Group, delivered two cross-faculty AI hackathons, deployed the Business School Faculty Bot, and soon will launch a staff-facing Introduction to Generative AI at Imperial course. We’ve established relationships with stakeholders and friends to align AI initiatives with staff and student needs, including the EDU, ICT, Centre for Academic English, and internal comms teams.

We had the opportunity to contribute a panel discussion at the Imperial Festival of Learning and Teaching, where, along with three student panelists, we considered and debated whether genAI is a sustainable partner, and what a sustainable future for generative AI in education might look like.

We’ve only begun to scratch the surface as regards AI and a wide range of associated ethical concerns. I consider this piece to underpin everything as we engage further with the tools and companies that produce and supply them. In recent weeks in the media, we’ve seen deeply troubling instances of generative AI being used by young people for mental health support, for example, and have become aware of a new level of the disturbing potential for AI-generated harm.

We’ve taken the responsibility of being AI Futurists to heart. To that end, we’ve had worthwhile and perplexing discussions about what it means to be true ‘Futurists’ while still needing to address existing needs and fill gaps.

Our priorities for the coming year include:

• Strategic AI curriculum development beyond immediate course-level changes
• Dedicated research time to assess AI’s pedagogical impact
• Broader faculty engagement, enabling proactive rather than reactive AI strategy development
• Prioritising student collaboration, for example, convening a student-focused forum to discuss their perspectives on generative AI’s impact on learning – in collaboration with ICU
• Experimental tool development in teaching, learning and assessment
• Explorations around the societal impact of genAI, which we expect will be cross-disciplinary with the Social Sciences and Humanities

We fully expect these aims to evolve and be subject to the influence of the perpetually moving target that is genAI in education.


Hall, Rachel. (2025) ‘Sliding into an abyss’: Experts warn over rising use of AI for mental health support. Available at: https://www.theguardian.com/society/2025/aug/30/therapists-warn-ai-chatbots-mental-health-support (Accessed: 2 September 2025).

Imperial College Library Services. (2025) Information and digital literacy. Available at: https://www.imperial.ac.uk/admin-services/library/learning-support/information-and-digital-literacy–/ (Accessed: 2 September 2025).

Papageorgiou, V. (2025) Conceptualising and envisioning the ‘sustainable teacher’ within the contemporary university. Society for Research into Higher Education. Available at: https://srhe.ac.uk/wp-content/uploads/2025/04/Papageorgiou_NRreport.pdf (Accessed: 1 September 2025).

Peck, E., McCarthy, B. and Shaw, J. (2025) The future of the campus university: 10 trends that will change higher education. HEPI Policy Note 64. Oxford: Higher Education Policy Institute. Available at: https://www.hepi.ac.uk/wp-content/uploads/2025/05/The-future-of-the-campus-university.pdf (Accessed: 1 September 2025).