AI in Education: What About Staff Use?

There has been, and continues to be, a lot of chatter and panic across many sections of society with the rise of AI, not least education. Schools, colleges and universities alike. The predominant focus has been on student use of AI, on plagiarism, on the tweaking and adapting of assessment, and on detection. The question though that should be asked is: What about staff use? Really, how much is too much from our side as faculty and teaching staff.

I raised this recently at an internal conference where a senior respected colleague was encouraging the faculty to embrace AI in teaching and support. It was a fair and well balanced talk. However, I felt the concern still seemed to be mainly about student use (and misuse), yet more and more staff are using AI to create lecture slides, video content and materials. Following the common saying, “What is good for the goose is good for the gander”, students can rightly ask us the same question; where are the checks and balances in place for staff use (and misuse)?

So today, that just happened. Directly and very publicly.

Earlier, The Guardian shared a story [1] about a student cohort at a UK university, who noticed their course looked like it was generated by AI. They cited suspicious file names, artificial voiceovers and content mistakes that had not been checked. The students challenged it in a live online lesson while recording the session. Outraged and indignant, they said they could have more easily asked ChatGPT themselves than attended the lessons! Damning.

To me, this did not look like students being anti-AI. They are already well aware that AI is changing work and life around us and are embracing its use. Studies show this [2]. However, despite the transformational uses of AI that we see everywhere, it is clear that in university teaching and learning there is an important point being made here. A question of quality and authenticity in education.

So what happens when education becomes ‘processed’. When convenience replaces care. When lecturers allow AI to do too much of the heavy lifting. Does it not resemble food culture. Once, home cooked meals were made from freshly prepared ingredients, which then moved to using sauces and additives from jars, it then moved to ready meals and then to fast food. It saved time but now, arguably, the quality has changed and somewhat diminished. Society now pays the price in declining health.

So the question remains. If we move too far toward AI generated teaching, what will be the cost to knowledge, learning and the student experience?

This is not a criticism, but something to reflect on and openly debate. Together.

[1] The Guardian: https://lnkd.in/ewXsH2TK
[2] Student perceptions of AI: https://lnkd.in/e-BD3FV4

*Accompanying image proudly created with ChatGPT! 🙂

Unknown's avatar

About Mohamed Yaseen

Digital Learning Consultant in the Higher Education sector. Governor of a leading city school in Leicester. #Education #Technology #DigitalLearning #PeopleDevelopment #Mediation #Productivity #PublicSpeaking
This entry was posted in Uncategorized and tagged , , , , . Bookmark the permalink.

Leave a comment