#Take5 #115 How Much AI-Generated Text Can Students Use?

This #Take5 is brought to you from Eva Shackel, a learning developer at Bath Spa University. Eva is co-chair of the ALDinHE Community of Practice on AI, and is currently researching students’ use of GenAI. Last year she wrote her university’s guidance on AI use, and here are her reflections on the decision making processes that went into deciding what to allow.

Bath Spa’s Lessons from the Academic Phrasebank 

bath spa how to use gen AI

It’s a full academic year since I wrote Bath Spa University’s guidelines for students on using Generative AI (GenAI). At the time, I felt quite daring in stating that not only could students use it, but they could also cut and paste from it. This approach was shaped by my own experiences with ChatGPT, and an ongoing internal dialogue with an imaginary academic integrity panel. 

From the outset, it seemed to me that debating whether students should be allowed to use GenAI was moot. There’s no point in banning something we can’t effectively police. Moreover, the term ‘use’ was being interpreted in fundamentally different ways. Asking GenAI to write whole essays is clearly inappropriate and easily detectable. However, using it for feedback, such as inputting different versions of a paragraph and asking which is better and why, is much harder to detect and, I would argue, wholly different. 

I made it up

As a learning developer, like many of us, I initially used ChatGPT to generate short pieces of academic writing to use as illustrative examples when teaching topics like referencing. It can quickly generate text tailored to the specific student cohort and meant that  PowerPoints and exercises were much more bespoke than the generic examples I’d relied on previously. At the same time, I was writing a PhD proposal on how students use GenAI. Though I asked it not to write the proposal for me, at times in our dialogue ChatGPT would say something that I was really, really tempted to cut and paste, but wasn’t a substantial enough idea to warrant referencing. This is where that imaginary academic integrity panel came into play. When I found myself using ChatGPT’s phrase “There is a pressing need for research…”, my defense lay in precedent. If it’s OK to cut and paste from the Academic Phrasebank, why not from ChatGPT? 

Illustration of a website maintenance robot
clipboard with a checklist

I started to think about what it was that made the Phrasebank acceptable to use. Firstly, it doesn’t contain facts. Secondly, it typically offers parts of sentences or short phrases. This led me to propose that students could similarly use short phrases or sentence fragments from ChatGPT, provided they don’t contain factual information. 

The next question was: how much student text could reasonably be composed of such fragments? Typically, a Phrasebank phrase introduces evidence or studies within a paragraph, and students might use three or so studies per paragraph. So the final guidance suggested a maximum of three such fragments per paragraph. In practice, some examples of phrases that would be allowed by this approach include: ‘Emerging technologies provide…’ and ‘This approach incorporates diverse perspectives…’, as well as the aforementioned ‘There is a pressing need for research…’

Say what?

At the time, this felt quite bold, and to my knowledge, Bath Spa’s guidance is unique in allowing a limited amount of cut and paste from GenAI. There’s also a lot in there about using GenAI in ways that don’t involve direct copying, such as to ask it for feedback and act as a virtual assistant. The guidelines have been published for students, and inform our team’s workshop content. I am currently updating the guidelines to talk about image creation – a hot topic given that a lot of our students are studying creative and art based subjects; and producing some differentiated resources for PhD students.

So far, the reception to this approach from academics has been largely positive, from the AI early adopters at least. However, one has said that “students won’t read it properly, and will just think you’re giving them permission to cut and paste wholesale”. It’s also been suggested that students need a certain amount of knowledge to be able to work with AI, and as learning developers we need to support writing as a learning process (see this guide on doing just that).To find out if they’re right, and learn how students are interpreting what I’ve written, the next stage is to seek feedback more formally from students. I am curious to hear from wider members of the learning development community as well, and academic integrity panels, real or imagined, if they think my defense is viable. You can contact me at e.shackel@bathspa.ac.uk.

About Eva

Eva Shackel

Eva currently works at Bath Spa University in the Academic Skills team. She started her career teaching Sociology, and has held a wide range of roles in learning development. She has published research into written feedback and is currently researching students’ use of GenAI.

Leave a Comment

Skip to content