Liverpool Hope Logo Liverpool Hope Logo
Liverpool Hope Logo

Expert comment: Why academia should embrace - not ban - ChatGPT

David Reid ChatGPT expert comment banner

A leading expert in Artificial Intelligence (AI) says schools, colleges and universities should embrace ChatGPT as a valuable tool for learning rather than prohibit it due to the threat of plagiarism.

David Reid, Professor of AI and Spatial Computing at Liverpool Hope University, believes the advent of ChatGPT proves that AI needs to be taught in schools and says that not doing so would be a great disservice to the workforce of tomorrow.

Launched by OpenAI, ChatGPT is a chatbot which is capable of producing an answer or response to whichever question it is asked and, upon its release to the public in December 2022, it caused immediate concerns for academia.

Unlike using a search engine such as Google, ChatGPT doesn’t just pull information from the internet, it tries to sensibly reconstruct frequently associated concepts, which makes it much more difficult to detect plagiarism. 

There is already lots of evidence of ChatGPT-based cheating within universities, which has led to it being widely banned across many states in the USA.

However, Professor Reid believes using the technology will become an essential life skill and is calling for AI to be taught from primary school age to ensure we equip future generations with the skills and knowledge to use it effectively.

“There are two ways the education sector can react to ChatGPT,” he explains.

“We can try and ban it, but that will be difficult because we have already seen that AI-based plagiarism detection tools like TurnItIn are failing to identify submissions from ChatGPT.

“Or, we try to educate people about it like we did with computers.

“ChatGPT can be a useful learning tool so we should be educating students on how best to use it at school or university, but we should also be doing so because it is going to become more common in the workplace.

“It is going to be an essential life skill so everyone needs to be AI literate just as they are computer literate.

“I personally think it should be taught in primary school and there is no question whatsoever that it should be taught in secondary school.”

Professor Reid says incorporating AI into the school curriculum should involve:

 - Highlighting the strengths and weaknesses of tools like ChatGPT to teach students to neither blindly accept its answers at face value, nor to fear using it.

 - Demonstrating how the data sets that ChatGPT uses may be inherently biased, which means it provides a useful function in holding a mirror up to a society's thoughts and opinions on subjects.

 - Illustrating how data can transform our fundamental way of interacting with computers. Rather than just considering computers as machines to execute algorithms, start considering them as belief machines.

 - Helping students understand at a basic level what tools like ChatGPT are doing ‘under the hood’ and why it makes the decisions it makes.

 - Highlighting how ChatGPT could be a co-worker in all sorts of circumstances and jobs.

Rather than viewing ChatGPT as a threat, Professor Reid considers it to be education’s ‘great disruptor’.

If the sector is to embrace the technology as a tool then it will need to adapt and assessment is one area he has identified as needing a change of approach.

Reid says ChatGPT could be incorporated into essays or exams – just as calculators can be used in maths – but believes universities will have to reformulate the types of questions they ask of students.

“At the moment, ChatGPT is poor at causal reasoning and constructing logical arguments,” he adds.

“It can contradict itself when arguing a specific point, answers are often overly verbose and it often makes up phantom references, and argues untruths in a convincing way. 

“It is also very poor at introspection, so when you point out its mistakes rather than defending its position it just accepts it. 

“However, one of its great strengths is that you can quickly refine its answers in an interactive way. All of this means we need to rethink how we assess students. 

“To contain its use we should ask questions that require proof of a logical ‘chain of thought’. They could be asked in a way that is hard to textually define, such as by using diagrams, pictures, videos or by the use of physical objects or hardware. 

“But a deeper question also exists – why not let students use it as part of the assessment under certain conditions?

“Big companies such as Google, Meta and IBM are all preparing their own chatbots and progress on the AI driving these systems is advancing rapidly, which ultimately means that trying to detect the hand of AI in future assignments is going to become far more difficult.”

Please note: All expert comments hosted on the Liverpool Hope University website provide the views and opinions of the author(s), and not the views and opinions of Liverpool Hope University.


Published on 23/02/2023