Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

THE MORNING LEAD

Students urged to steer clear of ChatGPT essays as bot reopens debate on college assessment

The copy-writing software has taken social media by storm, and left some educators concerned about the future of assessment.

HIGHER EDUCATION STAKEHOLDERS have said that colleges have to face up to emerging AI software such as ChatGPT, which has caused a storm on social media in recent weeks due to its ability to respond to almost any writing prompt.

The open-access software takes the form of a chatbot which responds to commands and prompts such as explaining scientific concepts, writing scenes for a play, summarising lengthy articles or even writing lines of computer code.

The bot’s abilities go so far as to be capable of mimicking the writing styles of various newspapers and poets, which has startled educators. New York City’s education department recently banned ChatGPT on its networks because of “concerns about negative impacts on student learning”.

Its performance has reopened the debate on the risks linked to artificial intelligence (AI) technologies.

But while college students may be tempted to use the software for assignments, third-level stakeholders have downplayed the potential of ChatGPT to transform university assessment as we know it.

Billy Kelly is the chair of the National Academic Integrity Network, which helps third-level institutions to effectively prevent and respond to academic misconduct. Speaking to The Journal, Kelly said that ChatGPT is only capable of producing essays “that might pass [in] the early years of university.

“If you ask it a generic question, it will give you a credible answer,” he said. “All it’s really doing is predicting text. It doesn’t know if the text is right or not.”

The chatbot is capable of answering a variety of questions, but its performance has reopened the debate on the risks linked to artificial intelligence (AI) technologies.

The software cannot justify its choices, such as explain why it picked the words that make up its responses.

Kelly said that a major giveaway that an essay has been written by ChatGPT is that it would contain “no references whatsoever.”

“It won’t really have substance … if you’re not saying where these ideas are coming from … that should be a red flag.”

Clodagh McGivern, the Vice President for Academic Affairs at the Union of Students in Ireland (USI) told The Journal that colleges must make students aware of what resources they may or may not use to write an assignment. “Using sites that generate a piece of work for students could be seen to some as a form of contract cheating,” she said.

Contract cheating is the practice of students using a third-party tool or service to write assignments.

“If that is the case, it is crucial that the policies in each HEI [higher-education institution] are updated as technology advances and higher education changes. Students should be informed of the resources they can and can’t use when completing assessments.

“The first time students hear about artificial intelligence from their colleges shouldn’t be when they are called before a disciplinary committee. HEIs must continue to educate students on their policies and help them understand why these policies have been adapted and the need to abide by them.”

But McGivern also advised students to “stay away from using AI to complete their assignments.

“If the institution you attend views this as a form of academic misconduct, the penalties can be severe. Penalties can range from grades being reduced, students having to repeat the assignment with the possibility of their new result being capped, or even losing their place on a course.

“Students using AI to complete their course work may also stunt their academic growth as they aren’t achieving their learning outcomes.”

She said students should instead seek out support services in their institution such as essay-writing workshops.

A group of Australian universities recently said they would overhaul exam formats to banish AI tools, regarding them as simply a new way of cheating.

But David Boud, the Co-Director of the Centre for Research in Assessment and Digital Learning in Australia’s Deakin University, told The Journal that colleges should instead change how they teach to better prepare students for the growing presence of technologies such as AI in their everyday lives.

“What’s happened in a number of places, including Australia, is there has been some knee-jerk reactions,” Boud said. This, he said, is “denying the reality of something that’s going to change what we do [every day].”

Universities have to teach their students to recognise both the strengths and limitations of tools like ChatGPT, he said. One potentially useful application of such software is producing summaries of lengthy research papers.

An unhelpful response to the rise of AI, he said, would be to tell students: “Don’t go there.”

Kelly also pointed out that ChatGPT is not the first online tool that could be used to cheat in an assignment. “Before ChatGPT, there were loads of paraphrasing tools out there which students could have used to conceal plagiarism.”

Your Voice
Readers Comments
17
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel