NOVEMBER 13 — As many educators know, the advent of ChatGPT, Gemini and the myriad of other Artificial Intelligence (AI) software have thrown a spanner into the entire system.
These AI chatbots are like virtual 3D printers in that they can produce and generate almost any kind of report, material, presentation and so forth.
Most controversially (in the context of education), AI systems allow students to produce immediate answers to assignment questions.
Given that more and more assignments are given out and submitted in soft-copy, this is a huge issue. That’s because in the past the only way to get past the anti-plagiarism software was to get someone else to write your assignment for you.
You couldn’t just copy-and-paste from all over the Web because the "non-originality” components will pop up faster than mushrooms in the morning.
With ChatGPT, however, students can simply plonk the assignment question into the dialogue box and voila the system can answer everything for you in no time.
TLDR: AI makes it easy and fast to cheat on online assignments nowadays.
So how do university departments prevent or "catch” cheating given that AI systems have run rampant?
For now there are basically only two main options when it comes to addressing cheating in online assignments.
The first is for lecturers to verbally "grill” students if they suspect something amiss with their work.
Face to face, if a student can’t respond to questions adequately that would sorta suggest his writing may not be entirely his.
The problem with this approach involves, obviously, practicality. If you have many students (fyi, the average university class can range from 20 to 200 students) surely you’re not going to speak to each and every one of whom submits a "suspiciously” well-written assignment?
And if you’re only speaking selectively to some, it wouldn’t be illegitimate to ask why you originally suspected this student and not another one.
The second is to use AI-detection software to inform lecturers if certain paragraphs in an assignment answer were generated via AI. You could say this is the academic version of an anti-deep fake.
However, just like with deep-fake images, AI software can improve over time. Not unlike how criminals can get smarter and smarter to deceive the police, AI programs can develop better and better ways to remain undetected by their opposite AI-detection cousins.
Now for the bad news.
Malaysia is presently home to thousands (more than 40,000 as at today) of students from China, many of whom struggle with writing in English.
Hence, many of them use online translation software; they write their assignments in Chinese then translate it prior to submission.
Here’s the rub: Almost all of these online translation programs employ AI and hence will likely be flagged by AI-detection software.
So, if nothing further is done, this means that many students will be in trouble for the crime of translating their work from Chinese (or another language) to English.
I understand some universities (or some of their individual departments) have foreworn dealing with this issue altogether. For them it’s, hang it, if students want to cheat with AI, that’s none of our business, if they want to learn, they learn, if not, it’s on them.
Another "extreme” approach is to drop online assignments and go almost full hard-copy. I really hope whoever decides on this approach doesn’t preach about "eco-sustainability” or savings trees (smile).
I suppose, in the end, almost the only way forward between the Scylla of ignoring the problem and Charybdis of going fully offline is to include some, well, trust and intuition during assessments.
At the end of the day, lecturers will likely need to look beyond what the AI-detection software tells them and use their gut-feel and overall interactions with the class(es) to decide if a student is cheating or not.
More importantly, maybe lecturers need to be a bit flexible about how much "cheating” or what kind of use of online material should matter.
At present there are no easy answers. But, who knows, maybe that’s a good thing?
* This is the personal opinion of the columnist.
You May Also Like