There should be national guidelines on the use of artificial intelligence (AI) in tests and documents submitted in higher education. The National Union of Students (LSVb) supports this. News hour. Colleges and universities differ on whether students are allowed to use this tool: one educational institution encourages its use, while another prohibits it.
The most popular tool for students is ChatGPT. It’s a bot that you can ask for information from and that draws on sources from all over the internet. In seconds, ChatGPT can write an academic-level research paper. It can also translate and proofread. But when teachers suspect that the text was written by a chatbot, proving it in practice can be a huge challenge.
In the old form of plagiarism, paragraphs could be tracked one by one, but the plagiarism checking tools that some teachers use using artificial intelligence are far from waterproof.
No sources found
“Sometimes I feel like an auditor,” says Associate Professor Marielle Attinger, director of education at the Open University’s School of Law. “If I recognize a particular structure from chatGPT and then I look at the footnotes, I see different sources.” When Attinger checks those sources, they often don’t exist at all. “So it’s poor quality content, it’s just copied individually from chatGPT. The student doesn’t meet the requirements of what is expected.”
While schools expect teachers to be vigilant against plagiarism and fraud, there is still no national policy on the use of AI in education. Every educational institution is reinventing the wheel. This leads to divisions over where students can and cannot use AI.
This AI tool helps these two TU Delft students:
“Do you always say please to ChatGPT?”
The student union LSVb is upset by the arbitrariness. “Let’s say you first start in higher vocational education where you are taught how to deal with artificial intelligence. But if you later go to study elsewhere, you suddenly become a fraud. While the same method is applied,” says Abdulkadir Karbas, president of LSVb.
For example, the University of Amsterdam prohibits AI “unless expressly stated otherwise.” Erasmus University Rotterdam believes this is the case. Plagiarism or ghostwriting When a student uses an AI program without the examiner’s permission.
The UTwente test requires students to indicate whether they use AI, even if they don’t. The Amsterdam University of Applied Sciences also requires a justification: “Because not all content generated by AI has specific sources but is trained with huge amounts of data, you indicate which AI model is used, such as ChatGPT,” is the rule there.
Student Suusje Helwegen would prefer not to use AI in her work. She thinks the risk is too great:
“What if they kick you out of college?”
Some educational institutions don’t have a policy at all. “We receive a lot of complaints from students,” says Karpaši, head of LSVb. “Students who don’t know how to deal with it. Most of them just want to succeed and not be labeled as fraudsters. That’s why it’s so important for educational institutions to have agreements on the use of AI.”
The Association of Universities of Applied Sciences stresses that schools have different rules. “We still have to decide the path when it comes to AI,” says chairman Maurice Lemmen.
Doubt often remains just doubt.
Attorney Casper Van Vliet, of CumLaud Legal, offers legal assistance to “suspected” students. “Many AIT texts have never been written this way before, so it’s very difficult for a plagiarism scanner to detect it. Then suspicion often remains suspicion.”
Van Vliet constantly deals with cases of students who have been identified as fraudsters. He says they regularly come to the forefront due to lack of evidence.
cat and mouse game
The Association of Applied Sciences calls it a cat-and-mouse game. “Detection methods continue to evolve, and so do AI programs,” says Lemmen..
But it can also be done differently, believes Associate Professor Attinger. According to her, too much attention is paid to the risks. If there is a suspicion of fraud, she now has her own method. “I don’t put my energy into reporting. I prefer to put my energy into educating students about what technology does and how they can use it for good.”
The Ministry of Education is monitoring the development of AI in education. But it also says: “Educational institutions themselves are responsible for the quality of education and testing.” The ministry also announced that all MBO, HBO and WO institutions are now working on a comprehensive vision for the sector.