Generating potential trouble: AI on campus

By John Cox

Youngstown State University students have been caught and penalized for using generative artificial intelligence to complete assignments. 

Certain programs, such as ChatGPT, pose a concern for instructors. With the wealth of data the AI is trained on, a detailed prompt can write a student’s assignment for them, to varying degrees of success. 

YSU staff is learning more about these programs, what they do, how their data is collected and what to do to combat them if used in an academically-dishonest way.

Mark Vopat, head of the YSU-OEA faculty union and a philosophy professor, attended a meeting with the founder of GPTZero, which is an AI that can detect when a text is created by another AI.

“A grad student at Princeton had written a program called GPTZero. He came up with an incredibly accurate detector of AI,” Vopat said. “He’s already formed a company with some of his fellow grad students and they’re offering his program along with an [application programming interface] that can be used for other programs.”

Blackboard currently supports SafeAssign, a plagiarism detection program. Rosalyn Donaldson, director of IT Training Services, said generative AI doesn’t cite work, so it could bypass the program.

“Generative AI does not necessarily cite information that is found on the web. It’s difficult to determine whether or not it’s plagiarized,” Donaldson said. “It’s up to the student in their sense of due diligence to do work [and] to not use those tools, but we have tools in place already to find issues of plagiarism.” 

Assistant Dean of Students Erin Hungerman said the use of AI by students violates academic integrity rules regarding plagiarism and use of outside sources. Students caught can be penalized by their professor if they confess to the charges.

“If students admit to the charge, the professor is able to work with them to come up with whatever type of penalty would be appropriate,” Hungerman said.

Hungerman also said if students deny the charge, the matter is taken to an academic integrity hearing, where a panel of faculty members and students determine the likelihood of the offense and if found guilty, are punished appropriately.

All students who’ve been found to violate academic honesty by using AI software go through the Dean of Students Office. Hungerman said she’s already dealt with violations.

“We’ve had a handful of cases that’ve been routed to our office where faculty members have been able to determine and students have eventually admitted to using AI for at least a portion of the writing assignments,” Hungerman said.

Donaldson said students should be cognizant that the process of completing assignments is important to the learning process.

“When you write a thesis or dissertation, the research you’re reading is that of others, but you’re assimilating that information into knowledge and using critical thinking in order to formulate an idea,” Donaldson said. “If you’re using generative AI, you don’t know if the information is correct or from a reputable source.”

Vopat said he agrees that assignments aren’t only about the finished product.

“If I assign a paper in one of my courses, is it the paper that I’m concerned about or should I be concerned with the entire process?” Vopat said. “I want my students to take an idea, research that idea, outline, draft a paper and revise that draft.” 

According to Vopat, students using ChatGPT to cheat and bypass the learning process are cheating themselves out of the higher education they’re paying for.

“Parents say to ‘Go to a university to get a good job.’ That’s not the only reason you go. It’s not just about a job or a grade on a paper it’s about learning to get to that end,” Vopat said.