By Nicarlyle Hanchard The Jambar
Since the innovation and the public’s accessibility of generative artificial intelligence, Youngstown State University, like many other institutions, has had to tailor its response to student usage.
Through the Office of Academic Affairs, the university’s AI policy focuses on furthering the university’s mission of “inspiring individuals, enhancing futures, and enriching lives.” The policy stresses ethical AI usage with the integration of the technology.
With existing challenges for university education, professors must now consider AI use when assigning and grading coursework. Mark Vopat, professor of philosophy and president of YSU’s Ohio Education Association chapter, said AI use must be regulated based on the nature of the course.
“I don’t think there is a one-size-fits-all sort of approach to AI. I think that in a lot of classrooms, particularly the type of classes I teach, that AI doesn’t actually help but actually hurts the learning process,” Vopat said. “In other areas I know that it can be valuable, so in computer programming, the computer science department, the way in which programming is being done now I think it’s helpful.”
Vopat also said though he recognizes other areas of legitimate use for the technology, he is wary of students using AI to replace doing work or interacting with difficult ideas.
“I worry about it being used as a replacement for the grappling with difficult subjects and difficult concepts, you know, and in part, that is the way we learn, is by having to wrestle with new ideas, new concepts, before we go and try to find an answer or someone to help,” Vopat said. “When we are talking about how AI should be used right now, unfortunately, the majority of the time I see it being used outside of specific classes that are utilizing it as part of the course, is that students are using it to avoid doing work.”
YSU has developed courses, such as Artificial Intelligence in Business, taught by Joe Palardy, professor of economics in the Williamson College of Business Administration, to help students navigate the changing technological space. He also said as the technology changes, so does the course content.
“What I was going to teach, or what I did teach last year, almost has to be different from what I’m teaching this year,” Palardy said. “The basic idea for the class, at least from a historical perspective, was we would teach some basic prompt engineering, right? How you work with the AI tools, what they’re capable of and then it was a lot of experimentation.”
Palardy said the class’ assignments expose students to the capabilities of the technology, sharpen their evaluative skills and determine the limits of the technology.
“It’s when you’re working a lot with AI, the skill sets that you often use sometimes are different, and one of the skill sets that you use is evaluation. Instead of generating stuff, you’re offloading a lot of the generation to the AI, but you still have to validate it,” Palardy said.
Continuing the idea of AI as a tool to aid learning, Jill Tall, associate professor in the department of Chemical and Biological Sciences, said she was introduced to AI as a collaborative tool. She said she encourages students to use the technology in a similar manner or to help simplify terms. She also said that regardless of students’ familiarity with AI, she does not believe her role as a professor would change.
“I really feel that my role as a faculty member is to foster critical thinking and curiosity, and plant those seeds to our undergraduates and let them grow as they become, in my case, physicians. Most of the students I work with are pre-health, so regardless of AI and its use, I feel that ultimately, in the classroom and in my laboratory and working with students in support hours, my role is to help them foster that curiosity,” Tall said.
Tall also said she believes it would be a disservice to students if they are not exposed to the technology before entering the workforce. Similar to Vopat and Palardy, Tall said she understands that the technology’s use and integration may differ based on areas of study.
“I do believe there’s something to be said for the discipline specific nature of the use of AI,” Tall said. “But it’s here, and our students need to have some awareness of it and just like if a new type of biological technique was developed, I think it’s our responsibility to show our undergraduate students what this technique is all about.”
Anwarul Islam, professor and director of the civil engineering program, combines the previously shared views. Islam said students should use AI to familiarize themselves with the technology, but thinks large language models can hinder the learning process.
“We want to see how you are evolving through the process, how much you are learning, how you are learning and using your reference materials and your own thought process into writing an essay,” Islam said. “On the other hand, if you use ChatGPT, you are not doing anything like that. ChatGPT is not using your thought process, it is a command-based output. So, whatever command you put in, that’s how ChatGPT will read, and then it will give you the output based on your command.”
He said the field of engineering is based on mathematical interpretations and other design concepts with which AI could assist students, but they must be aware of possible inaccuracies. Islam said students should use the available AI technologies to their advantage, but said he believes limitations must be placed on the extent of its use.
