By Christopher Gillett
Jambar Contributor
The company OpenAI announced an artificial intelligence video technology called Sora on Feb. 15. Improving upon previous AI video technologies, Sora generates more realistic videos.
In several industries, there are worries that AI video softwares such as Sora could be used for committing crimes, cheating, spreading disinformation and ending jobs.
OpenAI posted a YouTube video Feb. 17 showing prompted Sora-generated videos mimicking animations, drone shots, historical footage and movie cinematography.
While the company has said Sora will eventually be available to the general public, there is no announced release date or waitlist.
Joseph Palardy, an economics professor at Youngstown State University, is involved in crafting YSU’s AI policy. Palardy said a set of AI principles will be presented to the Academic Senate in April.
According to Palardy, AI deepfake technology could allow a student to frame a faculty member for something, but such unethical activities were already addressed by the university’s Acceptable Use of University Technology Resources policy and Student Code of Conduct before the technology’s development.
Palardy said one challenge will be regulating AI in syllabi.
“The faculty member has to know what’s possible [with AI] in order to put it down in their syllabus or specify in class that ‘okay, this is not what we want students to do,’” Palardy said.
Palardy said the speed of AI innovations can present challenges of regulating AI video technology.
“Any faculty member who is doing videography work or having their students create videos should learn how to use or at least experiment or play around with Sora so they know what it’s capable of,” Palardy said. “The technology’s changing so fast, it’s hard to make sure we get every possible iteration within the syllabus.”
Communications professor Dan McCormick is a studio coordinator and broadcast engineer at YSU who has worked as a professional videographer. McCormick said AI video potential is limited, especially at YSU.
“[AI is] not necessarily thinking it up like a human thinks up something. It’s more or less pulling from everything that’s ever been produced,” McCormick said. “If I wanted [students] to get b-roll or video of campus, there’s a good chance there wouldn’t be enough out there for an AI generator to pull from to create. Now, that’s today. Very soon that might not be an issue.”
Paul Ditchey, a senior lecturer and freelance videotape operator, said cheating with AI videos in and out of the classroom isn’t a guaranteed success.
“People have had their Pulitzer Prize taken away because they made up a story or something like that,” Ditchey said. “If you send [a student] out to shoot something locally and they came back with drone footage of the Amazon jungle, you’d probably know they didn’t shoot it themselves.”
Dicthey also said AI videos look different from video shot on a camera.
“They either have a little choppiness to them, or the colors aren’t quite right, or the movement of whoever’s in it doesn’t seem to be fluid enough,” Ditchey said. “You just look at it and go, ‘It’s not real. I just can’t tell you why.’”
According to McCormick, authentic-looking videos created by AI have the most potential.
“If they could pull that off, that’s the area where it’s a little bit scary because it looks like someone could’ve taken that video,” McCormick said. “That could be a security camera or that could be someone’s cell phone.”
McCormick also said he’s worried about the potential for AI video software to replace jobs.
“It makes me a little nervous as a video professional, because a lot of jobs that would pay decent money might be eliminated,” McCormick said. “Just because part of it is scary doesn’t mean that I’m against it. I’m just curious how it’s going to be used. It seems very easy to do the wrong thing — be it cheating on a project or framing someone for murder.”
Editor’s note: Paul Ditchey is Jambar TV’s advisor. Ditchey is not involved in the editorial process.