Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Ohio professors weigh how to adapt curriculum as student AI use soars

Woman in checkered suit lectures a class standing in front of a white board.
Ygal Kaufman
/
Ideastream Public Media
Professor Carelicha Duckworth speaks to her intro-level computer science class at Tri-C, where the curriculum revolves around teaching students to use AI tools responsibly.

Youngstown State University philosophy professor Mark Vopat bans use of generative AI in his classroom. But that doesn’t mean students don’t try to use it.

"A bunch of students actually plagiarized their paper by having an AI write their data ethics paper, which, that one right there, I didn't even know what to say," Vopat said.

Beyond academic dishonesty, however, Vopat said he thinks AI is preventing students from wrestling with big ideas, as philosophers do.

"I actually want them to struggle with the material ... hard is good. Hard is what allows us to move information from sort of a short-term memory into a long-term memory," Vopat said. "It's what allows them to learn how to make connections between difficult concepts."

Mark Vopat, a professor of philosophy at Youngstown State University.
Mark Vopat
Mark Vopat, a professor of philosophy at Youngstown State University.

The advent of AI tools like ChatGPT over the last several years has changed the higher education landscape, with recent surveys suggesting between 80 and 90% of college students use it for course work. But for professors especially, it's upending longstanding practices and patterns of teaching, and some are struggling to adapt. And recent national study of faculty found 95% were worried that student overreliance on AI erodes critical thinking skills and jeopardizes academic integrity.

As tools become more sophisticated, Vopat questioned how professors wil be able to tell if students used AI. Normally with violations of academic ethics guidelines, he'd just fail them on an assignment or test.

"But there are times when it will be then taken to the next level, which would be to report to the university and ask that there be a stricter penalty, which could be suspension or removal from the university," he said. "Right now with AI, it's becoming more and more difficult to even do that unless the student admits to it, right? Because generally without evidence, students say, 'No, I wrote that.' And there are legitimate problems detecting it."

AI in the classroom

Most colleges and universities in Ohio have broad policies allowing professors to adopt their own approach to AI in the classroom.

On one end of the spectrum are the Vopats of the world. On the other is Kent State University Professor Stefanie Moore. She encourages students to experiment with AI in her Digital Public Relations and Social Media class, as long as they cite it. During a recent classroom visit, students debated the merits of Coca-Cola's AI-generated Christmas TV commercial, and talked about their experiments with AI.

"Students have been charged with identifying a use case, so how they could use AI to automate a task or process," Moore explained.

Moore’s research suggests modern PR professionals use AI frequently in their work, and she wants her students to be prepared for that reality. Her attitude is that professors have to adapt their curriculum to fit the changing times.

Kent State University Professor Stefanie Moore, right, with some students in her digital public relations class at a recent Greater Cleveland PRSA event.
Stefanie Moore
Kent State University Professor Stefanie Moore, right, with some students in her digital public relations class at a recent Greater Cleveland PRSA event.

"We really have to take a step back in terms of our assignments and retooling them to really focus on the critical thinking and maybe changing how students are doing writing work, building it into the classroom, maybe coaching them to start with their own drafts, and then using AI to augment what they do," she explained. "So there's a lot of things that we've been working on to make the assignments more valuable in terms of critical thinking and having them use their own creativity."

Senior Tate Bryan thought the content strategy developed by a popular AI tool called Claude was helpful for marketing her blog about Hollywood’s golden age.

"It gives a content calendar, content pillars, a social media strategy, and it does SEO and growth," she explained. "It knows terms that we learn in class. So it's built for the profession."

But the students also butted up against the programming’s limitations.

Senior Busola Sobola tried to use AI tools to develop a video marketing campaign for shea butter targeting African women like her. They weren’t that accurate.

"This obviously is an American, an African-American woman. This is not an African woman, definitely," she said, laughing, as she showed me a picture the AI had generated.

The voices it made up also weren’t right.

"I gave it another prompt: Can you make it more Nigerian or have an African accent? Then it started using some Nigerian slangs, but still saying the words in the Western accent," she said.

Multiple studies suggest AI tools are replicating human racial bias across multiple systems and fields, from healthcare to language models.

AI and the future, for students and professors

Despite its limitations, some industries and modern workplaces are demanding knowledge of AI, said Carelicha Duckworth, a preceptor in the computer science department at Cuyahoga Community College (Tri-C). So, she created her intro-level computer science class curriculum from scratch last year to teach students how to use a mix of AI tools. One of the main class projects late last year was for students to use AI to build a plan to upgrade the IT infrastructure for a fictional veterinary clinic.

Duckworth said the assignment mimics what many students could be doing post-college.

Woman in checkered suit helps student at computer station.
Ygal Kaufman
/
Ideastream Public Media
Professor Carelicha Duckworth, center right, looks at a student's assignment in her computer science class at Tri-C.

"In order for these students to understand and appreciate technology, I think we have to remain innovative, right? So for me, it was a no-brainer. We are going to use AI responsibly," she said.

Duckworth's class is a mix of college-age and high-school students taking the class for college credit. Some said AI use on assignments is banned outright at their current high schools. Andre Donerson, a junior at Cleveland Central Catholic High School, said while he's long been interested in computer science, his exposure to AI in the classroom is informing his future career.

"I've now kind of migrated to maybe trying to find a major specifically focused on A.I.," Donerson said.

Some universities are charging ahead with the technology. Bowling Green University and the University of Akron have recently announced AI-focused majors and programs. And Ohio State University last year said it will embed AI into every class, and wants every student to graduate “fluent” with it. Vopat, at YSU, questions the rush.

"Everyone's presenting as this great tool, but it's only a tool once you know and understand how to use it or have the foundation," he said. "I don't want AI writing things for me if I don't know how to write well, right?"

If AI does all the hard work for students, Vopat worries they’ll leave college without the critical thinking skills and knowledge they’ll need to change the world for the better.