- As AI use becomes more commonplace in higher education, students are now raising concerns about professors’ use of AI tools for tasks like grading and lesson planning. College professors told Fortune that the use of AI for tasks such as class preparation and grading has become “pervasive.” Some students argue that this diminishes the value of their education and raises transparency and fairness issues.
AI use is continuing to cause trouble on college campuses, but this time it’s professors who are in the firing line. While it was once faculty at higher institutions who were up in arms about students’ use of AI, now some students are getting increasingly irked about their professors’ reliance on it.
On forums like Rate My Professors, students have complained about lectures’ overreliance on AI.
Some students argue that instructors’ use of AI diminishes the value of their education, especially when they’re paying high tuition fees to learn from human experts.
The average cost of yearly tuition at a four-year institution in the U.S. is $17,709. If students study at an out-of-state public four-year institution, this average cost jumps to $28,445 per year, according to the research group Education Data.
However, others say it’s unfair that students can be penalised for AI use while professors fly largely under the radar.
One student at Northeastern University even filed a formal complaint and demanded a tuition refund after discovering her professor was secretly using AI tools to generate notes.
College professors told Fortune the use of AI for things like class preparation and grading has become “pervasive.”
However, they say the problem lies not in the use of AI but rather the faculty’s tendency to conceal just why and how they are using the technology.
Automated Grading
One of the AI uses that has become the most contentious is using the technology to grade students.
Rob Anthony, part of the global faculty at Hult International Business School, told Fortune that automating grading was becoming “more and more pervasive” among professors.
“Nobody really likes to grade. There’s a lot of it. It takes a long time. You’re not rewarded for it,” he said. “Students really care a lot about grades. Faculty don’t care very much.”
That disconnect, combined with relatively loose institutional oversight of grading, has led faculty members to seek out faster ways to process student assessments.
“Faculty, with or without AI, often just want to find a really fast way out of grades,” he said. “And there’s very little oversight…of how you grade.”
However, if more and more professors simply decide to let AI tools make a judgment on their students’ work, Anthony is worried about a homogenized grading system where students increasingly get the same feedback from professors.
“I’m seeing a lot of automated grading where every student is essentially getting the same feedback. It’s not tailored, it’s the same script,” he said.
One college teaching assistant and full-time student, who asked to remain anonymous, told Fortune they were using ChatGPT to help grade dozens of student papers.
The TA said the pressure of managing full-time studies, a job, and a mountain of student assignments forced them to look for a more efficient way to get through their workload.
“I had to grade something between 70 to 90 papers. And that was a lot as a full-time student and as a full-time worker,” they said. “What I would do is go to ChatGPT…give it the grading rubric and what I consider to be a good example of a paper.”
While they said they reviewed and edited the bot’s output, they added the process did feel morally murky.
“In the moment when I’m feeling overworked and underslept… I’m just going to use artificial intelligence grading so I don’t read through 90 papers,” they said. “But after the fact, I did feel a little bad about it… it still had this sort of icky feeling.”
They were particularly uneasy about how AI was making decisions that could impact a student’s academic future.
“I am using artificial intelligence to grade someone’s paper,” they said. “And we don’t really know… how it comes up with these ratings or what it is basing itself off of.”
‘Bots Talking to Bots’
Some of the frustration is due to the students’ use of AI, professors say.
“The voice that’s going through your head is a faculty member that says: ‘If they’re using it to write it, I’m not going to waste my time reading.’ I’ve seen a lot of just bots talking to bots,” Anthony said.
A recent study suggests that almost all students are using AI to help them with assignments to some degree.
According to a survey conducted earlier this year by the UK’s Higher Education Policy Institute, in 2025, almost all students (92%) now use AI in some form, up from 66% in 2024.
When ChatGPT was first released, many schools either outright banned or put restrictions on the use of AI.
Students were some of the early adopters of the technology after its release in late 2022, quickly finding they could complete essays and assignments in seconds.
The widespread use of the tech created a distrust between students and teachers as professors struggled to identify and punish the use of AI in work.
Now, many colleges are encouraging students to use the tech, albeit in an “appropriate way.” Some students still appear to be confused—or uninterested—about where that line is.
The TA, who primarily taught and graded intro classes, told Fortune “about 20 to 30% of the students were using AI blatantly in terms of writing papers.”
Some of the signs were obvious, like those who submitted papers that had nothing to do with the topic. Others submitted work that read more like unsourced opinion pieces than research.
Instead of penalizing students for using AI directly, the TA said they docked marks for failing to include evidence or citations, rather than critiquing the use of AI.
They added that the papers written by AI were marked favourably when automated grading was used.
They said when they submitted an obviously AI-written student paper into ChatGPT for grading, the bot graded it “really, really well.”
Lack of transparency
For Ron Martinez, the problem with professors’ use of AI is the lack of transparency.
The former UC Berkeley lecturer and current Assistant Professor of English at the Federal University of Paraná (UFPR), told Fortune he’s upfront with his students about how, when, and why he’s using the tech.
“I think it’s really important for professors to have an honest conversation with students at the very beginning. For example, telling them I’m using AI to help me generate images for slides. But believe me, everything on here is my thoughts,” he said.
He suggests being upfront about AI use, explaining how it benefits students, such as allowing more time for grading or helping create fairer assessments.
In one recent example of helpful AI use, the university lecturer began using large language models like ChatGPT as a kind of “double marker” to cross-reference his grading decisions.
“I started to think, I wonder what the large language model would say about this work if I fed it the exact same criteria that I’m using,” he said. “And a few times, it flagged up students’ work that actually got… a higher mark than I had given.”
In some cases, AI feedback forced Martinez to reflect on how unconscious bias may have shaped his original assessment.
“For example, I noticed that one student who never talks about their ideas in class… I hadn’t given the student their due credit, simply because I was biased,” he said. Martinez added that the AI feedback led to him adjusting a number of grades, typically in the student’s favor.
While some may despair that widespread use of AI may upend the entire concept of higher education, some professors are already starting to see the tech’s usage among students as a positive thing.
Anthony told Fortune he had gone from feeling “this whole class was a waste of time” in early 2023 to “on balance, this is helping more than hurting.”
“I was beginning to think this is just going to ruin education, we are just going to dumb down,” he said.
“Now it seems to be on balance, helping more than hurting… It’s certainly a time saver, but it’s also helping students express themselves and come up with more interesting ideas, they’re tailoring it, and applying it.”
“There’s still a temptation [to cheat]…but I think these students might realize that they really need the skills we’re teaching for later life,” he said.
This story was originally featured on Fortune.com