天美麻豆

天美麻豆

Artificial Intelligence

Another AI Side Effect: Erosion of Student-Teacher Trust

AI is exacerbating a feeling that since the pandemic, the classroom dynamic has grown transactional.

By Greg Toppo | September 22, 2025
Eamonn Fitzmaurice/The74, Getty

William Liang was sitting in chemistry class one day last spring, listening to a teacher deliver a lecture on 鈥渞esponsible AI use,鈥 when he suddenly realized what his teachers are up against.

The talk was about a big, take-home essay, and Liang, then a sophomore at a Bay Area high school, recalled that it covered the basics: the rubric for grading as well as suggestions for how to use generative AI to keep students honest: They should use it as a 鈥渢hinking partner鈥 and brainstorming tool.

As he listened, Liang glanced around the classroom and saw that several classmates, laptops open, had already leaped ahead several steps, generating entire drafts of their essays.

Liang said his generation doesn鈥檛 engage in moral hand-wringing about AI. 鈥淔or us, it鈥檚 simply a tool that enables us not to have to think for ourselves.鈥

For us, it鈥檚 simply a tool that enables us not to have to think for ourselves.

William Liang, student

But with AI鈥檚 awesome power comes a side effect that many would rather not consider: It鈥檚 killing the trust between teachers and students. 

When students can cheaply and easily outsource their work, he said, why value a teacher鈥檚 feedback? And when teachers, relying on sometimes unreliable AI-detection software, believe their students are taking such major shortcuts, the relationship erodes further.

It鈥檚 an issue that researchers are just beginning to study, with results that suggest an imminent shakeup in student-teacher relationships: AI, they say, is forcing teachers to rethink how they think about students, assessments and, to a larger extent, learning itself. 

If you ask Liang, now a junior and an experienced 鈥 he has penned pieces for The Hill, The San Diego Union-Tribune, and the conservative Daily Wire 鈥 AI has already made school more transactional, stripping many students of their desire to learn in favor of simply completing assignments. 

鈥淭he incentive system for students is to just get points,鈥 he said in an interview. 

While much of the attention of the past few years has focused on how teachers can detect AI-generated work and put a stop to it, a few researchers are beginning to look at how AI affects student-teacher relationships.

Researcher Jiahui Luo of the Education University of Hong Kong that college students in many cases resent the lack of 鈥渢wo-way transparency鈥 around AI. While they鈥檙e required to declare their AI use and even submit chat records in a few cases, Luo wrote, the same level of transparency 鈥渋s often not observed from the teachers.鈥 That produces a 鈥渓ow-trust environment,鈥 where students feel unsafe to freely explore AI.

In 2024, after being asked by colleagues at Drexel University to help resolve an AI cheating case, researcher , who teaches in the university鈥檚 , analyzed college students鈥 , spanning December 2022 to June 2023, shortly after Open AI unleashed ChatGPT onto the world. He found that many students were beginning to feel the technology was testing the trust they felt from instructors, in many cases eroding it 鈥 even if they didn鈥檛 rely on AI.

While many students said instructors trusted them and would offer them the benefit of the doubt in suspected cases of AI cheating, others were surprised when they were accused nonetheless. That damaged the trust relationship.

For many, it meant they鈥檇 have to work on future assignments 鈥渄efensively,鈥 Gorichanaz wrote, anticipating cheating accusations. One student even suggested, 鈥淪creen recording is a good idea, since the teacher probably won鈥檛 have as much trust from now on.鈥 Another complained that their instructor now implicitly trusted AI plagiarism detectors 鈥渕ore than she trusts us.鈥

It's creating this situation of mutual distrust and suspicion, and it makes nobody like each other.

Tim Gorichanaz, Drexel University

In an interview, Gorichanaz said instructors鈥 trust in AI detectors is a big problem. 鈥淭hat’s the tool that we’re being told is effective, and yet it’s creating this situation of mutual distrust and suspicion, and it makes nobody like each other. It’s like, 鈥楾his is not a good environment.鈥欌

For Gorichanaz, the biggest problem is that AI detectors simply aren鈥檛 that reliable 鈥 for one thing, they are more likely to flag the papers of English language learners as being written by AI, he said. In one Stanford University , they 鈥渃onsistently鈥 misclassified non-native English writing samples as AI-generated, while accurately identifying the provenance of writing samples by native English speakers.

鈥淲e know that there are these kinds of biases in the AI detectors,鈥 Gorichanaz said. That potentially puts 鈥渁 seed of doubt鈥 in the instructor鈥檚 mind, when they should simply be using other ways to guide students鈥 writing. 鈥淪o I think it’s worse than just not using them at all.鈥 

鈥業t is an enormous wedge in the relationship鈥

Liz Shulman, an English teacher at Evanston Township High School near Chicago, recently had an experience similar to Liang鈥檚: One of her students covertly relied on AI to help write an essay on Romeo and Juliet, but forgot to delete part of the prompt he鈥檇 used. Next to the essay鈥檚 title were the words, 鈥淢ake it sound like an average ninth-grader.鈥

Asked about it, the student simply shrugged, Shulman recalled in she co-authored with Liang.

In an interview, Shulman said that just three weeks into the new school year, in late August, she had already had to sit down with another student who used AI for an assignment. 鈥淚 pretty much have to assume that students are going to use it,鈥 she said. 鈥淚t is an enormous wedge in the relationship, which is so important to build, especially this time of the year.鈥

It is an enormous wedge in the relationship, which is so important to build.

Liz Shulman, English teacher

Her take: School has transformed since 2020鈥檚 long COVID lockdowns, with students recalibrating their expectations. It鈥檚 less relational, she said, and 鈥渕uch more transactional.鈥 

During lockdowns, she said, Google 鈥渋nfiltrated every classroom in America 鈥 it was how we pushed out documents to students.鈥 Five years later, if students miss a class because of illness, their 鈥渋nstinct鈥 now is simply to check , the widely used management tool, 鈥渞ather than coming to me and say, 鈥楬ey, I was sick. What did we do?鈥欌

That鈥檚 a bitter pill for an English teacher who aspires to shift students鈥 worldviews and beliefs 鈥 and who relies heavily on in-class discussions.

鈥淭hat’s not something you can push out on a Google doc,鈥 Shulman said. 鈥淭hat takes place in the classroom.鈥

In a sense, she said, AI is contracting where learning can reliably take place: If students can simply turn off their thinking at home and rely on AI tools to complete assignments, that leaves the classroom as the sole place where learning occurs. 

鈥淏ecause of AI, are we only going to 鈥榙o school鈥 while we’re in school?鈥 she asked. 

鈥榃e forget all the stuff we learned before鈥

Accounts of teachers resigned to students cheating with AI are 鈥渃oncerning鈥 and stand in contrast to what a solid body of research says about the importance of teacher agency, said , senior vice president for Innovation and Impact at the Carnegie Foundation.

Teachers, she said, 鈥渁re not just in a classroom delivering instruction 鈥 they’re part of a community. Really wonderful school and system leaders recognize that, and they involve them. They’re engaged in decision making. They have that agency.鈥

One of the main principles of Carnegie鈥檚 , a blueprint for improving secondary education, includes a 鈥渃ulture of trust,鈥 suggesting that schools nurture supportive learning and 鈥減ositive relationships鈥 for students and educators.

鈥淓ducation is a deeply social process,鈥 Stafford-Brizard said. 鈥淭eaching and learning are social, and schools are social, and so everyone contributing to those can rely on that science of relational trust, the science of relationships. We can pull from that as intentionally as we pull from the science of reading.鈥

Education is a deeply social process. Teaching and learning are social, and schools are social.

Brooke Stafford-Brizard, Carnegie Foundation

Gorichanaz, the Drexel scholar, said that for all of its newness, generative AI presents educators with what鈥檚 really an old challenge: How to understand and prevent cheating. 

鈥淲e have this tendency to think AI changed the entire world, and everything’s different and revolutionized and so on,鈥 he said. 鈥淏ut it’s just another step. We forget all the stuff we learned before.鈥

Specifically, research going back identifies four key reasons why students cheat: They don鈥檛 understand the relevance of an assignment to their life, they鈥檙e under time pressure, or intimidated by its high stakes, or they don’t feel equipped to succeed.

Even in the age of AI, said Gorichanaz, teachers can lessen the allure of taking shortcuts by solving for these conditions 鈥 figuring out, for instance, how to intrinsically motivate students to study by helping them connect with the material for its own sake. They can also help students see how an assignment will help them succeed in a future career. And they can design courses that prioritize deeper learning and competence. 

To alleviate testing pressure, teachers can make assignments more low-stakes and break them up into smaller pieces. They can also give students more opportunities in the classroom to practice the skills and review the knowledge being tested.

And teachers should talk openly about academic honesty and the ethics of cheating.

鈥淚鈥檝e found in my own teaching that if you approach your assignments in that way, then you don’t always have to be the police,鈥 he said. Students are 鈥渕ore incentivized, just by the system, to not cheat.鈥

With writing, teachers can ask students to submit smaller 鈥渃heckpoint鈥 assignments, such as outlines and handwritten notes and drafts that classmates can review and comment on. They can also rely more on oral exams and handwritten blue book assignments. 

Shulman, the Chicago-area English teacher, said she and her colleagues are not only moving back to blue books, but to doing 鈥渁 lot more on paper than we ever used to.鈥 They鈥檙e asking students to close their laptops in class and assigning less work to be completed outside of class. 

As for Liang, the high school junior, he said his new English teacher expects all assignments to come in hand-written. But he also noted that a few teachers have fallen under the spell of ChatGPT themselves, using it for class presentations. As one teacher last spring clicked through a slide show, he said, 鈥淚t was glaringly obvious, because all kids are AI experts, and they can just instantly sniff it out.鈥 

He added, 鈥淭here was a palpable feeling of distrust in the room.鈥

Did you use this article in your work?

We鈥檇 love to hear how The 74鈥檚 reporting is helping educators, researchers, and policymakers.

Republish This Article

We want our stories to be shared as widely as possible 鈥 for free.

Please view The 74's republishing terms.





On The 74 Today