‘Lines are getting blurred’ between student AI use and cheating, educators say

Pen-and-paper tests may become the latest weapon in anti-cheating technology as artificial intelligence is rendering students’ take-home exams and homework assignments obsolete.

“The cheating…

Pen-and-paper tests may become the latest weapon in anti-cheating technology as artificial intelligence is rendering students’ take-home exams and homework assignments obsolete.

“The cheating is off the charts. It’s the worst I’ve seen in my entire career,” Casey Cuny, a 2024 recipient of California’s Teacher of the Year award, told the Milwaukee Independent. “Anything you send home, you have to assume is being AI’ed.”

The prolific use of AI has forced high school and college educators to re-evaluate “many of the teaching and assessment tools that have been used for generations” such as writing outside the classroom, the Independent explained. 

“We have to ask ourselves, what is cheating?” said Cuny, who has taught English for 23 years. “Because I think the lines are getting blurred.” 

“Learning with AI instead of cheating with AI” 

Some teachers such as Cuny have adapted by assigning most work inside classrooms under strict guidelines. 

By monitoring student laptop screens, blocking site-specific access and discussing AI usage in lessons and study aids, Cuny works “to get kids learning with AI instead of cheating with AI,” according to the Independent. 

One of Cuny’s former students includes Jolie Lahey, now an 11th grader at Valencia High School in southern California. She got used to working with ChatGPT and other AI programs under Cuny’s guidance but now faces “No AI” policies with her teachers this academic year, she said. 

“It’s such a helpful tool,” she said of AI. “And if we’re not allowed to use it, that just doesn’t make sense. It feels outdated.” 

Because schools have mostly left AI policies to individual teachers, guidelines can vary widely from district to district, school to school or even within schools. 

“Some educators, for example, welcome the use of Grammarly.com, an AI-powered writing assistant, to check grammar,” the Independent explains. “Others forbid it, noting the tool also offers to rewrite sentences.” 

“I wonder, is this cheating?” 

Even with consistent and specific AI policies, students still point out difficulties interpreting educators’ overall intent. 

For example, college sophomore Lily Brown has difficulties implementing one of her class syllabus rules: “Don’t use AI to write essays and to form thoughts.” 

“Sometimes I feel bad using ChatGPT to summarize reading, because I wonder, is this cheating? Is helping me form outlines cheating?” said Brown, who is majoring in psychology at an East Coast liberal arts school. 

“If I write an essay in my own words and ask how to improve it, or when it starts to edit my essay, is that cheating?” 

Rebekah Fitzsimmons is chair of the AI faculty advising committee at Carnegie Mellon University’s Heinz College of Information Systems and Public Policy. Over the summer, she helped create “detailed new guidelines for students and faculty that strive to create more clarity,” the Independent noted. 

“A lot of faculty are doing away with take-home exams. Some have returned to pen and paper tests in class, [Fitzsimmons] said, and others have moved to ‘flipped classrooms,’ where homework is done in class.” 

Ultimately, the responsibility lies with faculty to adapt alongside the fluctuating technological environment, argues Emily DeJeu, assistant teaching professor at Carnegie’s Tepper School of Business. 

“To expect an 18-year-old to exercise great discipline is unreasonable,” she said. “That’s why it’s up to instructors to put up guardrails.” 

Promoting AI usage “responsibly” nationwide 

As previously reported by The Lion, more than half of Americans in Pew Research surveys express reservations over increased AI usage in daily life. 

At the same time, a growing percentage of U.S. residents are working with AI across various professions, including data processing (63%), accounting, banking, finance, insurance or real estate (10%), and information and technology (12%). 

“U.S. adults are generally pessimistic about AI’s effect on people’s ability to think creatively and form meaningful relationships,” the survey center noted, adding “53% say AI will worsen people’s ability to think creatively, compared with 16% who say it will improve this.” 

The federal government has taken steps to increase AI education among the nation’s youth, with President Trump signing an executive order in April to “promote AI literacy and proficiency” for an “AI-ready workforce.” 

More recently in September, First Lady Melania Trump hosted the White House Task Force on AI Education, noting a duty for leaders and parents to “manage AI’s growth responsibly.” 

“During this primitive stage, it is our duty to treat AI as we would our own children – empowering, but with watchful guidance,” she said. 

Brooke Rollins, U.S. Secretary of Agriculture, noted “our young people are ready” to win the international race in developing AI with such tools as the First Lady’s Presidential AI Challenge. 

“I want to say that President Trump has been very clear, the United States will lead the world in artificial intelligence, period, full stop,” she said. “Not China, not of our other foreign adversaries, but America.” 

“It’s literally just science” 

One promising example of innovative AI usage involves Timeback, an AI app by Joe Liemandt to help students maximize their learning effectiveness. 

“As a country we spend $20,000 or more per kid, and what do we get? Sixty percent of eighth graders can’t read, they feel stupid, and they hate school,” Liemandt said. 

In contrast, AI apps such as Timeback can track progress, address learning gaps and reteach subjects so students can accomplish more in less time, according to Liemandt. 

“People think it’s witchcraft, but it’s literally just science,” he said. “It’s how we can take a kid the conventional school system calls ‘two years behind’ and catch them up in 40 to 60 hours. It’s how we give kids their time back.” 

Instead of dismissing all AI concerns from parents – including increased screen time, diminished academic performance and cyberbullying – Liemandt argues the way it’s used will determine its benefits to education. 

“If you deploy ChatGPT to every student in America, we will become the dumbest country on the planet,” he said. 

Meanwhile, a “combination of high support and high standards” provided by AI tools will empower students to outpace the traditional classroom model netting only a 5% retention rate, he told Colossus. 

“When you teach them 10 times better and then give them these cool workshops where they can learn these life skills that they really care about, they love it.”