Three years after the public launch of ChatGPT, artificial intelligence is no longer a novelty on college campuses—it's a daily reality. Universities across Minnesota and beyond are now at a critical juncture, as educators and students navigate the complex landscape of AI in higher education. The technology is forcing a fundamental reevaluation of teaching methods, assignment design, and the very definition of academic integrity.
While some professors see AI as an innovative tool for learning, others view it as a direct threat to critical thinking. This division has led to a patchwork of policies and classroom approaches, with some instructors reverting to traditional pen-and-paper exams while others design projects that require students to use AI ethically and effectively.
Key Takeaways
- Many university professors are now creating their own classroom policies on AI use, leading to inconsistent rules across departments.
- To combat potential cheating, instructors are increasingly using in-class handwritten essays, oral exams, and assignments based on personal experience.
- Forward-thinking educators are integrating AI into their curriculum, teaching students how to use it as a tool for data analysis and idea generation.
- The debate over AI is prompting a broader conversation about the essential skills students need to learn before graduation.
A Campus Divided: The Two Sides of AI Adoption
On university campuses, the response to AI tools like ChatGPT can be described as a tale of two philosophies. One camp is pushing back, concerned that the technology undermines the core purpose of education. The other is leaning in, arguing that proficiency with AI is a necessary skill for the modern workforce.
Dal Liddle, an English professor at Augsburg University, falls firmly into the first category. He has banned the use of AI in his writing classes, believing it prevents students from developing crucial job skills and the ability to think critically. To enforce this, he employs a mix of old and new methods.
"Universities teach job skills and critical thinking and build knowledge," Liddle stated, adding that in his view, AI "actively prevents all three."
His approach includes in-class paper and pencil tests, requiring handwritten writing samples, and using Google Docs to track a student's writing process. This allows him to see if text appears suddenly, a common sign of AI-generated content.
The Return to Traditional Assessment
Liddle is not alone. Many educators are turning back the clock on assessment methods. Cat Saint-Croix, who teaches philosophy at the University of Minnesota, has her students complete logic assignments and exams by hand in the classroom. Her policy permits using AI to brainstorm ideas, much like discussing a topic with a friend, but prohibits it from producing any of the final work.
A Shift in Grading
Some professors, like Augsburg University physics chair Moumita Dasgupta, have decreased the weight of homework assignments in final grades, acknowledging that students can easily use AI to complete them. Instead, she relies more on frequent in-class quizzes and oral exam components.
The sentiment is echoed by Keenan Hartert, a biology professor at Minnesota State University, Mankato. He believes that essential human elements of education, such as mentoring and grading, should never be outsourced to AI. His tests are all conducted in person on paper. For final projects, he now requires a spoken component and assigns tasks rooted in personal experience, such as tracing a genetic trait through a student's family history.
"At that point it is like, ‘Why cheat on this?’" Hartert explained. "It’s going to be more effort to make up something [so personal]."
Embracing AI as a Teaching Tool
While some professors build defenses against AI, others are actively integrating it into their lesson plans. They argue that since AI is already a fixture in many industries, students must learn to use it responsibly. Corey Nelson, a marketing lecturer at the University of Minnesota’s Carlson School of Management, uses AI daily in his own work and expects his students to learn how to leverage it.
In a recent project, Nelson had students analyze hundreds of Amazon product reviews for e-bikes. They used ChatGPT to identify trends in the data and then compared those findings with their own analysis performed in Excel. The goal was not to replace thinking but to use AI as a powerful analytical assistant.
Discipline-Specific Challenges
The usefulness of AI varies significantly by field. According to Claire Halpert, director of the University of Minnesota's Institute of Linguistics, AI models are less effective in niche subjects like linguistics because they haven't been trained on a large volume of specialized texts. In contrast, fields like marketing and computer science are already heavily influenced by AI.
Shilad Sen, who chairs the computer science department at Macalester College, takes a balanced approach. His department has a "no AI" policy for introductory courses to ensure students master the fundamentals. However, in his upper-level classes, students are taught to use AI to solve real-world problems drawn from his own work as an AI scientist.
One such assignment challenged students to determine how AI could be used to match job candidates with open positions, a complex task that requires both technical skill and ethical consideration.
The Future of Learning and Academic Integrity
The rise of AI is forcing a profound conversation about the purpose of higher education. "AI really forces us to return to that question of, ‘What are we doing here’?” said linguistics professor Claire Halpert. The consensus is that universities must focus on teaching skills that AI cannot replicate: critical analysis, ethical judgment, and creative problem-solving.
Many professors now begin their semesters with frank discussions about ethics. Sangupta tells her physics students that cheating with AI will not help them pass in-class exams or succeed in their careers. Others appeal to students' financial investment, reminding them of the high cost of tuition.
Student Perspectives
Student opinions on AI are just as varied as those of their professors. Some express moral or environmental objections to the technology. Galin Jones, director of the University of Minnesota’s School of Statistics, noted the range of student views he has encountered.
"I remember one young woman saying, ‘I hate it,’" Jones recalled. "Other people are like, ‘Yeah, I’m using it to just help me cheat in classes that I don’t really care about.’"
Even students who see its benefits are wary. Macalester student Sheila Bhowmik used ChatGPT to help understand a dense academic text and found it remarkably effective. However, the experience left her concerned about her ability to resist using it as a shortcut under pressure.
As universities move forward, the challenge will be to create a framework that harnesses AI's potential while safeguarding academic integrity. This may involve more in-class writing labs, oral defenses of written work, and a renewed emphasis on what Hamline University Provost Wes Kisting calls "good, old-fashioned literacy"—the ability to critique and analyze information, regardless of its source.





