A new study from the University of Southern California reveals a significant trend among college students: they are more likely to use generative artificial intelligence to get direct answers for assignments rather than to understand the material. The research highlights that students who feel less confident in a subject or are less connected to their peers are the most frequent users of AI for this purpose.
Key Takeaways
- A USC study found students primarily use generative AI for "executive help-seeking" (getting answers) instead of "instrumental help-seeking" (learning a concept).
- Students with lower course confidence and less peer interaction are more likely to rely on AI for quick solutions.
- Data shows high AI adoption, with one survey indicating 85% of students used it for coursework in the past year.
- Faculty encouragement of thoughtful AI use can shift student behavior towards learning-oriented approaches.
- Students are calling for clearer, standardized policies from universities on acceptable AI use.
A Shift in Student Help-Seeking Behavior
As generative AI tools become integrated into academic life, universities have focused on teaching students to use them ethically. However, research from the USC Center for Generative AI and Society suggests a gap between institutional goals and student practice. The study, which surveyed 1,000 U.S. college students, found that AI is often used as a shortcut.
Researchers made a distinction between two types of help-seeking. Instrumental help-seeking is when a student tries to understand a topic, such as asking for clarification on a concept from class. In contrast, executive help-seeking is a means to an end, like finding a quick answer to complete an assignment without engaging in the learning process.
Understanding Help-Seeking Behaviors
The USC study frames student actions in two ways: instrumental use, which is focused on learning, and executive use, which is focused on task completion. The findings indicate a strong preference for executive use when it comes to generative AI, suggesting students see it as a tool for efficiency rather than education.
When seeking to learn, students reported they were most likely to consult the internet or an instructor. Tutors and peers were ranked lower, just above generative AI. However, for getting quick answers, students again turned to the internet first, followed closely by generative AI and peers, placing instructors and tutors last. This pattern suggests students are more comfortable using technology than human sources for direct assistance.
Who Relies on AI the Most?
The USC research identified specific student profiles that were more likely to depend on AI for answers. A primary factor was a student's perceived competence in a course. Those who felt less confident were significantly more likely to use generative AI to complete their work.
Social engagement also played a crucial role. Students who were averse to asking their peers for support showed a much higher tendency to engage with AI chatbots. This points to a potential link between a student's sense of belonging in the classroom and their reliance on technology over human interaction.
High Rate of AI Adoption
According to a survey by Inside Higher Ed and Generation Lab, 85% of students reported using generative AI for their coursework in the past year. Common uses included brainstorming (55%), asking questions as if it were a tutor (50%), and studying for exams (46%).
Furthermore, the study found that students with weaker internet search skills were less equipped to find information independently and thus more dependent on generative AI. Conversely, those with strong search abilities were less likely to turn to AI for executive help. Trust in the technology was another predictor; students who trusted AI's output were more inclined to use it for answers.
Conflicting Data on AI and Human Support
While the USC study points to a preference for technology, national data on student help-seeking habits is not entirely consistent. Different studies present a more complex picture of how students balance AI with traditional support systems.
For instance, a 2025 study from Tyton Partners found that while two-thirds of students use AI chatbots weekly, 84% of students said they first turn to a person, such as a peer or instructor, when they need help in a course. Only 17% reported using AI tools as their primary source of assistance. This finding contrasts with the USC data suggesting a preference for technology over human help.
Another analysis from the Center for Studies in Higher Education at the University of California, Berkeley, noted that fewer students reported helping their classmates after the pandemic, which could contribute to an increased reliance on individual tools like AI.
Support for Marginalized Students
A separate report from WGU Labs discovered that students from marginalized backgrounds, including first-generation students and students of color, were more open to using AI for academic support. The report theorized this could be linked to a perceived lack of adequate support from traditional institutional resources.
The Role of Educators and Institutional Support
The way students interact with AI is not fixed. The USC research strongly suggests that pedagogy can significantly influence student behavior. When professors actively encourage the thoughtful and ethical use of generative AI, students are more likely to use it for learning rather than just for answers.
This finding underscores the social impact that instructors can have on technology adoption in the classroom. By framing AI as a tool for exploration and critical thinking, educators can guide students away from a purely transactional relationship with the technology.
The study's authors suggest a need for greater learning support for all students. This includes not only teaching them how to use generative AI effectively but also improving fundamental skills like internet searching. More faculty assistance and fostering a stronger sense of classroom community are also recommended solutions.
Students Call for Clearer AI Policies
Amid the rapid adoption of AI, students themselves are asking for more guidance. The Inside Higher Ed survey revealed that nearly all students believe colleges should address threats to academic integrity posed by AI.
A majority of students expressed a desire for clear and standardized policies outlining when and how AI can be used. Over half of the students surveyed requested either strict guidelines or, conversely, more flexibility for transparent AI use. This indicates a broad consensus among the student body that the current ad-hoc approach is insufficient, leaving them uncertain about academic expectations.





