? A Cairo University study has found that AI assistance may be undermining critical thinking and creative problem-solving among university students, adding to growing concerns that our increasing reliance on the technology may be fundamentally changing how we think and learn. The experiment, led by Professor of Political Science Dr. Mazen Hassan, alongside Assistant Lecturer Engi Amin, Associate Professor Sarah Mansour, and Assistant Professor Zeyad Kelani, was conducted with nearly 100 senior university students at Cairo University.
The research was prompted by dramatic changes they’ve witnessed in student behavior. “AI is now being used by 100% of students,” Dr. Hassan, who has taught for 22 years, told EnterpriseAM. “At least for brainstorming their research papers and summarizing the readers. It is also being predominantly used in writing term papers and answering assignments, which is very worrying.”
The research team chose to focus on three variables — innovation, effort, and risk behavior — because of their fundamental importance to higher education. To measure innovation, they used a computer-based game where subjects faced a business problem — increasing the sales of a street vendor selling lemonade. Students had to experiment with different variables, like the color of the lemonade, the sugar concentration, and booth placement. “It's a matter of trial and error, learning from experiences, sometimes taking risks, and eventually they are driven by [income].”
The study design was rigorous: Over a one-month period, participants submitted three graded assignments, with the treatment group using ChatGPT to write the essays, while the control group completed the assignments without AI assistance. One of the study’s most surprising — and perhaps counterintuitive — results was that ChatGPT users became more inclined to take risks. “In the real world,” explains Dr. Hassan, “We have to invest time and effort [into testing] each possible strategy. A [strategy] costs a lot of money and a lot of time.” But this is not the case when you’re relying on the predictions of a chatbot. “It’s a cheap method that encourages the user to try and test multiple solutions to the questions until they get a result,” says Dr. Hassan. “It’s cheap, and it’s fast.”
But the changes extend beyond just how students complete assignments — Dr. Hassan reports that students have been exhibiting less critical thinking. How do they know that it’s due to AI use? “All of a sudden, we’ve been seeing better English being used, which is an indicator of the use of AI,” we were told. When Dr. Hassan polls his students about their reading habits, the results are stark: almost none of them say they’ve actually done the reading — instead, they ask ChatGPT to summarize it for them.
Why summaries are harming us: Dr. Hassan emphasizes that this shift away from traditional reading carries serious consequences, observing that “students’ motivation, innovative drive, and eagerness to learn” have been on the decline since the ascent of AI. “Reading for 30 minutes as opposed to just reading a summary for a couple of minutes generates ideas in one’s mind, opens up completely new venues of thinking that one would not have thought of as opposed to the narrow-minded targeted reading of a [summarized] text,” he explains.
Will these effects on their critical thinking skills be permanent? “From one angle you can say they will be permanent, because people are using it constantly. So it's not a short intervention where we expect it will produce short-term results that would eventually fade away,” he told us. But he also suggests the cautiously optimistic possibility of what statisticians call “regression to the mean” — where as the novelty of the technology fades, students and people on a wider scale will revert back to relying on themselves. “[But] so far I don't see reasons for optimism, especially looking at the university students,” he adds.
What can educators do — if anything? Dr. Hassan argues that completely banning ChatGPT is not feasible — that we would be fighting a force beyond our means. Instead, he proposes several strategies: “We might want to go back to old-style exam-based testing at universities, as opposed to the drive in the past few decades, where the shift has been towards assignments, take-home exams, research papers,” he suggests, as educators cannot now be confident that these assignments are being done by students. Dr. Hassan also recommends more frequent faculty-student meetings to track the development of ideas.
But the challenge runs deeper than just assessment methods: “How do we encourage students to read? This is the toughest task,” Dr. Hassan admits. He believes that professors must become more creative in designing exams that truly test critical engagement with material rather than just comprehension of summaries. “I know that students get copies of past exams and then ask AI to predict what this year's exam would look like. And then they memorize the answer,” he reveals. “So we have to be constantly innovative in our own exam questions, always changing the strategy. This is what we do in political science — we include events that have happened maybe one day before the exam, which is very difficult for AI to predict (I hope).”
But the implications extend far beyond academia. As MIT researcher Nataliya Kosmyna and others have documented, AI assistance appears to reduce brain connectivity and impair memory formation. Cairo University’s research provides complementary evidence that the effects extend to innovation and critical thinking as well, representing one of the first empirical and peer-reviewed studies of ChatGPT’s impact on innovation, effort, and risk behavior in a real-world academic setting. “We want to make sure that the degree that we give to students is [one] that they have earned,” says Dr. Hassan, especially as this upcoming generation become “the ones that are asked by society and by governments, by everyone, to produce new solutions to either economic problems or social problems.”
Kosmyna noted receiving more than 4k emails from educators worried that AI is creating a generation that doesn’t have any usable knowledge or understanding of the material. The Cairo University research team’s findings suggest the need for urgent attention to how these technologies may be reshaping — or regressing — human cognitive capabilities.