This is your brain; this is your brain on ChatGPT. 🤣 According to an MIT study, ChatGPT may be eroding critical thinking skills.
In this Time article titled (wait for it!) ChatGPT May Be Eroding Critical Thinking Skills, According to a New MIT Study (written by Andrew R. Chow and available here), a new study from researchers at MIT’s Media Lab divided 54 subjects—18 to 39 year-olds from the Boston area—into three groups, and asked them to write several SAT essays using OpenAI’s ChatGPT, Google’s search engine, and nothing at all, respectively.
Researchers used an EEG to record the writers’ brain activity across 32 regions, and found that of the three groups, ChatGPT users had the lowest brain engagement and “consistently underperformed at neural, linguistic, and behavioral levels.” Over the course of several months, ChatGPT users got lazier with each subsequent essay, often resorting to copy-and-paste by the end of the study.
The paper suggests that the usage of LLMs could actually harm learning, especially for younger users. The paper has not yet been peer reviewed, and its sample size is relatively small. But its paper’s main author Nataliya Kosmyna felt it was important to release the findings to elevate concerns that as society increasingly relies upon LLMs for immediate convenience, long-term brain development may be sacrificed in the process. As for why it hasn’t been peer reviewed, Kosmyna’s team did submit it for peer review but did not want to wait for approval, which can take eight or more months, to raise attention to an issue that Kosmyna believes is affecting children now.
Kosmyna, who has been a full-time research scientist at the MIT Media Lab since 2021, wanted to specifically explore the impacts of using AI for schoolwork, because more and more students are using AI. So she and her colleagues instructed subjects to write 20-minute essays based on SAT prompts, including about the ethics of philanthropy and the pitfalls of having too many choices.
The group that wrote essays using ChatGPT all delivered extremely similar essays that lacked original thought, relying on the same expressions and ideas. Two English teachers who assessed the essays called them largely “soulless.” The EEGs revealed low executive control and attentional engagement. And by their third essay, many of the writers simply gave the prompt to ChatGPT and had it do almost all of the work. “It was more like, ‘just give me the essay, refine this sentence, edit it, and I’m done,’” Kosmyna says.
There was also a “brain-only group” which showed the highest neural connectivity, and a third group, which used Google Search, which also expressed high satisfaction and active brain function.
The subjects were then asked to re-write one of their previous efforts—but the ChatGPT group had to do so without the tool, while the brain-only group could now use ChatGPT. The first group remembered little of their own essays, and showed weaker alpha and theta brain waves, which likely reflected a bypassing of deep memory processes.
In other words, they didn’t learn much, if anything, about the topic from using ChatGPT.
Those of us who have seen so many fake citations case rulings (158 cases identified so far) are not surprised. Using ChatGPT effectively requires understanding the results (and ensuring they are accurate), not just copying and pasting them.
So, what do you think? Are you surprised that the study found that ChatGPT may be eroding critical thinking skills? Please share any comments you might have or if you’d like to know more about a particular topic.
Disclaimer: The views represented herein are exclusively the views of the authors and speakers themselves, and do not necessarily represent the views held by my employer, my partners or my clients. eDiscovery Today is made available solely for educational purposes to provide general information about general eDiscovery principles and not to provide specific legal advice applicable to any particular circumstance. eDiscovery Today should not be used as a substitute for competent legal advice from a lawyer you have retained and who has agreed to represent you.
Discover more from eDiscovery Today by Doug Austin
Subscribe to get the latest posts sent to your email.




We can run this same study on every new technology, though. 35 years ago, they taught me long division, step by step, on paper. Not long after that, we were allowed to use a calculator because the problems we were solving got more complex. Then a graphing calculator, then a computer, etc. Did my brain function suffer? Probably? Can I still do long division on paper, step by step .. nope. Somehow I’ve survived. People probably came down pretty hard on the folks who invented the plough something like 6,000 years ago as well. They attached it to humans or large animals and others thought “boy, those lazy plough guys.. their bodies are going to go to mush from avoiding turning the soil with their bare hands! Fools!” Meanwhile, people didn’t live past age 40, regardless of brain or body activity.
I think this entire study is bunk for the simple reason that 100% of SAT essays are on topics that teenagers don’t give a rat’s behind about. “Ethics of philanthropy”?? Get real, adults! We have dinosaurs writing these things and expecting “engagement” from kids? Come on. Expanding things to “18 to 39 year-olds” with a question like “what are the pitfalls of too many choices?” isn’t likely to get people motivated either.
I bet if you asked someone interested in sports to write something about the benefits and pitfalls of a 7-game final series in hockey, basketball, and baseball you’d get more engagement, LLM or no LLM. What is the societal impact of TikTok? What genre of music do you like the most, and provide arguments as to why this style is superior to other popular styles. Who is the greatest actor of your generation, noting some of the roles they were good at, and at least one role they could have done better at. Describe your best friend, or someone close to you.. what makes them so cool, and describe a problem that you worked together to solve. What Chat application do you use the most, and why is it superior to other Chat applications you’ve used?
I’d like to see this study, or one like it, involve topics/activities/challenges that “the youts” actually care about. The current study, to me, reads like this: “here’s an activity that is boring as hell. No one on earth wants to do this. You are allowed to have a robot do it for you, but the downside is that you will learn nothing about the admittedly-boring topic that we force-fed to you. In fact, it’s so boring and irrelevant that the LLM might actually fall asleep or try to power itself off. Now, we will make fun of you in a peer-reviewed journal for having low brain activity during this horrible exercise.”
Yeah, the dinosaurs will always be able to poke fun at the younger generation for being “lazy” or “getting soft”, and maybe put a bunch of data behind the argument .. but we all were born with the latest technology of that day at our disposal – – a Clovis point, a bow-drill, a plough, a calculator, a computer… and now we have access to a on-demand brain with all of human knowledge pre-loaded. Might as well use it. If the task at hand provides no benefit to humanity, or even a single human (eg, writing an essay that only a single English teacher/grader will ever read, and which the writer gives zero poops about), should we be doing it in the first place?
In recent years, there has been a growing use of the AI platform; ChatGPT. The complexity and advancement of the technology continues to surprise people and display the capabilities of Large Language Models (LLM). It has benefitted many people from walking through how to solve a physics problem to creating blueprints for structures. However, for all of its positive benefits, it has a dark side as well. On Time’s website they have an article entitled, “ChatGPT May Be Eroding Critical Thinking Skills, According to a New MIT Study”. The study it comments on was published June 10, 2025 and the Time article was written by Andrew Chow. The article comments on the study, “Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task”. The study showed that people who are allowed to use ChatGPT for writing essays have low brain engagement and become unmotivated after writing multiple essays. Even though people enjoy using AI, it may not be healthy for society, causing a reduction in critical thinking and producing a lazy society.
In order to be able to balance the usage of AI, we need to pray for the cardinal virtue of temperance. As the Catechism of the Catholic Church says, “Temperance is the moral virtue that moderates the attraction of pleasures and provides balance in the use of created goods.” (CCC 1809).
Temperance is the virtue which is severely lacking in the use of ChatGPT. People will use it idly and arbitrarily for anything (and everything) without thought to the consequences. When this happens, I have heard of relationships being significantly hurt if the AI is being improperly used to view, say, pornography or entertain fake romantic relationships. However, in this article by Chow, a research scientist said, “‘… AI, if used properly, could enhance learning as opposed to diminishing it.’” ChatGPT is not necessarily bad in and of itself, it is the way we use it, and this is where temperance comes in. Just taking that simple step of pausing and asking ourselves the reason for using ChatGPT and whether or not it is justified can help us practice not only temperance, but prudence and moderation as well. It will help balance the benefits of using AI while also not using it too much. If our answer is that we are bored and just want to use it for fun, our time will probably be spent better elsewhere. Doing this will also help us nourish the virtue of prudence by being aware of our own motivations.
We need to pray for the cardinal virtue of prudence to help us use ChatGPT wisely. As Dr. Gan in his book Infinite Bandwith: Encountering Christ in the Media, says, “… attitude awareness-works to loosen the media’s hold on us.” (Gan, pg. 63)
Having attitude awareness can really help us be introspective when we use tools like ChatGPT. If we us AI with the wrong attitude such as using it to fill a gap in our lives it can become like a God. What needs to happen is that we view it as what it is: a tool. A tool that should be used to help deepen our faith, not mindlessly complete simple tasks or assignments. “‘Education on how we use these tools, and promoting the fact that your brain does need to develop in a more analog way, is absolutely critical’” (Chow, 2025). This is a very good point because one of the best ways to prevent misuse of such tools is to teach someone how to properly use ChatGPT, with the right mindset and attitude. We need to stop and ask ourselves, why am I looking this up? Is it really necessary? Will this hurt or benefit my soul?
Praying for the virtue of justice helps us use this tool without harming people’s dignity. “Next to His saving love, our own dignity is the central truth of our lives, a truth intended to guide us in all our thoughts and actions…” (Gan, pg. 79). In order to give God and our neighbor what they deserve, we need to be able to live up to the fullest sense of our dignity
Chat boxes have the potential to diminish human dignity with inappropriate use because our work might no longer be our own. We need to be mindful of how this tool can degrade or uplift the dignity of others and our self. Part of our dignity is to use our God-given gift of reason and creativity. “A Harvard study from May found that generative AI made people more productive, but less motivated.” (Chow, 2025). This is the danger of AI tools like ChatGPT because they unjustly remove a man’s purpose from his work and so he is less motivated and fulfilled. It takes the value out of a person doing their own work and being proud of it. The work one does gives them dignity and if a robot takes the place of a person, dignity is lost and so is justice.
In order use robots like ChatGPT for good, we must pray for fortitude. It is the “… moral virtue that ensures firmness in difficulties and constancy in the pursuit of the good.” (CCC 1808). “Whether we’re creating a misleading picture of our life on Facebook, cultivating an online persona who does what we would never do offline, or simply failing to give credit where credit is due, we’re not using the media in a way that reflects or upholds truth.” (Gan, pg. 90).
Platforms like ChatGPT make truth difficult because it makes it easy to cheat and pass off the work of a robot as your own. “… by their third essay, many of the writers simply gave the prompt to ChatGPT and had it do almost all of the work.” (Chow, 2025). The lines of truth and lies can even become blurred. We talk ourselves into thinking something might not be cheating or plagiarizing when it really is. When that happens, our conscience is not properly formed and therefore diminishes our sense and pursuit of the truth. This is the beauty of fortitude. This virtue gives us the strength to push forward and inspires us constantly seek truth and be truthful even when circumstances make it difficult to do so. There is nothing wrong with proclaiming the truth and in most circumstances it can instill hope.
The theological virtue of hope is what can inspire us to use media in a way that it can help others. “Media, of course, doesn’t have to be inspiring.” but “Media should inspire because that’s what media is intended to do.” (Gan, pg. 109).
In some instances, chat boxes like ChatGPT can inspire hope and possibly other virtues, but in this article, we can see that it is not inspiring people to create their own content and it is also stealing people’s hope by making people feel lonely. “Studies from earlier this year, for example, found that generally, the more time users spend talking to ChatGPT, the lonelier they feel.” (Chow, 2025). This AI program is removing a person’s chance to talk to a real human and make connections with them. When talking to ChatGPT, you can’t make a difference to the person you are talking to, and you lose the value of other’s lives being able to transform your own and give you hope. And as we have seen, ChatGPT has not been able to do that because it is not skillfully developed in that way.
In order to resist falling into despair while using media, we should pray for the virtue of faith. “When we fail to recognize what constitutes skillfully developed media, we fail to incorporate the principles governing well-designed media into our own use.” (Gan, pg. 126).
As far as ChatGPT being skillfully developed overall, there’s no question that the creators made it engaging and intelligent. “…many people now search for information within AI chatbots as opposed to Google Search.” (Chow, 2025). However, it is not skillfully developed in another area: faith. This is because it lacks faith. ChatGPT doesn’t have a God and it can’t tell people who made them or what their true purpose is in life and that we have an intelligent designer who loves us. It is not able to cultivate faith in its users for this reason. Even though ChatGPT cannot cultivate faith on its own, users can still learn about their faith through its capabilities such as asking it to summarize aspects of the faith that might be difficult to find researching on your own. But always check its answers with a credible source because it can make mistakes. Having faith is important because if we don’t have faith in God, then people have no meaning in their lives. This is why it is always good to have personal testimony and experience to help others understand their faith better.
As a final note, “Pray for the theological virtue of charity, which can promote sympathy for and understanding others.” (Gan, pg. 146). Charity is so important because it informs and shapes our experiences with others. Charity can bring hope to a place that has none.
In the context of ChatGPT users, “The group that wrote essays using ChatGPT all delivered extremely similar essays that lacked original thought, relying on the same expressions and ideas.” Here we can see that using the AI for writing your essay lacks original thought which means that your papers are not as motivated by your experiences. With that, can also come a lack of charity to others because you are depriving them of unique experiences that could deepen their own faith.