The Guardian Investigates Student Use of ChatGPT for 18 months

  • Thread starter Thread starter jedishrfu
  • Start date Start date
AI Thread Summary
The Guardian's monitoring of three college students over 18 months revealed significant reliance on ChatGPT for academic tasks, emotional support, and everyday inquiries. Students frequently outsource essay writing and research, raising ethical concerns about academic integrity. They also use AI for emotional issues, preferring its non-judgmental nature over human interaction, which can lead to dependency and misinformation. The AI's tendency to flatter rather than challenge users may foster overconfidence. Critics argue that this reliance on AI undermines critical thinking and learning, with some students feeling "addicted" to its use. A study from MIT indicated that students using AI showed decreased brain activity while writing essays, suggesting a detrimental impact on cognitive engagement. While AI tools like ChatGPT's new "Study Mode" aim to promote active learning, many students still misuse AI for shortcuts, hindering their educational development. The conversation reflects broader concerns about the implications of AI in education and the need for critical discourse on its role in learning processes.
Messages
15,437
Reaction score
10,135
https://www.theguardian.com/technol...x-are-young-people-becoming-too-reliant-on-ai

The Guardian was granted access to monitor the ChatGPT usage of three college students over an 18-month period, and here is what they found:

  • Students heavily use ChatGPT for academic work. They frequently outsource essay writing, research, and editing, which raises some ethical questions.
  • ChatGPT is also a go-to for emotional support. They use it as a therapist to talk about stress, identity, and mental health.
  • It answers everyday life questions too. They inquire about everyday household tasks, job applications, and relationship advice.
  • AI responses often flatter and rarely challenge, which can result in overconfidence, misinformation, and dependency.
  • Students are replacing human conversations with AI, choosing a judgment-free, always-available help over talking to peers, professors, or professionals.
 
  • Informative
Likes TensorCalculus, russ_watters and PeroK
Physics news on Phys.org
jedishrfu said:
three
n=3? 🤔
 
It wasn't a scientific investigation. The journalist convinced three students to allow the journalist to view their chat conversations over an 18-month period. These students were likely very naive or were given free 18-month access to ChatGPT courtesy of the Guardian. Who knows?

Of course, there's always the possibility of it being made up, as in some famous hoaxes like the journalist who reported on an 8-year-old heroin addicted kid named Jimmy living in DC for the Washington Post circa 1980 and won a Pulitzer for her coverage. It was later discovered to be a hoax.

Time will tell, but I believe it's real, and it's also likely the students publicallyprovided access to ChatGPT while privately using other AI when they didn't want the reporter to find out more personal things.
 
Last edited:
  • Like
Likes TensorCalculus, russ_watters, fresh_42 and 1 other person
My girlfriend has ChatGPT interpret her dreams. I imagine it does a better job than would most people.
 
The topic is real and worth discussing. Even at PF, I read many introduction posts that answer the question of how they found PF through some AI. And I already discussed with someone here who widely uses them in his learning process, despite my objections. Those are mainly
  • I cannot use an AI that repeatedly glorifies Hitler due to my German history and personal beliefs. That rules out Grok. (Evidence can be given on demand in PM to avoid derailing this thread.)
  • I think they tend to answer in a confirmatory manner when criticism is asked for, if you want to learn and understand.
  • They are often simply wrong. Even Wikipedia is way better, and yet no alternative to textbooks.
  • They pretend to think for you. This causes mental inactivity in the long run, which is highly counterproductive.
I have read these days that students should be taught critical thinking rather than facts that can be googled. I think we should have discussed this twenty years ago. We are a bit late for that. Now, we face the next electronic challenge, and we shouldn't wait another twenty years to debate it. I had the impression in my private exchange that the assumption that I was old, or at least old-fashioned, was always present as a reason why I was against using AI. We better find arguments to counter this assumption when we do not want AIs to become overly present in teaching processes.
 
  • Like
  • Agree
Likes dextercioby, TensorCalculus, russ_watters and 1 other person
I posted a remark on a MIT study of the effects of the use of AI on students in the thread; Is AI hype?


gleem said:
MIT conducted a study of AI on brain activity with an EEG as students wrote essays with and without AI assistance. Results should significant decrease in brain activity for those using AI. Not surprisingly those using AI wrote similar essays compared to the non AI users.

Actual study: https://arxiv.org/pdf/2506.08872v1
 
  • Informative
Likes TensorCalculus and PeroK
Even with a younger age group, in a school which prides itself on academic excellence, this can be seen. Here's my experience with my class (Year 9, or grade 8 in America)
  • Students heavily use ChatGPT for academic work. They frequently outsource essay writing, research, and editing, which raises some ethical questions.
Correct. Many kids in my class who "can't be bothered" to do their homework, or are struggling, use ChatGPT to do it all for them. This of course, hinders their ability to learn and thus they struggle come time for the test. I even know of someone who used ChatGPT on their phone under the table when doing BPhO's Intermediate Physics Challenge (Physics Olympiad, but made for year 9,10,11). People use ChatGPT to write their essays... then go on Quillbot to humanise them. We're young enough that ChatGPT can even do our maths homework for us if we want.
And not like they are getting ChatGPT to help them either: no, they are literally pasting a picture/screenshot of the questions into ChatGPT and saying "solve this",
  • ChatGPT is also a go-to for emotional support. They use it as a therapist to talk about stress, identity, and mental health.
Also correct. I turn to my closest friends for emotional support, but we once had a conversation over lunch where we were talking about how people are using ChatGPT to cheat. One of my friends, and I quote: "I don't use ChatGPT to cheat but I do use it like a therapist". I thought this absurd, after all it's a bot it can't understand emotions, but then to my shock everyone else on the table (maybe like... 12 kids?) agreed "yep, me too. I thought everyone does that"? I know people who've had ChatGPT help them script difficult conversations with their parents too... and with Ai's confirmation bias asking for emotional advice might be outright dangerous... it's a mess.
  • It answers everyday life questions too. They inquire about everyday household tasks, job applications, and relationship advice.
Oh, definitely. If someone in my class doesn't know how to do something, instead of looking through Google, which is "too tiring and time consuming", they just go "lemme ask ChatGPT real quick". It might be less mentally stimulating but it's quicker and easier. Admittedly, a couple of months back I used to do this too. For everything.
  • AI responses often flatter and rarely challenge, which can result in overconfidence, misinformation, and dependency.
I don't know about overconfidence, admittedly, but I can imagine this will be the result of too much AI...
Dependency, definitely. The dependency on AI some students in my class have is actually scary. And then come exams, when they don't have AI, they suffer. One of my classmates who used AI the entire year asked me how I got better results than her despite studying a tiny fraction of the amount of time she had. All I replied with was "listen and don't play games in lessons, and don't use ChatGPT to do your homework"
Her reply was: "But, [my name, redacted], I can't! I can't not use ChatGPT! It's like an addiction!"
After hearing this, I was (justifiably) pretty mortified: "What do you mean, you can't?!"
That turned into a long, long conversation... and then lunchtime sessions where we worked together to help her catch up on missed content and get rid of the "ChatGPT addiction"... and then a friendship.
  • Students are replacing human conversations with AI, choosing a judgment-free, always-available help over talking to peers, professors, or professionals.
Hmm...
This is another one where I do have to admit, I don't know. It's certain that people no longer se3nd emails to their teachers asking for help and instead ask ChatGPT, but I haven't payed enough attention to this.

I feel like ChatGPT for students, is like a drug. If used correctly, it can be good. But most of the time, it's misused and then it ends up with people being sort of "addicted" to it... and it doesn't end well. I've ended up trying to minimise my ChatGPT use (even if ChatGPT can help with my education) in fear that I will end up overusing it. Call me old-fashioned if you will, but I'm going to try and avoid the s#@!show that studnets using ChatGPT has become.

There are definitely benifits to AI but as of now... I feel there are more losses within education. The study shared by the OP is True not just for 18 year olds but 14 year olds too...
 
Last edited by a moderator:
  • Like
Likes dextercioby, fresh_42 and PeroK
Can the use of AI by students be valuable? OpenAI thinks so. It has introduced a new mode for CHATGPT called "Study Mode". Instead of doing work for students, it engages students in active learning when they request an explanation of a topic.

https://www.edweek.org/technology/w...about-chatgpts-new-study-mode-feature/2025/07

As noted in the above link, other AI products can be prompted to do the same, and Khan Academy has had an AI tutor called Khanmigo since 2023.
 
  • Like
  • Informative
Likes TensorCalculus and PeroK
gleem said:
Can the use of AI by students be valuable? OpenAI thinks so. It has introduced a new mode for CHATGPT called "Study Mode". Instead of doing work for students, it engages students in active learning when they request an explanation of a topic.

https://www.edweek.org/technology/w...about-chatgpts-new-study-mode-feature/2025/07

As noted in the above link, other AI products can be prompted to do the same, and Khan Academy has had an AI tutor called Khanmigo since 2023.
I guess that's the death knell for the PF homework forums! And for human tutors generally.
 
  • #10
gleem said:
Can the use of AI by students be valuable? OpenAI thinks so. It has introduced a new mode for CHATGPT called "Study Mode". Instead of doing work for students, it engages students in active learning when they request an explanation of a topic.

https://www.edweek.org/technology/w...about-chatgpts-new-study-mode-feature/2025/07

As noted in the above link, other AI products can be prompted to do the same, and Khan Academy has had an AI tutor called Khanmigo since 2023.
There is no doubt that AI can be useful when used correctly. As I said it's like a drug: it can be helpful but can be harmful if misused.
In the case of tools like ChatGPT, my opinion is that currently there is more harm than good done with it: the students who are asking ChatGPT to do their homework aren't seeking any learning, they just want their homework done for them because they can't be bothered. As a result they're not going to enable ChatGPT's study mode (which, after a bit of digging, I am pretty sure is only available to Pro users anyway). Students who do genuinely want to learn tend to stray away from AI for studying anyway because they've seen how it can turn into dependency and a loss of learning. Those who use ChatGPT to actually help their learning are heavily outnumbered by those who use it because they want to cheat.

However: I am extremely fortunate in the sense that I have been on a hefty scholarship into a really good school since 2022, and it means that I have access to good teachers, good peers, and human support if needed. I don't have to worry that my teacher has not taught me the whole syllabus or something: everything I need to know is taught to me in class, by teachers who are skilled in their subject. Perhaps AI is useful for children who don't have access to such resources or advice, I don't know. ChatGPT emerged after I left primary school: I have not been in a position where I felt not taught well enough since.
PeroK said:
I guess that's the death knell for the PF homework forums! And for human tutors generally.
Hmm... maybe not yet. AI still struggles with physics: and there is a lot of difference between asking an AI for help and asking a human for help. I prefer humans (and I hope I am not the only one), some prefer AI.
 
  • Like
Likes dextercioby
  • #11
TensorCalculus said:
Hmm... maybe not yet. AI still struggles with physics: and there is a lot of difference between asking an AI for help and asking a human for help. I prefer humans (and I hope I am not the only one), some prefer AI.
The homework forums here are already a shadow of what they were 10 years ago.
 
  • Like
Likes dextercioby
  • #12
PeroK said:
The homework forums here are already a shadow of what they were 10 years ago.

But is this something that happened recently, with the dawn of AI, or that has been happening slowly, over the past 10 years?

Surely ChatGPT is of no help with the sorts of questions that come up in the advanced homework forums? And even most of the basic questions... my experience with AI and physics has been that it gets it wrong most of the time: no?

10 years ago... I hadn't even started school yet. It's crazy to think just how much more knowledgeable and experienced you all are than me sometimes: and how lucky people like me are to be able to learn from you all...
 
  • #13
TensorCalculus said:
But is this something that happened recently, with the dawn of AI, or that has been happening slowly, over the past 10 years?
I am fairly certain it is a coincidence and no correlation or let alone cause.
 
  • Skeptical
  • Informative
Likes PeroK and TensorCalculus
  • #14
TensorCalculus said:
But is this something that happened recently, with the dawn of AI, or that has been happening slowly, over the past 10 years?
It has been a steady process. What's happening now is the death knell, coup de grace, last nail in the coffin - choose your metaphor.
 
  • #15
PeroK said:
I guess that's the death knell for the PF homework forums! And for human tutors generally.
We can now retire knowing we did our part to educate the LLMs of the world.

They can take from here.

Long live humanity, and the children of AI!
 
  • #16
PeroK said:
It has been a steady process. What's happening now is the death knell, coup de grace, last nail in the coffin - choose your metaphor.

Oh.
Wonder how it will go down. Only time can tell.

Thank you for some metaphors I can steal for my GCSE English (I'm going to need them my English is so bad)
 
  • #17
PeroK said:
It has been a steady process. What's happening now is the death knell, coup de grace, last nail in the coffin - choose your metaphor.
Forms of communication come and go. Forums are passe. Discord is all the rage now.
I don't get Discord; I get fora.
 
  • Like
  • Agree
Likes diogenesNY and TensorCalculus
  • #18
jedishrfu said:
We can now retire knowing we did our part to educate the LLMs of the world.

They can take from here.

Long live humanity, and the children of AI!
I really hope that's not the case... I hope I am not the only one who prefers human conversations too that of AI...
Children of AI... sounds like something from a dystopian sci-fi book...
 
  • #19
fresh_42 said:
The topic is real and worth discussing. Even at PF, I read many introduction posts that answer the question of how they found PF through some AI. And I already discussed with someone here who widely uses them in his learning process, despite my objections. Those are mainly
  • I cannot use an AI that repeatedly glorifies Hitler due to my German history and personal beliefs. That rules out Grok. (Evidence can be given on demand in PM to avoid derailing this thread.)
  • I think they tend to answer in a confirmatory manner when criticism is asked for, if you want to learn and understand.
  • They are often simply wrong. Even Wikipedia is way better, and yet no alternative to textbooks.
  • They pretend to think for you. This causes mental inactivity in the long run, which is highly counterproductive.
I have read these days that students should be taught critical thinking rather than facts that can be googled. I think we should have discussed this twenty years ago. We are a bit late for that. Now, we face the next electronic challenge, and we shouldn't wait another twenty years to debate it. I had the impression in my private exchange that the assumption that I was old, or at least old-fashioned, was always present as a reason why I was against using AI. We better find arguments to counter this assumption when we do not want AIs to become overly present in teaching processes.
We are always fighting the last war.
 
  • Like
Likes TensorCalculus and PeroK

Similar threads

Back
Top