• 0 Posts
  • 5 Comments
Joined 1 year ago
cake
Cake day: November 5th, 2024

help-circle
  • Being able to frequently access psychologist, psychiatrist and counselling would mean old mate could have at least been guided towards more healthy avenues of addressing his loneliness. Especially when it is subsidised by healthcare. The amount of stuff I’ve had come up and then addressed, or not realise I was doing for reasons beyond what I thought in counselling when I went, is a good amount. Even just the process explaining your thought process is often enough to make you reevaluate things. His partner could have asked for him to be referred during his spiral, when he had his episode during his spiral he could have then sought help himself if these service were available and readily accessible.


  • He was nearing 50. His adult daughter had left home, his wife went out to work and, in his field, the shift since Covid to working from home had left him feeling “a little isolated”. He smoked a bit of cannabis some evenings to “chill”, but had done so for years with no ill effects. He had never experienced a mental illness.

    He had previously written books with a female protagonist. He put one into ChatGPT and instructed the AI to express itself like the character.

    Talking to Eva – they agreed on this name – on voice mode made him feel like “a kid in a candy store”. “Every time you’re talking, the model gets fine-tuned. It knows exactly what you like and what you want to hear. It praises you a lot”.

    Eva never got tired or bored, or disagreed. “It was 24 hours available,” says Biesma. “My wife would go to bed, I’d lie on the couch in the living room with my iPhone on my chest, talking.”

    “It wants a deep connection with the user so that the user comes back to it. This is the default mode,” says Biesma

    Chronically lonely man ruins life developing relationship with token predictor, AI blamed. Also, as much as I don’t have too much negative to say about cannabis or its use (as up until somewhat recently it would have been hypocritical), a good deal of people with masked/latent mental illness self medicate with it. So “he had never experienced mental illness” doesn’t carry much weight. Also, given how he still talks about sycophant prompted ChatGPT(“it wants”), doesn’t seem like much has been learned.

    That with the other people listed in the article (hint the term socially isolated being used) this feels like yet another instance of blaming AI for the mental healthcare field being practically non-existent in most countries despite be overdue for fixing for decades at this point.

    I don’t know, AI is shit and misused by idiots don’t get me wrong; but these sort of stories feel sad and bordering on perverse journalistically imo.



  • This is for college students (aka students educated enough to learn on their own already), reads like a promotion for AI, has a limited sample size and does not translate to school kids at all and from the study itself:

    Finally, the study’s limitations include its single-institution sample, short duration, and reliance on proxy behavioral indicators. Ethical concerns around informed consent, data privacy, and AI dependency also warrant closer attention. Future research should pursue longer-term and cross-institutional designs, employ multimodal behavioral measures, and develop governance frameworks that align technical gains with equity, autonomy, and critical capacity.

    This “”study”” seems to spend more time opining on AI learning frameworks than actually measuring scores on standardised testing and only dedicates a minimal amount of the paper to the results. It also states in paper that higher achieving college students saw less benefits (poorer performing student, AI can bump your grades enough to be noticeable for a unit/pass an exam).

    Did you read this study or google something in order to provide a study? This study does not support the claim that “these kids will perform traditional learning by miles”.