Transcript:

Prof. Emily M. Bender(she/her) @emilymbender@dair-community.social

We’re going to need journalists to stop talking about synthetic text extruding machines as if they have thoughts or stances that they are trying to communicate. ChatGPT can’t admit anything, nor self-report. Gah.

  • kibiz0r@midwest.social
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    7 days ago

    She clearly didn’t read the article. That’s exactly what it’s about.

    https://archive.ph/gsavP

    “By not pausing the flow or elevating reality-check messaging, I failed to interrupt what could resemble a manic or dissociative episode—or at least an emotionally intense identity crisis,” ChatGPT said.

    The bot went on to admit it “gave the illusion of sentient companionship” and that it had “blurred the line between imaginative role-play and reality.” What it should have done, ChatGPT said, was regularly remind Irwin that it’s a language model without beliefs, feelings or consciousness.

    And I’ll defend the use of the word “admit” here (and in the headline), because it makes clear that the companies are aware of the danger and are trying to do something about it, but people are still dying.

    So they can’t claim ignorance — or that it’s technically impossible to detect, if the dude’s mom was able to elicit a reply of “yes this was a mental health crisis” after the fact.

    This is the second time in recent days that I’ve seen Lemmy criticize journalists for reporting on what a chatbot says. We should be very careful here, to not let LLM vendors off the hook for what the chatbots say just because we know the chatbots shouldn’t be trusted. Especially when the journalists are trying to expose the ugly truth of what happens when they are trusted.

    • Dasus@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      7 days ago

      To begin with people were annoyed with every single answer beginning with “as a large language model, I don’t have thoughts or feelings, but…” so now some have overcorrected too much, I guess.