Meta “programmed it to simply not answer questions,” but it did anyway.

  • Terrasque@infosec.pub
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    That’s like saying car crash is just a fancy word for accident, or cat is just a fancy term for animal.

    Hallucination is a technical term for this type of AI, and it’s inherent to how it works at it’s core.

    And now I’ll let you get back to your hating.

    • CileTheSane@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      Hallucination is also wildly misleading. The AI does not believe something that isn’t real, it was incorrect in the words it guessed would be appropriate.