• theunknownmuncher@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    5 months ago

    This just demonstrates a funadmentally flawed understanding of LLMs… they don’t know anything they are generating the text that is stastically likely to follow. They will still generate what is most stastically likely for things they “don’t know”

    • Asafum@feddit.nl
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      I loved the confident response I got after a few corrections.

      “Thank you for your patience and assistance in correcting my mistakes, I will finally provide you with the correct response to your question…” Continues with a completely incorrect response lol