• Feathercrown@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    7 hours ago

    Hmm, interesting theory. However:

    1. We know this is an issue with language models, it happens all the time with weaker ones - so there is an alternative explanation.

    2. LLMs are running at a loss right now, the company would lose more money than they gain from you - so there is no motive.

    • MotoAsh@piefed.social
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      6 hours ago

      Of course there’s a technical reason for it, but they have incentive to try and sell even a shitty product.