
If you have ever read the “thought” process on some of the reasoning models you can catch them going into loops of circular reasoning just slowly burning tokens. I’m not even sure this isn’t by design.
I dunno, let’s waste some water
I’m pretty sure training is purely result oriented so anything that works goes
This is gold
Attack of the logic gates.
or
What happend here?
It’s like the text predictor on your phone. If you just keep hitting the next suggested word, you’ll usually end up in a loop at some point. Same thing here, though admittedly much more advanced.
LLMs work by picking the next word* as the most likely candidate word given its training and the context. Sometimes it gets into a situation where the model’s view of “context” doesn’t change when the word is picked, so the next word is just the same. Then the same thing happens again and around we go. There are fail-safe mechanisms to try and prevent it but they don’t work perfectly.
*Token
That was the answer I was looking for. So it’s simmolar to “seahorse” emoji case, but this time.at some point he just glitched that most likely next world for this sentence is “or” and after adding the “or” is also “or” and after adding the next one is also “or”, and after a 11th one… you may just as we’ll commit. Since thats the same context as with 10.
Thanks!
He?
This is not a person and does not have a gender.
Chill dude. It’s a grammatical/translation error, not an ideological declaration. Especially common mistake if of your native language have “grammatical gender”. Everything have “gender” in mine. “Spoon” is a “she” for example, but im not proposing to any one soon. Not all hills are worth nitpicking on.
This one is. People need to stop anthropomorphizing AI. It’s a piece of software.
I am chill, you shouldn’t assume emotion from text.
As I explained, this is specyfic example, I no more atrompomorphin it than if I’m calling a “he” my toliet paper. The monster you choose to charge is a windmill. So “chill” seems adequate.
To be clear using gendered pronouns on inanimate objects is the literal definition of anthropomorphization. So chill does not seem fair at all.
Yeah. It would have been much more productive to poke at the “well”, which was turned into “we’ll”.
I’ve got it once in a “while it is not” “while it is” loop.
Gemini evolved into a seal.
or simply, or
LLM showed its true nature, probabilistic bullshit generator that got caught in a strange attractor of some sort within its own matrix of lies.
Unmentioned by other comments: The LLM is trying to follow the rule of three because sentences with an “A, B and/or C” structure tend to sound more punchy, knowledgeable and authoritative.
Yes, I did do that on purpose.
Not only that, but also “not only, but also” constructions, which sound more emphatic, conclusive, and relatable.
Turned into a sea lion










