swiftywizard@discuss.tchncs.de to Programmer Humor@programming.dev · edit-213 hours agoLavalamp too hotdiscuss.tchncs.deimagemessage-square36linkfedilinkarrow-up1319arrow-down17
arrow-up1312arrow-down1imageLavalamp too hotdiscuss.tchncs.deswiftywizard@discuss.tchncs.de to Programmer Humor@programming.dev · edit-213 hours agomessage-square36linkfedilink
minus-squareAlex@lemmy.mllinkfedilinkarrow-up82·12 hours agoIf you have ever read the “thought” process on some of the reasoning models you can catch them going into loops of circular reasoning just slowly burning tokens. I’m not even sure this isn’t by design.
minus-squareFeathercrown@lemmy.worldlinkfedilinkEnglisharrow-up3·1 hour agoWhy would it be by design? What does that even mean in this context?
minus-squareswiftywizard@discuss.tchncs.deOPlinkfedilinkarrow-up49·12 hours agoI dunno, let’s waste some water
minus-squaregtr@programming.devlinkfedilinkarrow-up1·24 minutes agoThey are trying to get rid of us by wasting our resources.
minus-squareMajorasTerribleFate@lemmy.ziplinkfedilinkarrow-up2·20 minutes agoSo, it’s Nestlé behind things again.
minus-squareSubArcticTundra@lemmy.mllinkfedilinkarrow-up12·11 hours agoI’m pretty sure training is purely result oriented so anything that works goes
If you have ever read the “thought” process on some of the reasoning models you can catch them going into loops of circular reasoning just slowly burning tokens. I’m not even sure this isn’t by design.
Why would it be by design? What does that even mean in this context?
I dunno, let’s waste some water
They are trying to get rid of us by wasting our resources.
So, it’s Nestlé behind things again.
I’m pretty sure training is purely result oriented so anything that works goes