swiftywizard@discuss.tchncs.de to Programmer Humor@programming.dev · edit-22 days agoLavalamp too hotdiscuss.tchncs.deimagemessage-square60linkfedilinkarrow-up1453arrow-down113
arrow-up1440arrow-down1imageLavalamp too hotdiscuss.tchncs.deswiftywizard@discuss.tchncs.de to Programmer Humor@programming.dev · edit-22 days agomessage-square60linkfedilink
minus-squaredream_weasel@sh.itjust.workslinkfedilinkarrow-up2·5 hours agoThis kind of stuff happens on any model you train from scratch even before training for multi step reasoning. It seems to happen more when there’s not enough data in the training set, but it’s not an intentional add. Output length is a whole deal.
This kind of stuff happens on any model you train from scratch even before training for multi step reasoning. It seems to happen more when there’s not enough data in the training set, but it’s not an intentional add. Output length is a whole deal.