Users of OpenAI’s GPT-4 are complaining that the AI model is performing worse lately. Industry insiders say a redesign of GPT-4 could be to blame.
You must log in or register to comment.
deleted by creator
The model has become inbred because it’s now impossible to scrape the web without AI content getting ingested, which is full of “hallucinations” and other weird artifacts. The last opportunity to get “uncontaminated” training data was sometime in mid 2022.
Not to say that it’s causing this particular problem, but this issue will emerge eventually. Garbage in = garbage out. Eventually GPT-19 will grow a mighty Habsburg chin.
Maybe not yet, but…
- Spez will turn Reddit into a bot farm and sell this as training data
- Musk turns Twitter into a bigoted cesspool and will sell this as training data, which will subsequently be flagged for low quality (also: a botfarm)
- Threads is a corporate ad dashboard (and we already know how easy it is to GPT copy) and Zuck will sell this as training data
- Facebook is either dead or only good for boomers and Poles
- blogs are dead
- Fediverse is out there waiting to be scraped but possibly too small to sustain a big model
We’te getting there, hopefully.
deleted by creator