• LandedGentry@lemmy.zip
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    17 hours ago

    We get what you’re saying but they seem to be talking about the experience some people want, private corporate owned algorithm or otherwise. They’re not saying those algorithms are good for society or something, but they are good at predicting what people want to see.

    • FauxLiving@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      17 hours ago

      They’re good at predicting what people want to see, yes. But that isn’t the real problem.

      The problem isn’t that they predict what you want to see, it is that they use that information to give you results that are 90% what you want to see and 10% of results that the owner of the algorithm wants you to see.

      X uses that to mix in alt-right feeds. Google uses it to mix in messages from the highest bidder on their ad network and Amazon uses it to mix in product recommendations for their own products.

      You can’t know what they’re adding to the feed or how much is real recommendations that are based on your needs and wants and how much is artificially boosted content based on the needs and wants of the owner of the algorithm.

      Is your next TikTok really the next highest piece of recommended content or is it something that’s being boosted on the behalf of someone else? You can’t know.

      This has become an incredibly important topic since people are now using these systems to drive political outcomes which have real effects on society.

      • LandedGentry@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        ·
        16 hours ago

        You’re very fixated on something we all agree with and missing the thrust of the point.

        People want an algorithm, whether it’s parasitic or manipulative or whatever. Most people do not care enough to object. They will pick it over a mastodon/lemmy/etc experience to get curation. That’s all we’re saying

        • FauxLiving@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          13 hours ago

          I’m carrying on multiple conversations in this thread, so I’ll just copy what I said in a different thread:

          Of course people like these features, these algorithms are literally trained to maximize how likable their recommendations are.

          It’s like how people like heroin because it perfectly fits our opioid receptors. The problem is that you can’t simply trust that the person giving you heroin will always have your best interests in mind.

          I understand that the vast majority of people are simply going to follow the herd and use the thing that is most like Twitter, recommendation feed and all. However, I also believe that it is a bad decision on their part and that the companies that are intaking all of these people into their alternative social networks are just going to be part of the problem in the future.

          We, as the people who are actively thinking about this topic (as opposed to the people just moving to the blue Twitter because it’s the current popular meme in the algorithm), should be considering the difference between good recommendation algorithm use and abusive use.

          Having social media be controlled by private entities which use black box recommendation algorithms should be seen as unacceptable, even if people like it. Bluesky’s user growth is fundamentally due to people recognizing that Twitter’s systems are being used to push content that they disagree with. Except they’re simply moving to another private social media network that’s one sale away from being the next X.

          It’d be like living under a dictatorship and deciding that you’ve had enough so you’re going to move to the dictatorship next door. It may be a short-term improvement, but it doesn’t quite address the fundamental problem that you’re choosing to live in a dictatorship.