• Melmi@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    6 months ago

    Google destroys their own search engine by encouraging terrible SEO nonsense and then offers the solution in the form of these AI overviews, cutting results out of the picture entirely.

    You search something on the Web nowadays half the results are written by AI anyway.

    I don’t really care about the “human element” or whatever, but AI is such a hype train right now. It’s still early days for the tech, it still hallucinates a lot, and I fundamentally can’t trust it—even if I trusted the people making it, which I don’t.

    • belated_frog_pants@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      6 months ago

      You will never be able to consistently find the truth when AI optimizes for overfitting to get any result as long as something is show with ads next to it.

      AI has no way of understanding truth. It’s autocomplete trained on just anything it can find truth or not.

  • RickRussell_CA@beehaw.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    Oh good, now when I search I’ll have to wade through the effluent of AI-produced pablum to find an actual human journalism product.

  • FluffyPotato@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    6 months ago

    Google has been unusable in English for at least 2 years now. Searching in Estonian makes Google behave like it did when it was still good. I wonder how long that’s going to last.

  • Admiral Patrick@dubvee.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Sigh. AI has basically added a rocket booster to the enshittification train.

    Hopefully this doesn’t impact DDG.

    • 👍Maximum Derek👍@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      I mentioned this in another thread, but I do worry that google is eventually going infect the APIs that metasearch engines like DDG, Kagi, searchxng, etc depend on.

      In my experience, a lot of the sysadmins who run high traffic sites will treat all bots as scrapers that have to be blocked or slowed to a crawl. Then they make special allowances for googlebot, bing/msnbot, and a few others. That means there is a massive uphill climb (beyond the technical one) to making a new search engine from scratch. With Google and MS both betting the farm on LLMs I fear we’re going to lose access to two of the most valuable web reverse indexes out there.

      • MajorHavoc@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        6 months ago

        Yeah. This is going to suck worse before it gets any better. The good news is that all the useful content (outside of sales gardens) is going to be here in the Fediverse.

        The other good news is that the state of Cybersecurity investment is abysmal, and the walled garden content is going to get breached/leaked/pirated a lot, for a long time to come.

      • Admiral Patrick@dubvee.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        I fear that as well. I use Searx-NG at home, so am expecting that to start dying a death of a thousand cuts soon.

        Was thinking about standing up (or contributing to) either YaCY or Stract, but you made a good point about the bot allowances for the Googlebot et al crawler UAs. Wonder how frowned upon it would be to spoof the crawler UA in a self-hosted one?

        • Butterbee (She/Her)@beehaw.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          I’ve just started using Searxng… you expect it to die soon? Is it because you expect other search engines to follow suit until there are no search engines anymore, only hallucination machines?