• FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    9
    arrow-down
    1
    ·
    7 months ago

    Have you ever tried Bing Chat? It does that. LLMs that do websearches and make use of the results are pretty common now.

    • Bitrot@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      7 months ago

      Bing uses ChatGPT.

      Despite using search results, it also hallucinates, like when it told me last week that IKEA had built a model of aircraft during World War 2 (uncited).

      I was trying to remember the name of a well known consumer goods company that had made an aircraft and also had an aerospace division. The answer is Ball, the jar and soda can company.

      • NateSwift@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        7 months ago

        I had it tell me a certain product had a feature it didn’t and then cite a website that was hosting a copy of the user manual… that didn’t mention said feature. Having it cite sources makes it way easier to double check if it’s spewing bullshit though

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        7 months ago

        Yes, but it shows how an LLM can combine its own AI with information taken from web searches.

        The question I’m responding to was:

        I wonder why nobody seems capable of making a LLM that knows how to do research and cite real sources.

        And Bing Chat is one example of exactly that. It’s not perfect, but I wasn’t claiming it was. Only that it was an example of what the commenter was asking about.

        As you pointed out, when it makes mistakes you can check them by following the citations it has provided.