• RememberTheApollo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    ·
    edit-2
    4 days ago

    I don’t know how people can be so easily taken in by a system that has been proven to be wrong about so many things. I got an AI search response just yesterday that dramatically understated an issue by citing an unscientific ideologically based website with high interest and reason to minimize said issue. The actual studies showed a 6x difference. It was blatant AF, and I can’t understand why anyone would rely on such a system for reliable, objective information or responses. I have noted several incorrect AI responses to queries, and people mindlessly citing said response without verifying the data or its source. People gonna get stupider, faster.

    • cley_faye@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 days ago

      I don’t know how people can be so easily taken in by a system that has been proven to be wrong about so many things

      Ahem. Weren’t there an election recently, in some big country, with uncanny similitude with that?

    • hansolo@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      4 days ago

      I like to use GPT to create practice tests for certification tests. Even if I give it very specific guidance to double check what it thinks is a correct answer, it will gladly tell me I got questions wrong and I will have to ask it to triple check the right answer, which is what I actually answered.

      • RememberTheApollo_@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        4 days ago

        And in that amount of time it probably would have been just as easy to type up a correct question and answer rather than try to repeatedly corral an AI into checking itself for an answer you already know. Your method works for you because you have the knowledge. The problem lies with people who don’t and will accept and use incorrect output.

        • hansolo@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          ·
          4 days ago

          Well, it makes me double check my knowledge, which helps me learn to some degree, but it’s not what I’m trying to make happen.

    • WaitThisIsntReddit@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      That’s why I only use it as a starting point. It spits out “keywords” and a fuzzy gist of what I need, then I can verify or experiment on my own. It’s just a good place to start or a reminder of things you once knew.