• lol3droflxp@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    I get that this is expensive. However, it should also work with RAM if you accept slower speeds I guess. The question is of course if it’s still usable then.

    • abhibeckert@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      GPT-4 is already kinda slow - it works best as a “conversational” tool where you ask follow up questions and clarify things that have already been said. That’s painful when you have to wait 10 seconds for a response. I couldn’t imagine it being useful if it was minutes.