• 1 Post
  • 31 Comments
Joined 2 months ago
cake
Cake day: December 3rd, 2024

help-circle



  • as a starting point to learn about a new topic

    No. I’ve used several models to “teach” me about subjects I already know a lot about, and they all frequently get many facts wrong. Why would I then trust it to teach me about something I don’t know about?

    to look up a song when you can only remember a small section of lyrics

    No, because traditional search engines do that just fine.

    when you want to code a block of code that is simple but monotonous to code yourself

    See this comment.

    suggest plans for how to create simple sturctures/inventions

    I guess I’ve never tried this.

    Anything with a verifyable answer that youd ask on a forum can generally be answered by an llm, because theyre largely trained on forums and theres a decent section the training data included someone asking the question you are currently asking.

    Kind of, but here’s the thing, it’s rarely faster than just using a good traditional search, especially if you know where to look and how to use advanced filtering features. Also, (and this is key) verifying the accuracy of an LLM’s answer requires about the same about of work as just not using an LLM in the first place, so I default to skipping the middle-man.

    Lastly, I haven’t even touched on the privacy nightmare that these systems pose if you’re not running local models.


  • Creating software is a great example, actually. Coding absolutely requires reasoning. I’ve tried using code-focused LLMs to write blocks of code, or even some basic YAML files, but the output is often unusable.

    It rarely makes syntax errors, but it will do things like reference libraries that haven’t been imported or hallucinate functions that don’t exist. It also constantly misunderstands the assignment and creates something that technically works but doesn’t accomplish the intended task.