Except when it hallucinates, draws from biased sources, or straight-up responds with false information.
I’d rather look through the available links myself and research the direct source things came from. AI isn’t trained to look specifically for factual information. Unfortunately, a lot of people aren’t trained for that, either. But we can still educate ourselves. Relying on a bot is putting one more space between the information you receive and the source that created it.
I’d rather get my information from as close to the original source as possible. Only then can I determine if the source is even worth trusting in the first place.
Thanks for replying. I prefer when people actually articulate their disapproval to something than just downvote it, as it allows the other person to understand more.
Your comment is very reasonable and it makes me think that I perhaps give them too much credit when it’s a subject I’m not an expert in. We have embraced LLMs at work as software engineers for small company and it allows us to save so much time on the stuff we do over and over again. But that because we know the subject matter and it’s quite easier to see when they’re hallucinating. I should be more cautious when using them for stuff I’m not familiar with.
At work I work for a good company and we save so much time making enterprise software using LLMs as tools that we recently got a pay rise and reduction of hours in the same day.
😂.
That made me chuckle. Naughty LLM.
On a level though I don’t really get the disdain for them as search is a nightmare now and it’s a lot easier to just get the LLM to do it for you.
Except when it hallucinates, draws from biased sources, or straight-up responds with false information.
I’d rather look through the available links myself and research the direct source things came from. AI isn’t trained to look specifically for factual information. Unfortunately, a lot of people aren’t trained for that, either. But we can still educate ourselves. Relying on a bot is putting one more space between the information you receive and the source that created it.
I’d rather get my information from as close to the original source as possible. Only then can I determine if the source is even worth trusting in the first place.
When I use LLMs for search, I always ask for sources and then follow up.
Thanks for replying. I prefer when people actually articulate their disapproval to something than just downvote it, as it allows the other person to understand more.
Your comment is very reasonable and it makes me think that I perhaps give them too much credit when it’s a subject I’m not an expert in. We have embraced LLMs at work as software engineers for small company and it allows us to save so much time on the stuff we do over and over again. But that because we know the subject matter and it’s quite easier to see when they’re hallucinating. I should be more cautious when using them for stuff I’m not familiar with.
At work I work for a good company and we save so much time making enterprise software using LLMs as tools that we recently got a pay rise and reduction of hours in the same day.