May 28, 2025, 05:36 PM
92fstechPoll: Searching the internet, AI or Old School
quote:
As opposed to asking a LLM a question, getting an answer that may or may not be accurate, and then just blindly/lazily accepting the answer with the assumption that it must be correct. Which is how many/most people seem to be approaching using a LLM AI, despite the fact that there's a significant chance that it's either inaccurate or even outright hallucinated/bullshitted altogether.
You summed up my concerns perfectly in this paragraph. The problem isn't the tool, it's people's lazy and irresponsible usage of it.
May 29, 2025, 10:45 AM
220-9erThey're different sides of the same coin.
May 29, 2025, 10:47 AM
HRKquote:
Originally posted by 92fstech:
quote:
As opposed to asking a LLM a question, getting an answer that may or may not be accurate, and then just blindly/lazily accepting the answer with the assumption that it must be correct. Which is how many/most people seem to be approaching using a LLM AI, despite the fact that there's a significant chance that it's either inaccurate or even outright hallucinated/bullshitted altogether.
You summed up my concerns perfectly in this paragraph. The problem isn't the tool, it's people's lazy and irresponsible usage of it.
Isn't that a universal problem regarding people when assessing information
May 29, 2025, 12:19 PM
92fstechquote:
Isn't that a universal problem regarding people when assessing information
Absolutely. But AI just makes it easier, and it facilitates laziness on the content creation as well as the content consumption sides.
It also makes it harder to discern. When an idiot writes or produces something, there are usually a lot of other clues that the person is an idiot. AI can produce a technically perfect presentation with absolutely incorrect information.
Early AI was pretty easy to discern from a human writer because it was so sterile and stilted. Lately, it's gotten better at sounding human and sometimes it's hard to tell.
Gun reviews are one example. When I'm considering a new gun, I like to read real-life reviews from someone who has actually used it. I can research all the stat and measurement comparisons I want, but I value that real-life perspective on how the gun handles and human impression of its quality. In the last few years when I'm looking for that kind of stuff, my search results take me to all kinds of AI-generated articles that are basically just the stats and measurements vomited out in prose. Worse, some of these publications have an author's name ascribed to them, so either the writer got lazy and let AI do the work, or the publication is cheap and dishonest.
The old-fashioned way is better and more useful. There are tons of excellent, thoughtful reviews right here on this forum. Check out some of cslinger's recent posts in the pistols section, for example. I don't want to live in a world where stuff like that goes away and gets replaced by some soulless bot regurgitating numbers from a manual.
May 30, 2025, 11:45 AM
architectquote:
Originally posted by RogueJSK:
quote:
Originally posted by 92fstech:
I want to know and have personally chosen every word I put in a document like that.
110% agreed.
quote:
I think there are legitimate applications where AI can serve a valuable purpose, but overall our overdependence is just going to make us lazier and dumber, because that's what people do.
Also agreed.
A good example is something like generating computer code. LLM AIs are apparently quite good at generating programming code.
A fairly recent report claims that malware can be injected into AI generated code through prompt manipulation. You can even "pollute" a code repository so that the malware is injected nito any download.
Bottom line: AI may be useful for coding, but it exposes an additional attack surface that must be examined. If subtle enough, this will likely be more work than writing the code yourself.