News Focus
News Focus
Followers 2581
Posts 328192
Boards Moderated 24
Alias Born 04/12/2001

Re: blackhawks post# 532429

Monday, 06/30/2025 11:30:35 PM

Monday, June 30, 2025 11:30:35 PM

Post# of 575593
If you click on "Learn more", you get a sort of very brief guide to generative AI.

Now here's something interesting. I think this explains the "hallucinations" of LLMs better than I've so far seen:

It may make things up. When generative AI invents an answer, it's called a hallucination. Hallucinations happen because unlike how Google Search gets information from the web, LLMs don't gather information at all. Instead, LLMs predict which words come next based on user inputs.

That is clear and easy to understand. Actually helpful. This is where you find it:

https://support.google.com/websearch/answer/13954172?sjid=1474837517766669377-NA#zippy=%2Cai-can-will-make-mistakes%2Calways-evaluate-responses%2Cuse-code-with-caution

Discover What Traders Are Watching

Explore small cap ideas before they hit the headlines.

Join Today