u/Relative-Bottle-8498

▲ 0 r/GeminiAI+1 crossposts

Gemini is sorely lacking recently.

I have noticed that recently, Gemini is constantly hallucinating basic information such as phone specs from last year, or when i ask it about something it tends to say. [INSERT THING HERE] doesn't exist. However, did you mean [THE EXACT SAME THING]? This goes on for a while until I give up.

It seems to prioritise pleasing its users over giving factual information. For example, I was debugging my Chromebook recently and it kept telling me what I wanted to do was possible and I spent 2 hours following its instructions, only to then ask ChatGPT to research whether Gemini was hallucinating and low and behold! It was! Even after I told Gemini it was wrong, explained it and asked it to fact check itself with research it still hallucinates.

Has anyone else been getting these problems recently?

reddit.com
u/Relative-Bottle-8498 — 9 hours ago