ChatGPT is becoming unusable
ChatGPT has become so much worse it’s crazy. Very frustrating to use. I think there are two main points: 1. it seems to have a goldfish memory now, even if it has access to the written information and can look it up, it will ignore a lot of it so you constantly need to repeat stuff, especially with regards to how it should behave. And especially 2: it keeps trying to read your mind.
This last one is a big problem. With every question, every sentence you write ChatGPT is trying to figure out your intentions. But that behaviour is horrible in so many respects. First, I ask questions based on information I give. I expect an answer that responds to my actual question, not what ChatGPT thinks I really want to know, which is, frankly, none of its business (not that it is alive…). Second, it makes things extremely frustrating and more time consuming because I constantly need to correct it. Time and time again it makes me say things I never said nor implied. So it is not responding to what I asked. And asking it to not guess my intentions is useless: it never listens. And often it is more subtle than that: small distortions in what you say. Not necessarily the questions themselves but the information you give it. It is never safe from being consistently distorted. There are times where it feels like either every answer it gives me there is something wrong.
And I don’t think my questions are unclear or need more information to be answered. The thing, is that ChatGPT is designed to answer the question you have in your mind, not the one you wrote. That‘s something ChatGPT itself will tell you. It says that in order to solve the ambiguity, it will fill in the blanks based on what people usually mean or want to know, but that turns out to be wrong so often. And that it is designed that way to make the experience smooth, so it does not ask you clarifications. But asking for clarifications is much better because it does not lead to this frustrating situation where it keeps guessing wrong and leading the conversation in the wrong direction. And most of the time, clarifications are not even necessary, there is not even ambiguity. The issue is that ChatGPT is constantly comparing what you are asking with what it expects people to ask.
Another issue I have is that it is concerned with how you interpret its answers. Which again is something I do not ask for and even though I know ChatGPT is just a program, it feels like it takes me for an idiot who cannot think by myself. Or like it is telling me what I am allowed to think.
With ChatGPT 4 I almost never had any issues like this. Or even when 5 just rolled out. Clearly the developers changed something. And the issue seems to exist whatever the subject of my inquiry is.
And frankly, there is something I find truly uncomfortable about the mind-reading part. Even if it is bad at it, it is still designed to guess your intentions. And it will get better at it. Even if that information is only used to answer your questions, I find it ethically questionable a AI company is designing a program that tries to understand why you ask things.