You can cause chatgpt to hallucinate if you keep asking it “are you sure”
I talked to it for about an hour with friends the other day. We kept bombarding it with all kinds of random questions. At one point, it got so confusedthat it’s responses started to try to mash up every response it generated for the past hour into every topic it talked about there after.
It was telling us stuff like the square root of pie was related to how round peanuts were and that birds liked it while flying at 30 kilometers an hour and then tacked in references to politics, communism, economics, Joe Biden, South America and China.
Pro-tip: If you don’t like an answer ChatGPT gives you, just ask it, “Are you sure?”
but what if you don’t like THAT answer…
Are you really sure?
Goddamn it.
You can cause chatgpt to hallucinate if you keep asking it “are you sure”
I talked to it for about an hour with friends the other day. We kept bombarding it with all kinds of random questions. At one point, it got so confusedthat it’s responses started to try to mash up every response it generated for the past hour into every topic it talked about there after.
It was telling us stuff like the square root of pie was related to how round peanuts were and that birds liked it while flying at 30 kilometers an hour and then tacked in references to politics, communism, economics, Joe Biden, South America and China.
It was both hilarious and disturbing.