Quote:
Originally Posted by Cyberbob
I used ChatGPT to summarize a 16 chapter book for me yesterday, chapter by chapter. It worked really well until I told it to summarise chapter 17.
It outputted a genuinely convincing summary, just as it had 16 times before. I asked it "Are you sure that's the summary for chapter 17?" and it simply replied
Like it outright lied to me, and all of a sudden i couldn't trust any of the previous entries either. Dangerous.
|
It does that quite often. Did you ask for 1 chapter at a time. I was trying it with a game, it seems to get stuck after a while.
Not sure if I posted here, I also had it try and do a formula for me. The formula was longer but it essentially went.
2 + 2 + 2 = 6, and that is how you get the answer 10.
It seems to do a lot better at language than at mathmatics