View Single Post
Old 18th April 2023, 11:27     #17
Cyberbob
 
I used ChatGPT to summarize a 16 chapter book for me yesterday, chapter by chapter. It worked really well until I told it to summarise chapter 17.
It outputted a genuinely convincing summary, just as it had 16 times before. I asked it "Are you sure that's the summary for chapter 17?" and it simply replied

Quote:
I apologize for the confusion. I made a mistake in my previous response. That book does not have 17 chapters. It has 16 chapters, and my previous response mistakenly referred to a non-existent chapter. I apologize for any inconvenience this may have caused.
Like it outright lied to me, and all of a sudden i couldn't trust any of the previous entries either. Dangerous.
__________________
ɹǝʌo sᴉ ǝɯɐƃ ʎɥʇ
  Reply With Quote