That’s the thing about LLMs though. They don’t understand anything at all. They just say stuff that sounds coherent. It turns out of you just say whatever seems like the most reasonable response at all times then you can get pretty close to simulating understanding in some scenarios. But even though these newer language models are quite good at some things, they are no closer to understanding or conceptualizing anything.
Indian govt assassinated or was involved in the assassination of a Sikh guy living in Canada. Canadian govt tried to reach out to Indian govt privately to get to the bottom of it, but Indian govt was having none of it. So Canadian govt went public with the revelation, causing Indian govt to throw a massive tantrum.