• 0 Posts
  • 11 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023

help-circle
  • Let’s put it this way: If in our lifetime we can simulate the intelligence of a vinegar fly as general intelligence, that would be a monumental landmark in AGI. And we’re far, far, far away from it.

    I get what you mean here and I agree with it, if we’re talking about current “AI”, which isn’t anywhere close. I know, because I’ve programmed some simple “AIs” (Mainly ML models) myself.

    But your comparison to ancient egypt is somewhat lacking, considering we had the aptly named dark ages between then and now.

    Lot’s of knowledge got lost all the time during humanity’s history, but ever since the printing press, and more recently the internet, came into existence, this problem has all but disappeared. As long as humanity doesn’t nuke itself back to said dark ages, I recon we aren’t that far away from AGI, or at least something close to it. Maybe not in my lifetime, but another ~2000 years seems a little extreme.


  • Whenever I hear someone say that something is impossible with current technology, I think about my grandma. When she was a kid, only some important people had telephones. Doctors, police, etc.

    In her lifetime we went from that to today, and, since she’s still alive, even further into the future.

    Whenever someone calls something impossible, I think about how far technology will progress in my own lifetime and I know that they’ve got no idea what they’re talking about. (Unless, like you said, it’s against the laws of physics. But sometimes even then I’m not so sure, cause it’s not like we understand those entirely. )