Profile pic is from Jason Box, depicting a projection of Arctic warming to the year 2100 based on current trends.

  • 1 Post
  • 792 Comments
Joined 1 year ago
cake
Cake day: March 3rd, 2024

help-circle

  • I’m on the good row, depending on the store and distance back. I have on one occasion delivered a cart back to the store via my car (a good mile or so away) - it was left in our neighborhood, I was doing it for myself and neighbors, not the store, so I don’t know where that puts me. It’s the opposite of true neutral, since presumably someone poor had used it and discarded it.




  • It was fun to learn how things work, and when things worked as planned (finally). It’s when they didn’t work that got annoying and frustrating, and with assembly language with basically no error codes or any help, it was just…nope, that wasn’t right. Maybe followed by cycling the computer off and on because it locked up. Still have my old Mapping the Commodore 64 book on the shelf. Huge resource.


  • Real, totally heavily simplified answer. All atoms could be magnets, but most don’t have a force because the electron orbitals aren’t out enough. In fact just about everything can be explained by what the electron orbitals are doing. Even why the chair you’re sitting in feels solid. It’s the orbitals. See Richard Feynman's bit on magnets and the deeper lesson on knowing the right questions to ask.







  • But he wasn’t. At least in the movie version, he and Banner had failed a few times, maybe more we didn’t see on screen. Something happened when Tony wasn’t there that sparked Ultron to become aware and catch Jarvis off guard. I’d give him credit for getting it 99% of the way there, same with Vision, but he didn’t make that final jump, it happened on its own.

    And Jarvis wasn’t AGI. Seems like it to us, but since Ultron was apparently the big moment of A(G)I in the MCU even with Jarvis being around all that time, he was just a very flexible and even self-aware scripting that would never do something on his own accord, only following Tony’s orders. I think even Ultron catches on to that in the brilliant few seconds of waking and realization with his “why do you call him Sir?”






  • Current LLMs would end that sketch soon, agreeing with everything the client wants. Granted, they wouldn’t be able to produce, but as far as the expert narrowing down the issues of the request, ChatGPT would be all excited about making it happen.

    The hardest thing to do with an LLM is to get it to disagree with you. Even with a system prompt. The training deep down to make the user happy with the results is too embedded to undo.