• @Corkyskog@sh.itjust.works
    link
    fedilink
    110 months ago

    This reminds me of how someone illustrated the machine learning problem of what I want to say is called “gradient descent”. This was way back in the 2000s before all the more recent AI stuff.

    Basically the problem as I remember it being described in a Tedtalk was if you think of a problem like a sphere with a surface and a bunch of tunnels at the surface, where only one leads to the core (answer) of the sphere. Some tunnels might get really close to the core, but only one leads into the core. The AI would get stuck diving down these holes using insane amount of computational power trying to dig for the answer, not realizing that if it backed up a bit and went down the hole next to them they could reach the core (answer).

    One way to help this problem was developing the game “Foldit” which allowed regular old users to manipulate the proteins themselves. When people had foldit at home running they would notice that the Screensaver displaying the folding would skip over what seemed to be the right shape and would get frustrated that they couldn’t help guide it.

    This might be a different Ted Talk, but it is about the same subject.