Apple fans are starting to return their Vision Pros::The return window for the very first Apple Vision Pro buyers is fast approaching — and some have taken to social media to explain why they won’t be keeping their headsets.

  • linearchaos@lemmy.world
    link
    fedilink
    English
    arrow-up
    90
    arrow-down
    11
    ·
    10 months ago

    Some people are returning it because they had expectations that using VR would be immediately comfortable. The headset is heavier and more poorly strapped/distributed than ‘alternatives’ but it’s also graphically far more stunning. I honestly hope they stay in the game and push the competitors to up their game. maybe we can get pancake lenses, foveated rendering and eye tracking in a $1500 package.

    • darth_helmet@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      10 months ago

      So the quest pro? Foveated rendering only matters if you don’t have the graphics throughput to render it all, so I don’t totally buy that it’s key to a good vr headset so much as helps you get away with cheaper silicon. Maybe enough-lower tdp that it enables slimmer design.

      • Kage520@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        10 months ago

        I think foveated rendering also helps with immersion. Being able to blur things you are not specifically looking at and are farther away is a closer match to reality.

        • rhythmnova@lemmy.world
          link
          fedilink
          English
          arrow-up
          27
          arrow-down
          3
          ·
          10 months ago

          Reality doesn’t downsample when you’re not looking at it, your eye does that.

          • orgrinrt@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            10 months ago

            As far as I understand (and do correct me if I’ve got it wrong), your eyes still know they are looking at very small and very rapidly blinking lights in close proximity and in a flat array, which is why it mostly feels like uncanny valley in regards to that exact experience, and why software enhancement/approximation of the effect could be beneficial.

            • rhythmnova@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              Delayed response but if you’re talking about the general experience of VR being an uncanny valley experience then no, I don’t agree. It’s very common for people who use VR to say that they forgot for a moment that it wasn’t real.

          • kava@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            5
            ·
            10 months ago

            reality doesn’t downsample when you’re not looking

            As far as you know. Maybe that’s the reasoning behind weird stuff in quantum mechanics. The cat is both alive and dead until you open the box and look at it.

            • treesquid@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              10 months ago

              The whole point of the cat thing was to point out the absurdity of the claim that reality isn’t real until you know about it. The cat is already in whatever state you observe when you open the box. It’s not both alive and dead, it’s either alive or dead. The thought experiment isn’t serious, and it’s not supporting the idea that the cat is somehow magically in both states just because you haven’t yet manipulated the lid of a wooden cube.

              • kava@lemmy.world
                link
                fedilink
                English
                arrow-up
                -1
                arrow-down
                1
                ·
                edit-2
                10 months ago

                When we talk about the cat being both alive and dead, it’s a simplification to help visualize a quantum phenomenon where particles exist in multiple states simultaneously until measured or observed.

                Schrodinger came up with the cat to represent the absurdity of quantum mechanics because he thought it was absurd - but that doesn’t mean his metaphor isn’t a useful one. Particles like electrons or photons can exist in a state of superposition, where they hold multiple potential states (e.g., spin up and spin down) at the same time. This isn’t just a theoretical curiosity; it’s been experimentally verified in numerous quantum experiments, such as the double-slit experiment.

                The act of measurement in quantum mechanics forces a system to ‘choose’ a definite state from among its superposed states, a process known as wave function collapse. Before measurement, the system genuinely exists in all its possible states simultaneously, not in one state or the other. This is a fundamental aspect of the quantum world

      • daltotron@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        I don’t really look at it as a symptom of lack of graphics throughput, but more as a benefit of eye tracking, which is also potentially something that benefits, say, the immersion of others through portraying your facial expressions more realistically, or something to that effect. You could also use it as a kind of peripheral for games or software, and apple currently uses it as a mouse, so it’s not totally useless. But I also can’t imagine that most developers are going to be imaginative enough to make good use of it, if we can’t even think of good uses for basic shit, like haptic feedback.

        Perhaps it breaks even in terms of allowing them to save money they otherwise would’ve spent on rendering, but I dunno if that’s the case, since the camera has to be pretty low latency, and you have to still dedicate hardware resources to the eye tracking and foveated rendering in order to get it to look good. Weight savings, then? I just don’t really know. I guess we’ll see, if it gets more industry adoption.