I swear I’m not Jessica

blahaj.zone account for @[email protected]

  • 22 Posts
  • 211 Comments
Joined 2 months ago
cake
Cake day: October 30th, 2024

help-circle










  • I doubt a single image can say something so broad reaching about someone’s cognition. You’d need multiple images displayed in more standardized conditions to know anything with confidence. People could be seeing this on all sorts of displays, from an OLED smartphone screen with blue light filter enabled, to an HDR monitor with custom color calibration. They could have the image fill different proportions of their field of view and see it in very different emotional contexts. A single data point cannot say anything about a person’s cognition.

    On top of that, I have no idea what you mean by high vs low level vision. Are you talking about bottom-up vs top-down processing more generally?

    Is it specifically testing to see how your brain changes the color or brightness of the red/orange based on depth cues?

    The spacing of the lines on the circles gives the perception of a red/orange circle behind an object with slits cut out of it. Many of us can see an occluded circle because it’s useful to be able to identify objects that are particularly hidden. It’s like seeing an animal hiding in a bush at 100 meters, where you can’t rely on binocular clues to perceive depth.

    I’m guessing that the color/brightness change is similar to seeing the sun or the moon through a tree canopy. I’ve noticed that they look dimmer and more orange in a similar way to this image. The moon looks bigger and more yellow if it’s seen on the horizon or through trees.

    Regardless, a single image on 196 can’t fully demonstrate which way you lean on anything.