In the lab, conventional visual search tasks often include finding targets in 2D displays with examples of things. Real-world visual search frequently involves non-perpendicular seeing of 3D objects in 3D environments as well as relative movements between observers and search array items. Both of these factors cause projected pictures of the objects to change in legal but unexpected ways.
Additionally, observers did not always need to remember a target before searching; instead, they may refer to it while searching, for instance, by holding a photo of a person while searching for them in a crowd. The traditional visual search task was extended in the study as researchers looked at how having external references affected visual search performance, as well as the effects of perspective changes, brougtabout by discrete viewing angle changes (Experiment 1) and continuous rotations of the search array (Experiment 2).
The results showed that performance was comparable to searching from 2D exemplar views of objects when searching from 3D objects with a non-zero viewing angle, and that performance was comparable to searching from rotating virtual reality arrays when searching from stationary arrays searching from 3D targets. Generally speaking, discrete or continuous viewpoint changes had no impact on the search process as measured by eye movement patterns, accuracy, response time, or self-rated confidence. Due to this, a visual search did not require a perfect match between retinal pictures. The observers’ confidence and search accuracy were also increased by seeing the object during the search. Because observers actively checked back on the reference point, as seen the search time was prolonged by the eye movements, the search time was prolonged.
As a result, visual search was described as an embodied activity that involves the flow of information in real time between the observers and the environment.
Reference: jov.arvojournals.org/article.aspx?articleid=2783647