![]() ![]() ( 4) first show in a simple experiment that when listeners rotated their heads, they could differentiate a far sound source from a near source better than when they did not move their heads. In most cases human listeners are not very accurate in making absolute distance judgments, generally underestimating the true distance of a sound source ( 14). ( iii) When distances between sound source and listeners are large, the atmosphere reduces the level of sound before it reaches the ear in a frequency-dependent manner, so changes in a sound’s spectrum can indicate relative distance. The ratio of the direct-to-reflective sound level can indicate relative sound-source distance. ( ii) Sound from reflective surfaces (in a room) will reach listeners ears slightly after the direct sound arrives. While distance perception has not received a lot of attention in the literature ( 14), most research indicates that there are several possible cues for judging sound-source distance: ( i) Expectation/experience, with a priori knowledge about the sound from a sound source, softer sounds will be further away than louder sounds. To do so, they needed a paradigm in which the only cue for judging relative distance was auditory motion parallax: that is, all other auditory distance cues were controlled for. ( 4) tested whether motion parallax could be a cue for relative sound source distance judgments. If only the sound of the objects were used, then motion parallax, but only motion parallax, might provide a cue for distance perception (i.e., when listeners move, sound from near sound sources might appear to move faster across space than sound from far sound sources). In vision, parallax offers several possible cues for depth perception. Occlusion might be a cue to judge object distance. A near object (tractor) may occlude a far object (bell tower) in one ( Fig. Thus, direction of motion can indicate near or far objects. In vision, if one focuses the retina between the tractor and bell tower as the car moves, the near-by tractor will move opposite to the car direction, but the bell tower will move in the same direction. Thus, motion parallax leads to slow-moving objects being perceived as further away than fast-moving objects. If the car moves from left to right, then motion parallax will cause the nearby tractor to appear to move faster past the car than the far-away bell tower (the change in the angle over time is faster for the tractor than for the bell tower). ![]() 1, the position of the tractor or the bell tower would appear different if viewed from the car on the left compared with the car on the right (the angle β is smaller than the angle α). ![]() Parallax is when the position or direction of an object appears to differ when viewed from two different locations. ( 4) starts to fill in that gap in important ways. What is missing from this literature is a role of listener and sound motion in distance perception. This work has indicated both the considerable strength of some of Wallach’s ideas and some of the weaknesses. All of this work has involved azimuthal and elevated sound-source localization judgments when listeners and sound sources move. Several studies over the past decade have returned to Wallach’s papers, delving more deeply into his hypothesizes (e.g., refs. Boring, the famous psychologist at the time when Boring ( 2) summarized his history of sound-source localization with, “Wallach has made it quite clear that localization is not purely auditory, but the product of an integration of auditory, kinesthetic and, when the eyes are open, visual factors.” For reasons that are not well established, there was almost no follow-up to evaluating Wallach’s ( 7) hypothesis that sound-source localization is based on the integration of two cues (auditory-spatial and head-position cues). Using 3D geometry and trigonometry, Wallach showed that certain rotational relationships between listeners and sound sources could disambiguate front–back reversal errors, produce azimuthal illusions, and allow listeners to judge sound-source elevation. He had listeners rotate in a chair along the azimuth plane, judging the location of sounds presented from different azimuthal loudspeakers. Wallach ( 7) argued that vision, kinesthetic, and vestibular function provided head position cues. Differences in the arrival time of sound at the two ears and interaural level differences were the two “binaural cues” assumed for determining sound-source location. In the late 1930s, Hans Wallach ( 5– 7) published three papers in which he moved listeners and sound sources to test his main hypothesis that “Two sets of sensory data enter into the perceptual process of localization, (1) the changing binaural cues and (2) the data representing the changing position of the head” ( 7). ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |