首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The ability to judge heading during tracking eye movements has recently been examined by several investigators. To assess the use of retinal-image and extra-retinal information in this task, the previous work has compared heading judgments with executed as opposed to simulated eye movements. For eye movement velocities greater than 1 deg/sec, observers seem to require the eye-velocity information provided by extra-retinal signals that accompany tracking eye movements. When those signals are not provided, such as with simulated eye movements, observers perceive their self-motion as curvilinear translation rather than the linear translation plus eye rotation being presented. The interpretation of the previous results is complicated, however, by the fact that the simulated eye movement condition may have created a conflict between two possible estimates of the heading: one based on extra-retinal solutions and the other based on retina-image solutions. In four experiments, we minimized this potential conflict by having observers judge heading in the presence of rotations consisting of mixtures of executed and simulated eye movements. The results showed that the heading is estimated more accurately when rotational flow is created by executed eye movements alone. In addition, the magnitude of errors in heading estimates is essentially proportional to the amount of rotational flow created by a simulated eye rotation (independent of the total magnitude of the rotational flow). The fact that error magnitude is proportional to the amount of simulated rotation suggests that the visual system attributes rotational flow unaccompanied by an eye movement to a displacement of the direction of translation in the direction of the simulated eye rotation.  相似文献   

2.
In eight experiments, we examined the ability to judge heading during tracking eye movements. To assess the use of retinal-image and extra-retinal information in this task, we compared heading judgments with executed as opposed to simulated eye movements. In general, judgments were much more accurate during executed eye movements. Observers in the simulated eye movement condition misperceived their self-motion as curvilinear translation rather than the linear translation plus eye rotation that was simulated. There were some experimental conditions in which observers could judge heading reasonably accurately during simulated eye movements; these included conditions in which eye movement velocities were 1 deg/sec or less and conditions which made available a horizon cue that exists for locomotion parallel to a ground plane with a visible horizon. Overall, our results imply that extra-retinal, eye-velocity signals are used in determining heading under many, perhaps most, viewing conditions.  相似文献   

3.
Useful medical diagnostic information has been reported from low-frequency rotational testing of the horizontal vestibulo-ocular reflex (VOR) of patients with vestibular disorders. Servocontrolled rotating systems have been used as the only practical method to generate stimuli over lower VOR frequency response ranges, the decade from 0.01 to 0.1 Hz. Active head movements have been used for testing the human VOR at higher frequencies, exceeding 0.5 Hz. We examined whether active head movements could be used also to test the VORs of subjects over lower frequency ranges, extending to 0.02 Hz. We used a swept-frequency, active head movement protocol to generate a broad-band stimulus. Eye position was recorded with electro-oculography. Head velocity was recorded with a rotational sensor attached to a head band. Six individual test epochs from human subjects were concatenated to form complex, periodic waveforms of head and eye velocity, 75 seconds in duration. Broad-band cross-spectral signal processing methods were used to compute horizontal VOR system characteristics from these waveforms extending from 0.02 to 2 Hz. The low-frequency VOR data appeared to originate from amplitude modulation of high-frequency active movements, acting as carrier signals. Control experiments and processing of simulated data from a known system excluded the possibility of signal processing artifacts. Results from six healthy subjects showed low-frequency gains and phase values in ranges similar to those from published rotational chair studies of normal subjects. We conclude that it is feasible to test the human VOR over extended low-frequency ranges using active head movements because of amplitude modulation of the head and eye signals.  相似文献   

4.
When we make saccadic eye movements or goal-directed arm movements, there is an infinite number of possible trajectories that the eye or arm could take to reach the target. However, humans show highly stereotyped trajectories in which velocity profiles of both the eye and hand are smooth and symmetric for brief movements. Here we present a unifying theory of eye and arm movements based on the single physiological assumption that the neural control signals are corrupted by noise whose variance increases with the size of the control signal. We propose that in the presence of such signal-dependent noise, the shape of a trajectory is selected to minimize the variance of the final eye or arm position. This minimum-variance theory accurately predicts the trajectories of both saccades and arm movements and the speed-accuracy trade-off described by Fitt's law. These profiles are robust to changes in the dynamics of the eye or arm, as found empirically. Moreover, the relation between path curvature and hand velocity during drawing movements reproduces the empirical 'two-thirds power law. This theory provides a simple and powerful unifying perspective for both eye and arm movement control.  相似文献   

5.
Movements of the head and eyes are known to be intimately related. Eye position has also been shown to be closely related to the electromyographic activity of dorsal neck muscles; however, extraocular muscle proprioception has not generally been considered to play a part in the control of such movements. We have previously shown that, in the pigeon, imposed movements of one eye modify the vestibular responses of several dorsal neck muscles in ways that are dependent on stimulus parameters such as the amplitude and velocity of imposed eye movement. The present study examines more closely the interactions between imposed eye movements and different muscle pairs. The three neck muscle pairs studied each responded to afferent signals from the extraocular muscles in discrete and specific ways which appeared to be correlated with their different actions. Complementary effects of imposed eye movements in the horizontal plane were seen for both the complexus and splenius muscle pairs, with imposed eye movements in one direction producing the largest inhibition of the ipsilateral muscle's vestibular response and imposed eye movements in the opposite direction the largest inhibition of the contralateral muscle's vestibular response. During roll tilt oscillation (ear-up/ear-down) in the frontal plane, similar complementary effects of imposed eye movement were seen in the complexus muscle pair, but the splenius muscle pair showed little tuning, with similar inhibition for imposed eye movement directed either upwards or downwards. In contrast to these complementary effects, the biventer cervicis muscle pair showed no vestibular modulation during vestibular stimulation in the horizontal plane and their spontaneous activity was not altered by imposed eye movement. During roll-tilt oscillation (ear-up/ear-down) in the frontal plane imposed eye movement directed vertically upwards increased both muscles' vestibular responses and imposed eye movement directed vertically downwards inhibited both muscles' vestibular responses. Section of the ophthalmic branch of the trigeminal nerve (deafferenting the eye muscles) abolished the effects of imposed eye movement on the neck muscle pairs. In conjunction with further control experiments these results provide compelling evidence that proprioceptive signals from the extraocular muscles reach the neck muscles and provide them with a functionally significant signal. We have previously shown that signals from the extraocular muscles appear to be involved in the control of the vestibulo-ocular reflex. It follows from the experiments reported here that proprioceptive signals from the extraocular muscles are also likely to be involved in the control of gaze.  相似文献   

6.
Eye movements during natural tasks suggest that observers do not use working memory to capacity but instead use eye movements to acquire relevant information immediately before needed. Results here however, show that this strategy is sensitive to memory load and to observers' expectations about what information will be relevant. Depending upon the predictability of what object features would be needed in a brick sorting task, subjects spontaneously modulated the order in which they sampled and stored visual information using working memory more when the task was predictable and reverting to a just-in-time strategy when the task was unpredictable and the memory load was higher. This self organization was evidenced by subjects' sequence of eye movements and also their sorting decisions following missed feature changes. These results reveal that attentional selection, fixations, and use of working memory reflect a dynamic optimization with respect to a set of constraints, such as task predictablity and memory load. They also reveal that change blindness depends critically on the local task context, by virtue of its influence on the information selected for storage in working memory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
Shared motor error for multiple eye movements   总被引:1,自引:0,他引:1  
Most natural actions are accomplished with a seamless combination of individual movements. Such coordination poses a problem: How does the motor system orchestrate multiple movements to produce a single goal-directed action? The results from current experiments suggest one possible solution. Oculomotor neurons in the superior colliculus of a primate responded to mismatches between eye and target positions, even when the animal made two different types of eye movements. This neuronal activity therefore does not appear to convey a command for a specific type of eye movement but instead encodes an error signal that could be used by multiple movements. The use of shared inputs is one possible strategy for ensuring that different movements share a common goal.  相似文献   

8.
Three experiments are reported in which Ss produced rapid wrist rotations to a target while the position of their eyes was being monitored. In Experiment 1, Ss spontaneously executed a saccadic eye movement to the target around the same time as the wrist began to move. Experiment 2 revealed that wrist-rotation accuracy suffered if Ss were not allowed to move their eyes to the target, even when visual feedback about the moving wrist was unavailable. In Experiment 3, wrist rotations were equally accurate when Ss produced either a saccadic or a smooth-pursuit eye movement to the target. However, differences were observed in the initial-impulse and error-correction phases of the wrist rotations, depending on the type of eye movement involved. The results suggest that aimed limb movements use information from the oculomotor system about both the static position of the eyes and the dynamic characteristics of eye movements. Furthermore, the information that governs the initial impulse is different from that which guides final error corrections. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.
The synaptic organization of the saccade-related neuronal circuit between the superior colliculus (SC) and the brainstem saccade generator was examined in an awake monkey using a saccadic, midflight electrical-stimulation method. When microstimulation (50-100 microA, single pulse) was applied to the SC during a saccade, a small, conjugate contraversive eye movement was evoked with latencies much shorter than those obtained by conventional stimulation. Our results may be explained by the tonic inhibition of premotor burst neurons (BNs) by omnipause neurons that ceases during saccades to allow BNs to burst. Thus, during saccades, signals originating from the SC can be transmitted to motoneurons and seen in the saccade trajectory. Based on this hypothesis, we estimated the number of synapses intervening between the SC and motoneurons by applying midflight stimulation to the SC, the BN area, and the abducens nucleus. Eye position signals were electronically differentiated to produce eye velocity to aid in detecting small changes. The mean latencies of the stimulus-evoked eye movements were: 7.9 +/- 1.0 ms (SD; ipsilateral eye) and 7.8 +/- 0.9 ms (SD; contralateral eye) for SC stimulation; 4.8 +/- 0.5 ms (SD; ipsilateral eye) and 5.1 +/- 0.7 ms (SD; contralateral eye) for BN stimulation; and 3.6 +/- 0.4 ms (SD; ipsilateral eye) and 5.2 +/- 0.8 ms (SD; contralateral eye) for abducens nucleus stimulation. The time difference between SC- and BN-evoked eye movements (about 3 ms) was consistent with a disynaptic connection from the SC to the premotor BNs.  相似文献   

10.
Nitric oxide (NO) production by neurons in the prepositus hypoglossi (PH) nucleus is necessary for the normal performance of eye movements in alert animals. In this study, the mechanism(s) of action of NO in the oculomotor system has been investigated. Spontaneous and vestibularly induced eye movements were recorded in alert cats before and after microinjections in the PH nucleus of drugs affecting the NO-cGMP pathway. The cellular sources and targets of NO were also studied by immunohistochemical detection of neuronal NO synthase (NOS) and NO-sensitive guanylyl cyclase, respectively. Injections of NOS inhibitors produced alterations of eye velocity, but not of eye position, for both spontaneous and vestibularly induced eye movements, suggesting that NO produced by PH neurons is involved in the processing of velocity signals but not in the eye position generation. The effect of neuronal NO is probably exerted on a rich cGMP-producing neuropil dorsal to the nitrergic somas in the PH nucleus. On the other hand, local injections of NO donors or 8-Br-cGMP produced alterations of eye velocity during both spontaneous eye movements and vestibulo-ocular reflex (VOR), as well as changes in eye position generation exclusively during spontaneous eye movements. The target of this additional effect of exogenous NO is probably a well defined group of NO-sensitive cGMP-producing neurons located between the PH and the medial vestibular nuclei. These cells could be involved in the generation of eye position signals during spontaneous eye movements but not during the VOR.  相似文献   

11.
Eye movements are often misdirected toward a distractor when it appears abruptly, an effect known as oculomotor capture. Fundamental differences between eye movements and attention have led to questions about the relationship of oculomotor capture to the more general effect of sudden onsets on performance, known as attentional capture. This study explores that issue by examining the time course of eye movements and manual localization responses to targets in the presence of sudden-onset distractors. The results demonstrate that for both response types, the proportion of trials on which responses are erroneously directed to sudden onsets reflects the quality of information about the visual display at a given point in time. Oculomotor capture appears to be a specific instance of a more general attentional capture effect. Differences and similarities between the two types of capture can be explained by the critical idea that the quality of information about a visual display changes over time and that different response systems tend to access this information at different moments in time. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

12.
Presents a visual–spatial approach to the study of attention dysfunction. The hypotheses of broadened and narrowed attention were tested by comparing peripheral visual discrimination of 10 acute schizophrenic and 11 chronic schizophrenic inpatients and 16 normal Ss (hospital staff) within 2 regions of the functional visual field. Pairs of visual stimuli were presented at 4 display angles. Measures of response accuracy, response latency, and latency of eye movement of peripheral stimuli were obtained. Results indicate that acute schizophrenics generally discriminated peripheral signals more accurately than chronic schizophrenics or normals. Normals discriminated signals more accurately than chronic schizophrenics. Results suggest the differential use of selective strategies. Limitations in the use of peripheral information among chronic schizophrenics implies a reduction in the amount of information transmitted in a selective act and a reduction in the economy of selective activities. In contrast to normals, acute schizophrenics utilized more efficient selective strategies over a greater spatial area, implying greater transmission of information within discrete selective acts. Results also indicate that schizophrenics initiated eye movements earlier than normals and that response latency was greater for acute schizophrenics than for normals. Results are interpreted as providing partial support for P. H. Venable's (1964) theory of input dysfunction. (24 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
We studied the eye movements evoked by applying small amounts of current (2-50 microA) within the oculomotor vermis of two monkeys. We first compared the eye movements evoked by microstimulation applied either during maintained pursuit or during fixation. Smooth, pursuitlike changes in eye velocity caused by the microstimulation were directed toward the ipsilateral side and occurred at short latencies (10-20 ms). The amplitudes of these pursuitlike changes were larger during visually guided pursuit toward the contralateral side than during either fixation or visually guided pursuit toward the ipsilateral side. At these same sites, microstimulation also often produced abrupt, saccadelike changes in eye velocity. In contrast to the smooth changes in eye velocity, these saccadelike effects were more prevalent during fixation and during pursuit toward the ipsilateral side. The amplitude and type of evoked eye movements could also be manipulated at single sites by changing the frequency of microstimulation. Increasing the frequency of microstimulation produced increases in the amplitude of pursuitlike changes, but only up to a certain point. Beyond this point, the value of which depended on the site and whether the monkey was fixating or pursuing, further increases in stimulation frequency produced saccadelike changes of increasing amplitude. To quantify these effects, we introduced a novel method for classifying eye movements as pursuitlike or saccadelike. The results of this analysis showed that the eye movements evoked by microstimulation exhibit a distinct transition point between pursuit and saccadelike effects and that the amplitude of eye movement that corresponds to this transition point depends on the eye movement behavior of the monkey. These results are consistent with accumulating evidence that the oculomotor vermis and its associated deep cerebellar nucleus, the caudal fastigial, are involved in the control of both pursuit and saccadic eye movements. We suggest that the oculomotor vermis might accomplish this role by altering the amplitude of a motor error signal that is common to both saccades and pursuit.  相似文献   

14.
The posterior parietal cortex has long been considered an 'association' area that combines information from different sensory modalities to form a cognitive representation of space. However, until recently little has been known about the neural mechanisms responsible for this important cognitive process. Recent experiments from the author's laboratory indicate that visual, somatosensory, auditory and vestibular signals are combined in areas LIP and 7a of the posterior parietal cortex. The integration of these signals can represent the locations of stimuli with respect to the observer and within the environment. Area MSTd combines visual motion signals, similar to those generated during an observer's movement through the environment, with eye-movement and vestibular signals. This integration appears to play a role in specifying the path on which the observer is moving. All three cortical areas combine different modalities into common spatial frames by using a gain-field mechanism. The spatial representations in areas LIP and 7a appear to be important for specifying the locations of targets for actions such as eye movements or reaching; the spatial representation within area MSTd appears to be important for navigation and the perceptual stability of motion signals.  相似文献   

15.
Over the past half century, research on human decision making has expanded from a purely behaviorist approach that focuses on decision outcomes, to include a more cognitive approach that focuses on the decision processes that occur prior to the response. This newer approach, known as process tracing, has employed various methods, such as verbal protocols, information search displays, and eye movement monitoring, to identify and track psychological events that occur prior to the response (such as cognitive states, stages, or processes). In the present article, we review empirical studies that have employed eye movement monitoring as a process tracing method in decision making research, and we examine the potential of eye movement monitoring as a process tracing methodology. We also present an experiment that further illustrates the experimental manipulations and analysis techniques that are possible with modern eye tracking technology. In this experiment, a gaze-contingent display was used to manipulate stimulus exposure during decision making, which allowed us to test a specific hypothesis about the role of eye movements in preference decisions (the Gaze Cascade model; Shimojo, Simion, Shimojo, & Scheier, 2003). The results of the experiment did not confirm the predictions of the Gaze Cascade model, but instead support the idea that eye movements in these decisions reflect the screening and evaluation of decision alternatives. In summary, we argue that eye movement monitoring is a valuable tool for capturing decision makers' information search behaviors, and that modern eye tracking technology is highly compatible with other process tracing methods such as retrospective verbal protocols and neuroimaging techniques, and hence it is poised to be an integral part of the next wave of decision research. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

16.
Eye movements in reading and information processing: 20 years of research   总被引:5,自引:0,他引:5  
Recent studies of eye movements in reading and other information processing tasks, such as music reading, typing, visual search, and scene perception, are reviewed. The major emphasis of the review is on reading as a specific example of cognitive processing. Basic topics discussed with respect to reading are (a) the characteristics of eye movements, (b) the perceptual span, (c) integration of information across saccades, (d) eye movement control, and (e) individual differences (including dyslexia). Similar topics are discussed with respect to the other tasks examined. The basic theme of the review is that eye movement data reflect moment-to-moment cognitive processes in the various tasks examined. Theoretical and practical considerations concerning the use of eye movement data are also discussed.  相似文献   

17.
Visual inputs to the brain are mapped in a retinocentric reference frame, but the motor system plans movements in a body-centered frame. This basic observation implies that the brain must transform target coordinates from one reference frame to another. Physiological studies revealed that the posterior parietal cortex may contribute a large part of such a transformation, but the question remains as to whether the premotor areas receive visual information, from the parietal cortex, readily coded in body-centered coordinates. To answer this question, we studied dorsal premotor cortex (PMd) neurons in two monkeys while they performed a conditional visuomotor task and maintained fixation at different gaze angles. Visual stimuli were presented on a video monitor, and the monkeys made limb movements on a panel of three touch pads located at the bottom of the monitor. A trial begins when the monkey puts its hand on the central pad. Then, later in the trial, a colored cue instructed a limb movement to the left touch pad if red or to the right one if green. The cues lasted for a variable delay, the instructed delay period, and their offset served as the go signal. The fixation spot was presented at the center of the screen or at one of four peripheral locations. Because the monkey's head was restrained, peripheral fixations caused a deviation of the eyes within the orbit, but for each fixation angle, the instructional cue was presented at nine locations with constant retinocentric coordinates. After the presentation of the instructional cue, 133 PMd cells displayed a phasic discharge (signal-related activity), 157 were tonically active during the instructed delay period (set-related or preparatory activity), and 104 were active after the go signal in relation to movement (movement-related activity). A large proportion of cells showed variations of the discharge rate in relation to limb movement direction, but only modest proportions were sensitive to the cue's location (signal, 43%; set, 34%; movement, 29%). More importantly, the activity of most neurons (signal, 74%; set, 79%; movement, 79%) varied significantly (analysis of variance, P < 0.05) with orbital eye position. A regression analysis showed that the neuronal activity varied linearly with eye position along the horizontal and vertical axes and can be approximated by a two-dimensional regression plane. These data provide evidence that eye position signals modulate the neuronal activity beyond sensory areas, including those involved in visually guided reaching limb movements. Further, they show that neuronal activity related to movement preparation and execution combines at least two directional parameters: arm movement direction and gaze direction in space. It is suggested that a substantial population of PMd cells codes limb movement direction in a head-centered reference frame.  相似文献   

18.
According to Einstein's equivalence principle, inertial accelerations during translational motion are physically indistinguishable from gravitational accelerations experienced during tilting movements. Nevertheless, despite ambiguous sensory representation of motion in primary otolith afferents, primate oculomotor responses are appropriately compensatory for the correct translational component of the head movement. The neural computational strategies used by the brain to discriminate the two and to reliably detect translational motion were investigated in the primate vestibulo-ocular system. The experimental protocols consisted of either lateral translations, roll tilts, or combined translation-tilt paradigms. Results using both steady-state sinusoidal and transient motion profiles in darkness or near target viewing demonstrated that semicircular canal signals are necessary sensory cues for the discrimination between different sources of linear acceleration. When the semicircular canals were inactivated, horizontal eye movements (appropriate for translational motion) could no longer be correlated with head translation. Instead, translational eye movements totally reflected the erroneous primary otolith afferent signals and were correlated with the resultant acceleration, regardless of whether it resulted from translation or tilt. Therefore, at least for frequencies in which the vestibulo-ocular reflex is important for gaze stabilization (>0.1 Hz), the oculomotor system discriminates between head translation and tilt primarily by sensory integration mechanisms rather than frequency segregation of otolith afferent information. Nonlinear neural computational schemes are proposed in which not only linear acceleration information from the otolith receptors but also angular velocity signals from the semicircular canals are simultaneously used by the brain to correctly estimate the source of linear acceleration and to elicit appropriate oculomotor responses.  相似文献   

19.
A shaky hand holding a video camera invariably turns a treasured moment into an annoying, jittery momento. More recent consumer cameras thoughtfully offer stabilization mechanisms to compensate for our unsteady grip. Our eyes face a similar challenge in that they are constantly making small movements even when we try to maintain a fixed gaze. What should be substantial, distracting jitter passes completely unseen. Position changes from large eye movements (saccades) seem to be corrected on the basis of extraretinal signals such as the motor commands sent to the eye muscle, and the resulting motion responses seem to be simply switched off. But this approach is impracticable for incessant, small displacements, and here we describe a novel visual illusion that reveals a compensation mechanism based on visual motion signals. Observers were adapted to a patch of dynamic random noise and then viewed a larger pattern of static random noise. The static noise in the unadapted regions then appeared to 'jitter' coherently in random directions. Several observations indicate that this visual jitter directly reflects fixational eye movements. We propose a model that accounts for this illusion as well as the stability of the visual world during small and/or slow eye movements such as fixational drift, smooth pursuit and low-amplitude mechanical vibrations of the eyes.  相似文献   

20.
Binocular coordination of eye movements is essential for stereopsis (depth perception) and to prevent double vision. More than a century ago, Hering and Helmholtz debated the neural basis of binocular coordination. Helmholtz believed that each eye is controlled independently and that binocular coordination is learned. Hering believed that both eyes are innervated by common command signals that yoke the eye movements (Hering's law of equal innervation). Here we provide evidence that Hering's law is unlikely to be correct. We show that premotor neurons in the paramedian pontine reticular formation that were thought to encode conjugate velocity commands for saccades (rapid eye movements) actually encode monocular commands for either right or left eye saccades. However, 66% of the abducens motor neurons, which innervate the ipsilateral lateral rectus muscle, fire as a result of movements of either eye. The distribution of sensitivity to ipsilateral and contralateral eye movements across the abducens motor neuron pool may provide a basis for learning binocular coordination in infancy and adapting it throughout life.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号