Sensory Systems Involved in Visual Processing Term Paper

Pages: 6 (1761 words)  ·  Style: APA  ·  Bibliography Sources: 3  ·  File: .docx  ·  Topic: Psychology

Sensory Systems

Systems involved in visual processing are constantly being bombarded with various stimuli. In order to function effectively, these systems must selectively attend to information that is pertinent to the task at hand. Several biological and cognitive functions are employed or implemented for the execution of any visual task. Moreover, the visual system does not passively absorb all external stimuli, but instead introduces systematic bias in order to attach meaning to stimuli, or process raw visual data into usable information (Rosenzweig et al., (2004). Several factors influence the process of visual information, including visual masking (Macknik, 2006), spatial location and color of stimuli (Grabbe & Pratt, 2004), the manner in which stimuli are grouped (Min-Shik & Cave, 1999), as well as expectation of the presentation of visual stimuli (Anderson & Carpenter, 2006).Buy full Download Microsoft Word File paper
for $19.77

Term Paper on Sensory Systems Involved in Visual Processing Are Assignment

Visual masking happens when visual targets become invisible through the modification of the context in which they occur, without modification of the targets themselves (Macknik, 2006). Macknik (2006) explained how there is a minimum number of conditions that are necessary for the maintenance of awareness of the visibility of stimuli that is not being attended to. First of all, in order for targets to be visible, spatiotemporal edges must be present, and they must be encoded by fleeting bursts of spikes observable in the early visual system. Visibility fails if these bursts are inhibited. Another factor required for the achievement of visibility is a rise in activity within the visual hierarchy and further processing within the occipital lobe. This researcher also explains the important role that lateral inhibition plays in visibility, because it results in interactions between spatially positioned stimuli and forms certain responses to stimuli temporally. Lateral inhibition essentially acts as a filter that sorts through and discards or enhances visual information before it is sent through the optic nerve to the brain. Also, lateral inhibition has been shown to increase in strength for both monoptic and dichoptic stimuli throughout the visual hierarchy (Macknik, 2006).

Both color information and position information are involved in the process of visual stimuli. Grabbe & Pratt (2004) examined whether these factors were equivalent in their influence on visual processing. In this study, participants observed a briefly flashed array of letters and were then asked to report a letter of a certain color from a specific region of the presentation. They were also asked to report any other letters they could remember from the presented stimuli. The results indicated that in the reports of additional letters, more letters were reported of similar location than letters of the same color or neutral color. This suggested that location information has priority over color information when participants had to perform letter selection based on these two factors. Furthermore, according to Grabbe & Pratt (2004), "position information had a unique role in top-down - guided visual selection, and that it predominates over color when selection is required on both dimensions." This spatial dimension priority is demonstrated when the instructions for the task do not explicitly indicate that there is only one selection dimension, and location seems to be the default dimension under these circumstances (Grabbe & Pratt, 2004).

The actual pathways on which spatial information and feature information, such as color, travel from the eye to the brain may be separate but parallel (Grabbe & Pratt, 2004). This would mean that there is a fundamental anatomical and functional difference in the way these types of information are processes. Furthermore, "selection happens by differential activation (excitation, inhibition, or both), of certain representations (not necessarily location representations) (Grabbe & Pratt, 2004)." What are the specific anatomical areas responsible for spatial and nonspatial information? Spatial information is routed into posterior parietal areas, while nonspatial information is directed into inferior temporal cortical areas.

Attentional focus may also play a role in the fact that location information took priority over color information in the study by Grabbe & Pratt (2004). If visual targets are within attentional focus, reaction times for detection are decreased and accuracy of discrimination in responses increases. The researchers suggested that spatial attention was focused first to locate the appropriate section of the visual target, which was followed consecutively by color selection in top-down guided selection tasks (Grabbe & Pratt, 2004).

The mechanisms involved in feature-specific attention to color were investigated by Muller et al. (2006) through the examination of selective stimulus processing using an electrophysiological measure called the steady-state visual evoked potential (SSVEP). In this study, participants observed a display of red and blue dots that were intermingled and randomly and continually shifted their positions. The frequencies at which these red and blue dots flickered differed, and this resulted in the elicitation of distinct SSVEP signals in the visual cortex. Selective attention to either the red or blue dots resulted in an enhanced amplitude of its specific SSVEP. These signals were anatomically localized to early levels of the visual cortex through the use of source modeling (Muller et al. (2006). This observed amplification of signals associated with attended color items provides empirical evidence for the rapid identification of feature information during visual search tasks (Muller et al., 2006).

Perceptual grouping functions may also play an important role in determining location in visual selection tasks (Min-Shik & Cave, 1999). There are generally two hierarchical, functionally independent mechanisms contained in visual information processing theory (Min-Shik & Cave, 1999). These are an early, preattentive, parallel mechanism and a later, attentive, serial one as well. The first mechanism occurs when participants recognize and detect a target with no increase in reaction time as the number of present distractors increases in tasks where the taret is defined by a certain feature, such as color. Based on this observation Tresiman has proposed the feature integration theory of attention (FIT).

This theory maintains that there is a preattentive stage of the visual system that processes all the information pertaining to primitive visual features of stimuli, such as orientation, color, brightness, depth, etc. (Min-Shik & Cave, 1999). This information is processed spatially in an automatic manner, and this is done so in parallel across the entire visual field. However, when object recognition must take place according to a conjunction of certain features, spatial attention is required, and this preattentive stage can not execute the required selection.

Min-Shik & Cave (1999) used this theory as a basis for their research into the role that perceptual groupings play in visual processing. These researchers contrasted two different types of visual search models, which either emphasized the role that perceptual grouping plays in visual search or did not. The former type of visual model propose that visual search begins with a preattentive stage in which the visual field is segmented into distinct objects according to gestalt properties like contiguity, similarity and proximity. Attention can then step in and continue on the process using these perceptual units that are already preattentive organized. In this class of visual search model, location of the target is considered to be equivalent to all other properties, including color, movement, shape, etc. (Min-Shik & Cave, 1999). The other type of visual search model propse that location plays a special role in visual selection, and that spatial information organizes representations necessary for the search task (Min-Shik & Cave, 1999).

The study by Min-Shik & Cave (1999) demonstrated grouping processes based on selection of certain locations. Results of their study indicated no significant evidence for task irrelevant color grouping effect in a simple feature search. However, findings did indicate a grouping effect that was location-based in a conjunction target that is defined by nonspatial features. Furthermore, "arranging elements into groups affected conjunction search but not feature search (Min-Shik & Cave, 1999)." Overall, these results support the idea that spatial attention functions in visual search tasks by inhibiting nonselected locations based on grouping principles rather than on an individual basis (Min-Shik & Cave, 1999).

Expectation also plays a crucial role in the processing of visual information, and this expectation depends almost exclusively on previous experience (Anderson & Carpenter, 2006). This is empirically demonstrated through the fact that expectation influences response time to a visual stimulus. Anderson & Carpenter (2006) used this observation as a basis for their investigation into the effects of experience on visual processing. In this study, the probability of a visual target changed suddenly during the experiment, and this resulted in the response time for eye movement adapted and continuously changed. This change in eye movement was observed to eventually stabilize in a way that reflected the new probability that was presented. The researcher attempted to model this change based on the assumption that the brain discards old, irrelevant information about the probability of an event by a certain factor that is relative to new probability information (Anderson & Carpenter, 2006). This factor represents a compromise in processes between accurately and rapidly responding to actual changes in the environment and not hastily discounting any information that may still be valuable (Anderson & Carpenter, 2006).

Moreover, Anderson & Carpenter (2006) demonstrated that the visual environment… [END OF PREVIEW] . . . READ MORE

Two Ordering Options:

?
Which Option Should I Choose?
1.  Buy full paper (6 pages)Download Microsoft Word File

Download the perfectly formatted MS Word file!

- or -

2.  Write a NEW paper for me!✍🏻

We'll follow your exact instructions!
Chat with the writer 24/7.

Anatomy and Function of Vision Term Paper


Managing, Assessing, and Treating APD Term Paper


Human Brain and Memory Term Paper


Manifestations of Dyslexia Term Paper


Fine Arts and the K-12 Curriculum Essay


View 200+ other related papers  >>

How to Cite "Sensory Systems Involved in Visual Processing" Term Paper in a Bibliography:

APA Style

Sensory Systems Involved in Visual Processing.  (2006, November 28).  Retrieved April 6, 2020, from https://www.essaytown.com/subjects/paper/sensory-systems-involved-visual-processing/82692

MLA Format

"Sensory Systems Involved in Visual Processing."  28 November 2006.  Web.  6 April 2020. <https://www.essaytown.com/subjects/paper/sensory-systems-involved-visual-processing/82692>.

Chicago Style

"Sensory Systems Involved in Visual Processing."  Essaytown.com.  November 28, 2006.  Accessed April 6, 2020.
https://www.essaytown.com/subjects/paper/sensory-systems-involved-visual-processing/82692.