|Home | About | Journals | Submit | Contact Us | Français|
The ability of animals to visually memorize and categorize a large number of pictures is well established. Determining the kinds of information animals use to accomplish these goals has been more difficult. This experiment examined the contribution of spatial frequency information to picture memorization by pigeons. A series of grayscale pictures were notch-filtered to eliminate different portions of the spatial frequency spectrum of memorized pictures. The results indicated that the higher spatial frequencies in the pictures were most important to accurate recognition, suggesting that the detection of fine detail at the high range of pigeon visual acuity was a critical component to their memorized representations. Subsequent tests with band-pass and hybrid conflict stimuli confirmed this conclusion. It is suggested that cognitive and task demands may determine how spatial frequency is used by pigeons, with higher frequencies more important to item memorization, while lower spatial frequencies may contribute to categorization in other types of discrimination tasks.
Several long lines of research have focused on animals’ ability to visually memorize (Cook, Levison, Gillett & Blaisdell, 2005; Fagot & Cook, 2006), categorize (Herrnstein, Loveland, & Cable, 1976), and extract conceptual information from (Cook, Kelly, & Katz, 2003) feature-rich pictorial stimuli. These pictures are commonly color or black and white photographs of scenes, people, animals, and other everyday items. Animals seem largely sensitive to the same kind of visual information as humans, readily carving nature into classes of objects that are bounded by perceptual similarity (Astley & Wasserman, 1992).
Determining which kinds of information animals use to accomplish this goal has been a difficult question to answer (Cook, Wright, & Drachman, 2013; Brooks, Ng, Buss, Marshall, & Freeman, 2013). This is especially true of birds, which seem to have many similar functions as those found in mammalian visual systems, and yet through divergent evolution have an almost entirely different neural architecture (Husband & Shimizu, 2001). This neural divergence from mammals is reflected in their multifoveal retinal organization (Remy & Watanabe, 1993), primarily nuclear cortical structure, and a reliance on the collothalamic visual pathway for pattern-recognition and other sophisticated visually-guided tasks (Husband & Shimizu, 2001). A complete account of how such divergent organizational principles lead to a similarly rich visual interaction with the environment is important for creating a general account of visual cognition.
One general way in which visual processing has been hypothesized to function is by forming representations in terms of spatial frequency rather than as two-dimensional distributions of luminances (Campbell & Robson, 1968; DeValois & DeValois, 1988). Spatial frequency is a way to deconstruct the periodic distributions of light and dark across an image; high spatial frequencies correspond to features such as sharp edges and fine details, whereas low spatial frequencies correspond to features such as global shape and more broad swaths of luminance. Physiologically, cells in several area of the primate visual system, including visual cortex (DeValois, Albrecht, & Thorell, 1982) and the lateral geniculate (Derrington & Lennie, 1984) are tuned to particular spatial frequencies.
Sensitivity to spatial frequency has been investigated in the neurophysiology of cells in the pigeon optic tectum within the collothalamic pathway (Jassik-Gerschenfelt & Hardy, 1979). Lesions within this pathway to the nucleus rotundus (Macko & Hodos, 1984) and the entopallium (Hodos, Macko, & Bessette, 1984) correspondingly disrupt visual acuity. Contrast sensitivity functions have been measured using both behavioral and physiological methods for several species of birds (Ghim & Hodos, 2006); it has been found that birds have poorer contrast sensitivity when compared with humans (Hodos, 1993; Ghim & Hodos, 2006). Pigeons have much better visual acuity in peak viewing conditions than rats (Prusky & Douglas, 2005), though visual acuity remains worse than primates (Hodos, 2012).
Although it is possible to gain a physiological or psychological index of visual system function with artificial sine-wave grating stimuli, there is the important concern for testing animals with stimuli that approximate the kind of complexity and structure that occur in the world around them. For comparative psychologists, these naturalistic stimuli have often been real-world photographs, which capture the kinds and distributions of visual features found most often in the real world.
Recently, Lea, De Filippo, Dakin, and Meier (2013) reported that pigeons are more sensitive to low, rather than high, spatial frequency information when categorizing pictures of cat and dog faces. This was a surprising result, because previous experiments have suggested that while pigeons can demonstrate some sensitivity to global features in visual stimuli, they are dispositionally local processors, most readily attending to the small details within stimuli rather than their overall content (Cavoto & Cook, 2001). These local details to which pigeons appear most sensitive ought to be found in the high spatial frequencies of an image, and Lea et al. found the opposite. However, one potentially critical detail is that Lea et al. used a categorical discrimination in which the shared similarity across category members may have been most apparent to the birds at the global level.
Here, we examined how spatial frequency contributes to accuracy in picture memorization task by pigeons. This task should require the birds to utilize the most diagnostic level of information at spatial scales critical to picture memorization, rather than direct attention to the areas shared across images belonging to the same category. We speculated that when the experiment was refocused on picture memorization, pigeons would be more sensitive to high spatial frequency information. If we correctly assume that local details map onto high spatial frequency information, this result would be more in line with previous experiments that have tested their hierarchical attention (Cook, 2001).
To test this hypothesis, we adopted a method used previously to examine which spatial areas in pictures contain diagnostic information (Gibson, Wasserman, Gosselin, & Schyns, 2005; Gibson, Lazareva, Gosselin, Schyns, & Wasserman, 2007). In that method, pigeons were taught a task in which they had to classify a number of pictures. Once they reached a high level of accuracy, images were passed through random sets of different sized windows or “bubbles” that revealed only portions of each image on test trials. For example, if the base picture was a car, one test image might randomly contain a small portion of the door, a front wheel, and the windshield; another test image might reveal a view of the side paneling, the rear wheel, and the roof. Each revealed area was then correlated with classification accuracy. In such a procedure, areas that carry more information should be more strongly correlated with the birds’ ability to classify each picture than areas that carry less information.
In the present experiment, we applied similar logic to investigate the role of spatial frequency in picture perception. Thus, rather than dichotomously filtering images into “high” or “low” spatial frequencies, we utilized a selective set of notch filters which removed a small portion of the frequency spectrum from each image and preserved the rest. This notch filter creates much more subtle changes to the pictures than more commonly employed techniques, which often leaves the resulting picture looking quite different and unrealistic compared to the the original. We reasoned this filtering technique would allow for more robust testing across repeated exposures by avoiding this kind of generalization decrement.
To complement our experiments with these notch-filtered stimuli, we conducted two additional tests to determine to which frequency ranges the pigeons were most sensitive. Here, we manipulated the pictures to create two direct tests of the use of high and low spatial frequency information: band-pass high and low filtered pictures and conflict “hybrid” pictures (after Lea, De Filippo, Dakin, & Meier, 2013 and Oliva, Torralba, & Schyns, 2006). As mentioned, high-pass and low-pass filtered stimuli are the more traditional presentation of spatial frequency filtered stimuli, in which a large region of either high or low frequency space is removed. Despite our concerns, we thought it was important to include a test with these stimuli. Conflict stimuli, in which a stimulus has multiple competing sources of information that each elicit a different response, are frequently used in hierarchical tests of perception because they allow for direct competition between the sources of information. The resulting choice responses provide direct evidence as to which features more powerfully control discrimination. Our hybrid spatial frequency stimuli superimposed in combination the low spatial frequencies from one image with the high spatial frequencies with a second image. Each of the two pictures combined in this way were associated with different learned response choices. Thus, pigeons that primarily used information from one range of spatial frequencies would classify the hybrid picture differentially based on that information.
Four 1-year-old male White Carneaux pigeons (Columba livia) were tested. They were maintained at 80-85% of their free feeding weights individually in a 12:12 L:D colony with free access to grit and water.
Testing was conducted in a computer-controlled chamber. Stimuli were presented on a 30.5 x 22.9 cm LCD monitor operating at 1024 × 768 resolution visible behind a 26 × 18 cm infrared touchscreen (EloTouch; Harrisburg, PA) that recorded pecks. A ceiling light was illuminated at all times, except during time-outs. A central food hopper (Coulbourn Instruments) was located under the touchscreen. Stimulus manipulations were conducted in Matlab (Mathworks, Natick, MA). All other experimental events were controlled using custom-written programs in Visual Basic (version 6, Microsoft).
Discriminative stimuli consisted of 40 pictures (20 colored & 20 grayscale) 14.3 x 8.9 cm in size presented at 400 x 300 pixel resolution. These pictures consisted of a wide range of content, including landscapes, natural and man-made objects, people, and animals. Only the spatial frequencies of the grayscale images were altered in the experiments below.
Spatial frequency filtering was conducted using a custom Matlab script. Images were first transformed into frequency space using the fft2 function and shifted so that the zero-frequency component was located at the center of the spectrum using fftshift. These frequencies were then multiplied by a combined frequency filter of equivalent size (see below) after which the inverse shift (ifftshift) and fourier transform (ifft2) were applied. To equilibrate the luminance of each filtered stimulus, the filtered image was brightened or dimmed to match the luminance of the original image.
Stimuli were notch-filtered by eliminating a single octave of the spatial frequency spectrum from the grayscale pictures. Octaves were defined as a range of spatial frequency values for which the upper end of the octave was twice the value of the lower end. To filter the images, we created concentric ring shaped filters the same size as the image. These filters, which represented each preserved spatial frequency band, were then joined to create a frequency filter. However, in each frequency filter, a selected ring of content (i.e., a “notch”) was left out to create a stimulus with part of the frequency spectrum removed. Each notch represented one of thirteen overlapping spatial frequency octaves (see Figure 1, Panel B for examples). Spatial frequencies tested ranged between 0.34 cycles/deg and 44.75 cycles/deg based on an estimated average viewing distance of 6 cm. This estimate was derived from assumptions about the average distance from which the pigeons viewed the displays, as the pigeon was free to move in the operant chamber. These frequency ranges were controlled by manipulating the matrix generated by the MATLAB freqspace function; so that these procedures can be replicated precisely, we have included the upper and lower bounds of each freqspace filter in Table 1 of the Supplemental Information.
Band-pass stimuli were altered versions of eight grayscale pictures in which information was removed from between 4.20 to 44.75 cycles/deg (band-pass low stimuli) or between 0.34 to 4.20 c/deg (band-pass high stimuli; see Figure 2, panel A for examples). Each stimulus was tested separately with high or low spatial frequencies removed.
Hybrid stimuli were combinations of the band-pass pictures in which one low-frequency band-pass picture was superimposed or combined with a different high-frequency band-pass picture (see Figure 2, panel A for examples). Each of the pictures that were combined had been trained with a different choice response, so we could identify which portion of the spatial frequency spectrum was most important for each pigeon.
Pigeons were trained in a four alternative forced-choice memory task. Each trial started with a peck to a centrally-located 2.5 cm circular white warning signal. This signal was replaced with a randomly selected training stimulus. Pigeons were required to peck at this picture a variable number of times (20 to 24 pecks). Upon completion of this observing response, four different colored squares appeared, which served as choice response keys. These colored keys were presented at the corners of the picture in a fixed location on every trial. One of the four choice options was randomly assigned to be correct for each picture, with ten pictures trained to each choice response. A single peck to a choice response removed all stimuli from the screen and initiated feedback. For baseline training trials, a correct choice resulted in 2.4 s access to mixed grain, while pecks to any other choice option resulted in a 5 s dark timeout. Trials were separated by a 3 s lighted interval following either consequence. At the time of spatial frequency testing, the pigeons were all highly experienced at this picture memorization task.
After pigeons reached asymptotic levels of accuracy, we began including test trials. We only conducted spatial frequency tests with the grayscale images (half of the stimulus set) in order to uniformly apply the spatial frequency filter and not encounter artifacts due to filtering each color channel independently. These grayscale images had been introduced into baseline training more than five months in advance of these experiments in order to prepare for experiments in which spatial frequency content was varied. Test trials with notch-filtered stimuli were included as non-differentially reinforced probe trials; a peck to any choice key resulted in reinforcement. Test trials were otherwise identical to training trials. To ensure that the frequency filtering script itself did not create perceptual artifacts, we also included grayscale pictures that were run through the filtering algorithm with no spatial frequencies were removed. The pigeons were tested for 100 sessions. Each session consisted of 120 trials (105 training trials, 13 test trials, and 2 control trials).
Band-pass and hybrid stimuli were inserted into experimental sessions training as non-differentially reinforced probe trials. The pigeons were tested for 32 sessions on hybrid stimuli and 26 sessions on band-pass stimuli. Test sessions consisted of 120 training trials with 12 hybrid trials and 4 band-pass trials. An issue with response collection prevented us from analyzing several sessions of band-pass data in the high condition.
Before testing filtered stimuli, we confirmed each pigeon had reached a high level of accuracy at classifying each of the pictures. Overall accuracy at classifying the grayscale pictures was quite high (86.1%). Nevertheless, to ensure that meaningful results were analyzed during the notch-filter tests, we removed data from the small number of pictures that were consistently classified at below 60% accurate (for bird #1B, 4 pictures; for bird #2C, 2 pictures). Because we were measuring psychophysical functions and wanted as little noise as possible in our measurements, we also applied two general filters on the data. We removed data collected from the first 20 trials of each session because a small warm-up effect was observed (birds were 9% more accurate following the warm-up period). We also removed trials with reaction times of greater than 10 s, which were likely indicative of a loss of stimulus control. This 10s filter resulted in the removal of less than 3.5% of total trials. With these warm-up and long response trials removed, overall accuracy was slightly higher (90.8%).
Accuracy results from notch-filtered grayscale pictures are presented for each pigeon in the top panel of Figure 1. Although each pigeon shows a different degree of sensitivity to the spatial frequency manipulation overall (for example, #4S shows relatively little deficit as a result of filtering, but #3O shows a large decline in accuracy), all show the same general reliance on high spatial frequency information.
To help equilibrate these individual differences, we normalized data from each of the pigeons by creating a Z-score across each of the spatial frequency notches that were tested. Although these transformed data are fundamentally equivalent to the untransformed data, this analysis creates a more accurate representation of the impact of each spatial frequency notch relative to discrimination with other notches across the birds. As a result, we have labeled the axis of this normalized analysis “relative importance” in the bottom panel of Figure 1.
This analysis showed that each pigeon had a highly similar pattern of relative notch importance, with a characteristic ramp across the low values to a single peak of primary importance at the high spatial frequency notches. To statistically verify these trends, we conducted a linear mixed-effects model with spatial frequency notch as a fixed effect and bird as a random factor. That ANOVA revealed a highly significant effect of spatial frequency notch, F(12, 36) = 5.54, , 95% CIs = .28, .68 (all significant effects in this paper were evaluated with alpha level <= .05). The peak at spatial frequency Notch 11 corresponds to the removal of spatial frequency information of between 11.18 and 22.37 cycles/deg.
As can be seen in Figure 2 (bottom left panel), all of the birds demonstrated above chance accuracy on high band-pass stimuli, while accuracy on low bandpass stimuli was barely above chance. To analyze these data, we used a linear mixed-effects model with filter condition (baseline, high, or low) as a fixed effect and bird as a random effect. This analysis confirmed a highly significant effect of filter condition, F(2,6) = 137.80, , 95% CIs = .81, .99. To examine the source of this effect, we conducted two-tailed paired t-tests across each of the eight pictures tested in each condition for each pigeon. Three of the four pigeons showed significantly higher discrimination with pictures that had high spatial frequency information than low spatial frequency information; the remaining bird (#4S) showed a marginally significant difference (p = .065). Thus, the pigeons showed a clear and significant tendency to recognize the pictures that preserved their high spatial frequency than those in which only low spatial frequency was present.
Three of the four birds also showed significantly worse discrimination with the high spatial frequency items than with baseline items (for the remaining bird, #4S, p=.059). This generalization decrement from baseline to high spatial frequency pictures indicates that, although most of the discriminative information was present in the high spatial frequency ranges, the pigeons still clearly saw these images as perceptually altered from the originals.
Results from tests with conflict hybrid stimuli are presented in Figure 2 (bottom right panel). Because information from two pictures was superimposed in each stimulus, data are reported in terms of the birds’ likelihood to make one of three available choices: choice of the key that corresponds to either the “high” or “low” information present in the stimulus, as well as “irrelevant responses”, which correspond to the likelihood of making a response to one of the two alternative (i.e., incorrect) keys. To locate these various response alternatives on the same scale, “irrelevant response”, which has a 50% likelihood of occurring by chance, was halved to align it with the “high” and “low” response, which have a 25% likelihood of occurring by chance.
As one might expect from the results with notch filtering and band-pass filtering, most of the discriminative responses made by the pigeons was to the high spatial frequency content of these hybrid images. To analyze these data, we used a linear mixed-effects model to investigate response likelihood with response key as a fixed factor and bird as a random factor. This analysis revealed a highly significant effect of response key, F(2, 6) = 76.13, , 95% CIs = .69, .98 indicating that the birds were more strongly controlled by the high spatial frequency content of the images. To confirm these effects, pairwise t-tests were conducted between the likelihood of making each type of choice (“high”, “low”, or “guess”) for each bird across the 48 total hybrid stimuli. These tests confirmed significant differences in each pigeon between both the “High” vs. “Low” and “High” vs. “Irrelevant”, but showed no significant difference between the likelihood of making a “Low” response and a “Irrelevant” response. Thus, while the choice responses of each bird were clearly influenced by the high spatial frequency information present in each conflict stimulus, it appears that the low spatial frequency information provided essentially no more control over responding than would be expected by chance.
Three measures examining the spatial frequency sensitivity of pigeons indicated that the higher frequencies made the greatest contribution to accurate identification in a picture memorization task. First, after applying a notch filter across the different spatial frequencies that comprised memorized pictures, the pigeons showed a strong reliance on high spatial frequency content, peaking between 11 and 22 cycles/deg. Compared to acuity estimates with similarly young birds, this falls into the range of maximal acuity for the frontal visual field of about 12-18 cycles/deg (Hodos, 2012). Thus, it appears as though the pigeons were using the highest spatial frequency information they could extract in order to memorize the pictures.
Second, when we next tested with band-pass filtered stimuli in which only the high (which contained the optimum range noted above) or low (which removed the optimum range noted above) spatial frequencies were preserved in the pictures, the pigeons performed well above chance at recognizing the stimuli in which high spatial frequencies were preserved, but were quite poor at recognizing the stimuli in which only low spatial frequencies were preserved. Third, when we tested with hybrid stimuli in which the birds were shown pictures consisting of conflicting low and high spatial frequency information from different pictures, pigeons were controlled by the high spatial frequency content of such stimuli. Thus, across each of these three separate tests, we found converging and convincing evidence that the pigeons were memorizing and recognizing these pictorial stimuli primarily based on their high spatial frequency content.
These results with photographic stimuli nicely accord with other approaches to investigating the spatial scale at which pigeons preferentially process items. Previous studies have found that pigeons most readily attend to the local level of hierarchical stimuli (Cavoto & Cook, 2001) and demonstrate a characteristic insensitivity to occluded objects without the use of specialized training, implying that they have difficulty identifying objects when specific details are missing (Fujita & Ushitani, 2005, but, see Nagasaka, Lazareva, & Wasserman, 2007). While it is possible to train pigeons to report information at the global scale (Cook, Goto & Brooks, 2005), and pigeons do show some sensitivity to global form in well-controlled experiments (Goto, Wills, & Lea, 2004), the majority of pigeon research seems to support a particulate view that agrees with our finding on high spatial frequency reliance. This focus on high spatial frequency information may be why pigeons are disproportionately sensitive to local information in hierarchical displays and also why it is so difficult to train pigeons to report information that requires integrating multiple local features into global percepts.
These previous studies, as with the current one, have dealt mainly with representation rather than perception. Identifying what level of detail pigeons use to classify a set of memorized pictures involves a number of stages in the identification process, including both the scale at which features were first encoded and the scale at which features are recognized during recall. Thus, although our pigeons may be perfectly capable of processing the low spatial frequency information present in each picture, they do not naturally do so when presented with high spatial frequency information. Instead, they preferentially attend to the local features present in each image.
Despite this local processing, one interesting finding of the present research is that even the most important frequencies for each bird did not carry all of the information. That is, the birds were not simply attending to some singular detail in the pictures, even though such a strategy would be perfectly suited for the kind of memorization task in which they were engaged. This fact is evidenced through several separate findings. First, notch-filtered stimuli were not at chance, even at the most sensitive spatial frequency ranges; one might expect that if the birds were relying on a single feature that occupied a specific spatial frequency level, they would show chance responding when that feature was removed. Second, there was clear decrement on both high frequency band-pass and hybrid stimuli when compared to baseline. So, despite the fact that the birds seemed not to rely on the low spatial frequencies in order to inform choices, they clearly noticed the removal of concomitant low spatial frequency information. Thus, although the birds primarily relied on high spatial frequencies, they did not do so exclusively.
One interesting question to consider is how generalizable this high spatial frequency reliance might be to other experimental contexts. In the current task, we asked our pigeons to memorize arbitrary sets of pictures, but in many experiments, shared perceptual similarity among the items is used to promote categorization or concept learning. While it is possible that pictures serving as items in categorical tasks share some of the same high spatial frequency details, it is also likely that some of the shared elements that mediate successful transfer between category members is found at lower spatial frequencies (Soto & Wasserman, 2010). Indeed, both pigeons and people asked to classify increasing numbers of pictures show characteristic gains in the amount of generalization that they demonstrate to new members of the same class (Wasserman and Bhatt, 1992). This may be because categorization promotes differential attention to other spatial scales or features in the same stimulus. In a memorization task using arbitrary items (or very few items), identifying and attending to the featural level at which the pictures are most distinct (and therefore, least confusable) is the most efficient strategy. So, it may be that the sensitivity to high spatial frequencies is best revealed when asking birds to memorize, whereas sensitivity to lower frequencies is revealed with different cognitive demands, such as categorization.
While we know of no direct evidence that tests this hypothesis about memorization vs. categorization promoting attention to new spatial scales in animals, there is good evidence that both animals and humans change the way that they process stimuli depending on the task used to present them. For example, when both animals and humans are taught to complete different kinds of tasks with equivalent stimuli, they often process those stimuli in different ways and attend to different details of those same stimuli (e.g., Gibson et al, 2005). This attentional effect is almost certainly true in the spatial frequency domain, where different kinds of spatial frequencies might be critically important for performing different kinds of scene categorizations (Oliva and Torralba, 2001), a task that pigeons have recently been shown to accomplish (Kirkpatrick, Bilton, Hansen, and Loschky, 2014). It would not be unreasonable to suggest that in cases where animals are forced to categorize across a larger set of objects, the most obvious similarities between those objects lie not in the individual features that might strongly identify each class member, but instead in the shared space that robustly identifies each as a member of a particular class.
It is also possible that, apart from task context, different species of animals may preferentially encode different kinds of information in pictorial stimuli. This may be due to neurophysiological differences in visual systems, or the natural history of each animal and the spatial scale at which important items exist for them. But, whether this is due to a species or phylogenetic difference is difficult to isolate. Most of the experiments that have examined visual cognition in birds have used relatively few species, such as pigeons, though several groups have examined avian visual cognition in a variety other species (Emery & Clayton, 2004). Pigeons in particular may be environmentally local animals because the majority of their food is in the form of seeds which must be picked out against a background of similar looking dirt and gravel. Conducting more comparative tests in the future will be useful to more broadly address whether these picture perception results are typical of pigeons specifically or of birds generally. It is also possible that attention to high spatial frequency during picture memorization is a general strategy of all animals with complex visual systems, even those with substantially different neural and retinal architecture.
Birds as a class of laterally-eyed animals often have two foveae or fovea-like areas in each eye, one for the monocular lateral field of view, and one for the binocular frontal field of view (Remy & Watanabe, 1993). The two visual fields project to different neural pathways (Shimizu & Bowers, 1999) and may serve different ecological functions: the lateral visual field is well-disposed for detecting distant, moving objects (Hayes, Hodos, Holden, & Low, 1987; Maldonado, Maturana, & Varela, 1988; Martinoya, Rivaud, & Bloch, 1983), whereas the frontal field may be better suited for nearby pattern-recognition tasks (Nye, 1973). Given this, it is possible that each visual field relies on a different range of the spatial frequency spectrum to process the relevant aspects of a visual scene, although acuity experiments using a mandibulation procedure have estimated that the lateral and frontal fields may have similar degrees of acuity (Hahmann & Güntürkün, 1993). In future research, it will be important to investigate the spatial frequency sensitivity of these visual fields to better understand the perceptual processes underlying avian vision.
Identifying the features that control behavior in experimental tasks that use complex stimuli is essential for asking deeper cognitive questions. For example, Fagot and Cook (2006) estimated the size of pigeon and baboon memory to be in the hundreds or thousands of exemplars. Yet, to understand what that number of memorized items represents, it is critical to understand which features most strongly comprise the cognitive representations of such exemplars. To answer questions about these representations means traversing through the high dimensional space that comprises naturalistic pictures. To more efficiently address these issues, we have recently begun to explore novel methods of generating and choosing stimuli to test, allowing the pigeons to subselect features by evolving discriminanda with the use of a genetic algorithm (Cook & Qadri, 2013). Such adaptive testing procedures have recently been used to study more complex, scene-like structures using neural responses (Vaziri, Carlson, Wang, & Connor, 2014). This genetically inspired technique allows more focused testing in profitable areas of this large feature space.
It is important, however, when studying cognitive processes such as memory and categorization, to use stimuli like photographs that approximate the complexity and diversity of the natural world. Using such stimuli, we gain ecological validity and a better understanding of the intricacy of these cognitive processes, which is paid for in some loss of experimental control with such complex stimuli. Combining such analytic approaches with high dimensional stimuli may be a useful tool for understanding visual cognition in a wide range of species. Conducting further studies with birds, which have such different neurophysiological organization, provides an exciting opportunity to study the generalities of the cognitive importance of visual information in the spatial frequency domain.
This research was supported by a grant from the National Eye Institute (#RO1EY022655). The authors thank Ashlynn Keller and M. Ali Qadri for their helpful comments on previous drafts.
Partial filters created by using the Matlab freqspace function were assembled into frequency filters that were applied to the entire image.
All images included the highest (.999435 to 1.000) and lowest (0.000 to 0.007705) spatial frequency ranges. Due to the additive filtering method, several notches had randomly located, additional gaps that in were imperceptibly small (Notch Sizes of less than .0003). As discussed in the text, a secondary control set of images was tested in which images were passed through our filtering program with no programmed notches removed; birds showed no decrement on those images.
To convert these notch filters to cycles per degree, frequencies were halved and multiplied by the total image size in degrees of visual angle (89.55) for an assumed 6 cm viewing distance.
Matthew S. Murphy, Salem State University.
Daniel I. Brooks, Tufts University.
Robert G. Cook, Tufts University.