PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-15 (15)
 

Clipboard (0)
None

Select a Filter Below

Journals
Authors
more »
Year of Publication
more »
1.  Between the waves: Harvard Pigeon Lab 1955-1960. 
Six or more graduate students were active in and around the Pigeon Lab in the spring of 1955 and also in 1960, but when I arrived it the fall of 1955 there were none. Looking back in 2001 at that period I can appreciate the unique opportunity for research that I had, as well as the exceptional and productive groups of graduate students who had recently finished their study and research and those who would carry on the tradition of excellence.
doi:10.1901/jeab.2002.77-319
PMCID: PMC1284865  PMID: 12083684
2.  Duration and rate of reinforcement as determinants of concurrent responding1 
The duration and frequency of food presentation were varied in concurrent variable-interval variable-interval schedules of reinforcement. In the first experiment, in which pigeons were exposed to a succession of eight different schedules, neither relative duration nor relative frequency of reinforcement had as great an effect on response distribution as they have when they are manipulated separately. These results supported those previously reported by Todorov (1973) and Schneider (1973). In a second experiment, each of seven pigeons was exposed to only one concurrent schedule in which the frequency and/or duration of reinforcement differed on the two keys. Under these conditions, each pigeon's relative rate of response closely matched the relative total access to food that each schedule provided. This result suggests that previous failures to obtain matching may be due to factors such as an insufficient length of exposure to each schedule or to the pigeons' repeated exposure to different concurrent schedules.
doi:10.1901/jeab.1977.28-145
PMCID: PMC1333626  PMID: 16812021
concurrent schedules; amount of reinforcement; rate of reinforcement; choice; pigeons
3.  An analysis of rats' drinking-tube contacts under tandem and fixed-interval schedules of food presentation1 
Rats' lever presses and drinking-tube contacts were studied under fixed-interval schedules of food presentation and under a tandem schedule composed of three fixed intervals. One group of rats was exposed first to the tandem schedule, next to fixed-interval schedules of comparable interpellet intervals, and once again to the tandem schedule; a second group of rats was exposed first to a fixed-interval and then to the tandem schedule. Under the tandem schedule, lever presses occurred at a higher rate and were more uniformly distributed in time than under the fixed-interval schedule. Tube contacts emitted by rats exposed first to a fixed-interval schedule consisted mostly of tongue contacts, which occurred at a high rate shortly after food; tube contacts emitted by rats exposed first to the tandem schedule consisted mostly of paw contacts, which occurred at a lower rate at times other than shortly after food. Changing the schedule from fixed interval to tandem decreased the frequency of tongue contacts for all rats. Under schedules of food presentation with comparable interpellet intervals, the schedule of food presentation, rather than the rate of food delivery per se, determined the topography and temporal locus of drinking-tube contacts.
doi:10.1901/jeab.1976.25-361
PMCID: PMC1333475  PMID: 16811920
schedule-induced drinking; drinking-tube contacts; tandem schedules; fixed-interval schedules; lever pressing; rats
4.  Short-component multiple schedules: effects of relative reinforcement duration1 
Pigeons were exposed to multiple variable-interval 2-min variable-interval 2-min schedules of food presentation in which relative duration of food presentation was manipulated. When components alternated every 5 sec and were scheduled on separate response keys, relative response rates closely matched relative reinforcement duration in three of four pigeons. On the other hand, relative response rates were insensitive to relative reinforcement duration when components scheduled on a single response key alternated every 5 sec, and when components scheduled on separate response keys alternated every 2 min. Thus, both rapid alternation and spatial separation of components were necessary to produce approximate matching of relative responding to relative reinforcement duration. This finding contrasts with previous findings that only rapid component alternation is necessary for matching when relative rate of reinforcement is manipulated.
doi:10.1901/jeab.1975.24-183
PMCID: PMC1333397  PMID: 16811870
reinforcement duration; component duration; relative response rates; multiple schedules; key peck; pigeons
5.  Behavioral interactions in multiple variable-interval schedules1 
In Experiment I, two groups of four pigeons each were exposed to multiple schedules in which one component was always a variable-interval schedule with a mean interreinforcement interval of 30 or 180 seconds. The other component was either an equal variable-interval schedule or extinction. Response rates in the unchanged component always increased when reinforcement was no longer scheduled in the changed component, and decreased in seven of eight cases when the variable-interval schedule was re-introduced. The per cent rate change in the unchanged component was inversely related to the frequency of reinforcement and to the ongoing response rate in the unchanged component. Rate changes in the unchanged component were not consistently correlated with changes in any single feature of the relative-frequency interresponse-time distributions. In Experiment II, the same pigeons were exposed to variable-interval schedules and multiple variable-interval variable-interval schedules with equal mean interreinforcement intervals. Response rates were similar under both conditions.
doi:10.1901/jeab.1974.22-471
PMCID: PMC1333295  PMID: 16811810
6.  A detailed analysis of the effects of d-amphetamine on behavior under fixed-interval schedules1 
Pigeons were exposed to fixed-interval schedules of food reinforcement with durations of 300 sec, 100 sec, or 40 sec. A range of doses of d-amphetamine was administered to each pigeon, and the resulting behavior was analyzed at several levels of detail. Average rates in different portions of the intervals predicted the magnitude of the drug's effect, but a finer analysis showed that average rates did not adequately characterize the behavior in some parts of the intervals. The probability of responding in different parts of an interval without drug was also a good predictor of the magnitude of the effect of d-amphetamine, and at the same time was more descriptive of the interval-to-interval performance. Analyses of the control performance indicated that responding in individual intervals could be described as consisting of two parts: a very low, or zero, rate at the beginning of the interval followed by an abrupt transition to a slightly, but reliably, positively accelerated rate maintained until reinforcement.
doi:10.1901/jeab.1974.21-519
PMCID: PMC1333225  PMID: 4838199
7.  Second-order schedules with fixed-ratio components: variation of component size1 
Key pecking by pigeons was reinforced with food under second-order schedules with fixed-ratio units. A constant total number of key pecks was required for reinforcement under each condition, but the size and, inversely, number of fixed-ratio components were varied. The total response requirement of 256 pecks was divided into fixed-ratio units of 128, 64, 32, 8, and 2 responses. A brief stimulus, which always preceded food reinforcement, was presented upon completion of each fixed-ratio unit. Under most conditions, the pattern of within-unit responding was typical of that under simple fixed-ratio schedules. Overall response rate was an inverted U-shaped function of component size. That is, response rates were highest under moderate sized units (fixed ratio 128 and 64). This relationship is consistent with previous determinations of rate as a function of fixed-ratio value for simple fixed-ratio schedules.
doi:10.1901/jeab.1971.15-303
PMCID: PMC1333841  PMID: 16811516
8.  Information on conditioned reinforcement 
doi:10.1901/jeab.1970.14-361
PMCID: PMC1333748
9.  Stimulus control and the response-reinforcement contingency1 
Pigeons were trained under a schedule in which reinforcement was made available at varying periods of time after a prior reinforcement. The first key peck after a reinforcer was available began a timer and a second key peck, which exceeded a specified minimal time interval, produced the reinforcer. It was shown that a contingency which contains a minimal interresponse time does not necessarily weaken stimulus control by an exteroceptive stimulus.
doi:10.1901/jeab.1969.12-561
PMCID: PMC1338648  PMID: 16811376
10.  Some effects of discriminative training with equated frequency of reinforcement1 
Pigeons were exposed to a multiple schedule which provided equally frequent reinforcement in the presence of two stimuli but which produced markedly different rates of key-pecking. Generalization gradients were displaced away from the stimulus associated with the lower rate of key-pecking. Another group of pigeons had similar training, except that a low rate of key-pecking was established in a stimulus with a much higher frequency of food reinforcement. In this case, the generalization gradients were not affected by the training on the schedule producing a low response rate.
doi:10.1901/jeab.1968.11-415
PMCID: PMC1338503  PMID: 5672250
11.  The relation between response rates and reinforcement rates in a multiple schedule1 
In a multiple schedule, exteroceptive stimuli change when the reinforcement schedule is changed. Each performance in a multiple schedule may be considered concurrent with other behavior. Accordingly, two variable-interval schedules of reinforcement were arranged in a multiple schedule, and a third, common variable-interval schedule was programmed concurrently with each of the first two. A quantitative statement was derived that relates as a ratio the response rates for the first two (multiple) variable-interval schedules. The value of the ratio depends on the rates of reinforcement provided by those schedules and the reinforcement rate provided by the common variable-interval schedule. The following implications of the expression were evaluated in an experiment with pigeons: (a) if the reinforcement rates for the multiple variable-interval schedules are equal, then the ratio of response rates is unity at all reinforcement rates of the common schedule; (b) if the reinforcement rates for the multiple schedules are unequal, then the ratio of response rates increases as the reinforcement rate provided by the common schedule increases; (c) the limit of the ratio is equal to the ratio of the reinforcement rates. Satisfactory confirmation was obtained for the first two implications, but the third was left in doubt.
doi:10.1901/jeab.1968.11-271
PMCID: PMC1338485  PMID: 5660708
12.  Some effects on generalization gradients of tandem schedules1 
The relationship between training conditions and stimulus generalization gradients was examined using tandem schedules of reinforcement. Schedules were selected so that frequency of reinforcement and rate of responding were varied somewhat independently of each other. A peak-shift in the generalization gradient was obtained when extinction had been associated with one of the stimuli. No comparable peak shift was obtained when there were equal response rates in the training stimuli even with dissimilar frequencies of reinforcement. The data imply that response rates at the end of training, rather than reinforcement frequency per se, determine the characteristics of the generalization gradient.
doi:10.1901/jeab.1966.9-631
PMCID: PMC1338256  PMID: 5970384
13.  The relations among measures of performance on fixed-interval schedules1 
Quantitative measures of the performances of seven rats and two pigeons under FI schedules of reinforcement were obtained. For the rats (under FI 2 and FI 100 sec) the mean response rate and two measures of the temporal distribution of responses within the interval (quarter-life and an Index of Curvature) were computed for individual intervals. The measures of curvature were highly correlated with each other, whereas the response rate was only moderately correlated with either of them. Similar results were found for comparisons of the same measures on a session-by-session basis. The performances of the pigeons (under FI 10) were analyzed to yield response rate, quarter-life and elapsed time to the first, fifth and tenth response. Response rate was only moderately correlated with quarter-life, whereas quarter-life and time to the fifth or tenth response were highly correlated. Measures of temporal distribution based on an average of the intervals of a daily session were highly similar to the means of those measures calculated from the individual intervals.
doi:10.1901/jeab.1964.7-337
PMCID: PMC1404340  PMID: 14218286
14.  A Review of Positive Conditioned Reinforcement1 
This review critically analyzes experimental data relevant to the concept of conditioned reinforcement. The review has five sections. Section I is a discussion of the relationship between primary and conditioned reinforcement in terms of chains of stimuli and responses. Section II is a detailed analysis of the conditions in which the component stimuli in chained schedules of reinforcement will become conditioned reinforcers; this section also analyzes studies of token reinforcement, observing responses, switching responses, implicit chained schedules, and higher-order conditioning. Section III analyzes experiments in which potential conditioned reinforcers are used either to prolong responding or to generate responding during experimental extinction. This section discusses hypotheses that have been offered as alternatives to the concept of conditioned reinforcement and hypotheses concerning the necessary and sufficient conditions for establishing a conditioned reinforcer. Section IV discusses other variables that act when a conditioned reinforcer is being established or that act when an established conditioned reinforcer is used to develop or maintain behavior. Section V is a general discussion of conditioned reinforcement.
The evidence indicates that the conditioned reinforcing effectiveness of a stimulus is directly related to the frequency of primary reinforcement occurring in its presence, but is independent of the response rate or response pattern occurring in its presence. Results from chained schedules comprised of several components indicate that a stimulus can be established as a conditioned reinforcer by pairing it with an already established conditioned reinforcer rather than a primary reinforcer; however, this type of higher-order conditioning has not been clearly demonstrated with respondent conditioning procedures.
Although discriminative stimuli are usually conditioned reinforcers, the available evidence indicates that establishing a stimulus as a discriminative stimulus is not necessary or sufficient for establishing it as a conditioned reinforcer. Discriminative stimuli in chained schedules with several components are not always conditioned reinforcers; stimuli that are simply paired with reinforcers can become conditioned reinforcers.
The hypotheses that have been offered as alternatives to the concept of conditioned reinforcement are too limited to integrate the data that exist. The concepts of conditioned reinforcement and chained schedule, however, can be used to integrate the data obtained with diverse techniques. Recent experiments have revealed several techniques for the development of effective conditioned reinforcers. These techniques provide a powerful tool for advancing understanding of conditioned reinforcement and for extending control over behavior.
doi:10.1901/jeab.1962.5-s543
PMCID: PMC1404082  PMID: 14031747
15.  A SIMPLE PULSE SHAPER 
doi:10.1901/jeab.1958.1-122
PMCID: PMC1403917  PMID: 16811207

Results 1-15 (15)