Pigeons were studied in a two-component multiple schedule. In the first phase of the experiment, key pecks were reinforced on a variable-interval 2-min schedule in both components and free food was delivered additionally during one component. When components alternated every 8 sec, all pigeons pecked at a much higher rate during the component with free food than during the other component. At a component duration of 16 min, the reverse was true: all pigeons pecked at a higher rate during the component without free food. In the second phase, the additional food during one component was made contingent on pecking. Responding during the component without the extra food remained essentially unchanged, as expected, since rate of reinforcement remained identical to that in the previous phase. However, rate of responding during the component with the extra food (now contingent on pecking) was elevated, compared to the rate in the first phase, and did not show the marked decline as component duration was increased.
Three pigeons received training on multiple variable-interval schedules with brief alternating components, concurrently with a fixed-interval schedule of food reinforcement on a second key. Fixed-interval performance exhibited typical increases in rate within the interval, and was independent of multiple-schedule responding. Responding on the multiple-schedule key decreased as a function of proximity to reinforcement on the fixed-interval key. The overall relative rate of responding in one component of the multiple schedule roughly matched the overall relative rate of reinforcement. Within the fixed interval, response rate during one multiple-schedule component was a monotonic, negatively accelerated function of response rate during the other component. To a first approximation, the data were described by a power function, where the exponent depended on the relative rate of reinforcement obtained in the two components. The relative rate of responding in one component of the multiple schedule increased as a function of proximity to fixed-interval reinforcement, and often exceeded the overall obtained relative rate of reinforcement. The form of the function relating response rates is discussed in relation to findings on rate-dependent effects of drugs, chaining, and the relation between response rate and reinforcement rate in single-schedule conditions.
Pigeons were studied on a two-component multiple schedule in which the required operant was, in different conditions, biologically relevant (i.e., key pecking) or nonbiologically relevant (i.e., treadle pressing). Responding was reinforced on a variable-interval (VI) 2-min schedule in both components. In separate phases, additional food was delivered on a variable-time (VT) 15-s schedule (response independent) or a VI 15-s schedule (response dependent) in one of the components. The addition of response-independent food had different effects on responding depending on the operant response and on the frequency with which the components alternated. When components alternated frequently (every 10 s), all pigeons keypecked at a much higher rate during the component with the additional food deliveries, whether response dependent or independent. In comparison, treadle pressing was elevated only when the additional food was response dependent; rate of treadling was lower when the additional food was response independent. When components alternated infrequently (every 20 min), pigeons key pecked at high rates at points of transition into the component with the additional food deliveries. Rate of key pecking decreased with time spent in the 20-min component when the additional food was response independent, whereas rate of pecking remained elevated in that component when the additional food was response dependent. Under otherwise identical test conditions, rate of treadle pressing varied only as a function of its relative rate of response-dependent reinforcement. Delivery of response-independent food thus had different, but predictable, effects on responding depending on which operant was being studied, suggesting that animal-learning procedures can be integrated with biological considerations without the need to propose constraints that limit general laws of learning.
The duration and frequency of food presentation were varied in concurrent variable-interval variable-interval schedules of reinforcement. In the first experiment, in which pigeons were exposed to a succession of eight different schedules, neither relative duration nor relative frequency of reinforcement had as great an effect on response distribution as they have when they are manipulated separately. These results supported those previously reported by Todorov (1973) and Schneider (1973). In a second experiment, each of seven pigeons was exposed to only one concurrent schedule in which the frequency and/or duration of reinforcement differed on the two keys. Under these conditions, each pigeon's relative rate of response closely matched the relative total access to food that each schedule provided. This result suggests that previous failures to obtain matching may be due to factors such as an insufficient length of exposure to each schedule or to the pigeons' repeated exposure to different concurrent schedules.
concurrent schedules; amount of reinforcement; rate of reinforcement; choice; pigeons
During one component of a multiple schedule, pigeons were trained on a discrete-trial concurrent variable-interval variable-interval schedule in which one alternative had a high scheduled rate of reinforcement and the other a low scheduled rate of reinforcement. When the choice proportion between the alternatives matched their respective relative reinforcement frequencies, the obtained probabilities of reinforcement (reinforcer per peck) were approximately equal. In alternate components of the multiple schedule, a single response alternative was presented with an intermediate scheduled rate of reinforcement. During probe trials, each alternative of the concurrent schedule was paired with the constant alternative. The stimulus correlated with the high reinforcement rate was preferred over that with the intermediate rate, whereas the stimulus correlated with the intermediate rate of reinforcement was preferred over that correlated with the low rate of reinforcement. Preference on probe tests was thus determined by the scheduled rate of reinforcement. Other subjects were presented all three alternatives individually, but with a distribution of trial frequency and reinforcement probability similar to that produced by the choice patterns of the original subjects. Here, preferences on probe tests were determined by the obtained probabilities of reinforcement. Comparison of the two sets of results indicates that the availability of a choice alternative, even when not responded to, affects the preference for that alternative. The results imply that models of choice that invoke only obtained probability of reinforcement as the controlling variable (e.g., melioration) are inadequate.
Three pigeons responded for food reinforcement on multiple variable-interval schedules in which the total consumption of food was entirely determined by the subjects' interaction with the schedules (a closed economy). The finding of overmatching, where response allocation between components is more extreme than the distribution of reinforcers, was reconfirmed. Generalized-matching sensitivity decreased from overmatching to undermatching values typical of conventional multiple schedules when food deprivation was increased by decreasing session duration, but not when deprivation was increased by decreasing overall reinforcer rate. Sensitivity also increased from undermatching to overmatching as session duration increased from 100 min to 24 hr, while deprivation was held constant by decreasing overall reinforcer rate. These results can be understood in terms of increases in the value of extraneous reinforcers relative to food reinforcers as deprivation decreases or as the economy for extraneous reinforcers becomes more closed. However, no published quantitative expression of the effects of extraneous reinforcers is entirely consistent with the results.
multiple schedules; overmatching; closed economy; deprivation; extraneous reinforcement; session duration; Herrnstein's equation; key peck; pigeons
Pigeons responded for food on a multiple schedule in which periods of green-key illumination alternated with periods of red-key illumination. When behavior had stabilized with a variable-interval 2-min schedule of reinforcement operating during both stimuli, low rates of responding (interresponse times greater than 2 sec) were differentially reinforced during the green component. Conditions during the red stimulus were unchanged. Response rates during the green component fell without changing the frequency of reinforcement but there were no unequivocal contrast effects during the red stimulus. The frequency of reinforcement during the green component was then reduced by changing to a variable-interval 8-min schedule without reducing the response rates in that component, which were held at a low level by the spacing requirement. Again, the conditions during the red stimulus were unchanged but response rates during that stimulus increased. These results show that reductions in reinforcement frequency, independently of response rate, can produce interactions in multiple schedules.
Three pigeons performed on two-component multiple variable-interval variable-interval schedules of reinforcement. There were two independent variables: component duration and the relative frequency of reinforcement in a component. The component duration, which was always the same in both components, was varied over experimental conditions from 2 to 180 sec. Over these conditions, the relative frequency of reinforcement in a component was either 0.2 or 0.8 (±0.03). As the component duration was shortened, the relative frequency of responding in a component approached a value equal to the relative frequency of reinforcement in that component. When the relative frequency of reinforcement was varied over conditions in which the component duration was fixed at 5 sec, the relative frequency of responding in a component closely approximated the relative frequency of reinforcement in that component. That is, the familiar matching relationship, obtained previously only with concurrent schedules, was obtained in multiple schedules with a short component duration.
Four pigeons were exposed to a series of multiple schedules of variable-interval reinforcement in which pecks were required on one key (operant key) and components were signalled on a second key (signal key). Four additional pigeons experienced identical conditions, except that a yoking procedure delivered food on variable-time schedules, with no key pecks required. One of the components of the multiple schedule was constant throughout the experiment as a variable-interval (or variable-time) 30-second schedule. Operant-key responding during the constant component was uniform throughout the component, uninfluenced by changes in the duration of the variable component, and only slightly influenced by changes in reinforcement frequency correlated with the variable component. By comparison, signal-key response rate during the constant component was highest at the onset of the component, was higher when the variable component was 60-sec long than when it was 1-sec long, and was higher when no reinforcement occurred in the variable component than when reinforcement was scheduled in the variable component. These characteristics of signal-key pecking matched characteristics of local positive behavioral contrast. These data are taken to support the “additivity theory” of behavioral contrast and to suggest that Pavlovian stimulus-reinforcer relations contribute primarily to the phenomenon of local positive contrast.
behavioral contrast; local contrast; additivity theory; stimulus-reinforcer relations; multiple schedules; key pecking; pigeons
Pigeons were trained on a multiple schedule of reinforcement in which separate concurrent schedules occurred in each of two components. Key pecking was reinforced with milo. During one component, a variable-interval 40-s schedule was concurrent with a variable-interval 20-s schedule; during the other component, a variable-interval 40-s schedule was concurrent with a variable-interval 80-s schedule. During probe tests, the stimuli correlated with the two variable-interval 40-s schedules were presented simultaneously to assess preference, measured by the relative response rates to the two stimuli. In Experiment 1, the concurrently available variable-interval 20-s schedule operated normally; that is, reinforcer availability was not signaled. Following this baseline training, relative response rate during the probes favored the variable-interval 40-s alternative that had been paired with the lower valued schedule (i.e., with the variable-interval 80-s schedule). In Experiment 2, a signal for reinforcer availability was added to the high-value alternative (i.e., to the variable-interval 20-s schedule), thus reducing the rate of key pecking maintained by that schedule but leaving the reinforcement rate unchanged. Following that baseline training, relative response rates during probes favored the variable-interval 40-s alternative that had been paired with the higher valued schedule. The reversal in the pattern of preference implies that the pattern of changeover behavior established during training, and not reinforcement rate, determined the preference patterns obtained on the probe tests.
matching law; concurrent schedules; changeover behavior; probability of reinforcement; melioration theory; key peck; pigeons
Pigeons' pecks on two keys were maintained, without changeover delays, by independent variable-interval schedules of food reinforcement. Four regularly cycling 2-min components scheduled reinforcement respectively for both keys, left key only, both keys, and right key only. Initially, reinforcement scheduled for one key alone produced more responding on that key than reinforcement scheduled concurrently for both keys. Continued sessions reduced this difference; response rate on a given key approached constancy, or invariance with respect to the performance on and schedule for the other key. When extinction replaced the reinforcement schedule on either key, responding on that key decreased more during components that scheduled reinforcement for the other key than during those that did not. This demonstration that responses on one key were not supported by reinforcers on the other key suggested that the alternation of concurrent responding and either-key-alone responding prevented concurrent superstitions from developing.
concurrent schedules; multiple schedules; variable-interval schedules; changeover delays; rate constancy; concurrent superstition; response independence; key peck; pigeon
Two multiple-schedule experiments with pigeons examined the effect of adding food reinforcement from an alternative source on the resistance of the reinforced response (target response) to the decremental effects of satiation and extinction. In Experiment 1, key pecks were reinforced by food in two components according to variable-interval schedules and, in some conditions, food was delivered according to variable-time schedules in one of the components. The rate of key pecking in a component was negatively related to the proportion of reinforcers from the alternative (variable-time) source. Resistance to satiation and extinction, in contrast, was positively related to the overall rate of reinforcement in the component. Experiment 2 was conceptually similar except that the alternative reinforcers were contingent on a specific concurrent response. Again, the rate of the target response varied as a function of its relative reinforcement, but its resistance to satiation and extinction varied directly with the overall rate of reinforcement in the component stimulus regardless of its relative reinforcement. Together the results of the two experiments suggest that the relative reinforcement of a response (the operant contingency) determines its rate, whereas the stimulus-reinforcement contingency (a Pavlovian contingency) determines its resistance to change.
Two variable-interval 3-min schedules functioned concurrently to arrange reinforcement of a pigeon's pecks on a single key, the main key. Each schedule was associated with a distinct color of the main key; a response on a second key alternated the color and schedule assignment of the main key. A changeover delay, a period of time following schedule and key-color alternation during which reinforcement of responding on the main key could not occur, was arranged with equal or with unequal durations for the two directions of alternation. Durations were varied from 0.33 sec to 27 sec, in addition to no delay. With equal delays for the two directions of alternation, the pigeon alternated the schedules less often the larger the delay duration. When the delays in the two directions of alternation were unequal, it could be shown that alternation of the schedules was reduced both by a delay just incurred by the last alternation and by a. delay to be incurred by the next. The latter delay was more potent in reducing the frequency of alternations.
Local patterns of responding were studied when pigeons pecked for food in concurrent variable-interval schedules (Experiment I) and in multiple variable-interval schedules (Experiment II). In Experiment I, similarities in the distribution of interresponse times on the two keys provided further evidence that responding on concurrent schedules is determined more by allocation of time than by changes in local pattern of responding. Relative responding in local intervals since a preceding reinforcement showed consistent deviations from matching between relative responding and relative reinforcement in various postreinforcement intervals. Response rates in local intervals since a preceding changeover showed that rate of responding is not the same on both keys in all postchangeover intervals. The relative amount of time consumed by interchangeover times of a given duration approximately matched relative frequency of reinforced interchangeover times of that duration. However, computer simulation showed that this matching was probably a necessary artifact of concurrent schedules. In Experiment II, when component durations were 180 sec, the relationship between distribution of interresponse times and rate of reinforcement in the component showed that responding was determined by local pattern of responding in the components. Since responding on concurrent schedules appears to be determined by time allocation, this result would establish a behavioral difference between multiple and concurrent schedules. However, when component durations were 5 sec, local pattern of responding in a component (defined by interresponse times) was less important in determining responding than was amount of time spent responding in a component (defined by latencies). In fact, with 5-sec component durations, the relative amount of time spent responding in a component approximately matched relative frequency of reinforcement in the component. Thus, as component durations in multiple schedules decrease, multiple schedules become more like concurrent schedules, in the sense that responding is affected by allocation of time rather than by local pattern of responding.
Pigeons were trained on a multiple variable-interval 30-sec, variable-interval 90-sec schedule with each component presented alternately for an equal (on the average) duration. This average duration of exposure to each component was varied from 5 to 300 sec. The main concern was with rate of response in the variable-interval 30-sec component relative to rate of response in the variable-interval 90-sec component. In all cases, rate of response was higher in the variable-interval 30 sec component, but the discrepancy in the rate produced by the two schedules tended to be greatest when the duration of component presentation was brief. The mean proportion of responses emitted during the variable-interval 30-sec component (responses in variable-interval 30-sec component divided by total responses) varied from about 0.60 to 0.71, where 0.75 would be expected on the basis of a matching rule, and 0.59 was that obtained by Lander and Irwin (1968). These results are in agreement with data reported by Shimp and Wheatley (1971) from a similar experiment.
Pigeons were exposed to a procedure under which five pecks on one response key (the observing key) changed the schedule on a second key (the food key) from a mixed schedule to a multiple schedule for 25 sec. In Experiment I, a random-ratio 50 schedule alternated with extinction. The duration of the random-ratio 50 schedule component was varied between 1.25 and 320 sec, and extinction was scheduled for a varying time, ranging from the duration of the random-ratio 50 to four times that value. Each set of values was scheduled for a block of sessions. Before observing-key pecks were allowed at each set of parameter values, the pigeons were exposed to a condition where the mixed and multiple schedule alternated every 10 min, and observing-key pecks were not permitted. Rates of pecking on the observing key were high for all values of random-ratio component durations except 1.25 sec. Experiment II was conducted with the random-ratio component duration equal to 40 sec, and the random-ratio schedule was varied from random-ratio 50 to 100, 200, and 400. Observing-key pecking rates were high for all values of the random-ratio schedule except random-ratio 400. In both experiments, observing response rates were relatively little affected, suggesting that neither schedule component duration nor schedule value is a strong determinant of observing responses.
Behavioral momentum theory relates resistance to change of responding in a multiple-schedule component to the total reinforcement obtained in that component, regardless of how the reinforcers are produced. Four pigeons responded in a series of multiple-schedule conditions in which a variable-interval 40-s schedule arranged reinforcers for pecking in one component and a variable-interval 360-s schedule arranged them in the other. In addition, responses on a second key were reinforced according to variable-interval schedules that were equal in the two components. In different parts of the experiment, responding was disrupted by changing the rate of reinforcement on the second key or by delivering response-independent food during a blackout separating the two components. Consistent with momentum theory, responding on the first key in Part 1 changed more in the component with the lower reinforcement total when it was disrupted by changes in the rate of reinforcement on the second key. However, responding on the second key changed more in the component with the higher reinforcement total. In Parts 2 and 3, responding was disrupted with free food presented during intercomponent blackouts, with extinction (Part 2) or variable-interval 80-s reinforcement (Part 3) arranged on the second key. Here, resistance to change was greater for the component with greater overall reinforcement. Failures of momentum theory to predict short-term differences in resistance to change occurred with disruptors that caused greater change between steady states for the richer component. Consistency of effects across disruptors may yet be found if short-term effects of disruptors are assessed relative to the extent of change observed after prolonged exposure.
behavioral momentum theory; resistance to change; multiple schedules; concurrent schedules; alternative reinforcement; key peck; pigeons
The joint control of rate of key pecking in pigeons by stimulus-reinforcer and response-reinforcer relationships was studied in the context of a two-component multiple schedule of reinforcement. Food presentation was always associated with one component and extinction with the other. The stimulus-reinforcer relationship was manipulated by varying the relative durations of the two components. In the food-presentation component, a fixed rate of reinforcement, independent of rate of responding, was generated by a schedule referred to as “T*”. One aspect of the response-reinforcer relationship, contiguity, was manipulated by varying the percentage of delayed reinforcers. With the multiple T* extinction schedule, stimulus-reinforcer and response-reinforcer relationships could be varied independently of one another. Rate of key pecking was sensitive to manipulations of both relationships. However, significant differential effects due to either the stimulus-reinforcer or response-reinforcer relationship were obtained only when the other relationship was weak: stimulus-reinforcer and response-reinforcer relationships interacted in the joint control of responding.
stimulus-reinforcer relationship; response-reinforcer relationship; relative component duration; percentage delayed reinforcement; multiple schedule; T* schedule; key peck; pigeon
To determine the effect of a negative discriminative stimulus on the response producing it, two pigeons were each studied in a three-key conditioning chamber. During alternating periods of unpredictable duration, pecking the center (food) key either was reinforced with grain on a variable-interval schedule or was never reinforced. On equal but independent variable-interval schedules, pecking either of the side (observing) keys changed the color of all keys for 30 sec from yellow to either green or red. When the schedule on the center key was variable-interval reinforcement, the color was green (positive discriminative stimulus); when no reinforcements were scheduled, the color was red (negative discriminative stimulus). Since pecking the side keys did not affect grain deliveries, changes in the rate of pecking could not be ascribed to changes in the frequency of primary reinforcement. In subsequent sessions, red was withheld as one of the possible consequences of pecking a given side key. When red was omitted, the rate on that key increased, and when red was restored, the rate decreased. It was concluded that red illumination of the keys, the negative discriminative stimulus, had a suppressive effect on the response that produced it.
Pigeons responded on multiple variable-interval variable-interval schedules of reinforcement in an open and a closed economy. Equal duration components were increased in duration while the component rates of reinforcement were held constant, the component schedules were reversed, and component duration was decreased. In the open economy, daily sessions were limited to 1 hr, and subjects were maintained at 80% of their free-feeding weights through supplemental feeding when necessary in their home cages. In the closed economy, subjects were housed in their experimental chambers and no deprivation regimen was enforced. Relative response rate decreased as components were lengthened in the open economy, whereas in the closed economy relative rate increased as components were lengthened. Response proportions overmatched reinforcer proportions to a greater extent at long component durations in the closed economy, but there was no systematic effect of component duration on responding in the open economy.
In Experiment 1, matching of relative response rates to relative rates of reinforcement was obtained in concurrent variable-interval schedules when the absolute values of the two concurrent variable-interval schedules varied from 6 sec and 12 sec to 600 sec and 1200 sec. Increases in the duration of the changeover delay, however, produced decreases in the relative response rates and, consequently, some deviation from matching. In Experiment 2, matching of relative response rates to the relative duration of the reinforcer failed to occur when the equal variable-interval schedules arranging access to the two different reinforcer durations (1.5 and 6 sec) were varied in size from concurrent variable-interval 10-sec schedules to concurrent variable-interval 600-sec schedules.
In three experiments, pigeons' responses were reinforced on two keys in each component of a series of multiple-schedule conditions. In each series, concurrent variable-interval schedules were constant in one component and were varied over conditions in the other component. In the first experiment both components arranged the same, constant total number of reinforcers, in the second the two components arranged constant but different totals, and in the third experiment the total was varied in one component and remained constant in the other. Relative reinforcer rate during the varied component was manipulated over conditions in all three experiments. In all these experiments, response and time allocation in the constant component were invariant when reinforcer ratios varied in the other component, demonstrating independence of behavior allocation in a multiple-schedule component from the relative reinforcer rate for the same alternatives in another component. In the two experiments which maintained constant reinforcer totals in components, sensitivity to reinforcement in the multiple schedules was the same as that in the concurrent schedules arranged during the varied component, with multiple-schedule bias in the experiment in which the totals were unequal.
multiple schedules; concurrent schedules; sensitivity; bias; extraneous reinforcers; key peck; pigeons
Reinforcements were arranged independently of the pigeon's behavior by concurrent variable-interval schedules. The reinforcements arranged by one of the schedules occurred when the chamber was illuminated with amber light, and the reinforcements arranged by the other schedule occurred when the chamber was illuminated with blue light. Both schedules functioned concurrently, but reinforcers were delivered by each only in the presence of the appropriate stimulus condition. A response on a white key, the only key in the chamber, alternated the stimulus condition and the effective schedule. The results of this procedure were similar to those obtained with concurrent response-dependent variable-interval schedules of reinforcement. The proportion of the total session time spent in the presence of a schedule component approximated the proportion of the total number of reinforcements in the component. Changeover rate was a decreasing function of the changeover delay and of the difference between the relative rates of reinforcement for each pair of concurrent schedules.
Two experimental chambers were electrically connected so that the component selected by a pigeon confronting concurrent variable-interval schedules in one chamber could be successively presented as a multiple schedule to a second pigeon in the other chamber. Component duration was regulated by the use of a changeover delay, the value of which was systematically varied between 0 and 30 sec. It was found that the relative local response rates on the preferred key (absolute response rate to that component divided by the sum of the absolute response rates during both components) tended to increase with increasing component durations for the birds in the concurrent chamber, but decreased for the birds in the multiple chamber. These data support the interpretation that there are fundamental differences in the mode of responding to multiple and concurrent schedules. Based on these findings, it was concluded that previous demonstrations of matching on multiple schedules do not establish that response allocation is controlled by a process equivalent to that found on choice paradigms. It now appears that matching on multiple (but not concurrent) schedules is a consequence of selecting short component durations. The implications of these data for Herrnstein's (1970) and Rachlin's (1973) formulations of the relationship between multiple and concurrent schedules are examined.
Key pecking in pigeons was examined under concurrent and parallel arrangements of two independent and simultaneously available variable-interval schedules. Pecks on the changeover key alternated the schedule of reinforcement for responses on the main key. Under concurrent schedules, discriminative stimuli were paired with the reinforcement schedule arranged in each component and changeover responses also alternated these stimuli. Under parallel schedules, changeover responses alternated the effective reinforcement schedule, but did not change the discriminative stimulus. On concurrent procedures, changeover response rate was inversely related to the difference in reinforcement rate between the two components, whereas on parallel schedules no consistent relationship was found. With both schedules, absolute response and reinforcement rates were positively related, although for a given set of reinforcement frequencies, rates were often higher on the concurrent schedules. On concurrent schedules, relative response rates and relative times were equal to relative reinforcement rates. On parallel schedules these ratios were positively related, but response and time ratios were much smaller than were obtained with comparable concurrent schedules. This inequality was most pronounced when absolute reinforcement frequencies were lowest.
concurrent schedules; parallel schedules; matching law; stimulus control; variable-interval schedules; key pecking; pigeons