Local patterns of responding were studied when pigeons pecked for food in concurrent variable-interval schedules (Experiment I) and in multiple variable-interval schedules (Experiment II). In Experiment I, similarities in the distribution of interresponse times on the two keys provided further evidence that responding on concurrent schedules is determined more by allocation of time than by changes in local pattern of responding. Relative responding in local intervals since a preceding reinforcement showed consistent deviations from matching between relative responding and relative reinforcement in various postreinforcement intervals. Response rates in local intervals since a preceding changeover showed that rate of responding is not the same on both keys in all postchangeover intervals. The relative amount of time consumed by interchangeover times of a given duration approximately matched relative frequency of reinforced interchangeover times of that duration. However, computer simulation showed that this matching was probably a necessary artifact of concurrent schedules. In Experiment II, when component durations were 180 sec, the relationship between distribution of interresponse times and rate of reinforcement in the component showed that responding was determined by local pattern of responding in the components. Since responding on concurrent schedules appears to be determined by time allocation, this result would establish a behavioral difference between multiple and concurrent schedules. However, when component durations were 5 sec, local pattern of responding in a component (defined by interresponse times) was less important in determining responding than was amount of time spent responding in a component (defined by latencies). In fact, with 5-sec component durations, the relative amount of time spent responding in a component approximately matched relative frequency of reinforcement in the component. Thus, as component durations in multiple schedules decrease, multiple schedules become more like concurrent schedules, in the sense that responding is affected by allocation of time rather than by local pattern of responding.
Pigeons were studied on a two-component multiple schedule in which the required operant was, in different conditions, biologically relevant (i.e., key pecking) or nonbiologically relevant (i.e., treadle pressing). Responding was reinforced on a variable-interval (VI) 2-min schedule in both components. In separate phases, additional food was delivered on a variable-time (VT) 15-s schedule (response independent) or a VI 15-s schedule (response dependent) in one of the components. The addition of response-independent food had different effects on responding depending on the operant response and on the frequency with which the components alternated. When components alternated frequently (every 10 s), all pigeons keypecked at a much higher rate during the component with the additional food deliveries, whether response dependent or independent. In comparison, treadle pressing was elevated only when the additional food was response dependent; rate of treadling was lower when the additional food was response independent. When components alternated infrequently (every 20 min), pigeons key pecked at high rates at points of transition into the component with the additional food deliveries. Rate of key pecking decreased with time spent in the 20-min component when the additional food was response independent, whereas rate of pecking remained elevated in that component when the additional food was response dependent. Under otherwise identical test conditions, rate of treadle pressing varied only as a function of its relative rate of response-dependent reinforcement. Delivery of response-independent food thus had different, but predictable, effects on responding depending on which operant was being studied, suggesting that animal-learning procedures can be integrated with biological considerations without the need to propose constraints that limit general laws of learning.
Pigeons were exposed to alternative pairs of variable-interval schedules correlated with red and green lights on one key (the food key). In one experimental chamber, responses on a white key (the changeover key) changed the color of the food key and initiated a 2-sec changeover delay. Pigeons in a second chamber obtained food by pecking on a colored key whenever the pigeons in the first (concurrent) chamber had obtained food for a peck on that key color. There was no changeover key in the second (multiple) chamber: changeover responses in the first chamber alternated the schedules and colors in both chambers. The pigeons in both chambers emitted the same proportion of responses on each of the variable-interval schedules, and mastered discrimination reversals at the same rate. The pigeons differed only in their absolute response rates, which were greater under the concurrent schedules. In a second experiment, changes in key color occurred automatically, with different proportions of time allocated to the two variable-interval schedules. Matching of relative response frequency to relative reinforcement frequency was affected by the relative amounts of time in each component, by rate of changeovers, and by manipulations of the variable-interval scheduling.
Each of three pigeons was studied first under a standard fixed-interval schedule. With the fixed interval held constant, the schedule was changed to a second-order schedule in which the response unit was the behavior on a small fixed-ratio schedule (first a fixed-ratio 10 and then a fixed-ratio 20 schedule). That is, every completion of the fixed-ratio schedule produced a 0.7-sec darkening of the key and reset the response count to zero for the next ratio. The first fixed-ratio completed after the fixed-interval schedule elapsed produced the 0.7-sec blackout followed immediately by food. These manipulations were carried out under two different fixed-interval durations for each bird ranging from 3 min to 12 min. The standard fixed-interval schedules produced the typical pause after reinforcement followed by responding at a moderate rate until the next reinforcement. The second-order schedules also engendered a pause after reinforcement, but responding occurred in bursts separated by brief pauses after each blackout. For a particular fixed-interval duration, post-reinforcement pauses increased slightly as the number of pecks in the response unit increased despite large differences in the rate and pattern of key pecking. Post-reinforcement pause increased with the fixed-interval duration under all response units. These data confirm that the allocation of time between pausing and responding is relatively independent of the rate and topography of responding after the pause.
Four pigeons were exposed to a series of multiple schedules of variable-interval reinforcement in which pecks were required on one key (operant key) and components were signalled on a second key (signal key). Four additional pigeons experienced identical conditions, except that a yoking procedure delivered food on variable-time schedules, with no key pecks required. One of the components of the multiple schedule was constant throughout the experiment as a variable-interval (or variable-time) 30-second schedule. Operant-key responding during the constant component was uniform throughout the component, uninfluenced by changes in the duration of the variable component, and only slightly influenced by changes in reinforcement frequency correlated with the variable component. By comparison, signal-key response rate during the constant component was highest at the onset of the component, was higher when the variable component was 60-sec long than when it was 1-sec long, and was higher when no reinforcement occurred in the variable component than when reinforcement was scheduled in the variable component. These characteristics of signal-key pecking matched characteristics of local positive behavioral contrast. These data are taken to support the “additivity theory” of behavioral contrast and to suggest that Pavlovian stimulus-reinforcer relations contribute primarily to the phenomenon of local positive contrast.
behavioral contrast; local contrast; additivity theory; stimulus-reinforcer relations; multiple schedules; key pecking; pigeons
Four rats obtained food pellets by poking a key and 5-s presentations of the discriminative stimuli by pressing a lever. Every 1 or 2 min, the prevailing schedule of reinforcement for key poking alternated between rich (either variable-interval [VI] 30 s or VI 60 s) and lean (either VI 240 s, VI 480 s, or extinction) components. While the key was dark (mixed-schedule stimulus), no exteroceptive stimulus indicated the prevailing schedule. A lever press (i.e., an observing response), however, illuminated the key for 5 s with either a steady light (S+), signaling the rich reinforcement schedule, or a blinking light (S-), signaling the lean reinforcement schedule. One goal was to determine whether rats would engage in selective observing (i.e., a pattern of responding that maintains contact with S+ and decreases contact with S-). Such a pattern was found, in that a 5-s presentation of S+ was followed relatively quickly by another observing response (which likely produced another 5-s period of S+), whereas exposure to S- resulted in extended breaks from observing. Additional conditions demonstrated that the rate of observing remained high when lever presses were effective only when the rich reinforcement schedule was in effect (S+ only condition), but decreased to a low level when lever presses were effective only during the lean reinforcement component (S- only condition) or when lever presses had no effect (in removing the mixed stimulus or presenting the multiple-schedule stimuli). These findings are consistent with relativistic conceptualizations of conditioned reinforcement and extend the generality of selective observing to procedures in which the experimenter controls the duration of stimulus presentations, the schedule components both offer intermittent food reinforcement, and rats serve as subjects.
Pigeons were studied in a two-component multiple schedule. In the first phase of the experiment, key pecks were reinforced on a variable-interval 2-min schedule in both components and free food was delivered additionally during one component. When components alternated every 8 sec, all pigeons pecked at a much higher rate during the component with free food than during the other component. At a component duration of 16 min, the reverse was true: all pigeons pecked at a higher rate during the component without free food. In the second phase, the additional food during one component was made contingent on pecking. Responding during the component without the extra food remained essentially unchanged, as expected, since rate of reinforcement remained identical to that in the previous phase. However, rate of responding during the component with the extra food (now contingent on pecking) was elevated, compared to the rate in the first phase, and did not show the marked decline as component duration was increased.
Three pigeons received training on multiple variable-interval schedules with brief alternating components, concurrently with a fixed-interval schedule of food reinforcement on a second key. Fixed-interval performance exhibited typical increases in rate within the interval, and was independent of multiple-schedule responding. Responding on the multiple-schedule key decreased as a function of proximity to reinforcement on the fixed-interval key. The overall relative rate of responding in one component of the multiple schedule roughly matched the overall relative rate of reinforcement. Within the fixed interval, response rate during one multiple-schedule component was a monotonic, negatively accelerated function of response rate during the other component. To a first approximation, the data were described by a power function, where the exponent depended on the relative rate of reinforcement obtained in the two components. The relative rate of responding in one component of the multiple schedule increased as a function of proximity to fixed-interval reinforcement, and often exceeded the overall obtained relative rate of reinforcement. The form of the function relating response rates is discussed in relation to findings on rate-dependent effects of drugs, chaining, and the relation between response rate and reinforcement rate in single-schedule conditions.
Pigeons were exposed to a procedure under which five pecks on one response key (the observing key) changed the schedule on a second key (the food key) from a mixed schedule to a multiple schedule for 25 sec. In Experiment I, a random-ratio 50 schedule alternated with extinction. The duration of the random-ratio 50 schedule component was varied between 1.25 and 320 sec, and extinction was scheduled for a varying time, ranging from the duration of the random-ratio 50 to four times that value. Each set of values was scheduled for a block of sessions. Before observing-key pecks were allowed at each set of parameter values, the pigeons were exposed to a condition where the mixed and multiple schedule alternated every 10 min, and observing-key pecks were not permitted. Rates of pecking on the observing key were high for all values of random-ratio component durations except 1.25 sec. Experiment II was conducted with the random-ratio component duration equal to 40 sec, and the random-ratio schedule was varied from random-ratio 50 to 100, 200, and 400. Observing-key pecking rates were high for all values of the random-ratio schedule except random-ratio 400. In both experiments, observing response rates were relatively little affected, suggesting that neither schedule component duration nor schedule value is a strong determinant of observing responses.
An earlier experiment scheduled variable-interval reinforcement for pigeons' pecks on one key, and variable-interval reinforcement alternating with extinction, in a multiple schedule, for pecks on a second key. During the second key's extinction component, first-key pecking was relatively slow and continuous, rarely interrupted by second-key pecking; during the variable-interval component, first-key pecking was frequently interrupted by second-key pecking. When changeover delays operated, so that reinforced pecks on one key could not follow closely upon changeovers from the other key, rapid first-key pecking between interruptions compensated sufficiently for the time lost in second-key pecking that the overall rate of first-key pecking remained roughly constant across the alternating multiple-schedule components. The present experiments duplicated, on a single key, the temporal pattern of first-key pecking generated in the earlier experiments: components of continuous key availability were alternated with components of interrupted key availability. Approximately constant overall rates of responding were observed with a single-key equivalent of a changeover delay scheduled after interruptions and with manipulations of the on-off durations of the interruption cycle. Rate constancies in the original concurrent situation presumably depended on analogous contingencies that operated upon the concurrent responses, rather than on any constant “reserve” of responses.
The duration and frequency of food presentation were varied in concurrent variable-interval variable-interval schedules of reinforcement. In the first experiment, in which pigeons were exposed to a succession of eight different schedules, neither relative duration nor relative frequency of reinforcement had as great an effect on response distribution as they have when they are manipulated separately. These results supported those previously reported by Todorov (1973) and Schneider (1973). In a second experiment, each of seven pigeons was exposed to only one concurrent schedule in which the frequency and/or duration of reinforcement differed on the two keys. Under these conditions, each pigeon's relative rate of response closely matched the relative total access to food that each schedule provided. This result suggests that previous failures to obtain matching may be due to factors such as an insufficient length of exposure to each schedule or to the pigeons' repeated exposure to different concurrent schedules.
concurrent schedules; amount of reinforcement; rate of reinforcement; choice; pigeons
Three pigeons were exposed to a two-component multiple schedule in which a variable-interval 3-min schedule was always in effect in one component. The schedule in the other component was either variable-interval 3-min or extinction in alternate blocks of sessions. When the schedule was changed from multiple variable-interval 3-min variable-interval 3-min to multiple variable-interval 3-min extinction in the second and fourth phases of the experiment, overall response rates in the unchanged variable-interval 3-min component increased in two pigeons. Response rate declined when the schedule was changed to multiple variable-interval 3-min variable-interval 3-min again. Correlated with increases in overall response rate in the unchanged component were increases in local response rates at the beginning of the unchanged component and immediately after food presentation. Local rates 40 sec after food presentation did not increase greatly in the presence of the multiple variable-interval 3-min extinction schedule. An interresponse time analysis of three local rate samples showed small increases in the relative frequency of short-duration interresponse times at the beginning of the unchanged component and immediately after food presentation. Neither the postreinforcement pause nor the latency to the first response in the unchanged component changed systematically.
Pigeons were trained on a multiple schedule of reinforcement in which separate concurrent schedules occurred in each of two components. Key pecking was reinforced with milo. During one component, a variable-interval 40-s schedule was concurrent with a variable-interval 20-s schedule; during the other component, a variable-interval 40-s schedule was concurrent with a variable-interval 80-s schedule. During probe tests, the stimuli correlated with the two variable-interval 40-s schedules were presented simultaneously to assess preference, measured by the relative response rates to the two stimuli. In Experiment 1, the concurrently available variable-interval 20-s schedule operated normally; that is, reinforcer availability was not signaled. Following this baseline training, relative response rate during the probes favored the variable-interval 40-s alternative that had been paired with the lower valued schedule (i.e., with the variable-interval 80-s schedule). In Experiment 2, a signal for reinforcer availability was added to the high-value alternative (i.e., to the variable-interval 20-s schedule), thus reducing the rate of key pecking maintained by that schedule but leaving the reinforcement rate unchanged. Following that baseline training, relative response rates during probes favored the variable-interval 40-s alternative that had been paired with the higher valued schedule. The reversal in the pattern of preference implies that the pattern of changeover behavior established during training, and not reinforcement rate, determined the preference patterns obtained on the probe tests.
matching law; concurrent schedules; changeover behavior; probability of reinforcement; melioration theory; key peck; pigeons
During one component of a multiple schedule, pigeons were trained on a discrete-trial concurrent variable-interval variable-interval schedule in which one alternative had a high scheduled rate of reinforcement and the other a low scheduled rate of reinforcement. When the choice proportion between the alternatives matched their respective relative reinforcement frequencies, the obtained probabilities of reinforcement (reinforcer per peck) were approximately equal. In alternate components of the multiple schedule, a single response alternative was presented with an intermediate scheduled rate of reinforcement. During probe trials, each alternative of the concurrent schedule was paired with the constant alternative. The stimulus correlated with the high reinforcement rate was preferred over that with the intermediate rate, whereas the stimulus correlated with the intermediate rate of reinforcement was preferred over that correlated with the low rate of reinforcement. Preference on probe tests was thus determined by the scheduled rate of reinforcement. Other subjects were presented all three alternatives individually, but with a distribution of trial frequency and reinforcement probability similar to that produced by the choice patterns of the original subjects. Here, preferences on probe tests were determined by the obtained probabilities of reinforcement. Comparison of the two sets of results indicates that the availability of a choice alternative, even when not responded to, affects the preference for that alternative. The results imply that models of choice that invoke only obtained probability of reinforcement as the controlling variable (e.g., melioration) are inadequate.
The effects of pentobarbital and d-amphetamine were assessed on key pecking by pigeons under conventional single-key multiple schedules and under two-key multiple schedules in which discriminative stimuli appeared on one key (stimulus key) while pecks on a second key (constant key) produced food. Pecks on the stimulus key had no scheduled consequences. A 60-second variable-interval schedule operated in one component of each multiple schedule: either extinction or a 60-second variable-time schedule operated in the alternate component. When the alternate-component schedule was extinction, a high rate of responding was maintained in the variable-interval component of the single-key schedule; responding on both keys was maintained in the variable-interval component of the two-key schedule. Pentobarbital increased responding in the variable-interval component of the single-key schedule and increased stimulus-key, but not constant-key responding in that component of the two-key schedule. When the alternate-component schedule was changed to variable time, responding declined in the variable-interval component of the single-key schedule; stimulus-key responding was no longer maintained under the two-key schedule. Pentobarbital decreased responding in the variable-interval component of both schedules. With an exception, d-amphetamine only decreased responding in the variable-interval component of the single- and two-key schedules both when the alternate-component schedule was extinction and when it was variable time. The results suggest that the effects of pentobarbital, but not d-amphetamine, depend on the nature of the contingency (stimulus-reinforcer, response-reinforcer) that maintains responding.
Behavioral momentum theory relates resistance to change of responding in a multiple-schedule component to the total reinforcement obtained in that component, regardless of how the reinforcers are produced. Four pigeons responded in a series of multiple-schedule conditions in which a variable-interval 40-s schedule arranged reinforcers for pecking in one component and a variable-interval 360-s schedule arranged them in the other. In addition, responses on a second key were reinforced according to variable-interval schedules that were equal in the two components. In different parts of the experiment, responding was disrupted by changing the rate of reinforcement on the second key or by delivering response-independent food during a blackout separating the two components. Consistent with momentum theory, responding on the first key in Part 1 changed more in the component with the lower reinforcement total when it was disrupted by changes in the rate of reinforcement on the second key. However, responding on the second key changed more in the component with the higher reinforcement total. In Parts 2 and 3, responding was disrupted with free food presented during intercomponent blackouts, with extinction (Part 2) or variable-interval 80-s reinforcement (Part 3) arranged on the second key. Here, resistance to change was greater for the component with greater overall reinforcement. Failures of momentum theory to predict short-term differences in resistance to change occurred with disruptors that caused greater change between steady states for the richer component. Consistency of effects across disruptors may yet be found if short-term effects of disruptors are assessed relative to the extent of change observed after prolonged exposure.
behavioral momentum theory; resistance to change; multiple schedules; concurrent schedules; alternative reinforcement; key peck; pigeons
Four experiments examined the relationship between rate of reinforcement and resistance to change in rats' and pigeons' responses under simple and multiple schedules of reinforcement. In Experiment 1, 28 rats responded under either simple fixed-ratio, variable-ratio, fixed-interval, or variable-interval schedules; in Experiment 2, 3 pigeons responded under simple fixed-ratio schedules. Under each schedule, rate of reinforcement varied across four successive conditions. In Experiment 3, 14 rats responded under either a multiple fixed-ratio schedule or a multiple fixed-interval schedule, each with two components that differed in rate of reinforcement. In Experiment 4, 7 pigeons responded under either a multiple fixed-ratio or a multiple fixed-interval schedule, each with three components that also differed in rate of reinforcement. Under each condition of each experiment, resistance to change was studied by measuring schedule-controlled performance under conditions with prefeeding, response-independent food during the schedule or during timeouts that separated components of the multiple schedules, and by measuring behavior under extinction. There were no consistent differences between rats and pigeons. There was no direct relationship between rates of reinforcement and resistance to change when rates of reinforcement varied across successive conditions in the simple schedules. By comparison, in the multiple schedules there was a direct relationship between rates of reinforcement and resistance to change during most tests of resistance to change. The major exception was delivering response-independent food during the schedule; this disrupted responding, but there was no direct relationship between rates of reinforcement and resistance to change in simple- or multiple-schedule contexts. The data suggest that rate of reinforcement determines resistance to change in multiple schedules, but that this relationship does not hold under simple schedules.
In three experiments, pigeons' responses were reinforced on two keys in each component of a series of multiple-schedule conditions. In each series, concurrent variable-interval schedules were constant in one component and were varied over conditions in the other component. In the first experiment both components arranged the same, constant total number of reinforcers, in the second the two components arranged constant but different totals, and in the third experiment the total was varied in one component and remained constant in the other. Relative reinforcer rate during the varied component was manipulated over conditions in all three experiments. In all these experiments, response and time allocation in the constant component were invariant when reinforcer ratios varied in the other component, demonstrating independence of behavior allocation in a multiple-schedule component from the relative reinforcer rate for the same alternatives in another component. In the two experiments which maintained constant reinforcer totals in components, sensitivity to reinforcement in the multiple schedules was the same as that in the concurrent schedules arranged during the varied component, with multiple-schedule bias in the experiment in which the totals were unequal.
multiple schedules; concurrent schedules; sensitivity; bias; extraneous reinforcers; key peck; pigeons
Responses on one key (the main key) of a two-key chamber produced food according to a second-order variable-interval schedule with fixed-interval schedule components. A response on a second key (the changeover key) alternated colors on the main key and provided a second independent second-order variable-interval schedule with fixed-interval components. The fixed-interval component on one variable-interval schedule was held constant at 8 sec, while the fixed interval on the other variable-interval schedule was varied from 0 to 32 sec. Under some conditions, a brief stimulus terminated each fixed interval and generated fixed-interval patterns; in other conditions, the brief stimulus was omitted. Relative response rate and relative time deviated substantially from scheduled relative reinforcement rate and, to a lesser extent, from obtained relative reinforcement rate under both brief-stimulus and no-stimulus conditions. Matching was observed with equal components on both schedules; with unequal components, increasingly greater proportions of time and responses than the matching relation would predict were spent on the variable-interval schedule containing the shorter component. Preference for the shorter fixed interval was typically more extreme under brief-stimulus than under no-stimulus schedules. The results limit the extension of the matching relation typically observed under simple concurrent variable-interval schedules to concurrent second-order variable-interval schedules.
concurrent schedules; second-order schedule; brief-stimulus schedule; changeover delay; fixed interval; variable interval; key peck; pigeons
Pigeons responded for food on a multiple schedule in which periods of green-key illumination alternated with periods of red-key illumination. When behavior had stabilized with a variable-interval 2-min schedule of reinforcement operating during both stimuli, low rates of responding (interresponse times greater than 2 sec) were differentially reinforced during the green component. Conditions during the red stimulus were unchanged. Response rates during the green component fell without changing the frequency of reinforcement but there were no unequivocal contrast effects during the red stimulus. The frequency of reinforcement during the green component was then reduced by changing to a variable-interval 8-min schedule without reducing the response rates in that component, which were held at a low level by the spacing requirement. Again, the conditions during the red stimulus were unchanged but response rates during that stimulus increased. These results show that reductions in reinforcement frequency, independently of response rate, can produce interactions in multiple schedules.
Key pecking of 4 pigeons was maintained under a multiple variable-interval 20-s variable-interval 120-s schedule of food reinforcement. When rates of key pecking were stable, a 5-s unsignaled, nonresetting delay to reinforcement separated the first peck after an interval elapsed from reinforcement in both components. Rates of pecking decreased substantially in both components. When rates were stable, the situation was changed such that the peck that began the 5-s delay also changed the color of the keylight for 0.5 s (i.e., the delay was briefly signaled). Rates increased to near-immediate reinforcement levels. In subsequent conditions, delays of 10 and 20 s, still briefly signaled, were tested. Although rates of key pecking during the component with the variable-interval 120-s schedule did not change appreciably across conditions, rates during the variable-interval 20-s component decreased greatly in 1 pigeon at the 10-s delay and decreased in all pigeons at the 20-s delay. In a control condition, the variable-interval 20-s schedule with 20-s delays was changed to a variable-interval 35-s schedule with 5-s delays, thus equating nominal rates of reinforcement. Rates of pecking increased to baseline levels. Rates of pecking, then, depended on the value of the briefly signaled delay relative to the programmed interfood times, rather than on the absolute delay value. These results are discussed in terms of similar findings in the literature on conditioned reinforcement, delayed matching to sample, and classical conditioning.
Four pigeons responded on multiple schedules arranged on a “main” key in a two-key experimental chamber. A constant schedule component was alternated with another component that was varied over conditions. On an extra response key, conjoint schedules of reinforcement that operated in both components were arranged concurrently with the multiple schedule on the main key. On the main key, changes in reinforcement rate in the varied component were inversely related to changes in response rates in the constant component (behavioral contrast). On the extra key, some reinforcers were reallocated between components, depending on the schedules in effect on the main key in the varied component. In the varied component, the obtained rates of reinforcement on the extra key were inversely related to main-key reinforcement rate. In the constant component, extra-key reinforcer rates were positively related to main-key reinforcer rates obtained in the varied component, and were not a function of response rates on the extra key. In two comparisons, the rate at which components alternated and the value of the main-key schedule in the constant component were varied. Consistent with earlier work, long components reduced the extent of contrast. Reductions in contrast as a function of component duration were accompanied by similar reductions in the extent of reinforcer reallocation on the extra key. In the second comparison, lowering the rate of reinforcement in the constant component increased the rate at which extra-key reinforcers were obtained, reduced the extent of reinforcer reallocation, and reduced contrast. Overall, the results are consistent with the suggestion that some contrast effects are due to the changes in extraneous reinforcement during the constant component, and that manipulations of component duration, and manipulations of the rate of reinforcement in the constant component, affect contrast because they influence the extent of extraneous reinforcer real-location.
behavioral contrast; reallocation hypothesis; extraneous reinforcers; component duration; component value; multiple schedules; concurrent schedules; key peck; pigeons
Twelve pigeons responded on two keys under concurrent variable-interval (VI) schedules. Over several series of conditions, relative and absolute magnitudes of reinforcement were varied. Within each series, relative rate of reinforcement was varied and sensitivity of behavior ratios to reinforcer-rate ratios was assessed. When responding at both alternatives was maintained by equal-sized small reinforcers, sensitivity to variation in reinforcer-rate ratios was the same as when large reinforcers were used. This result was observed when the overall rate of reinforcement was constant over conditions, and also in another series of concurrent schedules in which one schedule was kept constant at VI ached 120 s. Similarly, reinforcer magnitude did not affect the rate at which response allocation approached asymptote within a condition. When reinforcer magnitudes differred between the two responses and reinforcer-rate ratios were varied, sensitivity of behavior allocation was unaffected although response bias favored the schedule that arranged the larger reinforcers. Analysis of absolute response rates ratio sensitivity to reinforcement occurrred on the two keys showed that this invariance of response despite changes in reinforcement interaction that were observed in absolute response rates on the constant VI 120-s schedule. Response rate on the constant VI 120-s schedule was inversely related to reinforcer rate on the varied key and the strength of this relation depended on the relative magnitude of reinforcers arranged on varied key. Independence of sensitivity to reinforcer-rate ratios from relative and absolute reinforcer magnitude is consistent with the relativity and independence assumtions of the matching law.
Two multiple-schedule experiments with pigeons examined the effect of adding food reinforcement from an alternative source on the resistance of the reinforced response (target response) to the decremental effects of satiation and extinction. In Experiment 1, key pecks were reinforced by food in two components according to variable-interval schedules and, in some conditions, food was delivered according to variable-time schedules in one of the components. The rate of key pecking in a component was negatively related to the proportion of reinforcers from the alternative (variable-time) source. Resistance to satiation and extinction, in contrast, was positively related to the overall rate of reinforcement in the component. Experiment 2 was conceptually similar except that the alternative reinforcers were contingent on a specific concurrent response. Again, the rate of the target response varied as a function of its relative reinforcement, but its resistance to satiation and extinction varied directly with the overall rate of reinforcement in the component stimulus regardless of its relative reinforcement. Together the results of the two experiments suggest that the relative reinforcement of a response (the operant contingency) determines its rate, whereas the stimulus-reinforcement contingency (a Pavlovian contingency) determines its resistance to change.
In two experiments the conditioned reinforcing and delayed discriminative stimulus functions of stimuli that signal delays to reinforcement were studied. Pigeons' pecks to a center key produced delayed-matching-to-sample trials according to a variable-interval 60-s (or 30-s in 1 pigeon) schedule (Experiment 1) or a multiple variable-interval 20-s variable-interval 120-s schedule (Experiment 2). The trials consisted of a 2-s illumination of one of two sample key colors followed by delays ranging across phases from 0.1 to 27.0 s followed in turn by the presentation of matching and nonmatching comparison stimuli on the side keys. Pecks to the key color that matched the sample were reinforced with 4-s access to grain. Under some conditions of Experiment 1, pecks to nonmatching comparison stimuli produced a 4-s blackout and the start of the next interval. Under other conditions of Experiment 1 and each condition of Experiment 2, pecks to nonmatching stimuli had no effect and trials ended only when pigeons pecked the other, matching stimulus and received food. The functions relating pretrial response rates to delays differed markedly from those relating matching-to-sample accuracy to delays. Specifically, response rates remained relatively high until the longest delays (15.0 to 27.0 s) were arranged, at which point they fell to low levels. Matching accuracy was high at short delays, but fell to chance at delays between 3.0 and 9.0 s. In Experiment 2, both matching accuracy and response rates remained high over a wider range of delays in the variable-interval 120-s component relative to the variable-interval 20-s component. The difference in matching accuracy between the components was not due to an increased tendency in the variable-interval 20-s component toward proactive interference following short intervals. Thus, under these experimental conditions the conditioned reinforcing and the delayed discriminative functions of the sample stimulus depended on the same variables (delay and variable-interval value), but were nevertheless dissociated.