Animal Learning and Behavior Research Paper

This sample Animal Learning and Behavior Research Paper is published for educational and informational purposes only. If you need help writing your assignment, please use our research paper writing service and buy a paper on any topic at affordable price. Also check our tips on how to write a research paper, see the lists of psychology research paper topics, and browse research paper examples.

Why do psychologists train rats or monkeys to press a bar for food, or present a buzzer prior to food presentation for cats or dogs, since these situations bear little resemblance to the real world? (In natural settings, rats and monkeys do not have to bar press for food, and cats and dogs do not usually hear a buzzer before they eat.) The answer to this question lies in the belief that there are some general laws of learning. These laws reveal themselves in the study of any behavior, even behaviors not exhibited in natural settings.

Psychologists investigating operant conditioning use the bar press response because many different species acquire it easily. Actually, the unnaturalness of bar pressing is thought to be desirable because the animal comes into the conditioning situation without any past experience that may affect its behavior. The following statement by Skinner (1938) illustrates the belief that the study of any behavior reveals specific laws governing the operant conditioning: “The general topography of operant behavior is not important, because most if not all specific operants are conditioned. I suggest that the dynamic properties of operant behavior may be studied with a single reflex” (pp. 45-46).

Although Skinner studied operant conditioning using the bar press response, the rules Skinner detailed governing the acquisition and extinction of the bar pressing response control the operant conditioning process with many different behaviors and in many species. Thus, using a maze to study the instrumental conditioning process would demonstrate the same rules governing the acquisition or extinction of a bar press response (operant response). Operant conditioning research also demonstrates that different types of reinforcers increase the rate of bar pressing and that the operant conditioning principles identified by Skinner govern behavior in both laboratory and real-world settings. It is not surprising that psychologists have felt confident that training rats and primates to bar press for food reveals the general laws of operant conditioning.

Similarly, psychologists who present a buzzer prior to food assume that any rules they uncover governing the acquisition or extinction of a conditioned salivation response will represent the general laws of classical conditioning. The choice of a buzzer and food is arbitrary: Cats or dogs could be conditioned to salivate as readily to a wide variety of visual, auditory, or tactile stimuli. The following statement by Pavlov (1928) illustrates the view that all stimuli are capable of becoming conditioned stimuli: “Any natural phenomenon chosen at will may be converted into a conditioned stimulus.. .any visual stimulus, any desired sound, any odor, and the stimulation of any part of the skin” (p. 86).

The specific UCS used also is arbitrary: Any event that can elicit an unconditioned response can become associated with the environmental events that precede it. Thus, Pavlov’s buzzer could have as easily been conditioned to elicit fear by pairing it with shock as it was to salivation by being presented prior to food. Pavlov (1928) described the equivalent associability of events in the following statement: “It is obvious that the reflex activity of any effector organ can be chosen for the purpose of investigation, since signaling stimuli can get linked up with any of the inborn reflexes” (p. 17). Pavlov found that many different stimuli can become associated with the UCS of food. Other psychologists documented the conditioning of varied stimuli with a multitude of UCSs. Also, the literature points out that different CSs and UCSs can become associated in both laboratory and natural situations. The idea that any environmental stimulus can become associated with any unconditioned stimulus seemed a reasonable conclusion, based on the research conducted on classical conditioning.

A Behavior Systems Approach

All organisms.. .possess the basic behavioral patterns that enable them to survive in their niches, but learning provides the fine tuning necessary for successful adaptation. (Garcia & Garcia y Robertson, 1985)

The “general laws of learning” view just described assumes that learning is the primary determinant of how an animal acts. According to this approach, learning functions to organize reflexes and random responses so that an animal can effectively interact with the environment. The quote presented above provides a different perspective on the impact of learning on behavior. Rather than assuming that learning organizes behavior, Garcia and Garcia y Robertson (1985) suggested that the organization of behavior already exists within the animal. The function of learning is to enhance already existing organization rather than to create a new organization.

Timberlake’s (2001; Timberlake & Lucas, 1989) behavior systems approach suggests that learning modifies preexisting instinctive systems rather than constructing a new behavioral organization. According to Timberlake, an animal possesses a set of instinctive behavior systems such as feeding, mating, social bonding, care of young, and defense. These instinctive behavior systems are independent and serve a specific function or need within the animal.

The predatory subsystem of the feeding system in rats illustrates this view. The predatory sequence begins with the rat in a general search mode. The search mode causes the rat to show enhanced general searching, which leads to greater locomotion and increased sensitivity to spatial and social stimuli that are likely to bring the rat closer to food. The increased locomotion and greater sensitivity to the environment lead the rat to notice a small moving object in the distance as its prey. Once the prey is close, the rat shifts from a general search mode to a focal search mode. The focal search mode causes the rat to engage in a set of perceptual-motor modules related to capturing and subduing its prey. Once the prey is captured, the rat shifts to a handle/consume mode, which elicits the biting, manipulating, chewing, and swallowing involved in the consumption of the prey. This complex instinctive predatory behavior system allows the animal to find and consume the nutrients it needs to survive, with the search modes motivating the perceptual-motor modules that allow the rat to locate, approach, and consume its prey.

In Timberlake’s behavior systems approach, learning changes the integration, tuning, instigation, or linkages within a particular behavior system. For example, a new environmental stimulus could become able to release an instinctive motor-response module as a result of a Pavlovian conditioning experience. Learning can also alter the intensity of a simple motor response due to repetition, or improve the efficiency of a complex behavior pattern as a result of the contingent delivery of a reinforcer.

Cues that signal the receipt of reinforcers can be conditioned to activate a mode. For example, cues associated with the prey can be conditioned to the search modes that bring the rat into contact with its prey. However, the conditioning of modes is different from the conditioning of perceptual-motor modules (Timberlake, 2001). Specific motor responses are conditioned to specific stimuli as a result of the conditioning of perceptual-motor modules. By contrast, the conditioning of a specific mode produces a general motivational state that sensitizes all of the perceptual-motor modules in that mode. For example, activation of the general search mode of the predatory subsystem sensitizes the rat to engage in the travel, socialize, investigate, chase, and lie-in-wait perceptual-motor modules.

Timberlake’s behavior systems approach assumes that different stimuli can be conditioned to different modes. Distal cues are relevant to a general search mode—the rat must locate distant prey. By contrast, proximal cues are relevant to a focal search mode—the rat must capture and subdue its prey. Silva, Timberlake, and Gont (1998) evaluated the view that distant cues are associated with a general search mode, whereas proximal cues are associated with a focal search mode. In their study, there were two levers on each side of a food tray—one lever was “far” from the food tray, the other “near” to it. Each lever was presented in succession for four seconds, followed by food. In the F-N condition, the far (F) lever was presented first, followed by the near lever; in the N-F condition, the near (N) lever was presented first, followed by the far lever. Silva et al. found that animals in the F-N condition first attended to the far lever, then transferred their attention to the near lever, and finally nosed the food tray just prior to the presentation of food. In Timberlake’s view, the F-N condition resembles the predatory subsystem: The general search mode activated attention to the distal far lever, then the focal search mode activated attention to the proximal near lever. But what about the N-F condition? Silva et al. found that the N-F animals attended first to the near lever and then to the food tray, but showed no response to the far lever. The second (far) lever led the rat away from food; because activation of the focal search mode by the first near lever focused attention toward and not away from food, no conditioning occurred to the distal far lever. In fact, rats in the N-F condition spent more time nosing the first near lever than did rats in the F-N condition.

One of the functional features of the behavior system approach is that variations in learning occur between species (Timberlake, 2001). In Timberlake’s view, different species of animals learn a particular behavior at different rates. Different species also learn different ways of responding to a particular situation. Considerable variation also occurs within a species. Timberlake suggested that some behaviors are learned more rapidly than others within a given species. Further, there may be different rates of learning of a particular behavior between different members of a species.

What causes these variations between and within animal species? Timberlake (2001) proposed that the variations are due to either predispositions or constraints on what an animal or a person can learn. A predisposition refers to instances in which an animal learns more rapidly or in a different form than expected. Predispositions occur when environmental circumstance easily modifies the instinctive behavior system that the animal brings into the situation. Variations in learning also can reflect the impact of a constraint on learning; a constraint occurs when an animal learns less rapidly or less completely than expected. Constraints on learning occur when environmental circumstance is not suited to the animal’s instinctive behavior system. In the next five sections, we will examine examples of predispositions and constraints on learning.

Animal Misbehavior

Keller Breland and Marian Breland (1961) wanted to see if operant procedures could be used to teach exotic behaviors to animals. They trained 38 species, including reindeer, cockatoos, raccoons, porpoises, and whales, at Animal Behavior Enterprises in Hot Springs, Arkansas. In fact, they trained over 6,000 animals to emit a wide range of behaviors, including teaching hens to play a five-note tune on a piano and perform a “tap dance,” pigs to turn on a radio and eat breakfast at a table, chicks to run up an inclined platform and slide off, a calf to answer questions in a quiz show by lighting either a “yes” or “no” sign, and two turkeys to play hockey. These exotic behaviors have been on display at many municipal zoos and museums of natural history, in department store displays, at fair and trade convention exhibits, tourist attractions, and on television. These demonstrations have not only provided entertainment for millions of people but have also documented the power and generality of the operant conditioning procedures Skinner described.

Although K. Breland and M. Breland (1961, 1966) were able to condition a wide variety of exotic behaviors using operant conditioning, they noted that some operant responses, although initially performed effectively, deteriorated with continued training despite repeated food reinforcements. According to K. Breland and M. Breland, the elicitation of instinctive food-foraging and food-handling behaviors by the presentation of food caused the decline in the effectiveness of an operant response reinforced by food. These instinctive behaviors, strengthened by food reinforcement, eventually dominated the operant behavior. They called the deterioration of an operant behavior with continued reinforcement instinctive drift, and the instinctive behavior that prevented the continued effectiveness of the operant response animal misbehavior. One example of animal misbehavior is described next.

K.Breland and M. Breland (1961, 1966) tried to get pigs to pick up a large wooden coin and deposit it in a piggy bank several feet away. Depositing four or five coins was needed to receive one food reinforcement. According to the Brelands, “Pigs condition very rapidly, they have no trouble taking ratios, they have ravenous appetites (naturally), and in many ways are the most trainable animals we have worked with.” However, each pig exhibited an interesting pattern of behavior following conditioning. At first, the pigs picked up a coin, carried it rapidly to the bank, deposited it, and readily returned for another coin. However, over a period of weeks, the pigs’ operant behavior became slower and slower. Each pig still rapidly approached the coin, but rather than carry it immediately over to the bank, the pigs “would repeatedly drop it, root it, drop it again, root it along the way, pick it up, toss it up in the air, drop it, root it some more, and so on.”

Why did the pigs’ operant behavior deteriorate? According to K. Breland and M. Breland (1961, 1966), the pigs merely exhibited the instinctive behaviors associated with eating. The presentation of food not only reinforces the operant response, but it also elicits instinctive food-related behaviors. The reinforcement of these instinctive food-gathering and food-handling behaviors strengthens the instinctive behaviors, which results in the deterioration of the pigs’ operant responses (depositing the coin in the bank). The more dominant the instinctive food-related behaviors become, the longer it takes for the operant response to occur. The slow deterioration of the operant depositing response provides support for their instinctive drift view of animal misbehavior.

K.Breland and M. Breland (1961, 1966) have observed many other instances of animal misbehavior. They found hamsters that stopped responding in a glass case, porpoises and whales that swallowed balls or inner tubes instead of playing with them to receive reinforcement, cats that refused to leave the area around the food dispenser, and rabbits that refused to approach their feeder. They also reported extreme difficulty in conditioning many bird species to vocalize to obtain food reinforcement. In each case of animal misbehavior, they suggested that the instinctive food-seeking behavior prevented the continued high performance level of an operant response required to receive reinforcement. These findings suggest that the effectiveness of food reinforcement to establish an operant behavior is limited.

Boakes, Poli, Lockwood, and Goodall (1978) established a procedure for producing animal misbehavior in a laboratory. These researchers trained rats to press a flap to obtain a ball bearing and to deposit it in a chute to obtain food reinforcement. They reported that although all the rats initially released the ball bearing readily, the majority of the animals became reluctant to let go of the ball bearing after several training sessions. These rats repeatedly mouthed, pawed, and retrieved the ball bearing before finally depositing it in the chute.

K. Breland and M. Breland (1961, 1966) suggested that the elicitation and strengthening of instinctive food-related behaviors during operant conditioning is responsible for animal misbehavior. Boakes et al. (1978) proposed another explanation. In their view, animal misbehavior is produced by Pavlovian conditioning rather than by operant conditioning. The association of environmental events with food during conditioning causes these environmental events to elicit species-typical foraging and food-handling behaviors; these behaviors then compete with the operant behavior. Consider the misbehavior of the pig detailed earlier. According to K. Breland and M. Breland, the pigs rooted the tokens because the reinforcement presented during operant conditioning produced and strengthened the instinctive food-related behavior; in contrast, Boakes et al. suggested that the association of the token with food caused the token to elicit the rooting behavior.

Timberlake, Wahl, and King (1982) conducted a series of studies to evaluate the validity of each view of animal misbehavior. The results of the experiments Timberlake et al. conducted show that both operant and Pavlovian conditioning contribute to producing animal misbehavior. In their appetitive structure view, pairing food with the natural cues controlling food-gathering activities produces species-typical foraging and food-handling behaviors. The instinctive food-gathering behaviors must be reinforced if misbehavior is to dominate the operant behavior. Animal misbehavior does not occur in most operant conditioning situations because (a) the cues present during conditioning do not resemble the natural cues eliciting instinctive foraging and food-handling behaviors, and (b) these instinctive behaviors are not reinforced.

Timberlake et al. (1982) used the ball bearing procedure that Boakes et al. (1978) had developed to validate their appetitive structure view of animal misbehavior. Recall that rats in this situation repeatedly mouth, paw, and retrieve the ball bearing before releasing it down the chute to obtain food reinforcement. In Experiment 1, they assessed the contribution of Pavlovian conditioning to animal misbehavior by pairing the ball bearing with food in experimental subjects. Experimental treatment animals received food after the ball bearing had rolled out of the chamber. This study also used two control conditions to evaluate the importance of pairing the ball bearing and food: Animals in one control condition were given random pairings of the ball bearing and food (random group); subjects in the second control condition received only the ball bearing (CS-only group). They reported that experimental group animals exhibited a significant amount of misbehavior toward the ball bearing: they touched it, carried it about the cage, placed it in their mouths, and bit it while holding it in their forepaws. By contrast, infrequent misbehavior occurred in animals in the two control groups. These observations indicate that the ball bearing and food must be presented together for a high level of misbehavior to occur.

The pairing of the ball bearing with food is necessary but not sufficient for the development of misbehavior. Timberlake et al. (1982) asserted that the misbehavior must be reinforced by food presentation for misbehavior to dominate operant responding. Experiments 3 and 4 evaluated the importance of operant conditioning to the establishment of animal misbehavior. In Experiment 3, contact with the ball bearing caused food to be omitted. If reinforcement of contact with the ball bearing is necessary for the dominance of animal misbehavior, the contingency that contact with the ball bearing would prevent reinforcement should lead to an absence of animal misbehavior. The results of Experiment 3 show that if contact with the ball bearing prevents reinforcement, then the animals did indeed exhibit no contact with the ball bearing. In Experiment 4, the contact with the ball bearing was reinforced. If the animal did not touch the ball bearing, it received no food on that trial. Timberlake et al. reported that reinforcement of contact with the ball bearing produced a rapid increase in the level of animal misbehavior. These studies suggest that for misbehavior to develop, stimuli (e.g., ball bearings) resembling the natural cues controlling food-gathering activities must be consistently paired with food (Pavlovian conditioning), and the presentation of food must reinforce the occurrence of species-typical foraging and food-handling behaviors elicited by the natural cues (operant conditioning).

Schedule-Induced Behavior

F. Skinner (1948) described an interesting pattern of behavior that pigeons exhibited when reinforced for key pecking on a fixed-interval schedule. When food reinforcement was delivered to the pigeons on a fixed-interval, 15-second schedule, they developed a “ritualistic” stereotyped pattern of behavior during the interval. The pattern of behavior differed from bird to bird—some walked in circles between food presentations; others scratched the floor; still others moved their heads back and forth. Once a particular pattern of behavior emerged, the pigeons repeatedly exhibited it, with the frequency of the behavior increasing as the birds received more reinforcement. Skinner referred to the behaviors of his pigeons on the interval schedule as examples of superstitious behavior.

Why do animals exhibit superstitious behavior? One reasonable explanation suggests that animals have associated the superstitious behavior with reinforcement, and this association causes the animals to exhibit high levels of the superstitious behavior. Staddon and Simmelhag’s (1971) analysis of superstitious behavior indicated that it is not an example of the operant behavior. They identified two types of behavior produced when reinforcement (for example, food) is programmed to occur on a regular basis: terminal behavior and interim behavior. Terminal behavior occurs during the last few seconds of the interval between rein-forcer presentations, and it is reinforcer oriented. Pigeons peck on or near the food hopper that delivered food; this is an example of terminal behavior. Interim behavior, in contrast, is not reinforcer oriented. Although contiguity influences the development of terminal behavior, interim behavior does not occur contiguously with reinforcement. Terminal behavior falls between interim behavior and reinforcement but does not interfere with the exhibition of interim behavior.

Staddon and Simmelhag (1971) suggested that terminal behavior occurs in stimulus situations that are highly predictive of the occurrence of reinforcement—that is, terminal behavior is typically emitted just prior to reinforcement on a fixed-interval schedule. By contrast, interim behavior occurs during stimulus conditions that have a low probability of the occurrence of reinforcement—that is, interim behavior is observed most frequently in the period following reinforcement.

The strange, superstitious behavior that Skinner initially described is only one example of interim behavior. Animals exhibit a wide variety of other behaviors (e.g., drinking, running, grooming, nest building, aggression) when reinforcement occurs regularly. When fixed-interval schedules of reinforcement elicit high levels of interim behavior, we refer to it as schedule-induced behavior.

Schedule-Induced Polydipsia

The most extensively studied form of schedule-induced behavior is the excessive intake of water (polydipsia) when animals are reinforced with food on a fixed-interval schedule. John Falk (1961) conducted a study to observe schedule-induced polydipsia. He deprived rats of food until their body weight was approximately 70 to 80 percent of their initial weight and then trained them to bar press for food reinforcement. When water was available in an operant chamber, Falk found that the rats consumed excessive amounts of water. Even though the rats were not water deprived, they drank large amounts of water; in fact, under certain conditions, animals provided food reinforcement on an interval schedule will consume as much as one half their weight in water in a few hours. Not water deprivation, heat stress, or a similar amount of food in one meal produces this level of excessive drinking. Apparently, some important aspect of providing food on an interval schedule can elicit excessive drinking.

Is schedule-induced polydipsia an example of interim behavior? Recall Staddon and Simmelhag’s (1971) definition—interim behavior occurs in stimulus situations that have a low probability of reinforcement occurrence. Schedule-induced drinking does fit their definition: Animals reinforced on an interval schedule typically drink during the period following food consumption. In contrast, drinking usually does not occur in the period that precedes the availability of food reinforcement.

Schedule-induced polydipsia has been consistently observed in rats given food on an interval schedule (Wetherington, 1982). A variety of different interval schedules of reinforcement have been found to produce polydipsia. Falk (1961) observed polydipsia in rats on a fixed-interval schedule, and Jacquet (1972) observed polydipsia on a variety of compound schedules of reinforcement. Schedule-induced polydipsia also is found in species other than rats. Shanab and Peterson (1969) reported schedule-induced polydipsia in pigeons, and Schuster and Woods (1966) observed it in primates.

Other Schedule-Induced Behaviors

Several other instinctive behaviors are observed in animals on interval schedules. A number of psychologists (King, 1974; Staddon & Ayres, 1975) have reported that interval schedules of reinforcement produce high levels of wheel running. Schedule-induced wheel running was observed using both food and water reinforcement. Levitsky and G. Collier (1968) found that the highest rate of wheel running occurs in the time immediately following reinforcement and then decreases as the time for the next reinforcement nears, while Staddon and Ayres reported that as the interreinforcement interval increases, the intensity of wheel running initially increases and then declines.

Animals receiving reinforcement on an interval schedule will attack an appropriate target of aggressive behavior. Cohen and Looney (1973) reported that pigeons will attack another bird or a stuffed model of a bird present during key pecking for reinforcement on an interval schedule. Similar schedule-induced aggression appears in squirrel monkeys (Hutchinson, Azrin, & Hunt, 1968) and rats (Knutson & Kleinknecht, 1970). Knutson and Kleinknecht reported that the greatest intensity of aggressive behavior occurred in the immediate postreinforcement period.

Flavor-Aversion Learning

Contiguity plays a critical role in the acquisition of a conditioned response: Little conditioning occurs if the CS precedes the UCS by several seconds or minutes. However, animals will develop aversions to taste cues even when the taste stimulus preceded illness by several hours (see also Chapter 35). This indicates that, unlike other conditioned responses, the flavor aversion does not depend on contiguity. The association of a flavor with illness is often referred to as long-delay learning; this term suggests a difference between flavor-aversion learning and other examples of classical conditioning.

Some stimuli are more likely than others to become associated with a particular UCS. Garcia and Koelling’s (1966) classic study shows that a taste is more salient when preceding illness than when preceding shock, whereas a light or tone is more salient when preceding shock than when preceding illness. In Garcia and Koelling’s study, rats were exposed to either a saccharin taste cue or a light-and-tone compound stimulus. Following exposure to one of these cues, animals received either an electric shock or irradiation-induced illness. Animals exhibited an aversion to saccharin when it was paired with illness but not when it was paired with shock. In addition, they developed a fear of the light-and-tone stimulus when it was paired with shock but not when paired with illness.

On the basis of the Garcia and Koelling study, Seligman (1970) proposed that rats have an evolutionary preparedness to associate tastes with illness. Further support for this view is the observation that adult rats acquire an intense aversion to a flavor after a single taste-illness pairing. Young animals also acquire a strong aversion after one pairing (Klein, Domato, Hallstead, Stephens, & Mikulka, 1975). Apparently, taste cues are very salient in terms of their associability with illness.

Seligman also suggested that rats are contra prepared to become afraid of a light or tone paired with illness. However, other research (Klein, Freda, & Mikulka, 1985) indicates that rats can associate an environmental cue with illness. Klein et al. found that rats avoided a distinctive black compartment previously paired with an illness-inducing apormorphine or lithium chloride injection, whereas Revusky and Parker (1976) observed that rats did not eat out of a container that had been paired with illness induced by lithium chloride. Although animals can acquire environmental aversions, more trials and careful training procedures are necessary to establish an environmental aversion than for a flavor aversion (Riccio & Haroutunian, 1977).

Although rats form flavor aversions more readily than environmental aversions, other species do not show this pattern of stimulus salience. Unlike rats, birds acquire visual aversions more rapidly than taste aversions. Wilcoxin, Dragoin, and Kral (1971) induced illness in quail that had consumed sour blue water. They reported that an aversion formed to the blue color, but not to the sour taste. In the same vein, Capretta (1961) found greater salience of visual cues than taste cues in chickens.

Why are visual cues more salient than taste stimuli in birds? According to Garcia, Hankins, and Rusiniak (1974), this salience hierarchy is adaptive. Since birds’ seeds are covered by a hard, flavorless shell, they must use visual cues to assess whether food is poisoned; thus, visual cues enable birds to avoid consuming poisonous seeds. Although this view seems reasonable, it does not appear to be completely accurate. Braveman (1974, 1975) suggests that the feeding time characteristic of a particular species determines the relative salience of stimuli becoming associated with illness. Rats, which are nocturnal animals, locate their food at night and therefore rely less on visual information than on gustatory information to identify poisoned food. By contrast, birds search for their food during the day, and visual information plays an important role in controlling their food intake. Braveman evaluated this view by examining the salience hierarchies of guinea pigs, which, like birds, seek their food during the day. He found visual cues to be more salient than taste stimuli for guinea pigs.

Imprinting

Infant Love

You have undoubtedly seen young ducks swimming behind their mother in a lake. What process is responsible for the young birds’ attachment to their mother? Lorenz (1952/1957) investigated this social attachment process, calling it imprinting. Lorenz found that a newly hatched bird approaches, follows, and forms a social attachment to the first moving object it encounters. Although typically the first object that the young bird sees is its mother, birds imprint to many different and sometimes peculiar objects. In a classic demonstration of imprinting, newly hatched goslings imprinted to Lorenz and thereafter followed him everywhere. Birds have imprinted to colored boxes and other inanimate objects as well as to animals of different species. After imprinting, the young animal prefers the imprinted object to its real mother; this shows the strength of imprinting.

Although animals have imprinted to a wide variety of objects, certain characteristics of the object affect the likelihood of imprinting. P. H. Klopfer (1971) found that ducklings imprinted more readily to a moving object than to a stationary object. Also, ducks are more likely to imprint to an object that (a) makes “lifelike” rather than “gliding” movements (Fabricius, 1951); (b) vocalizes rather than remains silent (N. E. Collias & E. C. Collias, 1956); (c) emits short rhythmic sounds rather than long high-pitched sounds (Weidman, 1956); and (d) measures about 10 cm in diameter (Schulman, Hale, & Graves, 1970).

Harry Harlow (1971) observed that baby primates readily became attached to a soft terry cloth surrogate mother but developed no attachment to a wire mother. Harlow and Suomi (1970) found that infant monkeys preferred a terry cloth mother to a rayon, vinyl, or sandpaper surrogate; liked clinging to a rocking mother rather than to a stationary mother; and chose a warm (temperature) mother over a cold one. Mary Ainsworth and her associates (Ainsworth, 1982; Blehar, Lieberman, & Ainsworth, 1977) reported that human infants also need a warm, responsive mother for social attachment; they found a strong attachment to mothers who were responsive and sensitive to their children’s needs. By contrast, infants showed little attachment to anxious or indifferent mothers.

Age plays an important role in the imprinting process. Not only does imprinting occur readily during certain sensitive periods, but also imprinting is less likely to occur following this sensitive period. Jaynes’s (1956) study exposed newly hatched New Hampshire chicks to cardboard cubes at different times. He reported that five-sixths of the chicks imprinted within 1 to 6 hours after hatching. However, only five-sevenths of the chicks met the criterion for imprinting when exposed to the cardboard cube 6 to 12 hours after hatching. The percentage declined to three-fifths at 24 to 30 hours, two-fifths at 30 to 36 hours, and only one-fifth at 48 to 54 hours. The sensitive period for social attachment differs between species; in sheep and goats, it is 2 to 3 hours after birth (P. H. Klopfer, Adams, & M. S. Klopfer, 1964); in primates, 3 to 6 months; and in humans, 6 to 12 months (Harlow, 1971).

However, the sensitive period merely reflects a lesser degree of difficulty in forming an attachment; when sufficient experience is given, imprinting will occur after the sensitive period has lapsed. Brown (1975) trained ducklings ranging in age from 20 to 120 hours to follow an object to an equivalent degree. He found that although the older the duck, the longer the time required for imprinting to occur, all the ducklings with sufficient training showed an equal degree of attachment to the imprinted object.

Imprinting differs from other forms of associative learning (Davey, 1989). The animal’s response to the imprinting object is less susceptible to change than an animal’s reaction to events acquired through conventional associative learning is. Conditioned stimuli that elicit saliva quickly extinguish when food is discontinued. Similarly, the absence of shock produces a rapid extinction of fear. By contrast, the elimination of reinforcement does not typically lead to a loss of reaction to an imprinting object. Hess (1962, 1964) observed that once a three- to four-day-old chick developed a food preference to a less preferred object, this preference remained, despite the subsequent lack of food reinforcement when the chick pecked at this object.

Although punishment quickly alters an animal’s response to a conditioned stimulus, animals seem insensitive to punishment from an imprinting object. Kovach and Hess (1963) found that chicks approached the imprinting object despite its administration of electric shock. Harlow’s (1971) research shows how powerful the social attachment of the infant primate is to its surrogate mother. Harlow constructed four abusive “monster mothers.” One rocked violently from time to time; a second projected an air blast in the infant’s face. Primate infants clung to these mothers even as they were abused. The other two monster mothers were even more abusive: One tossed the infant off her, and the other shot brass spikes as the infant approached. Although the infants were unable to cling to these mothers continuously, they resumed clinging as soon as possible when the abuse stopped. His observations are consistent with observations of many abused children, who typically desire to return to their abusive parent.

Sexual Preference

Lorenz (1957) reported an interesting behavior in one of his male jackdaws. The bird attempted courtship feeding with him: It finely minced worms, mixed them with saliva, and attempted to place the worms in Lorenz’s mouth. When Lorenz did not open his mouth, he got an earful of worm pulp. Lorenz suggested that the male jackdaw had sexually imprinted to him.

The sexual preference of many birds is established during a sensitive period (Eibl-Eibesfeldt, 1970; Lorenz, 1957). Also, the birds’ sexual preference does not have to be for their own species; that is, a sexual preference can be established to another species if exposure occurs during the sensitive period. Because sexual preference develops in immature birds when copulation is impossible, the establishment of the birds’ sexual preference does not depend upon sexual reinforcement. Further, the imprinted bird’s sexual preference is not modified even after sexual experience with another bird species. Perhaps this type of sexual imprinting is a cause of the development and persistence of human sexual preferences.

Food Preference

Hess (1962, 1964) suggested that an animal’s experience with food during a sensitive period of development results in the establishment of a food preference. This preference can develop to a typically nonpreferred food and, once established, is permanent. Chicks innately prefer to peck at a white circle on a blue background rather than a white triangle on a green background. Hess gave different groups of chicks of various ages experience with the less-preferred stimulus. The chicks developed a strong preference for the white-triangle-green-background stimulus if they experienced it during days three to four after the chicks had hatched. This preference did not develop if the experience occurred on days one, two, seven, or nine after hatching. These observations indicate that the sensitive period for the establishment of food preference in chicks is three to four days following hatching. Hess suggested that this time period for the establishment of a food preference is critical because three-day-old chicks no longer use the yolk sac for nutrients and can peck with maximum accuracy.

Humans differ considerably in their food preferences (Rozin & Zellner, 1985). These preferences may to some degree reflect experience with a specific food during the sensitive period of development. People typically prefer familiar foods, which suggests an imprinting influence in food preference. Food aversions in humans may also be sensitive to a developmental stage; Garb and Stunkard’s (1974) observation that people are most apt to develop food aversions between the ages of 6 and 12 years provides additional evidence that imprinting affects the establishment of food preferences and aversions.

Avoidance Of Aversive Events

Species-Specific Defense Reactions

Bolles (1970, 1978) suggested that animals have species-specific defense reactions (SSDR) that allow them to avoid dangerous events. According to Bolles, animals have little opportunity to learn to avoid danger: They either possess an instinctive means of keeping out of trouble or they perish. For example, a deer does not have time to learn to avoid its predator. Unless the deer possesses an instinct for avoiding predators, it will probably wind up as a predator’s meal.

The instinctive responses that enable animals to avoid aversive events differ. An animal’s evolutionary history determines which behaviors will become SSDRs: Responses that enable animals to avoid aversive events remain in their genetic programming, whereas nonadaptive responses will not be passed on to future generations. Bolles (1970, 1978) proposed that animals experiencing danger narrow their response repertoire to those behaviors that they expect will eliminate the danger. Because evolution has proved the species-specific defense reactions to be effective and other behaviors likely to produce failure, behaviors other than the species-specific defense reactions probably would be nonadaptive. Thus, animals limit their reactions to SSDRs as they attempt to avoid danger.

Rats employ three different species-specific defense reactions: running, freezing, and fighting. They attempt to run from a distant danger; a close danger motivates freezing. When these two responses fail, rats use aggressive behavior to avoid aversive events. Other animals employ different instinctive responses to avoid danger: The mouse, as Bolles (1970, 1978) suggested in a quote from Robert Burns’s “To a Mouse,” is “a wee timorous beastie” when experiencing danger because this is the only way this small and relatively defenseless animal can avoid danger. By contrast, the bird just flies away.

Bolles and A. C. Collier (1976) demonstrated that the cues that predict danger not only motivate defensive behavior but also determine which response rats will exhibit when they expect danger. Rats received shock in a square or a rectangular box. After they experienced the shock, the rats either remained in the dangerous environment or were placed in another box where no shocks were given. Defensive behavior occurred only when the rats remained in the compartment where they had previously been shocked. Bolles and A. C. Collier also found that a dangerous square compartment produced a freezing response, while a dangerous rectangular box caused the rats to run. These results suggest that the particular SSDR produced depends on the nature of the dangerous environment.

Animals easily learn to avoid an aversive event when they can use an SSDR. Rats readily learn to run to avoid being shocked. Similarly, pigeons easily learn to avoid shock by flying from perch to perch. By contrast, animals have difficulty learning to avoid an aversive event when they must emit a behavior other than an SSDR to avoid the aversive event. D’Amato and Schiff (1964) trained rats to bar press to avoid electric shock. They reported that over half of their rats, even after having participated in more than 7,000 trials over a four-month period, failed to learn the avoidance response.

Bolles (1969) provides additional evidence of the importance of instinct in avoidance learning. He reported that rats quickly learned to run in an activity wheel to avoid electric shock, but found no evidence of learning when the rats were required to stand on their hind legs to avoid shock. Although rats stood on their hind legs in an attempt to escape from the compartment where they were receiving shock, these rats did not learn the same behavior to avoid shock. Bolles suggested that a rat’s natural response in a small compartment was to freeze, and this innate SSDR prevented them from learning a nonspecies-specific defensive reaction as the avoidance behavior.

Predispositions and Avoidance Learning

Bolles (1978) suggested that predispositions are responsible for the development of avoidance learning. In his view, aversive events elicit instinctive species-specific defensive responses. The environment present during aversive events becomes able to produce these instinctive defensive reactions as a conditioned response. Whereas instinctive CRs to cues associated with reinforcement elicit approach and contact behavior that enables an animal to obtain reinforcement, stimuli associated with aversive events produce instinctive defensive responses that allow the animal to avoid aversive events.

Bolles (1978) suggested that Pavlovian conditioning rather than operant conditioning influences whether or not avoidance learning occurs. The association of environmental stimuli with aversive events rather than reinforcement causes the development of avoidance behavior. Bolles and Riley’s (1973) study shows that reinforcement is not responsible for the rapid acquisition of avoidance behavior. In their study, some animals could avoid being shocked by freezing. They reported that after only a few minutes of training, their animals were freezing most of the time. Two additional groups were included in their study: One group was punished for freezing and could avoid shock by not freezing; the other group was shocked regardless of their behavior. Bolles and Riley observed that the rats punished for freezing still froze much of the time. Furthermore, rats punished for freezing froze as much as the rats that were shocked regardless of their behavior. Bolles suggested that when an animal is in a small confined area and anticipates an aversive event, it freezes. The animals punished for freezing would still have frozen all the time had frequent shocks not disrupted their freezing. Yet, as soon as the shock ended, the anticipation elicited the instinctive freezing response. Thus, with the exception of shock-induced disruption of freezing in animals punished for freezing, animals either reinforced or punished for freezing showed equivalent levels of freezing behavior. These results suggested that the contingency between the freezing response and the aversive event did not affect these animals’ behavior.

References:

  1. Ainsworth, M. D. S. (1982). Attachment: Retrospect and prospect. In C. M. Parkes & J. Sevenson-Hinde (Eds.), The place of attachment in human behavior. New York: Basic Books.
  2. Blehar, M. C., Lieberman, A. F., & Ainsworth, M. D. S. (1977). Early face-to-face interaction and its relation to later infant-mother attachment. Child Development, 48, 182-194.
  3. Boakes, R. A., Poli, M., Lockwood, M. J., & Goodall, G. (1978). A study of misbehavior: Token reinforcement in the rat. Journal of the Experimental Analysis of Behavior, 29, 115-134.
  4. Bolles, R. C. (1970). Species-specific defense reactions and avoidance learning. Psychological Review, 77, 32-48.
  5. Bolles, R. C. (1978). The role of stimulus learning in defensive behavior. In S. H. Hulse, H. Fowler, & W. K. Honig (Eds.), Cognitive processes in animal behavior (pp. 89-108). Hillsdale, NJ: Erlbaum.
  6. Bolles, R. C., & Collier, A. C. (1976). The effect of predictive cues on freezing in rats. Animal Learning and Behavior, 4, 6-8.
  7. Bolles, R. C., & Riley, A. (1973). Freezing as an avoidance response: Another look at the operant-respondent distinction. Learning and Motivation, 4, 268-275.
  8. Braveman, N. S. (1974). Poison-based avoidance learning with flavored or colored water in guinea pigs. Learning and Motivation, 5, 182-194.
  9. Braveman, N. S. (1975). Formation of taste aversions in rats following prior exposure to sickness. Learning and Motivation, 6, 512-534.
  10. Breland, K., & Breland, M. (1961). The misbehavior of organisms. American Psychologist, 61, 681-684.
  11. Breland, K., & Breland, M. (1966). Animal behavior. New York: Macmillan.
  12. Brown, J. L. (1975). The evolution of behavior. New York: Norton.
  13. Capretta, P. J. (1961). An experimental modification of food preference in chickens. Journal of Comparative and Physiological Psychology, 54, 238-242.
  14. Cohen, P. S., & Looney, T. A. (1973). Schedule-induced mirror responding in the pigeon. Journal of the ExperimentalAnalysis of Behavior, 19, 395-408.
  15. Collias, N. E., & Collias, E. C. (1956). Some mechanisms of family integration in ducks. Auk, 73, 378-400.
  16. D’Amato, M. R., & Schiff, E. (1964). Further studies of overlearning and position reversal learning. Psychological Reports, 14, 380-382.
  17. Davey, G. (1989). Ecological learning theory. Florence, KY: Taylor & Francis/Routledge.
  18. Eibl-Eibesfeldt, I. (1970). Ethology: The biology of behavior. New York: Holt.
  19. Fabricius, E. (1951). Zur Ethologie Junger Anatiden. Acta Zoologica Fennica, 68, 1-175.
  20. Falk, J. L. (1961). Production of polydipsia in normal rats by an intermittent food schedule. Science, 133, 195-196.
  21. Garcia, J., & Garcia y Robertson, R. (1985). Evolution of learning mechanisms. In B. L. Hammonds (Ed.), Psychology and learning. Washington, DC: American Psychological Association.
  22. Garcia, J., Hankins, W. G., & Rusiniak, K. W. (1974). Behavioral regulation of the milieu interne in man and rat. Science, 185, 824-831.
  23. Garcia, J., Kimeldorf, D. J., & Koelling, R. A. (1955). Conditioned aversion to saccharin resulting from exposure to gamma radiation. Science, 122, 157-158.
  24. Garcia, J., & Koelling, R. A. (1966). The relation of cue to consequence in avoidance learning. Psychonomic Science, 4, 123-124.
  25. Harlow, H. F. (1971). Learning to love. San Francisco: Albion.
  26. Harlow, H. F., & Suomi, S. J. (1970). Nature of love—Simplified. American Psychologist, 25, 161-168.
  27. Hess, E. H. (1962). Ethology: An approach toward the complete analysis of behavior. In R. Brown, E. Galanter, E. H. Hess, & G. Mandler (Eds.), New directions in psychology (pp. 157-266). New York: Holt.
  28. E. H. (1964). Imprinting in birds. Science, 146, 1128-1139.
  29. Hutchinson, R. R., Azrin, N. H., & Hunt, G. M. (1968). Attack produced by intermittent reinforcement of a concurrent operant response. Journal of the Experimental Analysis of Behavior, 11, 498-495.
  30. Jacquet, Y. F. (1972). Schedule-induced licking during multiple schedules. Journal of the Experimental Analysis of Behavior, 17, 413-423.
  31. Jaynes, J. (1956). Imprinting: The interaction of learned and innate behavior: I. Development and generalization. Journal of Comparative and Physiological Psychology, 49, 201206.
  32. King, G. D. (1974). Wheel running in the rat induced by a fixed-time presentation of water. Animal Learning and Behavior, 2, 325-328.
  33. Klein, S. B., Domato, G. C., Hallstead, C., Stephens, I., & Mikulka, P. J. (1975). Acquisition of a conditioned aversion as a function of age and measurement technique. Physiological Psychology, 3, 379-384.
  34. Klein, S. B., Freda, J. S., & Mikulka, P. J. (1985). The influence of a taste cue on an environmental aversion: Potentiation or overshadowing? Psychological Record, 35, 101-112.
  35. Klopfer, P. H. (1971). Imprinting: Determining its perceptual basis in ducklings. Journal of Comparative and Physiological Psychology, 75, 378-385.
  36. Klopfer, P. H., Adams, D. K., & Klopfer, M. S. (1964). Maternal “imprinting” in goats. National Academy of Science Proceedings, 52, 911-914.
  37. Knutson, J. F., & Kleinknecht, R. A. (1970). Attack during differential reinforcement of low rate of responding. Psychonomic Science, 19, 289-290.
  38. Kovach, J. K., & Hess, E. H. (1963). Imprinting: Effects of painful stimulation upon the following response. Journal of Comparative and Physiological Psychology, 56, 461-464.
  39. Levitsky, D., & Collier, G. (1968). Schedule-induced wheel running. Physiology and Behavior, 3, 571-573.
  40. Lorenz, K. (1957). The past twelve years in the comparative study of behavior. In C. H. Schiller (Ed.), Instinctive behavior (pp. 288-317). New York: International Universities Press. (Original work published 1952)
  41. Pavlov, I. (1928). Lectures on conditioned reflexes: Vol. 1: The higher nervous activity of animals (H. Gantt, Trans.). London: Lawrence and Wishart.
  42. Revusky, S., & Parker, L. A. (1976). Aversions to drinking out of a cup and to unflavored water produced by delayed sickness. Journal of Experimental Psychology: Animal Behavior Processes, 2, 342-353.
  43. Riccio, D. C., & Haroutunian, V. (1977). Failure to learn in a taste-aversion paradigm: Associative or performance deficit? Bulletin of the Psychonomic Society, 10, 219-222.
  44. Schulman, A. H., Hale, E. B., & Graves, H. B. (1970). Visual stimulus characteristics of initial approach response in chicks (Gallus domesticus). Animal Behavior, 18, 461-466.
  45. Seligman, M. E. P. (1970). On the generality of laws of learning. Psychological Review, 77, 406-418.
  46. Shanab, M. E., & Peterson, J. L. (1969). Polydipsia in the pigeon. Psychonomic Science, 15, 51-52.
  47. Silva, F. J., Timberlake, W., & Gont, R. S. (1998). Spatiotemporal characteristics of serial CSs and their relation to search modes and response form. Animal Learning and Behavior, 26, 299-312.
  48. Skinner, B. F. (1938). The behavior of organisms: An experimental analysis. New York: Appleton-Century-Crofts.
  49. Skinner, B. F. (1948). Superstition in the pigeon. Journal of Experimental Psychology, 38, 168-172.
  50. Staddon, J. E. R., & Ayres, S. L. (1975). Sequential and temporal properties of behavior induced by a schedule of periodic food delivery. Behavior, 54, 26-49.
  51. Staddon, J. E. R., & Simmelhag, V. L. (1971). The “Superstition” experiment: A reexamination of its implications for the principles of adaptive behavior. Psychological Review, 78, 3-43.
  52. Timberlake, W. (2001). Motivated modes in behavior systems. In R. R. Mowrer & S. B. Klein (Eds.), Handbook of contemporary learning theories (pp. 155-209). Mahwah, NJ: Erlbaum.
  53. Timberlake, W., & Lucas, G. A. (1989). Behavior systems and learning: From misbehavior to general principles. In S. B. Klein & R. R. Mowrer (Eds.), Contemporary learning theory: Instrumental conditioning theory and the impact of biological constraints on learning (pp. 237-275). Hillsdale, NJ: Erlbaum.
  54. Timberlake, W., Wahl, G., & King, D. (1982). Stimulus and response contingencies in the misbehavior of rats. Journal of Experimental Psychology: Animal Behavior Processes, 8, 62-85.
  55. Weidman, U. (1956). Some experiments on the following and the flocking reaction of mallard ducklings. British Journal of Animal Behavior, 4, 78-79.
  56. Wetherington, C. L. (1982). Is adjunctive behavior a third class of behavior? Neuroscience and Biobehavioral Reviews, 6, 329-350.
  57. Wilcoxon, H. C., Dragoin, W. B., & Kral, P. A. (1971). Illness-induced aversions in rats and quail: Relative salience of visual and gustatory cues. Science, 7, 489-493.

See also:

Free research papers are not written to satisfy your specific instructions. You can use our professional writing services to order a custom research paper on any topic and get your high quality paper at affordable price.

ORDER HIGH QUALITY CUSTOM PAPER


Always on-time

Plagiarism-Free

100% Confidentiality
Special offer! Get discount 10% for the first order. Promo code: cd1a428655