Social Cognition Research Paper Example

This sample Social Cognition Research Paper is published for educational and informational purposes only. If you need help writing your assignment, please use our research paper writing service and buy a paper on any topic at affordable price. Also check our tips on how to write a research paper, see the lists of research paper topics, and browse research paper examples.

Social cognition is the branch of social psychology that studies how people think about themselves and other people. It focuses on the steps people take and the conclusions they reach as they strive to make sense of their social environment. The field tends to view people as information processors, something like a computer, who take in information from the outside world, sort that information out and interpret it, calculate a judgment, and then choose a behavior in response. In doing so, the field examines what information people pay attention to, how they analyze it, how they reach judgments based on that information, how those judgments guide their behaviors, and then which parts of the information they remember.

Work in social cognition is wide-ranging and explores a breathtaking diversity of topics. For example, some social cognitive researchers investigate how people develop opinions and attitudes about social issues. Others examine whether people’s judgments of others are distorted by stereotypes. Others look at whether people reach decisions that are wise and rational versus faulty and costly. Others study how people reach impressions of themselves that lead to high versus low self-esteem.

Social cognition principles carry a wide variety of implications for real world pursuits. Social cognition principles, for example, explain why people make right versus wrong decisions about their health. It suggests the best ways to teach students to remember school material. It explains the pitfalls that prevent people in negotiations from reaching harmonious settlements. It explains how people can commit discrimination against people from other ethnic or social groups without even knowing it. It describes why and when people make poor decisions about their money.

One can claim that social cognition has always been a featured part of social psychology, even when the rest of psychology has neglected, or even denied, the importance of people’s internal thought processes. In particular, in the early to mid-twentieth century, the bulk of psychology was dominated by the behaviorist tradition, led by B. F. Skinner (1904-1990) and others, which emphasized how organisms reacted to rewards and punishments while studiously avoiding any talk of that organism’s internal psychological world. During this era, many social psychologists squarely examined that internal life, exploring how people developed their attitudes toward social issues, as well as how they formed stereotypes about social groups, or made attributions about the causes of other people’s behavior.

However, in the 1960s, with the advent of the cognitive revolution, things changed dramatically. The mainstream of psychology became fascinated with the organism’s internal life—how that organism perceived, thought about, and remembered the world around it. Cognitive psychologists, in particular, generated many sophisticated and powerful theories describing thought and memory. Social psychologists quickly adopted these theories and methods to explore in finer detail how people strive to comprehend events in their social world. Today, work on social cognition remains a vigorous and prominent branch of social psychology.

Although work in social cognition is too diverse to be captured in a simple catalogue, one can point to dominant ideas and themes that social cognitive research has repeatedly demonstrated.

Schemata: The Building Blocks Of Social Cognition

One prominent theme focuses on the building blocks of people’s thoughts. People carry with them information about individuals, social groups, objects, and events arranged in schemata (singular: schema). A schema is a knowledge structure containing the features and examples associated with a person, group, object, or event. For example, people’s schema of bird usually contains such characteristics as wings, feathers, a beak, and flight, as well as some common examples of birds, such as robin and duck. Usually schemata are described as associative networks, that is, as a web of linked associations. Thus, when the concept of bird comes to mind, these associative links activate the relevant features and examples (e.g., wings, duck) connected to the concept, thus also bringing those notions to mind.

Schemata are tremendously helpful for social life. If a friend tells you, for example, that he or she went to a restaurant last night, you can easily surmise, because you possess a special type of schema for an event called a script, that the person looked at a menu, ordered food, ate it, paid the bill, and left a tip. Your friend does not need to specify these details; you already know.

That said, schemata can also be misleading or harmful, especially when people try to recall the past. For example, if someone asks George to remember the words drowsy, bed, pillow, snoring, and nighttime, he will probably remember most of these words. But he will also probably mistakenly recall the word sleep because all those terms listed above are linked to sleep through the associative network (the schema) of this concept. Memory errors prompted by schemata can be quite profound. For example, people witnessing a crime may misremember that the culprit had a gun, a disguise, or unkempt hair if it fits their schema of the event they witnessed. Schemata also explain how stereotypes can distort memory. If, for example, a friend describes a professor as distant, smart, and assertive, one might also mistakenly recall that the friend said the professor was arrogant—if that attribute fits one’s schema of a professor.

The impact of schemata on social judgment, interpretation, and memory has been shown to be profound in a wide array of studies, and there has been a good deal of discussion about the specific form that schemata take. According to the prototype view, schemata consist of the features associated with an object or event (such as wings and feathers to a bird). According to an exemplar view, schemata consist of typical examples of a concept (such as a robin being a typical example of a bird). Research ultimately suggests, however, that schemata tend to be a blend of both features and examples.

Errors And Biases In Social Judgment

One additional prominent theme in social cognitive research scrutinizes cognitive habits that lead people to make errors and biases in their judgments. For example, one such consequential habit is confirmation bias. Research on this bias shows that when people ask a question (e.g., “Is Jerry outgoing?”), they tend to look for information that would confirm the question in the positive (e.g., “He does go to parties”) and not for information that would disconfirm it (e.g., “He said last week he hated talking in front of large groups”). When the opposite question is asked (e.g., “Is Jerry shy?”), people instead search for information that would confirm that reverse hypothesis, leading to very different conclusions.

Confirmation bias can lead to several problems in judgment. For example, it can lead people to be overconfident in their predictions about themselves and others. When people consider a question soliciting a prediction (e.g., “Will I get a good grade in this class?”), they tend to consider information that suggests that the answer is “yes.” This can lead them to overconfidence about the chance that their prediction will prove to be accurate. That is, they may say that they are 90 percent sure their prediction will be correct even when the real chance is closer to 70 percent. Indeed, when people say they are 100 percent certain of their prediction, they tend to be wrong roughly one time out of five. Some researchers have suggested that a valuable habit for avoiding overconfidence is to also ask how an event might go in the opposite direction from that posed in the question (e.g., ask the reasons why one might get a poor grade). This consider-the-oppo-site strategy has been shown to reduce overconfidence in people’s predictions.

People also suffer from illusory correlations, seeing relationships between variables even when they do not exist. Some illusory correlations are inspired by schemata and stereotypes. For example, in an experiment described in a 1967 article, Loren J. Chapman and Jean P. Chapman showed participants a series of drawings, some of which were purportedly drawn by people suffering from paranoia. Participants in the study tended to conclude that the drawings of paranoid individuals more often than not included people with larger eyes—even when there was no relationship between eye size and mental illness in the drawings they looked over.

Other illusory correlations are inspired by what people find easier to remember. For example, people tend to remember unusual behaviors (e.g., riding a unicycle) performed by rare groups (e.g., Alaskans), as reported by David L. Hamilton and Robert K. Gifford in a 1976 article. Thus, when asked if there is any relation between a rare behavior and a rare group (e.g., do Alaskans participate in odd sports?), people report that such a relationship exists, even if the evidence they have reviewed fails to support this conclusion. Because rare-rare combinations are memorable, they lead to illusory correlations.

People also fall prey to the fundamental attribution error (also known as the correspondence bias), which means that they give too much weight to a person’s personality in explaining, evaluating, and predicting social behavior and too little weight to situational forces. That is, people look primarily to a person’s internal character to explain his or her actions, and not to factors outside the person that could have produced the behavior. This bias most commonly arises when people make attributions for another person’s behavior; that is, they try to identify the causes for why the behavior occurred. For example, if you say that “John stumbled while learning the dance,” people tend to leap to the conclusion that John is clumsy (i.e., something about his internal personality) rather than that the dance was difficult (e.g., something about the outside situation).

Several studies have provided powerful demonstrations of the fundamental attribution error. Consider the classic Milgram experiment, completed in the 1960s, in which Stanley Milgram (1933-1984) demonstrated that a majority of participants, if asked, would continue to shock another participant if an authority figure asked them to—even if the other participant suffered heart problems and had stopped answering, and for all practical purposes might be dead. Almost everyone who hears about the study denies that they would “go all the way,” complying with the experimenter until the session is curtailed. However, up to two-thirds of people in this situation do go all the way. The situation is extremely powerful even though people do not see it, and there are few indicators from a person’s personality that reliably predict whether that person will comply or defy the command to shock another person who has stopped answering.

People also make errors because they rely on quick heuristics to reach their judgments, according to the work of Amos Tversky (1937-1996) and Daniel Kahneman. One such example is the availability heuristic, in which people judge the odds, frequency, or truthfulness of an event based on how quickly examples of it spring to mind. For example, if one asks how commonly words of the form-n- appear in English, people tend to say that there are not many. However, if asked how many words of the form ——ing appear, people say quite a few, mostly because such words are easily brought to mind. Of course, all ——ing words are also ——-n- words, so the latter type of word, paradoxically, must be more frequent.

People also rely on the representativeness heuristic. This heuristic refers to the fact that people judge the odds, frequency, or truthfulness of an event based on how well it matches a schema in their head. For example, suppose you were told that Linda is politically liberal and a philosophy major. Which of the following descriptions do you think is the most likely to be true and which the least: that Linda is a feminist, that she is a bank teller, or that she is both a feminist and a bank teller. Most people rate “feminist” as most likely and “bank teller” as least, although that is necessarily an error. Mathematically, the least likely event must be that Linda is both a feminist and a bank teller. Why? If Linda is both, then she is already a bank teller—and there is an added chance that she might be a bank teller without being a feminist. Thus, the single description of “bank teller” must be more probable than being both a teller and a feminist.

This conjunction fallacy (i.e., rating a combination of two events as more likely than one of its two individual component events) is caused by the representativeness heuristic. People form a schema of Linda and then quickly compare the various events (e.g., is a bank teller) to this stereotype. If the event matches the schema (e.g., is a feminist), it is seen as probable. If it does not (e.g., is a bank teller), it is seen as improbable. However, in using this heuristic, people commonly violate the simple mathematics inherent in the situation, and thus reach conclusions that cannot be right.

In the following heuristics, people also ignore other valuable information that would lead them to more accurate predictions. For example, people tend to neglect the base rates of events, even though these rates have a large impact on what will happen. Base rates refer to the commonness of an event. A high base rate means an event is common (e.g., people tend to have ten toes); a low base rate means that an event is rare (e.g., people tend not to have more than ten toes, although some do). The overall base rate of an event is a valuable indicator about whether or not it will occur in the future, but people, relying on availability and representativeness heuristics, tend not to factor base rates into their judgments and predictions. For example, let’s say that you know someone who is over seven feet tall and athletic. Is he more likely to be an NBA basketball player or an accountant? Most people quickly predict that this person is an NBA player, because that fits their schema of a professional basketball player (the representativeness heuristic at work), but they should actually predict that he is an accountant, because accountants far outnumber NBA basketball players. That is, the base rate of accountants is several times higher than the base rate of being an NBA basketball player. Because being an accountant is the much more common event, it is the event one should predict.

Dual Systems In Social Thought

In another predominant theme, social cognitive work has also increasingly recognized that people possess two very different modes of thought. System 1 is a rapid mode of thought, in which people reach their judgments quickly through simple associations and heuristics, like the availability heuristic. System 2 is slower, conscious, deliberate, effortful, rule-based, and analytical.

Anyone who has solved a complex math problem is familiar with system 2. This is the system in which people consciously apply rules to compute some sort of judgment. People may not be as familiar with the operation of system 1. Indeed, at its extreme, system 1 may work so rapidly that a person is not even aware of its operation.

System 1 thinking is often associated with being automatic. There are many senses in which thought can be automatic. First, automatic thought can be quick. For example, people recognize the faces of their friends and family in an instant, without conscious deliberation. Second, automatic thought can be efficient, in that it does not detract from other tasks that people apply themselves to. For example, people can drive a car along a familiar route while fully engaged in other tasks, such as listening to the car stereo or talking to a passenger. Third, automatic thought can be completed without monitoring. People can form perfectly grammatical sentences, for example, without consciously monitoring the construction of each single phrase. Fourth, automatic can mean that the thought is outside of the control of the individ-ual—that it just happens. Indeed, it often requires no conscious goal to set itself in motion. For example, few Americans can hear the date “September 11” without reflexively thinking of terrorism.

Finally, and perhaps most importantly, automatic thought can occur without one being aware of it. When this occurs, the thought is usually described as nonconscious (i.e., below awareness) or preconscious (i.e., occurring before any thought reaches consciousness). Ultimately, this means that the conclusions people reach can be shaped by influences they are not aware of.

These influences are most directly shown in studies of priming, in which people are exposed to incidental material that later shapes their conclusions about some seemingly irrelevant situation. For example, if people complete a sentence-completion task that contains such words as hostile, mean, and unfriendly, they will judge a person they encounter soon afterward as more unpleasant and aggressive than they would if exposed to the words kind, generous, and sociable. The influence of priming can occur even when people are not aware of the prime. For example, John A. Bargh, Mark Chen, and Lara Burrows (1996) exposed college students to words associated with elderly people (e.g., wisdom, Florida) so quickly that the students were not aware that they had been shown any words at all. They thought they were merely seeing flashes on a computer screen. Despite this fact, as students left the experiment, exposure to these primes caused them to walk more slowly (a stereotypical attribute of the elderly) to the elevator as they left the experiment.

System 1, and the automatic thoughts that come with it, produces wide-ranging consequences. For example, the accuracy of people’s judgments, as described above, is heavily influenced by rapid use of availability and representativeness heuristics. Some forms of system 1 thinking can also be shown to trump system 2 thought. Norbert Schwarz and colleagues in a 1991 article showed how system 1 elements can have more influence than the actual content of conscious thoughts. In one study, they asked college students to write down six examples of their own assertive behavior. Students found this task easy. Another group was asked to write down twelve examples and found this task difficult. Later, when asked to rate their assertiveness, the first group saw themselves as more assertive than the second group—even though the second group had generated a greater number of examples indicating that they were assertive. Schwarz and colleagues argued that the first group had perceived themselves as more assertive because they were relying on the availability heuristic. Generating six examples was so easy and available that it tended to convince students that they were assertive. System 1 (the availability heuristic) in this case was a more powerful influence than system 2 (the actual number of examples in conscious thought).

The impact of system 1 thought is also evident in social attribution. People appear to reach attributions about others quickly and spontaneously, through system 1, even without a conscious goal of trying to understand those people. For example, if you mention that Janice helped the elderly woman carry her groceries to the car, many people will rapidly and unknowingly classify the behavior as helpful. (There is an ongoing debate about whether people think of the behavior or the person, Janice, as helpful.) Indeed, if cued with the word helpful later, people will be more likely to remember the sentence that inspired the thought.

Such spontaneous system 1 attributions may explain the fundamental attribution error. Daniel T. Gilbert and colleagues (1988) have proposed that people make rapid attributions to another person’s personality. Once made, people correct these quick personal attributions by considering the impact of the situation in a more effortful, conscious, system 2 way. In support of this idea, Gilbert and colleagues have shown that people make greater attributions to someone’s personality if they are distracted by some other task, because they are deprived of the cognitive capacity necessary to correct for the quick personal attributions produced by system 1.

System 1 also carries consequences for stereotyping. Patricia G. Devine (1989) has suggested that people apply stereotypes in their judgments of others in a quick, system 1 way. Importantly, these stereotype-inspired thoughts even occur to those who wish not to be influenced by them. People who consciously deny stereotypes based on gender, race, or age know that those stereotypes exist and what they are—and these stereotypes will produce automatic, system 1 associations even among these people. In response, those who wish to avoid using stereotypes must apply more effortful system 2 thought to correct for the impact of those stereotypes. However, when people do not have the cognitive capacity to perform this system 2 correction, they will commit stereotypical thinking even though they wish to prevent it. This may happen when they are tired or distracted by some other task.

System 1 also influences attitudes and persuasion. As Shelly Chaiken and Yaacov Trope (1999) have pointed out, people can be persuaded to hold an attitude via two different routes. Through a heuristic route, people can be persuaded in a system 1 way through rapid associations and rules of thumb. For example, people can be persuaded of a viewpoint if the person trying to persuade them is physically attractive, or has an impressive title, or just rattles off a large number of arguments. This type of persuasion occurs when people are not motivated to think deeply about what they are being told. However, when people are motivated, they more effortfully and consciously deliberate over what they are told. This is the systematic route to persuasion, and depends on whether people find the arguments they are given to be strong. (John T. Cacioppo and Richard Petty’s elaboration likelihood model offers a similar treatment of system 1 and 2 routes to persuasion).

Implicit Versus Explicit Attitudes

The presence of system 1 also means that people may hold multiple, and sometimes contradictory, attitudes about social groups and issues. For example, at a conscious, explicit level, people may harbor no negative attitudes toward people from other racial groups, or the elderly, or the political party opposite their own. However, at an implicit level, below conscious awareness or control, people may hold such prejudices. That is, they may hold automatic negative associations to those groups that they are not aware they have.

Much research has shown how attitudes at the implicit level may differ from those at an explicit level. For example, people tend to deny explicitly having any negative opinions of racial groups different from their own. However, if placed in a performance task that assesses their automatic, implicit associations, such negative links are often found. For example, in one version of the implicit association task, people are asked to complete two tasks simultaneously. In one task, they are asked whether a face is of a European American or an African American, pressing a button with their left hand if the face is European and with their right if the face is African. Intermixed with this task, they are also shown words (e.g., puppy, disease) and asked if each is positive or negative in nature, using their left hand to indicate the former and their right the latter. They then perform these two intermixed tasks again, but this time the hands indicating positive and negative words are switched.

Most European Americans find this second version of the task to be more difficult, in that they are using the same hand to indicate European and negative (an association they may not have at the automatic level) and the other hand to indicate African and positive (again, an association they may not possess at the automatic level). African American participants find the second version of the task to be easier than the first, presumably because it matches associations (e.g., African and positive) that they possess at an implicit level.

New Directions

Work in social cognition has moved vigorously in several directions since 1990. For example, all the work described so far paints social perceivers as cold, machinelike calculators, calmly using systems 1 and 2 to determine some judgment about their social worlds. More recent work since the 1990s has recognized that cognition need not only be “cold,” it can also be “hot,” involving vivid and full-blooded emotions. Thus, recent work in social cognition increasingly focuses on the role of emotions in social thought. For example, research has shown that emotional arousal prompts people to pay attention more to the evaluative charge of information in their environment—that is, to whether it is positive or negative—over other aspects of that information. Fear and anxiety also narrow attention to central and salient aspects of a situation, at the expense of more peripheral features.

Emotions also lead people to make different assumptions about a situation. When people become fearful, for example, they perceive themselves to lack control over a situation. Thus, they become more pessimistic and reluctant to take on risks. However, when people are angry, they perceive themselves as more in control and more likely to seek risks out. This was directly shown in a survey taken after the terrorist attacks in the United States on September 11, 2001. Those asked first to describe how the attacks made them fearful perceived the United States to be more at risk for future attacks than did those asked to describe how the attacks made them angry.

Work in social cognition since 1990 has also begun to explore the role played by culture, taking pains to study how social cognition operates around the world. In doing so, researchers have found that culture has a profound impact on the ways people think about their social world and the conclusions they then reach. Differences in culture also appear to extend even to perception of the physical world. For example, if people are shown a scene, North Americans are more likely to describe and remember central components of the scene. East Asian respondents, relative to their North American counterparts, are more likely to describe and remember the context surrounding those central components, better recalling peripheral details of a scene.

This different degree of attention paid to the center versus the context is echoed in social judgment. People from East Asia also tend to avoid the fundamental attribution error, more frequently emphasizing situational factors that may have produced a person’s behavior, in contrast to what is emphasized by their North American counterparts. In essence, people from East Asia tend to emphasize the surrounding situational context in their explanations for the behavior of other people, whereas North Americans tend to emphasize the central actor.

Finally, social cognition research since 2000 has increasingly delved into the neurophysiology of social thought, examining the brain structures that support judgments and decision making. By using such techniques as fMRI (functional magnetic resonance imaging) or ERP (event-related brain potentials), psychologists can determine which neural structures in the brain are active as people reach decisions. For example, people who possess negative implicit attitudes toward ethnic groups different from their own (as measured by the implicit association test described above) show more activation of the amygdala, a part of the brain associated with emotional learning and evaluation.

Studies of this type have also begun to map different neural routes that people take to reach conclusions about their social world. In a study on moral judgment by Joshua D. Greene and colleagues (2001), participants were asked how they would respond to the following two moral dilemmas. One dilemma concerned whether participants would switch the track that a train was traveling on to keep it from hitting and killing five people, knowing that the train on its new track would unfortunately kill one other individual. The second dilemma concerned whether participants would push a person in front of a train, killing him, in order to stop the train from killing five people further down the tracks. Although the two scenarios share the same overall structure (i.e., sacrificing one person to save five), people tend to reach different decisions about how to act, being more likely to switch the train track in the first scenario than to push the person onto the tracks in the second.

Participants also tend to reach these decisions via different neural routes. Participants in this study appeared to solve the first dilemma in a calculated system 2 way, analyzing the benefit of switching the train track. In support of this observation, fMRI measurements suggest that as people considered the scenario their parietal lobes, as well as the right middle-front gyrus, were active—areas associated with the “working memory” people use as they think through a decision. In contrast, people appeared to solve the second dilemma by going with their initial emotional reaction, with brain areas associated with emotion (e.g., the right and left angular gyrus, the bilateral posterior cin-gulate gyrus, and the bilateral medical frontal gyrus) being most active.

Concluding Remarks

Social cognition is a vibrant area of research. Its influence is also increasingly felt in other scientific and professional disciplines, as scholars in medicine, law, business, education, and philosophy comb its insights to provide knowledge to address questions in those areas of endeavor. In a sense, this vibrancy should not come as a surprise. Every day, people expend a great deal of effort in social cognition, trying to make sense of themselves and the people around them. It is a safe bet that they will never find a point in their lifetime when they can stop doing this task. If this is true of people in everyday life, then it must also be true of the social cognition researcher, who, after all, is also just trying to make sense of what other people do.


  1. Bargh, John A., Mark Chen, and Lara Burrows. 1996. Automaticity of Social Behavior: Direct Effects of Trait Construct and Stereotype Priming on Action. Journal of Personality and Social Psychology 71: 230–244.
  2. Chaiken, Shelly, and Yaacov Trope, eds. 1999. Dual-Process Theories in Social Psychology. New York: Guilford.
  3. Chapman, Loren J., and Jean P. Chapman. 1967. Illusory Correlation as an Obstacle to the Use of Valid Psychodiagnostic Signs. Journal of Abnormal Psychology 72: 193–204.
  4. Devine, Patricia G. 1989. Stereotypes and Prejudice: Their Automatic and Controlled Components. Journal of Personality and Social Psychology 56: 5–18.
  5. Gilbert, Daniel T., Brett W. Pelham, and Douglas S. Krull. 1988. On Cognitive Busyness: When Person Perceivers Meet Persons Perceived. Journal of Personality and Social Psychology 54: 733–740.
  6. Gilovich, Thomas, Dale Griffin, and Daniel Kahneman, eds. 2002. Heuristics and Biases: The Psychology of Intuitive Judgment. New York: Cambridge University Press.
  7. Greene, Joshua D., R. Brian Sommerville, Leigh E. Nystrom, et al. 2001. An fMRI Investigation of Emotional Engagement in Moral Judgment. Science 293: 2105–2108.
  8. Greenwald, Anthony G., David E. McGhee, and Jordan L. K. Schwarz. 1998. Measuring Individual Differences in Implicit Cognition: The Implicit Association Test. Journal of Personality and Social Psychology 74: 1464–1480.
  9. Hamilton, David L., and Robert K. Gifford. 1976. Illusory Correlation in Interpersonal Perception: A Cognitive Basis of Stereotypic Judgments. Journal of Experimental Social Psychology 12: 392–407.
  10. Kunda, Ziva. 1999. Social Cognition: Making Sense of People. Cambridge, MA: MIT Press.
  11. Lerner, Jennifer S., Roxanne M. Gonzalez, Deborah A. Small, and Baruch Fischhoff. 2003. Effects of Fear and Anger on Perceived Risks of Terrorism: A National Field Experiment. Psychological Science 14: 144–150.
  12. Moskowitz, Gordon B. 2005. Social Cognition: Understanding Self and Others. New York: Guilford.
  13. Nisbett, Richard E. 2004. The Geography of Thought: How Asians and Westerners Think Differently … and Why. New York: Free Press.
  14. Phelps, Elizabeth A., Kevin J. O’Connor, William A. Cunningham, et al. 2000. Performance on Indirect Measures of Race Evaluation Predicts Amygdala Activation. Journal of Cognitive Neuroscience 12: 729–738.
  15. Schwarz, Norbert, Herbert Bless, Fritz Strack, et al. 1991. Ease of Retrieval as Information: Another Look at the Availability Heuristic. Journal of Personality and Social Psychology 45: 513–523.

See also:

Free research papers are not written to satisfy your specific instructions. You can use our professional writing services to buy a custom research paper on any topic and get your high quality paper at affordable price.


Always on-time


100% Confidentiality
Special offer! Get discount 10% for the first order. Promo code: cd1a428655