Search in the document preview
Experimental Psychology – PSY402 VU LESSON 13
REASONING AND PROBLEM SOLVING
Reasoning: Making Up Your Mind
Professors deciding when students' assignments are due
An employer determining whom to hire out of a pool of job applicants
The president concluding it is necessary to send troops to a foreign nation
The common thread among these three circumstances: Each requires reasoning, the process by which information is used to draw conclusions and make decisions. Although philosophers and logicians have considered the foundations of reasoning for centuries, it is only relatively recently that cognitive psychologists have begun to investigate how people reason and make decisions. Together, their efforts have contributed to our understanding of formal reasoning processes as well as the mental shortcuts we routinely use shortcuts which may sometimes lead our reasoning capabilities astray (Evans, Newstead, & Byrne, 1994; Johnson-Laird & Shafir, 1994; Corrigan, 19%).
Deductive and Inductive Reasoning One approach taken by cognitive psychologists in their efforts to understand decision making is to examine how people use formal reasoning procedures. Two major forms exist: deductive reasoning and inductive reasoning (Rips, 1990, 1994a, 1995). In deductive reasoning, we draw inferences and implications from a set of assumptions and apply them to specific cases. Deductive reasoning begins with a series of assumptions or premises that arc thought to be true, and then derives the implications of these assumptions. If the assumptions are true, then the conclusions must also be true.
A major technique for studying deductive reasoning involves asking people to evaluate syllogisms. A syllogism presents a series of two assumptions, or premises, that arc used to derive a conclusion. By definition, the conclusion must be true if the assumptions or premises are true. For example, consider the following syllogism:
All men are mortal. [premise] Socrates is a man. [premise] Therefore, Socrates is mortal. [conclusion]
In this case both premises are true, and so, then, is the conclusion. More abstractly, we can state the syllogism as the following: All As are B. C is an A. Therefore, C is a B. [premise] [premise] [conclusion] On the other hand, if either or both of the premises in a syllogism are not accurate, then there is insufficient support for the accuracy of the conclusion. Suppose, for example, you saw the following syllogism: All men are mortal. [premise] Socrates is a man. [premise] Therefore, all men are Socrates. [conclusion] Obviously, such a conclusion makes no sense. We can more easily see why it's unreasonable by restating the syllogism in the abstract and coming up with an obviously false conclusion: All As are B. C is an A. Therefore, all As are C.
©Copyright Virtual University of Pakistan 69
Experimental Psychology – PSY402 VU [premise] [premise] [conclusion] The conceptual complement of deductive reasoning is inductive reasoning. In inductive reasoning, we infer a general rule from specific cases. Using our observations, knowledge, experiences, and beliefs about the world, we develop a summary conclusion. (You can recall the distinction between deductive and inductive reasoning in this way: In Deductive reasoning, the conclusion is derived through the use of general rules, whereas in inductive reasoning, a conclusion is inferred from specific examples. Sherlock Holmes used inductive reasoning in his quest to solve mysteries. By amassing clues, he was ultimately able to determine the identity of the criminal. Similarly, we all use inductive reasoning, although typically in more ordinary situations. For instance, if the person in the apartment below you constantly plays Spice Girls music, you may begin to form an impression of what that individual is like, based on the sample of evidence available to you. Like Sherlock Holmes, you use pieces of evidence to draw a general conclusion. The limitation of inductive reasoning is that conclusions may be biased if insufficient or invalid evidence is used. Psychologists know this well: The various scientific methods that they may employ in the collection of data to support their hypotheses are prone to several sorts of biases, such as using an inappropriate sample of subjects. Similarly, you may fail to draw appropriate conclusions about your neighbor if your impression is based only on the music he or she plays and not on a broader sample of behavior. Inductive reasoning: A reasoning process whereby n general rule is inferred from specific cases, using observation, knowledge, experience and beliefs Algorithms and Heuristics When faced with a decision, we often turn to various kinds of mental shortcuts, known as algorithms and heuristics, to help us. An algorithm is a rule which, if followed, guarantees a solution to a problem. We can use an algorithm even if we cannot understand why it works. For example, you may know that the length of the third side of a right triangle can be found using the formula a2 + b2 - c2. You may not have the foggiest notion of the mathematical principles behind the formula, but this algorithm is always accurate and therefore provides a solution to a particular problem. For many problems and decisions, however, no algorithm is available. In those instances, we may be able to use heuristics to help us. A heuristic is a rule of thumb or mental shortcut that may lead to a solution. Heuristics enhance the likelihood of success in coming to a solution but, unlike algorithms, they cannot ensure it. For example, some tic-tac-toe players follow the heuristic of placing an "X" in the center of the squares at the start of the game. This lactic doesn't guarantee that they will win, but it does increase their chances of success. Similarly, some students follow the heuristic of preparing for a test by ignoring the assigned textbook reading and only studying their lecture notes—a strategy that may or may not pay off (Nisbett et aL, 1993), to consider as many as 200 million possible chess positions a second in and of itself, such sheer calculating speed does not constitute thinking. On the other hand, Herbert Simon, a cognitive psychologist at Carnegie Mellon University and Nobel prize-winner, argues that the computer did show rudiments of human like thinking because of its selectivity, its knowledge of where to look—and where not to look—for an answer to a problem. Simon suggests that Deep Blue's capacity to evaluate potential moves and to ignore unimportant possibilities gives it thinking ability (Webber, 1996; Wright, 1996). Some critics, however, suggest that Deep Blue's ability to consider billions of moves is qualitatively no different from what a simple calculator can do. Perhaps Deep Blue can do more calculations than a pocket calculator, but neither, machine is capable of worrying about what comes next, strategizing about how to take account of an opponent's emotions, or dreaming about what one might do with the prize money for winning the tournament. In the view of critics, because such feelings and expectations are not part of the abilities of the computer, it is not engaged in true thinking (Weber, 1996). Obviously, many of the questions surrounding the ability of computers to think are more philosophical than psychological and are not readily answered. Still, it is clear that computers are becoming increasingly sophisticated, ever more closely approximating human thought processes. For example, psychologists Jack Gelfand and Susan Epstein are developing a computer that can demonstrate expertise in a variety of tasks, rather than being a master at only a single task, such as chess. To do so, they are seeking to design a
©Copyright Virtual University of Pakistan 70
Experimental Psychology – PSY402 VU machine that can learn to play games using experience, memory, and heuristics. The computer also has a visual recognition component, permitting it to perceive patterns in board games. The ability to "see" patterns on the board permits the computer to draw conclusions about whether a specific pattern of game pieces is more or less likely to produce success (Epstein, 1995; Azar, 1997). One of the computer's programs, called Hoyle, is designed to become an expert game-player. Starting off simply, it plays as a novice would, practicing with "expert" computers designed to play just one game. For instance, it might initially play tic-tac-toe. As it gains experience, it learns strategies, balancing and weighing the different strategies to you which will be most successful. If a strategy is particularly effective, the computer increases its weight, so this strategy counts more in the future; if strategy is not very useful, its weight decreases. Hoyle has so far mastered almo twenty different board games. Because like humans, it learns some games more quickly than others, it appears to be employing strategies effectively to help learn. Furthermore, research on children how they learn games suggest that they and Hoyle use similar strategies (Rattermann, 1992). Is Hoyle thinking like a human? The question remains unanswerable. Machines like Hoyle and Dee Blue are clearly making strides towards imitating the moves of expert game players. In fact, it is possible that one day this world chess champion will be a machine But even if this forecast becomes a reality, the real champions will remain the people who program the computers—the humans behind the machines. Although heuristics often help people solve problems and make decisions, certain kinds of heuristics may backfire. For example, we sometimes use the representative heuristic, a rule we apply when we judge people by the degree to which they represent a certain category or group of people. Suppose, for instance, you are the owner of a fast-food store that has been robbed many times by teenagers. The representative heuristic would lead you to raise your guard each time someone of this age group enters your store (even though, statistically, it is unlikely that any given teenager will rob the store). The availability heuristic involves judging the probability of an event by how easily the event can be recalled from memory (Tversky & Kahneman, 1974, 1990). According to this heuristic, we assume that events we remember easily are likely to have occurred more frequently in the past than those that are harder to remember. Furthermore, we assume that the same sort of event is more likely to occur in the future. For example, we are more apt to worry about being murdered than dying of diabetes, despite the fact that it is twice as likely that we will die of the disease. We err because of the ease with which we remember dramatic, highly publicized events like murder; this leads us to overestimate the likelihood of their occurrence. Similarly, many people are more afraid of dying in a plane crash than in an auto accident—despite statistics clearly showing that airplane travel is much safer than auto travel. The reason is that plane crashes receive far more publicity than car crashes, and arc therefore more easily remembered. It is the availability heuristic that leads people to conclude that they are in greater jeopardy in an airplane than in a car (Slovic, Fischhoff, & Lichtcnstien. 1976: Schwarz et al., 1991). Are algorithms and heuristics confined to human thinking, or can computers programmed to use them? As we discussed in the Applying Psychology in 21 Century box, new research suggests that a future in which machines think is not altogether far-fetched. At the same time, such work raises some fundamental questions about the nature of thought and the mind.
Problem Solving According to an old legend, a group of monks in Vietnam devote much of their time to the effort of solving a problem called the Tower of Hanoi puzzle. The monks believe that, if they succeed, the world as we know it will come to an end (Raphael, 1976). (Should you prefer that the world remain in its present state, there's no need for immediate concern: According to one estimate, the puzzle is so complex that it will take about a trillion years to reach a solution.)
Why are cognitive psychologists interested in the Tower of Hanoi problem? The answer is that the way people go about solving this puzzle and simpler ones like it helps illuminate the processes by which people solve complex problems that they encounter in school and at work. For example, psychologists have found that problem solving typically involves three major steps: preparation for the creation of solutions, production of solutions, and evaluation of solutions that have been generated (Sternberg & Frensch, 1991).
©Copyright Virtual University of Pakistan 71
Experimental Psychology – PSY402 VU
Preparation: Understanding and Diagnosing Problems When approaching a problem like the Tower of Hanoi, most people begin by trying to ensure that they thoroughly understand the problem. If the problem is a novel one, they ire likely to pay particular attention to any restrictions placed on coming up with a solution as well as the initial status of the components of the problem. If, on the other hand, the problem is a familiar one, they are apt to spend considerably less time in this stage. Problems vary from well-defined to ill-defined (Reitman, 1965; Axlin, 1989). In a well-defined problem— such as a mathematical equation or the solution to a jigsaw puzzle—both the nature of the problem itself and the information needed to solve it are available and clear. Thus, straightforward judgments can be made about whether a potential solution is appropriate. With an ill-defined problem, such as how to increase morale on an assembly line or bring peace to the Middle East, not only may the specific nature of the problem be unclear, but the information required to solve the problem may be even less obvious. For example, consider the following problem, first devised by Karl Duncker (1945): Suppose you are a doctor faced with a patient who has a malignant tumor in his stomach. To operate on the patient is impossible, but unless the tumor is destroyed, the patient will die. A kind of ray, at a sufficiently high intensity, can destroy the tumor. Unfortunately, at this intensity the healthy tissue that the rays pass through on the way to the tumor will also be destroyed. At lower intensities the rays are harmless to healthy tissue but will not affect the tumor, either. How can the rays be used to destroy the tumor without injuring the healthy tissue? Most people have a great deal of difficulty in thinking of even one solution to this problem. The major barrier is that the ill-defined nature of the problem, which involves some vague sort of "rays," doesn't suggest any immediate solutions. However, there is an ingenious solution to the problem: aiming weak rays at the tumor from several different entry points. In this way, no one portion of healthy tissue is damaged, while the tumor receives a full dosage.
Kinds of Problems Problems typically fall into one of the three categories: arrangement, inducing structure, and transformation (Greeno, 1978; Spitz, 1987). Solving each type requires somewhat different kinds of psychological skills and knowledge.
A. Arrangement problems 1. Anagrams: Rearrange the letters in each set to make an English word:
a. Two strings hang from a ceiling but are too far apart to allow a person to hold one and walk to the other. On the floor are a book of matches, screwdriver, and a few pieces of cotton. How could the strings be tie together? 1 Inducing structure 1. What number comes next in the series? 1424344454 6 4 2. Complete these analogies: Baseball is to bat as tennis is to Merchant is to sell as customer is to
Arrangement Problems require that a group of elements be rearranged or recombined in a way that will satisfy a certain criterion. There are usually several different possible arrangements that can be made, but only one or a few of the arrangements will produce a solution. Anagram problems and jigsaw puzzles represent arrangement problems. In problems of inducing structure, a person must identify the relationships that exist among the elements presented and construct a new relationship among them. In such a problem, it is necessary to determine not only the relationships among the elements, but the structure and size of the elements involved.
©Copyright Virtual University of Pakistan 72
Experimental Psychology – PSY402 VU
The Tower of Hanoi puzzle represents a third kind of problem. Transformation problems consist of an initial state, a goal state, and a series of methods for changing the initial state into the goal state. In the Tower of Hanoi problem, the initial state is the original configuration; the goal state consists of the three disks on the third peg; and the method consists of the rules for moving the disks.
Whether the problem is one of arrangement, inducing structure, or transformation, the initial stage of understanding and diagnosing is critical in problem solving because it allows us to develop our own cognitive representation of the problem and to place it within a personal framework. The problem may be divided into subparts or some information may be ignored as we try to simplify the task. Winnowing out nonessential information (such as the matches and cotton in the hanging strings problem) is often a critical step in problem solving.
Representing and Organizing the Problem A crucial aspect of the initial encounter with a problem is the way in which we represent it to ourselves and organize the information presented to us (Brown & Walter, 1990, 1993; Davidson, Deuser, & Sternberg, 1994). Consider the following problem: A man climbs a mountain on Saturday, leaving at daybreak and arriving at the top near sundown. He spends the night at the top. The next day, Sunday, he leaves at daybreak and heads down the mountain, following the same path that he climbed the day before. The question is this: Will there be any time during the second day when he will be at exactly the same point on the mountain as he was at exactly that time on the first day?
Production: Generating Solutions If a problem is relatively simple, a direct solution may already be stored in long-term memory, and all that is necessary is to retrieve the appropriate information. If the solution cannot be retrieved or is not known, we must instigate a process by which possible solutions can be generated and compared with information in long- and short-term memory.
Trial and Error At the most primitive level, solutions to problems can be obtained through trial and error. Thomas Edison was able to invent the light bulb only because he tried thousands of different kinds of materials for a filament before he found one that worked (carbon). The difficulty with trial and error, of course, is that some problems are so complicated it would lake a lifetime to try out every possibility.
Means-Ends Analysis In place of trial and error, complex problem solving often involves the use of heuristics, which, as we discussed earlier, are rules of thumb that can lead the way to solutions. Probably the most frequently applied heuristic in problem solving is a means-ends analysis. In a means-ends analysis, people repeatedly test for differences between the desired outcome and what currently exists. For example, people using a means-ends analysis to search for the correct sequence of roads to get to a city that they can see in the distance would analyze their solutions in terms of how much closer each individual choice of roadway brings them to the ultimate goal of arriving at the city. Such a strategy is only effective, though, if there is a direct solution to the problem. If the problem is such that indirect steps have to be taken that appear to increase the discrepancy between the current state and the solution, means-ends analyses can be counterproductive. For some problems, the converse of a means-ends analysis is the most effective approach: working backward by beginning with the goal and moving toward the starting state. Instead of starting with the current situation and moving closer and closer to the solution, people can work in the opposite direction, starting with the goal and aiming to reach the beginning point (Malin, 1979; Bourne et al., 1986; Hunt, 1994).
©Copyright Virtual University of Pakistan 73
Experimental Psychology – PSY402 VU Another commonly used heuristic is to divide a problem into intermediate steps, or subgoals, and to solve each of those steps. For instance, in our modified Tower of Hanoi problem, there are several obvious subgoals that could be chosen, such as moving the largest disk to the third post. If solving a subgoal is a step toward the ultimate solution to a problem, then identifying subgoals is an appropriate strategy. There are cases, however, in which the formation of subgoals is not all that helpful and may actually increase the time needed to find a solution (Hayes, 1966; Reed, 1996). For example, some problems cannot be subdivided. Others are so difficult to subdivide that it takes longer to identify the appropriate subdivisions than to solve the problem by other means. Finally, even when a problem is divided into subgoals, it may be unclear what to do after a given subgoal has been reached.
Insight Some approaches to problem solving focus less on step-by-step processes than on the sudden bursts of comprehension that one may experience during efforts to solve a problem. Just after World War I, German psychologist Wolfgang Kohler examined learning and problem-solving processes in chimps (Kohler, 1927). In his studies, Kohler exposed chimps to challenging situations in which the elements of the solution were all present; all that was needed was for the chimps to put them together. For example, in one series of studies, chimps were kept in a cage in which boxes and sticks were strewn about, with a bunch of tantalizing bananas hanging out of reach. Initially, the chimps engaged in a variety of trial-and-error attempts at getting to the bananas: They would throw the stick at the bananas, jump from one of the boxes, or leap wildly from the ground. Frequently, they would seem to give up in frustration, leaving the bananas dangling temptingly overhead. But then, in what seemed like a sudden revelation, they would abandon whatever activity they were involved in and standing on a box in order to be able to reach the bananas with a stick. Kohler called the cognitive processes underlying the chimps' behavior insight, a sudden awareness of the relationships among various elements that had previously appeared to be independent of one another.
Although Kohler emphasized the apparent suddenness with which solutions were revealed, subsequent research has shown that prior experience and initial trial-and-error practice in problem solving are prerequisites for "insight" (Metcalfe, 1986). One study demonstrated that only chimps who had experience in playing with sticks could successfully solve the problem; inexperienced chimps never made the connection between standing on the box and reaching the bananas (Birch, 1945). Some researchers have suggested that the behavior of the chimps represented little more than the chaining together of previously learned responses, no different from the way a pigeon learns, by trial and error, to peck a key (Epstein el al., 1984; Epstein, 1987, 1996). It is clear that insight depends on previous experience with the elements involved in a problem. Judgment: Evaluating the Solutions The final step in problem solving is judging the adequacy of a solution. Often, this is a simple matter: If there is a clear solution—as in the Tower of Hanoi problem—we will know immediately whether we have been successful. On the other hand, if the solution is less concrete, or if there is no single correct solution, evaluating solutions becomes more difficult. In such instances, we must decide which solution alternative is best. Unfortunately, we are often quite inaccurate in estimating the quality of our own ideas (Johnson, Parrott, & Stratton, 1968). For instance, a team of drug researchers working for a particular company may feel that their remedy for an illness is superior to all others, overestimating the likelihood of success and belittling the approaches of competing drug companies. Theoretically, if the heuristics and information we rely on to make decisions are appropriate and valid, we can make accurate choices among problem solutions. However, as we see next, there are several kinds of obstacles to and biases in problem solving that affect the quality of the decisions and judgments we make.
Impediments to Problem Solving
©Copyright Virtual University of Pakistan 74
Experimental Psychology – PSY402 VU Consider the following problem-solving test (Duncker, 1945): You are presented with a set of tacks, candles, and matches in small boxes, and told your goal is to place three candles at eye level on a nearby door, so that wax will not drip on the floor as the candles burn. How would you approach this challenge? If you have difficulty solving the problem, you are not alone. Most people are unable to solve it when it is presented in the manner illustrated in the figure, in which the objects are located inside the boxes. On the other hand, if the objects were presented beside the boxes, just resting on the table, chances are you would solve the problem much more readily—which, in case you are wondering, requires 'tacking the boxes to the door and then placing the candles inside them. The difficulty you probably encountered in solving the problem stems from its presentation and relates to the fact that you were misled at the initial preparation stage. Actually, significant obstacles to problem solving exist at each of the three major stages. Although cognitive approaches to problem solving suggest that thinking proceeds along fairly rational, logical lines as a person confronts a problem and considers various solutions, a number of factors hinder the development of creative, appropriate, and accurate solutions. The problem here is to place three candles at eye level on a nearby door so that the wax will not drip on the floor as the candles burn—using only the tacks, candles, and matches in the small boxes.
Functional Fixedness and Mental Set The reason that most people experience difficulty with the candle problem is a phenomenon known as functional fixedness, (he tendency to think of an object only in terms of its typical use. For instance, functional fixedness probably leads you to think of the book you are holding in your hands as something to read, as opposed to its value as a doorstop or as kindling for a fire. In the candle problem, functional fixedness occurs because the objects are first presented inside the boxes, which are then seen simply as containers for the objects they hold rather than as a potential part of the solution. Functional fixedness is an example of a broader phenomenon known as mental set, the tendency for old patterns of problem solving to persist. This phenomenon was demonstrated in a classic experiment carried out by Abraham Luchins (1946). As you can see in Figure 8-8, the object of the task is to use the jars in each row to measure out the designated amount of liquid. (Try it yourself to get a sense of the power of mental set before moving on.) If you have tried to solve the problem, you know that the first five parts are all solved in the same way: Fill the largest jar (B) and from it fill the middle-size jar (A) once and the smallest jar (C) two times. What is left in B is the designated amount. (Stated as a formula, it is B - A - 2C.) The demonstration of menial set comes with the sixth part of the problem, a point at which you probably encountered some difficulty. If you are like most people, you tried the formula and were perplexed when it failed. Chances are, in fact, that you missed the simple (but different) solution to the problem, which merely involves subtracting C from A. Interestingly, those people who were given problem 6 first had no difficulty with it at all. Mental set can also affect perceptions. It can prevent you from seeing your way beyond the apparent constraints of a problem. For example, try to draw four straight lines so that they pass through all nine dots in the grid below—without lifting your pencil from the page. If you had difficulty with the problem, it was probably because you felt compelled to keep your lines within the grid. If you had gone outside the boundaries, however, you would have succeeded with the solution
Inaccurate Evaluation of Solutions When the nuclear power plant at Three Mile Island in Pennsylvania suffered its initial malfunction in 1979, a disaster that ALMOST led to a nuclear meltdown, the plant opera- tors were faced immediately with solving a problem of the most serious kind. Several monitors indicated contradictory information about the source of the problem; One suggested that the pressure was too high, leading to the danger of an explosion; others indicated that the pressure was too low, which could lead to a meltdown. Although the pressure was in fact too low, the supervisors on duty relied on the one monitor— which was faulty—that suggested the pressure was too high. Once they had made their decision and acted upon it, they ignored the contradictory evidence from the other monitors (Wickens, 1984).
©Copyright Virtual University of Pakistan 75
Experimental Psychology – PSY402 VU One reason for the operators' mistake is the confirmation bias, in which initial hypotheses are favored and contradictory information supporting alternative hypotheses or solutions is ignored. Even when we find evidence that contradicts a solution we have chosen, we are apt to stick with our original hypothesis. There are several reasons for the confirmation bias. One is that it takes cognitive effort to rethink a problem that appears to be solved already, so we are apt to stick with our first solution. Another is that evidence contradicting an initial solution may present something of a threat to our self-esteem, leading us to hold to the solutions that we have come up with first (Fischoff, 1977; Rasmussen, 1981). Creativity and Problem Solving Despite obstacles to problem solving, many people are adept at coming up with creative solutions to problems. One of the enduring questions that cognitive psychologists have sought to answer is what factors underlie creativity, the combining of responses or ideas in novel ways (Isaksen & Murdock, 1993; Boden, 1994, 1996; Smith, Ward, & Finke, 1995). Although identifying the stages of problem solving helps us to understand how people approach and solve problems, it does little to explain why some people—such as Jacob Rabinow, whose inventions were described in the Prologue—come up with better solutions than others. For instance, the possible solutions to even the simplest of problems often show wide discrepancies. Consider, for example, how you might respond to the question, "How many uses can you think of for a newspaper?" Now compare your own solution with this one proposed by a 10-year-old boy: You can read it, write on it, lay it down and paint a picture on it. . . . You could put it in your door for decoration, put it in the garbage can, put it on a chair if the chair is messy. If you have a puppy, you put newspaper in its box or put it in your backyard for the dog to play with. When you build something and you don't want anyone to see it, put newspaper around it. Put newspaper on the floor if you have no mattress, use it to pick up something hot, use it to stop bleeding, or to catch the drips from drying clothes. You can use a newspaper for curtains, put it in your shoe to cover what is hurting your foot, make a kite out of it, and shade a light that is too bright. You can wrap fish in it, wipe windows, or wrap money in it. . . . You put washed shoes in newspaper, wipe eyeglasses with it, put it under a dripping sink, put a plant on it, make a paper bowl out of it, use it for a hat if it is raining, tie it on your feet for slippers. You can put it on the sand if you had no towel, use it for bases in baseball, make paper airplanes with it, use it as a dustpan when you sweep, ball it up for the cat to play with, wrap your hands in it if it is cold (Ward, Kogan, & Pankove, 1972). It is obvious that this list shows extraordinary creativity. Unfortunately, it has proved to be considerably easier to identify examples of creativity than to determine its causes. Several factors, however, seem to be associated with creativity (Swede, 1993; Csikszentmihalyi, 1997; Ward, Smith, & Vaid, 1997).
One of these factors is divergent thinking. Divergent thinking refers to the ability to generate unusual, yet nonetheless appropriate, responses to problems or questions this type of thinking contrasts with convergent thinking, which produces responses that are based primarily on knowledge and logic. For instance, someone response lying or convergent thinking answers "You read it"' Psychologists Robert Sternberg and Todd Lubart suggest that one important ingredient of creativity is the willingness to take risks that may result in potentially high payoffs (Sternberg & Lubart. 1992, 1995, 1996; Lubart & Sternbreg. 1995). In their view, creative people are similar to successful stock market investors, who follow the rule of "buying low and selling high." In an analogous fashion, creative individuals formulate and promote ideas that arc, at least for the moment, out-of-synch with prevailing wisdom ("buying low"). Ultimately, though, highly creative people expect that their ideas will rise in value and that others will ultimately find them of value and adopt them ("selling high"). Another ingredient of creativity is cognitive complexity; the use of and preference for elaborate, intricate, and complex stimuli and thinking patterns. Similarly, creative people often have a wider range of interests and are more independent and more interested in philosophical or abstract problems than arc less creative individuals (Barron, 1990). One factor that is not closely related to creativity is intelligence. Most items on intelligence tests, which are well defined and have only one acceptable answer, focus on convergent thinking skills. Creative people who are divergent thinkers may therefore find themselves at a disadvantage when taking such tests. This may explain why researchers consistently find that creativity is only slightly related to intelligence or school
©Copyright Virtual University of Pakistan 76
Experimental Psychology – PSY402 VU grades, particularly when intelligence is measured using typical intelligence tests (Barron & Harrington, 1981; Sternberg, 1988; Albert, 1992; Simonton, 1994; Hong, Milgram, & Gorsky, 1995).
©Copyright Virtual University of Pakistan 77