A Quick Puzzle to Test Your Problem Solving (via NYT)

This fantastic test was posted by the New York Times over a year ago:

Here’s how it works:

We’ve chosen a rule that some sequences of three numbers obey — and some do not. Your job is to guess what the rule is. We’ll start by telling you that the sequence 2, 4, 8 obeys the rule:

Now it’s your turn. Enter a number sequence in the boxes below, and we’ll tell you whether it satisfies the rule or not. You can test as many sequences as you want.

[click here to go to the NYT link and test your answer]

*SPOILER BELOW*

The answer was extremely basic. The rule was simply: Each number must be larger than the one before it. 5, 10, 20 satisfies the rule, as does 1, 2, 3 and -17, 14.6, 845. Children in kindergarten can understand this rule.

But most people start off with the incorrect assumption that if we’re asking them to solve a problem, it must be a somewhat tricky problem. They come up with a theory for what the answer is, like: Each number is double the previous number. And then they make a classic psychological mistake.

They don’t want to hear the answer “no.” In fact, it may not occur to them to ask a question that may yield a no.

Remarkably, 77 percent of people who have played this game so far have guessed the answer without first hearing a single no. A mere 9 percent heard at least three nos — even though there is no penalty or cost for being told no, save the small disappointment that every human being feels when hearing “no.”

It’s a lot more pleasant to hear “yes.” That, in a nutshell, is why so many people struggle with this problem.

Confirmation Bias

This disappointment is a version of what psychologists and economists call confirmation bias. Not only are people more likely to believe information that fits their pre-existing beliefs, but they’re also more likely to go looking for such information. This experiment is a version of one that the English psychologist Peter Cathcart Wason used ina seminal 1960 paper on confirmation bias. (He used the even simpler 2, 4 and 6, rather than our 2, 4 and 8.)

Most of us can quickly come up with other forms of confirmation bias — and yet the examples we prefer tend to be, themselves, examples of confirmation bias. If you’re politically liberal, maybe you’re thinking of the way that many conservatives ignore strongevidence of global warming and its consequences and instead glom onto weaker contrary evidence. Liberals are less likely to recall the many incorrect predictions over the decades, often strident and often from the left, that population growth would create widespread food shortages. It hasn’t.

This puzzle exposes a particular kind of confirmation bias that bedevils companies, governments and people every day: the internal yes-man (and yes-woman) tendency. We’re much more likely to think about positive situations than negative ones, about why something might go right than wrong and about questions to which the answer is yes, not no.

Sometimes, the reluctance to think negatively has nothing to do with political views or with a conscious fear of being told no. Often, people never even think about asking questions that would produce a negative answer when trying to solve a problem — like this one. They instead restrict the universe of possible questions to those that might potentially yield a “yes.”

Government Policy

In this exercise, the overwhelming majority of readers gravitated toward confirming their theory rather than trying to disprove it. A version of this same problem compromised the Obama administration’s and Federal Reserve’s (mostly successful) response to the financial crisis. They were too eager to find “green shoots” of economic recovery that would suggest that the answer to the big question in their minds was, just as they hoped and believed: “Yes, the crisis response is aggressive enough, and it’s working.” More damaging was the approach that President George W. Bush’s administration, and others, took toward trying to determine whether Iraq had weapons of mass destruction a decade ago — and how the Iraqi people would react to an invasion. Vice President Dick Cheney predicted in 2003, “We will, in fact, be greeted as liberators.”

Corporate America

Corporate America is full of more examples. Executives of Detroit’s Big Three didn’t spend enough time brainstorming in the 1970s and 1980s about how their theory of the car market might be wrong. Wall Street andthe Fed made the same mistake during the dot-com and housing bubbles. To pick an example close to home, newspapers didn’t spend enough time challenging the assumption that classified advertisements would remain plentiful for decades.

One of the best-selling business books in history — about negotiation strategy — is “Getting to Yes.” But the more important advice for us may instead be to go out of our way to get to no. When you want to test a theory, don’t just look for examples that prove it. When you’re considering a plan, think in detail about how it might go wrong.

Some businesses have made this approach a formal part of their decision-making: Imagine our strategy has failed; what are the most likely reasons it did? As Jason Zweig has written in The Wall Street Journal, “Gary Klein, a psychologist at Applied Research Associates, of Albuquerque, N.M., recommends imagining that you have looked into a crystal ball and have seen that your investment has gone bust.”

When you seek to disprove your idea, you sometimes end up proving it — and other times you can save yourself from making a big mistake. But you need to start by being willing to hear no. And even if you think that you are right, you need to make sure you’re asking questions that might actually produce an answer of no. If you still need to work on this trait, don’t worry: You’re only human.

Scroll to Top