Writing software is hard, particularly when the schedules keep programmers “nose to the grindstone”; every so often, it's important to take a breather and look around the world and discover what we can find - ironically, what we find can often help us write software better.

Consider yourself an educated developer? Think you've got a good grip on analytical processes? Or, as a popular game show in the U.S. puts it, “Are You Smarter than a Fifth Grader”? Let's find out.

Imagine a farmer owns a rectangular plot of land, whose perimeter is 110 meters in length, and whose area is 700 square meters in size. What are the dimensions of the land?

Take a moment and see if you can work out the answer before continuing on. No points are awarded, by the way, for brute-forcing your way through to an answer.

Here's another one: imagine that a well 3 meters in diameter looks like a perfect cylinder, open at the top. For reasons as yet unknown to the world, a little boy throws two sticks, four and five meters in length, respectively, into the well, and they happen to land such that they touch each other somewhere in the middle. Find the height of the point at which they cross.

Both of these problems come from mathematics textbooks schoolchildren are using - the first from a textbook for junior-high-schoolers (ages 13 - 15), and the second from one aimed at upper elementary school kids (ages 10 - 13).

What's disturbing about these two problems is that these are problems are actually much harder to solve than one might hope: “If you solve [the well problem] within an hour (!), you'll belong to the elite one percent of the people we tested who managed to get the right answer within that time. What's more, everyone we tested had at least an undergraduate degree either in mathematics, engineering, or computer science.” (**How to Solve It: Modern Heuristics**, 2nd Ed, p. 5)

So why was it so easy in school?

### When Schooling Fails

For most of those who fail to solve the problem, the failure usually lies in the act of “getting started”: once somebody suggests the right algorithm or formula to use to solve it, it's a relatively simple process to plug values into the formula, do the appropriate add/subtract/multiply/divide thing, and arrive at the result. But getting to the point of understanding which of the seemingly-infinite number of formulae to use is often the problem. Is this a problem of slope (y = mx + b)? Is this a calculation of the area under a curve? By the time most of us graduated university (even those of us who graduated with liberal arts degrees), we'd been exposed to literally hundreds of formulae and concepts that might work here.

And yet, somehow, in the fifth, sixth, or seventh grades, these problems were not only approachable, but we did literally dozens (if not hundreds) of them on a weekly basis as part of homework. What's different between then, and now?

Context.

Back in school, whenever a problem was presented to us, it was always within the context of the skill necessary to solve it: after learning about the quadratic equation, we were presented with problems that demanded its use. After being presented the slope equation, we were presented with problems that demanded its use. And so on. As a matter of fact, remember how final exams that stretched across the entirety of the semester or quarter seemed incredibly unfair? It meant that we had to go back and re-study everything we'd learned over the past 10 or 15 weeks - criminal!

And yet… Once we get out into “the real world”, few, if any, of the problems we will have to face will come explicitly inside the context of the necessary tools to solve it. Learning how to solve problems by approaching it “without context” is a skill that quickly earns respect, yet rarely shows up in a technical interview.

Fortunately, in 1944, somebody started thinking about the process of approaching problems without the solution already in mind: “The author remembers the time when he was a student himself, a somewhat ambitious student, eager to understand a little mathematics and physics. He listened to lectures, read books, tried to take in the solutions and facts presented, but there was a question that disturbed him again and again: �Yes, the solution seems to work, it appears to be correct; but how is it possible to invent such a solution? Yes, this experiment seems to work, this appears to be a fact; but how can people discover such facts? And how could I invent or discover such things by myself?'” (**How to Solve It**, Preface to the First Printing.)

George Polya, a Hungarian born in 1887, wrote those words in the first printing of his book, **How to Solve It**, a book aimed specifically at the process by which we discover the solutions to problems. In it, Polya describes a `core`

method by which people should approach problems of a mathematical nature, and it's a heuristic that applies equally well to working out software architectures, designs, or algorithms:

**Understand the problem.** What is the unknown? What are the data? What is the condition?

**Devising a plan.** Have you seen a problem like this before? Have you seen the same problem in a slightly different form? Do you know a related problem?

**Carry out the plan.** Check each step. Can you see clearly that each step is correct? Can you prove that it is correct?

**Review the results.** Can you check the result? Can you check the argument? Can you derive the result differently? Can you see it at a glance? Can you use the result, or the method, for some other problem?

To many, listing these out explicitly feels redundant. And for those for whom the process is already so deeply intuitive as to eschew formal analysis, boo-yah! For the rest of us, for whom approaching a problem like the above intuitively doesn't immediately and explosively light up a solution in our heads, having an explicit process to follow serves as a helpful framework.

### Testing it out

As an example, consider the above farmer's field problem. *Understand the problem* leads us to state, explicitly, that the goal is to identify the width and height of the farmer's rectangular field, which we'll label “x” and “y” for short. We have two pieces of data, the perimeter of the farm, which we know to be 110 meters, and the area of the farm, which we know to be 700 meters.

*Devising a plan* suggests that even though we might not have seen a problem of this form before, we do have two closely-related ideas that are clearly recognizable from basic geometry way back in school: the perimeter of a rectangle is obtained by adding the height and the width values twice (2x + 2y), and that the area of a rectangle is obtained from the product of width and height (x * y). Given that we also know that both of these values describe the same farm, then we have the realization that we have two equations with two unknowns - a situation which then suggests that we can solve for one in terms of the other.

*Carrying out the plan* then says we do the math thing and work to isolate an unknown by itself on one side of the equals sign:

```
2x+2y = 110 -> x + y = 55 -> y = 55 - x
```

And then plug that value in to the other equation:

```
x * y = 700 -> x * (55 - x) = 700 ->
55x - x2 = 700
```

From here, it's relatively simple math to figure out a value for x (assuming you remember the quadratic formula-if not, it's an easy Google/Bing search away), and arrive at the answer of x is either 20 or 35 (leaving y to be the other value) for the dimensions of the farmer's field.

Although it may seem like the work is done, *checking the result* is critical to make sure that there wasn't a mistake somewhere along the line, or that other possible answers are out there. In this case, if we plug in 20 and 35 for the perimeter and area formulae, we get `(20 + 20 + 35 + 35) 110, and (20 * 35) = 700`

, meaning the answer is correct.

### Outside the Math

“All of which is well and good if the developer reading this article is trying to undertake some kind of mathematical puzzle,” says the skeptic, “But I fail to see how this can solve a problem in my completely non-mathematical code.” As it turns out, this particular process works in an almost identical fashion when attempting to debug a misbehaving program. Consider:

**Understand the problem.** What is the exception being thrown? Is that exception occurring because of an earlier exception being thrown? What are the data elements in place when that exception is being thrown? In the case where no exception is occurring, then how do we know there is a bug? What is being seen? What isn't being seen?

**Devising a plan.** Have you seen a problem like this before? Have you seen the same problem in a slightly different form? Do you know a related problem? Is this an exception that has occurred elsewhere in the codebase before? Does this exception bear any resemblance to other exceptions that have been thrown before?

To be precise, a developer should establish a clear mental picture of what the problem is. Create a hypothesis, and state it out loud: “I believe the problem is that the `Person`

object has a `null`

`FirstName`

property which is leading to the `Null`

ReferenceException being thrown in the data-access layer.” By establishing a hypothesis, the developer can now create the conditions (either by unit test or by running the program with certain data values) by which the code will either fail, meaning the hypothesis holds `true`

so far, or not fail, meaning the hypothesis, as stated, is flawed, and needs to be revised. No code should be changed until the hypothesis has been proven correct! Or, if code needs to be changed to more easily prove the hypothesis, then once that hypothesis proves `false`

, the code changes made should immediately be backed out to the original form, and a new hypothesis formed.

**Carry out the plan.** Once the hypothesis holds true, then the developer can establish a plan to fix the code-creating an invariant in the `Person`

class to ensure that `FirstName`

is never `null`

, perhaps, or establishing a default value for `null`

FirstNames. Or any other solution that ensures that FirstNames are never `null`

. Or even changing the data access code and/or the database schema to handle the possibility of a `null`

FirstName. And so on.

In some cases, “carrying out the plan” will require the developer to meet with other folks (customers, developers, project champions, etc.) to determine what the right fix should be-can a Person in the system have an empty FirstName? Often these are business decisions, not technical ones, and therefore outside of the developers' realm of responsibility to fix.

**Review the results.** Lastly, once the fix is in place, the developer needs to verify that the fix works. This is where the benefit of having written the unit test back in “devising a plan” will pay off-by simply running the unit test suite a second time, the fix will demonstrate that either the fix works, or doesn't. Having a full suite of unit tests will also ensure that if the fix breaks something else elsewhere in the program, the developer will know about it quickly and can either back out the fix-that-isn't, or dig more deeply and discover what might be the root cause behind an even deeper defect.

### Polya the Designer

This particular process also works well not just for debugging scenarios, but also for creative ones: developers and/or architects seeking to create an object model or architecture for a new project.

Consider the steps and the corresponding activities involved in creating a new architecture or class design:

**Understand the problem**. What are we trying to build? It seems obvious that we have to know what the project is about before we can build something to do what needs to be done, but numerous software projects began with a predetermination to “build a Web app” or “use SharePoint” long before the project requirements were understood.

**Devise a plan.** The most important exercise the software architect or designer engages in is the “What-if” game. “We have to build a web app allowing users to post and retrieve content, but it has to be massively scalable; what if we use CouchDB as the back end?” The same questions from the debugging story apply here as well: have you seen a problem like this before? Have you seen the problem in a slightly different form? Can you take what you know about those other, known, scenarios, and apply it here?

Be warned, however, not to take the similarity approach too strongly-just because your last project used a rules engine doesn't mean every project you approach from then on needs one. Familiarity creates a sense of comfort, leading the unwary to try to “fight the last war” all over again.

**Carry out the plan.** Obviously, in the case of an architecture, carrying out the plan means building the system. While we could just state that and move on, the wary architect, always aware that architectural decisions sometimes don't reveal their consequences until very late in the project lifecycle, will look for ways to verify the plan's veracity long before then, usually through some kind of research spike or prototype/proof-of-concept. (Just make sure the boss doesn't know about it, or you'll find your prototype being shipped off to the data center!)

**Review the results.** Does the prototype work? Is the system doing what it was intended to do, as often, as fast, and for as many simultaneous users, as it was supposed to? Software development, in many ways, is about managing pain-when it hurts, something has to change. But without constant feedback, without continuous review of the results, the architect will never feel that pain, and the software will just eventually wither away and die of neglect.

### Summary

Polya's work dates back to a time when computers were clearly more fiction than fact-while he was clearly around at the same time as the then-nascent computer industry was taking off, his work never really touched on them in any meaningful way. Despite that, however, his four-part recipe remains a powerful way to think about approaching problems that might have at one point seemed unsolvable.

Some readers will point out that this same four-step approach can be used for a variety of different things; I heartily agree. In fact, one of my most recent book acquisitions, *Cooking for Geeks*, happens to suggest advice that follows quite closely.

But we'll talk more about books later. For now, jot Polya's four steps down somewhere handy, and the next time you're stumped by a problem, give them a try.