Home | Older Articles |     Share This Page
Creative Problem-Solving

P. Lutus Message Page

"God is subtle, but he is not malicious" — Albert Einstein.

I want to blame television, but I suspect that's too easy. I've been hearing about the decline of standards in American education, the absence of creative thinking skills in young people, but until recently I had no direct evidence. Now that I receive daily e-mails from people asking questions about the Internet, all doubt has evaporated — today's Americans cannot think for themselves.



In the largest sense, American society is breaking into two classes:
The world view of these two classes is so fundamentally different that a chasm is opening between them that will soon swallow even the illusion of American democracy.
We are entering an era in history in which, if things go very badly, a small number of people trained to create ideas will completely dominate the lives of a vast sea of idea consumers, people whose lives are ruled by facts.

The belief in the authority of idea producers is the modern replacement for religion. Just like religion, this belief closes off a vast area of human experience, streamlines the equations of life and makes one's existence pathetically simple — all one need do is find out what views it is acceptable to hold.

My purpose in this article is to undermine that belief. I assert that there are no authorities in the realm of human ideas — each idea must be weighed against all other ideas, and ideas should be evaluated solely on their intrinsic merit, without regard to their source.

The above position will not be a fast-breaking story to someone trained in original thought, and neither will this: all these statements are one person's ideas and are most useful when included in a much larger collection of ideas, including the reader's own.

Here are some common misconceptions about ideas and idea production:

  Thesis: A holder of a college degree has an intellectual superiority that others lack.

Does anyone still believe this? Yes. And the fastest way to lose this belief is to talk to college graduates, as I regularly do. At its best, college will train you to think creatively, and prepare you for the reality that education never ends. At its worst, college will reinforce an inbred intellectual smugness, dress it in facts, and provide you with a document asserting your immunity to all future intellectual experience.

If this thesis held any intrinsic truth, then going to college would make you smart, a position for which there is no evidence. Going to a college doesn't make you smart any more than going to a bank makes you rich — unless, of course, you bring your own money to the bank. By the same token, if you arrive at college with your own intellectual resources — your own riches — you may well become "smarter," whatever that means in real human terms.

In the absence of that personal investment, as you leave the bank or college, you will end up with symbolic wealth instead of real wealth, and the trip will have been wasted. The expression, "I must be smart — I went to college" makes as much sense as "I must be rich — I have dozens of credit cards."

  Thesis: Consulting an expert is a more efficient way to solve a problem than simply working it out for myself.

This is only true if you are a lifelong idea consumer, an unenviable position. After the most basic facts have been taken into account, one is better off creating personal solutions to problems than searching for existing solutions. The reasons this is true are almost too numerous to list:


This is just a short list.

  Thesis: A computer program with a lot of predefined features is superior to one that allows me to create my own features.

I know this is a sudden shift from the general to the specific, but it serves to show how a bias created by television can influence other behaviors. The basic message of television is "Let me do it for you — you cannot do it alone, I am essential. Your experience is shaped by my ideas, your own ideas have no value at all. You cannot possibly detect humor, that is beyond the scope of your small brain, so I will signal the presence of humor by laughing for you. Laugh with me, think with me, become me."

The inevitable result is that computer programs are esteemed to the degree that they are like television. For example, a really awful program like Microsoft's Front Page receives high ratings, even though the program actively represses individual creativity and the user is reduced to following a single behavioral pattern built into the program by its designers. It is instructive to submit a pre-existing HTML document to Front Page, which then proceeds to erase any tags it doesn't recognize as its own. No error messages, no dialogue, all tags it doesn't know about are simply eaten. Most people I have talked to finally realize it is easier to stop using Front Page than to try to exercise any personal control over its behavior.

By contrast, consider Arachnophilia, my Web page workshop. I am using Arachnophilia only to make a point, and I am not trying to sell you my program, as you will discover if you visit the page. Arachnophilia is a good example of open software design — a design the user can change in the field to meet requirements the programmer (me) cannot possibly think of himself, including new requirements that do not exist at the time of development. By the way, Arachnophilia is a descendant of my classic best-seller "Apple Writer," an early word processor for the Apple II that also was field-extendable by way of a macro language (in an era when macro languages were virtually unheard of).

But, and as usual, I digress. My goal in writing Arachnophilia was to replace its immediate predecessor WebThing, which could be programmed by the user only to a small extent (one special toolbar, 32 custom commands). But while distributing WebThing I noticed people were asking for features and HTML tags I couldn't possibly fit into the framework of any finite-sized computer program. So I made sure the new design included a way for users to add any number of additional commands and tags. The user can even create new toolbars, then fill them with commands. This feature is especially important in HTML development, because HTML is itself rapidly changing, and it is easier to add tags to an existing program than to create a new version of that program every time a raft of new tags appears.

After I finished Arachnophilia I expected an end to the flood of e-mail asking for new tags. After all, it would now take less time for a user to add a new tag than to write and ask me to do it. But, to my surprise, the volume of e-mail increased!

After a while I realized what the problems were. The first problem was that people didn't expect to be able to change the behavior of the program, because virtually no Windows programs let you do this. The second problem was no one bothered to read the help screens. Basically if my program could be made to produce a result, immediately, without any study or insight, then it was useful, and any capabilities behind that external appearance might as well not exist. Yes, just like television.

I personally think computer programs should be evaluated on criteria that are not presently thought important (because the average user is happy thinking of his computer as though it were a toaster — press the button, wait for smoke). The most important missing question is: how easy is it to add a behavior to the program the designer didn't think of? If this question were to be regarded as important, most Windows programs would fail. The next question would be: how much does the program learn from the user?

  A summary — key elements of problem-solving

Here is some common-sense advice for solving problems. If you are in the minority who are trained to think creatively and you think these points are too obvious, that is only because you haven't read my e-mail.

Thesis Everyday Expression
If the problem you are trying to solve has already been solved by others, by all means learn that solution first, even if you intend to improve on it. If your home page doesn't work, do not assume that (1) the HTML editor you have chosen is defective, (2) nature is out to get you, or (3) you are too stupid to learn HTML. Instead, browse the Web, where in a few short minutes you will find millions of pages you can download and examine. Find a page with the appearance you want for your page. Download it. Look at its HTML code. Notice that the tags are not the same as yours. Change your tags.
Begin by solving the simplest version of the problem. A page that says "Hi, this is my page!" — and works perfectly — is morally superior to one that has a faint chance of proving Fermat's Last Theorem in real time, but crashes every browser it encounters.
Build your solution incrementally. Add to your page in small parts. Test the new page as you work. Otherwise you will have no chance to discover the tiny flaw, entered three hours ago, that now makes smoke come out of your computer.
Avoid focusing on a single solution. Try to look at problems in fresh ways. Deliberately abandon your pet theory about the nature of the problem, adopt instead a theory that may seem silly at first glance (like the ridiculous idea that the continents are drifting around. Or that birds are descended from dinosaurs. Imagine!).
Avoid hidden assumptions. Sometimes when your page crashes the browser, it is the fault of the browser, not the page.
Be patient and persevere. If you are patient enough and look enough places, nature will reveal her secrets. Every great inventor discovered this first.
Don't expect to find permanent answers. Like great poems, great ideas are never finished, only replaced by others.

Further reading: How we confuse symbols and thingsWhy are computers so hard to use?

Download this article in Word 7.0 formatDownload this article in RTF format


These Pages Created and Maintained using  Arachnophilia.


Main Page

Home | Older Articles |     Share This Page