There’s a book on my shelf at my office that usually elicits a chuckle from visitors: Why Smart People Can Be So Stupid  . I don't remember how I came across this collection of 15 scholarly essays on the subject of stupidity--perhaps some “smart” friend believed I needed help--but I keep it prominent on my shelves to remind me that even the brightest minds are capable of blunders of breathtaking scope and magnitude.
Most of us in the world of science and engineering esteem “brains." Indeed, many of us have a tendency to judge others mostly on the basis of intelligence and technical skill. Some people in the outside world have similar tendencies; a person with a Ph.D. from a top university is assumed to be brilliant. We know better, of course.
We know, for one, that there's a downside to all that gray matter. Although being smart and working hard to become an expert are valuable and laudable, the pursuit of those skills does not guarantee that you'll acquire the resources you need to be effective. An overreliance on brains can even leave you vulnerable to commonsense mistakes and stunt your ability to innovate. Thinking too much can lead to paralysis or obsession with marginal details. I call this the “Curse of Brains.”
One manifestation of the curse of brains is the tendency to overanalyze. A big CPU allows you to map out the consequences of your personal or professional choices for years into the future. Should I take postdoc A or fellowship B? How will this affect my chances of getting professorship X? Funding? Tenure? Oh, and what about a family?
It’s good to think through the consequences of your choices, but intellectually smart people sometimes analyze so much that they delay taking any action while creating for themselves a great deal of mental work and anxiety. Most important professional choices involve a paucity of information, which means that rational analysis alone is either inadequate or really, really hard. Decisions require a combination of analysis, judgment, and “gut” (whatever that is).
At the same time, overanalysis can lead “smart” people to obsess about details that are, in the grand scheme of things, minor. Our ability to analyze can cause us to focus on the portions of the problem that are analyzable--and we analyze and analyze and analyze. This is especially a problem in the profit-seeking world, where time is of the essence. I encountered this issue in one start-up I was involved in. The CEO, a gifted and talented Ph.D., spent hours developing detailed spreadsheets projecting monthly expenditures down to tens of dollars. But portions of the business experienced unpredictable monthly financial fluctuations at a magnitude of about $10,000. All the effort to chase down those small (but quantifiable) expenses was irrelevant in the face of the larger uncertainties.
You can reduce the potential for analysis paralysis and “minutia obsession” by taking several steps while processing a choice or analyzing a course of action. First, indulge in as much rigorous analysis as you want but for a limited time. Give yourself a hard deadline for finishing your analysis and make sure your deadline is well in advance of when you actually have to make a decision. Structuring your choice in two-column lists of pros and cons and journaling about the decision and the possible outcomes can help your brain and your gut hook up. One person I met used a coin toss for critical decisions--but in an unusual way. When she tossed the coin and watched it flipping over and over in the air, she allowed herself to feel what her heart “wanted” the outcome to be. If she found herself disappointed when the coin landed on one side, she took that emotional reaction as a very serious indicator of what she felt about the choice. And often, she allowed that to guide her.
One interesting aspect of expertise is that, once it is obtained, it is impossible to not know what you know. In approaching any problem, the expert has a tendency to rely on knowledge and experience rather than consult others or explore nontraditional pathways. (The obsession with minutia described previously might be viewed as a particular example of this; intellectuals are analysis experts, so they analyze when a different approach might be better.) Analyzing the problem through the lens of your expertise can stifle creativity and innovation; often, it is the thoroughly indoctrinated expert who is last to convert to a new mode of thinking. This can be observed repeatedly in the history of science: Each time a major new concept such as quantum mechanics, natural selection, or the asteroid extinction of the dinosaurs emerged, opposition was often the fiercest from those who were “experts” in the field.
So what can one do to break out of the expert’s prison? Innovation, by definition, comes from reexamining a problem from a new perspective--and the more stuck you are in your intellectual ways, the harder that is to do. Innovation often requires us to hold on to some key facts and ideas while setting much of what we think we know aside.
One way to avoid this trap is to move (physically as well as intellectually) to a new field in which you lack expertise and have to rely upon your broader problem-solving and analytical skills. Keep it fresh. Or it may just mean learning something new that complements what you already know. Sadly, much of how we do science these days seems in opposition to this approach; often, professors are hired for their narrow field of expertise rather than for the breadth of their intellectual contribution .
This is why interdisciplinary research can be so fruitful. When people from outside the discipline examine technical problems, they often explore solutions from a nontraditional approach.
Teaching and mentoring can be another avenue for breaking out of the expert’s prison--by proxy. Working with bright, uninitiated colleagues forces you to explain and justify “facts” that you may never have had the opportunity to examine and question.
The final curse of brains I think is the most insidious one: fear of failure. Because intellectually smart people esteem intelligence so highly, they can find it difficult to risk intellectual failure. Rather than take a risk and explore a new problem, or propose an unconventional solution, they may take an incremental approach to a familiar problem. Science's review processes--including grant, publication, and tenure review--tend to reinforce and reward an incremental approach.
The dismal levels of R&D funding these days may even make it seem like the right approach. But if you ask the department chairs at leading research universities (or successful entrepreneurs in the private sector), they probably will tell you that they prefer to hire people who swing for the fences. Whether they follow through at tenure time is another question.
This column has focused on a cluster of related disadvantages often faced by big-brain types such as you and me. When it comes to making good decisions, your big, beautiful brain is a huge resource--but it's not sufficient. You need to supplement your abundant analytical horsepower with something else. That “something” can be any number of things, but one common form it takes, I believe, is passion. True, passion can lead you astray--sometimes the safe road is the right road--but that's a chance we have to take.
Peter Fiske is a Ph.D. scientist and co-founder of RAPT Industries, a technology company in Fremont, California. He is the author of Put Your Science to Work  and co-author, with Dr. Geoff Davis, of a blog (at phds.org) on science policy, economics, and educational initiatives that affect science employment. Fiske lives with his wife and two daughters in Oakland, California, and is a frequent lecturer on the subject of career development for scientists.
Comments, suggestions? Please send your feedback to our editor .