Robert Bunsen was a renowned chemist, the kind of serious 19th century German academic whose photograph makes you glad you didn't attend graduate school in an era of three-piece suits and puffy neck beards. During his illustrious career, he found an antidote for arsenic poisoning, co-discovered two elements, and improved galvanic batteries.
He also, at some point, made a few changes to a laboratory burner. And that's why you've heard of him.
Bunsen shares this fate with his contemporary Emil Erlenmeyer, another bearded 19th century German chemist. How many scientists remember that Erlenmeyer discovered isobutyric acid or proposed the structure of naphthalene? Not me. I had to learn this from Wikipedia. But everyone recognizes the classic conical shape of a glass Erlenmeyer flask as an icon of laboratory science. It even appears in Super Mario Bros. 2.
You've heard of these men. But you probably haven't heard of William Robert Grove, whose electrochemical cell Bunsen improved upon, or Friedrich Kekulé, who mentored Erlenmeyer, or Hund der Opersingt, because I just made him up, and his name means "dog that sings opera."
What would Bunsen think if he knew the ubiquity of his namesake burner inspired the first name of the Muppet Dr. Bunsen Honeydew? How would Erlenmeyer feel about the fact that his flask transports Mario to a subspace where harvested turnips yield giant coins? And what would bacteriologist Julius Richard Petri make of all of this?
All three being 19th century Germans, they probably wouldn't find it very funny.
Still, these men have accomplished what you and I probably haven't: Albeit through quirky, unsatisfying means, they have achieved scientific immortality. You may not remember exactly what they did (even though you just read it a few paragraphs ago), but if you were playing Outburst, and the category was "scientists," you might just say their names.
If all goes well, our science careers will sustain us over the course of our lives. But then what? How can we ensure that future students will read our names when they open their science textbooks on the iPad 15 (featuring a 500-terabyte hard drive and a 2-megapixel camera)? Here are some sure-fire ways to stay out of the ash heap of scientific history.
We're celebrating April Fools' Day with this article and another, "Slipping Humor Into Scientific Presentations ," in which Elisabeth Pain, our contributing editor for Europe, interviews experts about how to deploy humor to get your science across.
Students worldwide will assume you were a great thinker when they're forced to memorize your name and associate it with a formula or postulate that they'll forget as soon as the exam is over. Your decades of work may be reduced to a single algebraic expression -- the formula "S = k. log W" is actually engraved atop Ludwig Boltzmann's tombstone in Vienna -- but at least you'll be remembered. And if you're a unit of measure, like Newton, Watt, Joule, Ångström, or Coulomb, you'll be remembered each time a teacher deducts points for omitting the unit of measure.
Planck, Avogadro, and Rydberg may already have their own constants locked up, but there are literally an infinite number of constants remaining in the universe to be owned. So 6.02 × 1023 is already taken; has anyone claimed 6.02 × 1024? Grab it now before it's gone. I call dibs on 5.
Many important terms end in the suffix "-ian" besides Armenian surnames. They're adjectives named after scientists, and they're science's way of saying, "That thing you just did? That is so this guy." If you don't believe me, remember that a Gaussian distribution of Mendelian genetics shows that Darwinian evolution, plotted using a Hessian matrix of Cartesian coordinates in Euclidean space during the Copernican revolution, uses a Laplacian filter on Newtonian fluids experiencing Brownian motion to demonstrate that Aristotelian physics is Freudian.
People love when scientists cure diseases, but seriously, how often do we do that? Not nearly as often as we discover new ones to which to append our names. From Asperger syndrome to Creutzfeldt-Jakob disease to Huntington's chorea, if you're lucky enough to plant your flag in a novel pathogen, you can guarantee that thousands, or even millions, of people will think of your name while suffering. Just make sure you're renowned for discovering the disease, not for contracting it. (Sorry, Lou Gehrig.)
A parable is a quirky anecdote, usually false, that future generations will love to retell regardless of its veracity, such as "Sir Isaac Newton discovered gravity when an apple fell on his head" or "Archimedes ran naked through the streets shouting 'Eureka!' " or "Einstein disproved cold fusion with the help of Tim Robbins and Meg Ryan." Start thinking now about the myths you want to perpetuate behind your own research, because no one is likely to say, "Did you know that so-and-so discovered this while surfing YouTube in an Incognito Google Chrome browser window?"
The beauty of this method is that you don't actually need to discover or solve something in order to be famous. You merely need to ask a difficult question and walk away, leaving others to debate the answer. Ideally, you should also die at this point, but only after writing something tantalizing in the margin of your lab notebook, such as: "I have discovered an elegant solution to this problem, and I'll totally write it down, just as soon as I eat a spoonful of this yogurt. Is this yogurt still good? The expiration date is all smudged. Let's find out."
The person whose grand hypothesis is ultimately supported by evidence is the person who ends up revered. But to learn about the field, future scientists will study not only the correct theory but also the wacky flops -- and that's where you come in. Remember Jean-Baptiste Lamarck, who went down in history as "the man who was wrong about what Darwin was right about"? Or the highly inaccurate models of the atom named after Bohr, Lewis, and the otherwise revered Dr. Plum Pudding? This may sound like an ignoble path to veneration, but it's your choice: Would you rather be remembered as "the person who got it wrong" or as "who?"
Come on! You're a scientist! You play with drugs and things, right? Surely some combination of those drugs could make you impervious to death; you just haven't tried hard enough. Didn't I read that some scientist made himself immortal? I could have sworn I saw that somewhere. Maybe it was in People. Or am I thinking about vampires?
* * *
It turns out that Bunsen and Erlenmeyer actually collaborated at one point to study fertilizers. I can picture the two of them, hard at work in their lab, trying in vain to heat a fluid in a beaker:
Bunsen: Man, Erlenmeyer, these beeswax candles suck.
Erlenmeyer: Yeah, and this cylindrical flask isn't exactly doing me any favors.
Bunsen: Huh. You know what would be great?
And history was born.
Scientists all want to be remembered. But ask yourself: If they were to name something after you based on the work you're doing right now, what would it be? Bench paper? A well-maintained EndNote file? A coffee pot? Will my grandchildren learn in school about The Ruben Method of Applying Hundreds of Labels to Microcentrifuge Tubes While Listening to NPR Podcasts?
Or maybe we just want our work to be remembered, eponym or no eponym. But each result we publish is merely a Post-it note on the vast and slippery wall of geologic time -- will anyone really remember the years in the lab you spent purifying your starting materials or trying to figure out whether your instrument was working properly? Surely 99% of what we do is so mundane as to be fame-proof.
Maybe there's something more important, even, than wanting our work to be remembered: We all want our work to be useful. That's the heart of our drive for intellectual immortality and the reason we revere the luminaries in the first place: Petri's dish is such a great idea that my lab buys thousands of them, none of which we would be able to examine without the work of Robert Hooke, Antonie van Leeuwenhoek, Louis Pasteur, or Robert Koch. Their discoveries have proved so useful that our work depends on them.
So when pondering your legacy, let utility itself be your guiding principle. Remember that your job, besides making discoveries, is to give the next century's scientists the tools with which to make discoveries. Truly that is the Rubenian way of thinking.
Adam Ruben, Ph.D., is a practicing scientist and the author of Surviving Your Stupid, Stupid Decision to Go to Grad School .