BACK TO THE FEATURE INDEX

Last December, a massive earthquake deep beneath the Indian Ocean generated a tsunami that killed nearly 300,000 people. In 1700, the Cascadia Subduction Zone in the U.S. Pacific Northwest unleashed a quake of similar magnitude that also generated tsunamis. Recent global position system (GPS) data shows that the New Madrid fault that runs through St. Louis--a victim of earthquakes equivalent to 8.0 on the Richter scale in 1811 and 1812--is gradually building up pressure of similar magnitude to that which led to those past catastrophic quakes. And then there is California's San Andreas Fault, which is famous as an ever-looming threat.

The specter of such enormous destruction would, you might think, create good job prospects for scientists interested in that sort of thing. But just like the rumored 'big one' it aims to predict, the boom in earthquake-science careers hasn't happened yet. Earthquake science is a profession still dominated by academic study, with a few jobs available in government and a few others in the construction industry, where earthquake experts dispense advice on the design and location of buildings and other structures. Predicting earthquakes is still very much a niche science.

Indeed, many high-minded scientists would be surprised to learn that career opportunities for earthquake scientists--especially seismologists-- are affected more by the movement of oil prices than by the movement of the earth. Current high oil prices have the traditionally cyclic U.S. oil industry on an upswing, because high market prices make it worthwhile to exploit smaller reserves. To find those reserves, oil companies need seismologists and geologists. "Many of the techniques and fundamental science [of studying earthquakes] can be very similar to those used in exploration. You're just tuning to a different frequency," says Steve Malone, professor of geophysics at the department of earth and space sciences at the University of Washington and director of the Pacific Northwest Seismograph Network.

But even if the Earth hasn't moved, there has been some change in earthquake science in recent years, as scientists have realized they need more data. This, in turn, has led to a few new career opportunities. Recent years have seen the establishment of data-collection initiatives like the National Science Foundation's EarthScope and the U.S. Geological Survey's Advanced National Seismic System. Some of the increased interest can be traced to the early 1990s, when steady improvements in raw computing power made clear that earthquake science was limited by a lack of data. "People realized that we had to have a quantum leap in data collection to get around those limitations," says Arthur Lerner-Lam, associate director for seismology, geology, and tectonic physics at Columbia University's Lamont-Doherty Earth Observatory.


Building a GPS station on Mount St. Helens in Oregon. Photo courtesy of EarthScope

Institutes Get Ready to Rumble: State-of-the-Art Data Generation

The push for more data started as a grass roots movement within the earthquake science community. The project that eventually became known as EarthScope started that way and was eventually picked up by the National Science Foundation (NSF), which officially launched it in 2003. A unique effort to study the structure and evolution of the North American continent, EarthScope consists of observatories and networked instruments, including seismic arrays that produce three-dimensional images of the continental crust and the underlying mantle. GPS receivers and satellite radar imagery map small movements across faults. In late July, EarthScope drilled across the San Andreas Fault at a depth of two miles, providing a literal window into a location that actively generates earthquakes.

Another data collecting source, USGS's Advanced National Seismic System (ANSS), is more applied. When complete, it will consist of more than 7000 instruments for measuring shaking, on the ground and in buildings. The data will provide engineers with information about how buildings and specific sites respond to earthquakes, and provide earthquake scientists with data for the study of earthquake processes and ground structure and dynamics.

On the international scale, the Incorporated Research Institutes for Seismology (IRIS) is a university consortium that maintains a global network of seismic instruments to measure shaking from earthquakes, volcanic eruptions, and other events. It distributes the network's data to scientists, and maintains a stable of portable seismographs for researchers to borrow for use in short-term projects.

The above networks and others like them are creating their own small job market, as the ongoing installation of instruments requires people who have experience installing seismographs and other instruments. This phenomenon is putting the squeeze on some academic groups, says Malone, because the positions pay better than academia.

The influx of raw data created by these programs and others is now putting pressure on the market for computer modelers, says David Applegate, senior science advisor for earthquake and geologic hazards at the U.S. Geological Survey. "We're really in an exciting period of new data streams becoming available, both in terms of seismic networks and geodetic (topographic) monitoring and remote sensing capabilities. We want to develop physics-based models of how the crust operates and how faults operate. We want to take that information and turn it into hazard estimates," such as the probability of an earthquake in a given area, and the likely extent of damage.


Drilling to install a braced GPS station near Parkfield, California. Photo courtesy of EarthScope

Computer models can also answer some fundamental questions that weren't approachable just a few years ago. One example is the influence faults can have on each other. Faults don't exist in isolation--a major tremor in one fault can trigger consequences in others. Current hazard estimates are based on the recurrence intervals of earthquakes at individual faults, but models that take a network of faults into account could provide a dynamic model with better predictive power. "How movement in one fault affects the probability of earthquakes in other areas ... these kinds of interactions have been very hard to get at," says Applegate.

A Gap in Knowledge

These new data sources are rich sources to mine, but to take advantage of them geoscientists need to be familiar with modeling software and computer programming. A surprising number of graduate students aren't, says Malone. Ten years ago, software programs and operating systems weren't nearly as sophisticated, so that many budding scientists had direct experience with programming their own machines.

"You could make the assumption that virtually anyone entering graduate school had some experience doing computer programming," says Malone. With today's more sophisticated programs, however, many students now enter his earth and space sciences program at the University of Washington with little more than a passing acquaintance with Microsoft Excel, "and that's generating a huge disadvantage for them."

Correcting this lack of preparedness is a relatively simple matter for the aspiring scientist: Take a class. It doesn't matter what computer language it is. What's important is to learn the underlying logic of computer programming. "Just the experience of being able to follow through the logic of writing a program (is important). Once you learn the process, you can learn a new programming language fairly easily," says Malone.

For those ready to delve into modeling software, the Computational Infrastructure for Geodynamics is a good resource. The organization develops software for use in computational geophysics and similar fields.

In Academic Research, Not Much Is Shaking

The job market is more sluggish in academia. Although NSF and other agencies have poured money into data collecting infrastructure projects over the past several years, the funds available for research grants have remained nearly stagnant, says Kaye Shedlock, program director of NSF's EarthScope, even after disasters like the recent tsunami.

A Swiftly Changing Landscape

In the early 1990s, there was not enough data. "Possibly that will (change) because a lot of new data will become available, and clever people will find ways to use it in new and intriguing ways," says Malone. But that assumes that resources will be available; in the near future, there may be a glut of data and not enough research dollars to train and support scientists to analyze it.

As in many areas of science, there's lots of work to be done, but the resources available to fund that work--and the opportunities available to scientists entering the field--are limited. So how do you, an aspiring earthquake scientist, separate yourself from the competition? Two words: "instrumentation" and "modeling." Familiarize yourself with the instruments' capabilities, and think of innovative ways to use the data those instruments collect, advises Lerner-Lam. "Funding is always competitive; whether you succeed in getting it will depend on if you are doing something new with the data."

Researchers able to use and model these abundant new data sets will likely be part of a swiftly changing landscape in earthquake science. Real-time data make it possible to design predictive experiments and build iterative models that should greatly enhance our understanding of the mechanics and hazards of earthquakes in the years to come, despite the improbability of an eruption in job opportunities. "It is an exciting time," says John Taber, education outreach program manager for IRIS.

Jim Kling is a freelance science and medical writer based in Bellingham, Washington.