Ed Lazowska holds the Bill & Melinda Gates Chair in Computer Science & Engineering at the University of Washington (UW); he joined the faculty there in 1977. Lazowska served as co-chair (with David E. Shaw) of the Working Group of the President’s Council of Advisors on Science and Technology (PCAST) to review the Federal Networking and Information Technology Research and Development Program in 2010. He also has served as chair of the Computing Research Association (1997 to 2001), chair of the National Science Foundation CISE Advisory Committee (1998 to 99), chair of the DARPA Information Science and Technology Study Group (2004 to 06), and chair of the Computing Community Consortium (2007 to present), as well as serving on a large number of National Academies study committees. He has received the Computing Research Association Distinguished Service Award, the Association for Computing Machinery Presidential Award, and the ACM Distinguished Service Award. He is a member of the Microsoft Research (MSR) Technical Advisory Board and serves as a board member or technical adviser to a number of high-tech companies and venture firms.

Science Careers interviewed Lazowska by e-mail, asking him about the future of computer science careers, especially at the Ph.D. level. This interview has been edited for brevity and clarity.

 

Q: Please tell us something about yourself—your expertise, research, training, and so on.

E.L.: I've been a faculty member at the University of Washington for 35 years. As an undergraduate, I tried out four majors before I managed to graduate: electrical engineering, physics, applied mathematics, and computer science. I was lucky enough to stumble into a programming course (in assembly language), became an undergraduate teaching assistant, then an undergraduate research assistant, and I was hooked. It's impossible to overstate the impact of my undergraduate mentors at Brown University (Charlie Strauss, David Lewis, and, most importantly, Andy van Dam), my graduate mentor at the University of Toronto (Ken Sevcik), and my first few department chairs at the University of Washington (Jerre Noe and Bob Ritchie). I've truly been blessed, and I try in some small way to pass it along.

My research concerns the analysis, design, and implementation of high-performance computing and communication systems, and, more recently, the techniques and technologies of data-driven discovery. I've also been active in federal policy issues related to research and innovation in computer science.

Q: Eight-plus years ago, we asked Greg Andrews, "How are career opportunities in computer science today, compared with how they've been in the recent past? Are they better or worse?" Let me repeat the question now: Eight years later, are things better or worse for people working in the field and seeking careers in the field?

E.L.: These things always have their ups and downs. When you spoke with Greg, we were in a "down." It's important to look at the long-term trends, though. The Bureau of Labor Statistics projects that two-thirds of all available jobs in all fields of science and engineering during this decade—in the mathematical sciences, the physical sciences, the life sciences, engineering, and the social sciences—will be in computer science. At the University of Washington, there are literally dozens of programs targeting this wide variety of fields. My program, Computer Science & Engineering, is just one of them. And two-thirds of the jobs are in our field. There is absolutely no reason to believe that this will change. There will be ups and downs, but the role of computer science is ever-expanding. Alfred Spector, VP for Research at Google, describes computer science as "an ever-expanding sphere." That's a great description. "If you're not us, you soon will be."

Q: At the time of the earlier interview, production of Ph.D.s in computer science had been trending down, though not dramatically. Graduate enrollment was up some, but undergraduate enrollment was down. What are the trends now, and how do they relate to employment opportunities?

E.L.: There was an undergraduate enrollment downturn in the past decade. This was a side effect of the tech downturn. Simultaneously, there was a graduate enrollment upturn—great undergraduates chose graduate school due to diminished employment opportunities. The field is producing 50% more Ph.D.s than a decade ago.

Today, undergraduate enrollments are through the roof at all major programs across the nation. At UW, our introductory course has twice the enrollment of a decade ago.

Because of the extraordinary employment opportunities, I wouldn't be surprised if graduate enrollment declines somewhat. And I wouldn't be surprised if the news media were to report this as "a decrease of interest in the field." That would, of course, be nonsense.

In terms of employment opportunities, and in terms of intellectual opportunities, what field would you be better advised to major in? Journalism?

Q: Is the production of Ph.D.-level computer scientists excessive (relative to economic demand)? Insufficient? About right?

E.L.: I would have to say "about right." Ph.D. production in computer science is far lower than in fields with far fewer employment opportunities. And Ph.D.s in computer science have a broad range of employment opportunities that take full advantage of their training. In most other STEM [science, technology, engineering, and mathematics] fields, the vast majority of graduates at all levels take jobs unrelated to their field of study. In computer science, the opposite is true: The vast majority of graduates at all levels take jobs that are in their "sweet spot." Google hires roughly the same number of graduate students as undergraduate students from the University of Washington. Microsoft also hires a large number of our best Ph.D. students, both for Microsoft Research [MSR] and for the development organization.

I do think we need to be cautious. We need to avoid the overproduction—and, honestly, exploitation—that characterizes other fields. Hopefully we'll be smart enough to learn from their behavior.

Q: The big trend 8 years ago was the reduction in jobs in industry. Historically, industry had employed about half of all Ph.D. computer science graduates, but after the dot-com boom, that started trending down. Andrews said it was the result of the closure of industrial research labs. Has this trend continued?

E.L.: Here's the bad news: Very few IT companies today have research organizations that are looking out more than one or two product cycles. And even at companies that do—such as Microsoft—this constitutes a tiny fraction of overall R&D expenditures. (Microsoft Research is about 5% of Microsoft's R&D; at most companies, this sort of work is 0%.) This illustrates the critical role of the Federal government's investment in computing research.

And here's the good news: First, while Microsoft Research is a small proportion of Microsoft's overall R&D investment, it's a lot of money. Microsoft spends about as much on MSR as the National Science Foundation spends on its Computer and Information Science and Engineering Directorate. Ditto for IBM. Second, the things you learn in a graduate program are just as relevant to the development organizations of leading companies.  At Google, a group of extraordinary Ph.D.s, led by Jeff Dean (UW) and Sanjay Ghemawat (Massachusetts Institute of Technology) has truly changed the world through innovations in scalable systems such as MapReduce, the Google File System, BigTable, and Spanner.  Twenty-five percent of Google's hires from UW are Ph.D.s, all into the development organization.

Q: Are there any other important new employment trends? More entrepreneurship? Higher unemployment?

E.L.: I think the major trend is the "infiltration" of all other fields by computer scientists. Every year, UW sends students to law school, to business school, to medical school, to biotech companies. You read today about the extraordinary demand that's looming for "data scientists." Who are these people going to be? They are going to be people educated in computer science departments in scalable machine learning, data visualization, and related areas. As I noted earlier, there will always be ups and downs, but the long-term trends are clear: We are the world.

Q: In our earlier interview, Andrews noted that computer science does not have a postdoc tradition, but at that time, the Taulbee Survey showed the postdoc beginning to emerge, with a 25% jump in the most recent survey. But the number of postdocs was still small—about 400 total, I think, in 2006. Has this trend continued?

E.L.: There has indeed been a considerable increase in the number of postdocs. And the Taulbee Survey doesn't capture all of them. It doesn't capture postdocs in industry (e.g., Microsoft Research). The important thing, in my mind, is that we are thoughtful about this. Postdocs are great if they are advancing the careers of the individuals—a natural situation as a field matures and it takes longer to get to the bottom of it. Postdocs are bad if they are a holding pen for people whose permanent job prospects are questionable—if these people are treated as disposable cheap labor for senior investigators. The Computing Research Association has recently adopted a postdoc best practices manifesto, an effort led by Anita Jones.

I'm a huge fan of "truth in advertising" at all educational levels. What are your future prospects when you choose a particular undergraduate program? What are your future prospects when you choose a particular graduate program? What are your prospects when you choose a particular postdoctoral position? This "truth in advertising" should apply across fields, across subfields within a field, and across institutions. 

Q: How is the faculty job market? Has retirement of the field's first generation opened up more academic positions? Are more positions being added?

E.L.: Many factors weigh into this. Several years ago, private institutions suffered endowment plummets, and public institutions suffered state funding plummets. This caused belt-tightening across the board. Today, private institution endowments have largely recovered. State funding plummets are being addressed by raising tuition. (This worries me greatly: The mission of the nation's great public universities is to provide socioeconomic upward mobility; rising tuition is at odds with this mission.) What's important, though, is that investments in faculty are driven by student demand and intellectual opportunity. No field has more of either of these than computer science. This year, every institution that I'm aware of is hiring, and the quality of Ph.D. graduates is truly extraordinary.

Q: What about cybersecurity? It seems to be big these days. Is there a major role for Ph.D.-level computer scientists in that field?

E.L.: Certainly, there's a huge role in cybersecurity, both in research and in practice. Our nation is far behind in this field. We have made great progress in recent years. Our progress in software engineering—in tools to create secure software—has been remarkable. Despite this, though, we are still outgunned: We are surviving through heroic "band-aid" efforts, scurrying to patch vulnerabilities. It's important not to think of cybersecurity merely as Web browser vulnerabilities. Every aspect of our nation's critical infrastructure—the air traffic control grid, the electric power grid, the financial system, etc.—relies today on information technology. Hardening these systems is the full-employment act for well-educated cybersecurity professionals.

Q: At the time of our interview, Andrews called diversity in the field "absolutely woeful." Is this still true? Have there been gains in degrees conferred on and employment of women and underrepresented minorities?


CREDIT: Henri Bulterman

Ed Lazowska at Brown University, 1971

E.L.: As a field, we are focused on this, and we are making progress. Across the nation, enrollments are up, and enrollments of women in particular are up even more. But we still have a long way to go.

I like to focus on the "why." Bill Wulf put it extremely well a decade ago: Sure, we need to be more inclusive for reasons of social equity: Groups that are currently underrepresented should have the opportunity to participate in this field. And sure, we need to be more inclusive for reasons of workforce: There is enormous employer demand, and we're failing to fully tap potential participants. But most importantly, there is the selfish reason. Each of us brings a unique perspective to the systems that we design. If we fail to include certain groups in our field, we limit the perspectives that will be brought to bear on the solutions that we create. A more diverse workforce yields a better-engineered end product. So even if you're a heartless capitalist, with no interest in equity or workforce, you should focus on diversity because the result will be a superior product and one that meets the needs of a broader swath of the population.

Q: What's new and emerging in computer science? If you were in training today, about to choose a thesis area, what subfields would you look at?

E.L.: Computer science is a field of limitless opportunity, and limitless impact. We are terrible at predicting the future: We overestimate what can be achieved in 10 years, and we underestimate what can be achieved in 50. Look back 10 or 12 years. Did we foresee the revolutions in search, Web-scale systems, digital media, mobility, e-commerce, the cloud, social networking, and crowdsourcing? No way! These were barely on the horizon in 2000, and they are part of our everyday lives today.

Here's one thing that's certain in the next 10 years: We will put "the smarts" in everything:  smart homes, smart cars, smart health, smart robots, smart science (confronting the data deluge), smart crowds and human-computer systems, smart interaction (virtual and augmented reality).

And here's another thing that's certain: Every field of discovery will become an "information" field. That's the "big data" story: Data-driven discovery will become the norm, driven by advances in computer science. Think about biology. [James] Watson and [Francis] Crick discovered the biochemistry of DNA. But what they really discovered is that the human genome is a digital code, which can be read, deciphered, and rewritten. Over several decades, this transformed biology into an information science. Today, if you're a biologist who is not deeply rooted in "computational thinking," you're collecting tadpoles in some swamp. The same is true of an increasing number of fields.

These advances draw upon all of computer science. Today, machine learning is hot. Tomorrow, it will be something else. The only thing for sure is that it will be computer science.

Q: Because we're often not that well informed, we journalists often ask the wrong questions and, hence, overlook important things. Help me avoid doing that. Beyond the specific questions I've asked (and you've answered), what should aspiring computer scientists know about?

E.L.: I hesitate to say this to a AAAS guy, but here goes: Science policy in this nation, and STEM education, is in the iron grip of chemists, physicists, astronomers, and biologists. They don't want any interlopers. But increasingly, advances in these fields are being driven by computer science. There is no field that is more important to the future of the nation and the world.

All of our national and global challenges—education, health care, transportation, energy, national security, scientific discovery, you name it—rely on advances in computer science.

Let's recognize this, and act accordingly.

Jim Austin is the editor of Science Careers. @SciCareerEditor on Twitter

10.1126/science.caredit.a1300057