Scientists as Customers?

Would Karl Marx smile and nod sagely if he observed how scientists do their work today?What would a business efficiency expert say to a scientist today? I had these thoughts while recently thumbing through a new issue of the pop science magazine Nautilus. Because, right on page 3, there’s this:

Screen Shot 2014-10-06 at 11.15.11 AM

The text at the bottom is hard to read so here’s a detail:

Screen Shot 2014-10-06 at 11.15.20 AM

At first I thought nothing of it and just kept reading. But this announcement kept coming back to me, raising all sorts of questions. For example – At who is this message aimed? Presumably not many readers of Nautilus will be jetting off to Chile to use the Very Large Telescope or any of the other science facilities the European Southern Observatory operates.

Screen Shot 2014-10-06 at 11.20.48 AM

OK then, so this isn’t an advertisement to drum up visitors to Cerro Paranal or solicit proposals for telescope time.

No, something else is going on here. ESO’s advertisement must be read as a boast – it’s trumpeting the efficiency and effectiveness of its scientific facilities. Its observatories are, ESO claims, the “most productive” in the world. This is not the same as proclaiming that they produce the “best science” which is a much harder claim to make.

This focus on productivity, and its close cousin, efficiency, got me thinking about Frederick Winslow Taylor. In 1911,Taylor published his book The Principles of Scientific Management

Screen Shot 2014-10-06 at 11.32.04 AM

Although little remembered today, it’s one of the 20th century’s most influential books. In it, Taylor laid out a philosophy of managing workers and work flow with the aim of solving some of that era’s labor problems (and making business more profitable). In short, he wanted to get manual laborers to do more work in the same amount of time. Workers, to put it mildly, objected to Taylor’s intrusion into their workplace. Moreover, in some cases, they proved that Taylor’s methods were anything but scientific. When you read today about managers monitoring the workplace, keeping track of key strokes, and recording service calls – thank Taylor.

Shift from scientific management to managing science. Until the 1990s, telescopes used to be operated most often in what’s called “classical mode.” You can picture the scene – astronomer at the telescope, late at night, alone, cold, heroically working to unravel the mysteries of the Universe. Something like this, although maybe without the coat and tie:

Screen Shot 2014-10-06 at 11.42.05 AM

1936 image by Russell Porter of astronomer using the 200-inch at Palomar.

Fast forward 40 years…astronomers’ nightly work now looked very much like this.  As I’ve written, computers changed everything about how astronomy was done.

Screen Shot 2014-10-06 at 11.45.02 AM

Astronomer Caty Pilachowski, c. 1988, using 4-meter telescope at Kitt Peak.

Along with computers came the introduction in the 1990s of what’s known as queue observing. In fact, computers and computer models made this possible. We might think of new way of doing science as an application of Taylor’s general goals of maximizing efficiency to science. Successful proposals for telescope time are put into an observatory’s queue and executed by staff astronomers when observing conditions are suitable. ESO operates its big facilities in Chile in this fashion, as do many other major observatories.1

Advocates of this queue observing stress that it enables science facilities can be used more efficiently. This isn’t trivial when a night of observing time can cost upwards of a $1/second. Opponents of queue scheduling argued that this mode of doing science might produce a generation of researchers who were, as Karl Marx might have said, alienated from the means of production. As one scientist remarked in 1996, “I am really worried about the Nintendo mentality in astronomy.”

Decades earlier, physicists accepted arguments about cost-effective use. At a 1966 meeting at the Stanford Linear Accelerator, for example, Berkeley’s Luis Alvarez encouraged colleagues to think in terms of the number of interesting “events per dollar” produced by ever-more expensive Big Science machines.

By the late 1990s, queue scheduling had prevailed at places like the Very Large Telescope and the international Gemini Observatory. Coincident with this was a shift in language about the effective use of science facilities. Look at the questions posed at a meeting in the mid-1990s to discuss telescope use:

The choice of language here is striking. Astronomers are referred to as “customers” seeking a product. So, what’s the product? As Matt Mountain, currently director of the Space Telescope Science Institute, told me in an interview several years ago, “We produce high quality, corrected beams of light pointed at the right direction at good instruments and detectors and collect the data.”

Queue scheduling allows on-site observers to select observing programs that are best suited for prevailing weather conditions. Moreover, telescope design has been done to increase the rapidity with which this “high quality” stream of photons can be switched from one instrument to another. (Observatories typically have several highly complex instruments clustered underneath or nearby the actual telescope.)

Queue scheduling at places like Gemini and the VLT was set up to maximize the efficiency and productivity. We might think of this emphasis on flexibility, efficiency, and productivity as resembling the famous “just in time” manufacturing techniques pushed by Japanese car makers in the 1950s (and widely admired by executives in the U.S.).

It’s this shift in telescope use – where efficiency is paramount – that is reflected in the advertisement ESO placed in Nautilus.

Screen Shot 2014-10-06 at 11.15.20 AM

Did the quest for better science drive the shift toward emphasizing productivity and efficiency? Yes, but that’s only part of the story. In the United States, these concerns followed larger trends. In 1993, for example, Congress passed the Government Performance and Results Act requiring each federal agency, including the NSF, to devise yardsticks to measure performance and progress. This was not just an American trend. European astronomers did similar studies evaluating telescope productivity. As ESO’s advertisement indicates, this way of thinking is still very much alive.

The need to demonstrate greater efficiency and productivity encouraged scientists to accept models and metaphors from the business world to describe observatory management and telescope operation. Astronomy in the 1990s, like particle physics in the 1950s and 60s, became a “big business” or, at the least, a very expensive one. The next generation of giant telescopes will drive this trend forward even more. Astronomers started describing observatories as “data factories.” So, perhaps its not a surprise that perhaps some observatory directors and their staff started to see the researchers who came to their facilities as customers.

None of this addresses the question of what one means by “productive” though. Is the proper metric of productivity the number of times a publication was cited? Perhaps it could be the number of scientific problems “solved?” Or prizes won by a paper published using data from a particular facility?

Screen Shot 2014-10-06 at 12.48.58 PM

Could a time come when observatories and other science facilities take a cue from the Golden Arches and simply tout the number of customers served? Let’s hope not.


  1. To be fair, I’m talking here largely about ground-based optical astronomers. Radio astronomers had long been accustomed to receiving data collected by others. And, of course, all space-based observations are done in queue mode. If you’re unclear why, watch this. []

To Have and/or To Have Not

Screen Shot 2013-12-26 at 12.37.57 PM

The 4-meter Mayall telescope at Kitt Peak. Dedicated in 1973…slated to be closed in 2017.

“They don’t give a flying fuck about the rest of us. They’d just as soon take it all.” Such was the explosion of words – part angry, part resigned – I heard. But the speaker wasn’t some anti-Wall Streeter in an Occupy camp. No, the words came instead from a senior astronomer at a major Midwest research university. The time was 1999 and the context was a series of interviews I was doing as research for what became my book Giant Telescopes.1

Some translation is in order: “Us” is the community of astronomers who rely on the national system of publicly funded and accessible telescopes to do their research. “They” refers to those researchers with access to privately funded telescopes whose access is much more tightly guarded.

The topic stirs strong emotions. All scientists need resources – equipment, funding, time – to carry out their research. For astronomers, two of the most important assets are access to telescopes and sufficient time allocated on them to make observations and collect data. The community makes a fundamental and long-standing distinction between those who have access to telescopes through their institutional affiliation and those who do not and must instead compete for time at one of the federally-funded national centers. A former observatory director explained the situation as, “There are the independent observatories which some people call the ‘haves.’ And there are the ‘have nots!’…The people who are the ‘have nots’ still have to rely on the National Observatory to get time.”

Optical astronomy – the subfield of astronomy I know best – has a long tradition of private and philanthropic support. This history stretches back to the 19th century and the continued generosity of deep-pocketed donors for astronomy separates it from most other sciences in the United States. Historically, private institutions or state-supported universities have funded and managed most large American telescope facilities. The largest and best telescopes have been available to only to a small fraction of the entire astronomical community. (A disclaimer: I work at the University of California and recently spent a year as a visiting professor at Caltech. UC and Caltech astronomers have access to the two privately-owned 10-meter telescopes in Hawai’i which were built in the 1980s with funds from the W.M. Keck Foundation.)

The consequences of this history were made clear in a recent Science article. Over the next three years, five publicly-accessible telescopes are slated to be closed. The root cause is a funding crunch at the National Science Foundation. Closing the five telescopes could save about $20 million annually at a time when the agency is also trying to build two new telescope facilities.

Screen Shot 2013-12-26 at 12.54.20 PM

‘Scopes on the chopping block; graphic from 20 December 2013 article in Science by Yudhijt Bhattacharjee

If the proposed shut-downs occurred, there would be no publicly-accessible telescopes available to optical astronomers in the continental U.S.. The Gemini North telescope, a publicly funded 8-meter facility on Mauna Kea, would still be open as would its southern twin in Chile. But jetting off to Chile is a lot harder and more expensive than traveling to Kitt Peak in southern Arizona. ((A follow-up note: Point taken…yes, Gemini N/S is primarily queue based but it can also be used in observer mode still, yes? In any case, I think my general point about not having access to telescopes in the CONUS and having instead to fly to Chile or Hawai’i makes sense. One might also make the case that smaller telescopes – the kinds that KPNO operates – are ideal for training students. I know a lot of this can be done remotely but I wonder if this is a substitute for actually being on-site. Moreover, in the NSF’s plan, only one major radio telescope in the U.S. – the Very Large Array in New Mexico – would be left.

Once again, astronomers are referring to the have vs. have-not dynamic that defines their community. Science quoted, for example, Angela Speck, an astronomer at the University of Missouri which doesn’t have access to its own telescope facilities, as noting that “more than half” of the optical.infrared observing community is the same boat as her. Without access to publicly-available telescopes, recruiting and training graduate students and postdocs becomes problematic. As another scientist noted, the proposed shutdowns could signal the “loss of a generation of astronomers.”

This isn’t the first time that the nationally-operated system of telescope has come under threat. As I detailed in my book, the astronomy community in the 1980s and 1990s was regularly rocked by fears that it would lose access to its research instruments. Again, there is no single root cause. Budgetary woes at the NSF are compounded by ambitious plans to build a new generation of cutting edge telescope facilities. For instance, The NSF is currently funding construction of the Advanced Technology Solar Telescope (~$300 million), the ALMA array in Chile (an international project that prices out at around $1.5 billion), and the Large Synoptic Survey Telescope (~$665 million). So, it’s not as if the U.S. isn’t investing into observational astronomy. But these mega-projects are crowding out the smaller ‘scopes that a large fraction of the community uses. Is this a good thing? Is it inevitable?

Compounding the problem is the fact that, once built, telescopes stay open for a long time. The venerable 200-inch telescope operated by Caltech on Palomar Mountain – another private facility – was dedicated in 1948 and is still used today to make observations (although light pollution has dimmed its usefulness), train students, and test new instruments such as the innovative ARCONS camera.

Screen Shot 2013-12-26 at 1.25.37 PM

Still going…

At the same time, annual operations costs needed to keep an observatory going are generally assumed to be about 5-10% of construction costs…so, something like the LSST may cost something like $50 million each year to keep surveying the sky.

The problem of old ‘scopes being sacrificed to pay for new ones isn’t confined just to the publicly-funded system though. Science also reported that the Lick Observatory – a 125 year old facility operated by the University of California on Mt. Hamilton east of San Jose – also faced the threat of closure. The expected savings from not operating Lick would be plowed back into funding the Keck Observatory as well as UC’s portion of its next mega-project, a 30-meter optical telescope underway in conjunction with Caltech (as well as organizations in Canada, China, India, and Japan). Expected price tag of the TMT? Over $1 billion.

A few years ago, I co-authored a paper about ways in which NASA was making plans to build the successor to the Hubble Space Telescope. What was remarkable was that that the long lead time necessary to secure the political and financial support meant that the plans for a new space-based instrument had to be made while the existing one was still chugging along. Similar dynamics are very much at work in ground-based astronomy today, regardless of whether one is a have or a have-not.

  1. If you want a short intro to the topic and some more discussion, see this 2000 paper. []

Sharing the Sky

Screen shot 2013-07-09 at 2.01.29 PMLeaping Robot has been leaping a little less often. It’s summer and I have a big new research project – more on this some other time – that I’m getting off the ground. Also, I’ve been busy getting ready for an upcoming trip to Manchester to attend the 24th International Congress of History of Science, Technology, and Medicine. With meetings held every four years, it’s not quite the World Cup (or Comic-con) but it’s still the largest gathering for historians of science. One of the things I’ll be doing in Manchester – besides visiting Jodrell Bank and having a few pints of good beer with friends – is giving a paper called “Learning to Share.” Subtitled “Astronomers, Data, and Networks,” the paper gives an overview of a project I’m just wrapping up; the final results will be published next year in an issue of Technology and Culture.

My work on this topic started in 2011-12 when I spent a year at Caltech as a visiting professor. The main research project I did while there (besides finishing The Visioneers) was to explore the digitization of astronomy. Starting in the 1960s, astronomers’ view of the sky shifted from an analog perspective in which data was recorded using photographic plates and strip charts to one wholly mediated by digital technologies.Basically, I’m interested in how astronomy went from this…

Screen shot 2013-07-09 at 2.01.37 PM

Astronomer Roger Lynds doing some “old school observing”

to this:

Screen shot 2013-07-09 at 2.01.44 PM

Inside of telescope control room, Kitt Peak, a decade later.

By the early 1980s, astronomers expressed growing concern about having to deal with a deluge of data that was increasingly “born digital.” Data management became one of the modern astronomers’ necessary tasks as astronomy itself transformed into a particular form of “information science.” This transition presaged today’s debates about Big Data and the archiving of massive data sets in astronomy and other sciences which researchers mine.This process still has implications for today in the ways in which scientists share their data. For example…

In February 2013, the Obama administration announced a new policy designed to increase public access to scientific research funded by the federal government. In a memo that accompanied the White House announcement, John P. Holdren, director of the White House Office of Science and Technology Policy, directed U.S. science agencies to develop “clear and coordinated policies” so results from research they support will be publicly available within a year of publication. The new policy was motivated in part by the belief that shared science creates tangible practical and economic benefits. Another driver was the often stated complaint by scientists from astronomy to zoology who claimed they were simply drowning in data.

An overabundance of data, in fact, has long presented scientific communities with tremendous challenges. Today, astronomy is the scientific discipline that often appears in journalists’ accounts of “Big Data,” “data deluges,” and “information explosions.” However, it was during the 1970s, astronomers began commenting on an especially significant discontinuity in the amount of data they found at their disposal. Within a decade, astronomers routinely spoke with both trepidation and excitement about onrushing “floods” of data such that one might dare refer to research before this as “antediluvian astronomy.”

The key catalyst for this sense of crisis was the relatively sudden proliferation of new electronic and digital means for recording astronomical observations. This was not simply a matter of astronomers adding electronic computers to their toolkit. The operation of a competitive modern observatory also required a new workforce whose members possessed a different set of skills. Instead of (or in addition to) traditional expertise in astronomy and astrophysics, the digitization of astronomy required knowledge about solid-state detectors, digital circuits, and computer programming. Moreover, the digitization of astronomy helped reshape traditional norms and behaviors – what we can call a “moral economy” – in the astronomy community including those associated with sharing data and research tools.

Imagine it is 1976 and you are an observational astronomer. Regardless of what kind of telescope you use – optical or radio, public or private, orbiting in space or sitting on a mountaintop – if you wanted to share data you collected, could you? In the older analog tradition, astronomers loaned photographic plates to colleagues while observatories maintained physical libraries of the same. But, as more data was born-digital, the ease of sharing it posed an increasingly problematic issue.

In order for scientists to readily share their growing collections of digital data within the same observatory or – even more difficult – between institutions located in different countries it needed to exist in a common format. And, once digital data was in a common format, questions arose about how and when it, along with the digital tools to process and analyze it, might be shared.

At the Manchester meeting, I’ll be focusing on two episodes in this historical process using astronomy as an example. Both center around scientists’ wish to share data and data processing tools with their colleagues. The first episode concerns the emergence of the Flexible Image Transport System or FITS.

Screen shot 2013-07-09 at 2.05.20 PM

1981 paper describing FITS

This is a common data format developed by scientists at national observatories in the U.S. and the Netherlands and accepted as an international standard in 1982. Recently, the Vatican Library adopted FITS as its standard for storing tens of thousands of paper documents and manuscripts, some dating back more than 1,800 years, after they had been converted into digital images.

As critical as FITS was to calming scientists’ concerns about being overwhelmed by a rising flood of digital data, it only resolved part of the problem. The issue of how to interact with digital data still remained a pressing issue for astronomers. Even as FITS was accepted as a common data format, astronomers at institutions around the world still faced a bewildering assortment of image processing programs. When an scientist finished an observing run, they would often write their own “homegrown” software code to process their data. Consequently, here was little in the way of consistency when it came to image processing programs as astronomers came up with solutions that were local, disorganized, and ad-hoc.

This leads to the second example I’ll be discussing in Manchester. STARLINK was a sophisticated computer network for sharing and manipulating digital astronomical images that debuted in the United Kingdom in 1980. STARLINK was centered around a central node – this was at the Rutherford Laboratory near Oxford – to which several other sites were be linked.

Screen shot 2013-07-09 at 2.08.51 PM

Schematic of the STARLINK system.

Leased telephone lines (this is all pre-World Wide Web!) from Britain’s Post Office connected the STARLINK sites. Their initial data capacity was sufficient to allow the transmission of image processing programs but too slow to transmit large amount of data. At each of STARLINK’s sites, astronomers could access two image-display systems as well as peripherals like printers which allowed them to interact with their data in real time. But implementing STARLINK as a tool for sharing software proved more difficult than getting astronomers to share the data. As a result, STARLINK, while ambitious, never achieved all of its creators’ idealistic goals.

Screen shot 2013-07-09 at 2.09.59 PM

The Starlink VAX11/780 in the Atlas Centre, at the Rutherford Laboratory, August 1980

Astronomers’ development of new tools like FITS and STARLINK produced a new space in which different ideas, practices, and behaviors about data and its ownership could emerge. The digitization of astronomy and the question of “sharing science” was not limited to just one country or subfield of the discipline. Rather it was a process that all researchers – observers and theoreticians alike – experienced in some way. For astronomers, the question of how to share their science, was, in both senses of the phrase, a universal concern.