Astronomy’s History Trap

Last week, I wrote a blog post about the National Science Foundation’s plan to close several optical and radio telescopes as a cost-cutting measure. It clearly hit a nerve. Thanks largely to a reposting from Physics Today, more people – some 2200 – read this, more than any other I’ve written. (Only my chiding of Michio Kaku drew a similar number of readers). In response, several people wrote to me and noted the current plans in the United States to build a new optical telescope facility with light collecting area equivalent to a 30-meter mirror. Here’s my take on this…

There is quote typically attributed to Mark Twain that says, “History doesn’t repeat itself; but it does rhyme.” Regardless of whether Mr. Clemens actually said this, the fact is that astronomers today should be hearing all sorts of rhyming. But many aren’t and therein lies the problem.

Screen Shot 2013-12-30 at 12.31.38 PM

The more the merrier? Telescopes at Kitt Peak National Observatory (credit: NOAO/AURA/NSF)

There are two contenders for ground-based astronomy’s next big machine. The “Thirty Meter Telescope” project is spearheaded by scientists from Caltech and my own school, the University of California; institutions in India, Japan, China, and Canada also pledging funds to build it on Mauna Kea in Hawai’i. The heart of the telescope’s design is 492 mirror segments, each 1.45 meters in size, that would create a mosaic-like light collecting surface. Cost? Somewhere between $970 million and $1.2 billion. The Moore Foundation (started by Intel co-founder Gordin Moore) has so far pledged $250 million toward the TMT.

Screen Shot 2013-12-30 at 12.37.37 PM

Schematic of the Thirty Meter Telescope

Going head-to-head with the TMT is the Giant Magellan Telescope. Planned for a mountain site in Chile, the GMT’s cost is about the same as low-end estimates for the TMT. But the GMT’s design is radically different. It will instead use seven massive 8.4-meter mirrors to create the equivalent of a 30-meter telescope.1 The GMT consortium includes the Carnegie Institution of Science, Harvard, the Smithsonian, the University of Arizona, and several more.

Screen Shot 2013-12-30 at 12.42.00 PM

Artist’s rendering of the Giant Magellan Telescope

It’s at this point that we should start to hear the rhyming sounds of history. Because thirty years ago, American astronomers were in exactly the same spot. And, from what I can tell, they didn’t learn as much as they could have from the experience. 

To wit: In the mis-1970s, the American astronomy community was in crisis. The traditional design model for large telescopes, based on the 200-inch Hale telescope on California’s Palomar Mountain, could no longer satisfy the financial constraints on and research expectations of the U.S. astronomy community. At the same time, the nation’s large telescopes were increasingly over-subscribed; simply observing faint objects for longer times was not feasible logistically.

Several ways forward were proposed. Kitt Peak National Observatory even developed initial designs for a 25-meter Next Generation Telescope.

Screen Shot 2013-12-30 at 12.30.03 PM

Credit: Rick Showalter/NOAO/AURA/NSF

When the 25-meter proved too ambitious, the national observatory scaled plans back to 15 meters. The problem? There were two competing designs for what was then called the National New Technology Telescope…is this starting to sound familiar?

Plan #1 – Build a telescope with a 15-meter light collecting area using 60 individual hexagonal glass segments to form the light collecting area. This design was championed by, yes, astronomers from Caltech and the University of California.

Plan #2 – Or…put four 7.5 meter mirrors on a common mount to create a total light gathering ability of a 15-meter telescope. This effort was pushed by researchers at – wait for it – the University of Arizona.

Are you hearing those echoes of the past yet?

Screen Shot 2013-12-30 at 12.31.55 PM

Segmented-mirror and multiple-mirror designs for the NNTT with a scale
model of the Kitt Peak 4-meter telescope.

So, what happened to the NNTT? At a meeting in July 1984, a blue ribbon panel of astronomers and engineers picked Arizona’s multiple-mirror design. But it was a Pyrrhic victory. Less than a year later, the W.M. Keck Foundation gave Caltech $70 million to build a 10-meter telescopes in Hawai’i; funding to build a second followed.2

Meanwhile, NSF funding for the “victorious” NNTT was nowhere near as generous and, in 1987, the 15-meter national telescope project was killed. What arose from the ashes of the national 15-meter project was an international partnership to build two 8-meter telescopes. The first Gemini telescope in Hawai’i saw first light in 1999; its twin in Chile reached the same milestone in 2000. The result of all this astro-politicking: two privately operated 10-meter telescopes and two publicly accessible 8-meter telescopes. ((The whole story is way more complicated that I’ve summarized here. For example, Arizona’s mirror technology wasn’t used in either Keck or Gemini. Rather, it was used for the twin Magellan telescopes in Chile and the NNTT design was used, sort of, to build the Large Binocular Telescope in southern Arizona. Meanwhile, the Gemini telescopes were built using what are called “thin meniscus mirrors,” a third technological path that emerged in the 1980s.))

Today’s impasse over whether to build the TMT, the GMT, both, et cetera closely resembles the debates in the early 1980s about the National New Technology Telescope. To be sure, the past isn’t exactly repeating. Neither TMT nor GMT is envisioned as a publicly accessible facility. But the history does rhyme. Many of the actors (individuals  as well as institutions) that were so embroiled in that controversy/competition over how to best build today’s giant telescopes are implicated in today’s debates about which design and which partnership model is best for tomorrow’s bigger (gianter?) telescopes. (One quickly runs out of superlatives…large, overwhelmingly large, monster, etc…meanwhile, the European Southern Observatory’s its 40-meter mega-project with the anodyne name of the Extremely Large Telescope.)

Why should this ancient history – water under the bridge, one might say – matter to astronomers today? I can think of at least three reasons:

#1 – A billion dollars to build a new telescope – whether it comes from private donors or governments – is obviously a lot of money. This is thrice true when we’re talking about possibly building three 30-meter class facilities. If this means shuttering smaller ‘scopes, as the NSF is planning, than one has to consider the impact this could have on astronomy’s “have-nots” i.e. those people without access to privately-operated facilities.3 Will ever-larger research facilities affect how science is done? How many grad students or postdocs will have access to a 30-meter facility? Would they simply become folded into a much larger research program? The current generation of 8/10-meter class facilities clearly changed the practice of doing science…there’s every reason to expect 30-meter telescopes will do likewise.

Screen Shot 2013-12-30 at 2.33.20 PM

#2History seems to be poised to repeat itself…In the 1980s, while the U.S. community bickered over which design was best (and tried to raise the capital necessary to start building), the European science community slowly and methodically built an innovative series of telescopes. The culmination of this was the Very Large Telescope (again, the names…). This suite of four 8-meter telescopes in Chile helped put European astronomy on equal (some would say better) footing than their U.S. counterparts. Now, I’m not trying to make some nationalistic argument here. Astronomical research in the 21st century is certainly more international than it was three decades ago while new players (China, India, et al.) have entered the telescope game. But astronomers with whom I have spoken warn of a similar dynamic at work now…while the U.S. community dithers over whether to build TMT, GMT, whatever, the European community is gradually making progress towards its 40-meter goal. Americans’ fear is that their European competitors will be able to pick the low-hanging “astronomy fruit” that a new giant facility will put in reach.

#3 – Perhaps the most critical reason for thinking about all of this “history” is how today’s debates over which telescope to build affect the morale and spirit of the American astronomy community. How does this infighting reflect the community’s moral economy i.e. those unstated yet accepted rules that define and structure community interactions? The principal actors driving the TMT and GMT projects forward are leaders in the astrophysics community. How much energy and effort is being spent in sparring with one another and touting the benefits of one’s own design (and disparaging the neighbor’s). To a naif, this battling can seem downright ridiculous. Caltech and Carnegie are a few miles apart yet there might as well be a shark-filled canyon between them given the vitriolic statements I’ve heard from the two camps. The only point of agreement seems to be how the NSF has failed to provide necessary and adequate leadership in helping the U.S. get its act together.

Now, one could, of course, argue that a lesson to take away from all of this history is that it all turned out fine in the end. Keck was built, Gemini was built, et cetera…maybe the “market” for telescopes worked and things just naturally sorted themselves out. Maybe competition was a good thing…

But I am inclined to think it all worked in spite of things. More to the point, I think astronomers need to recognize how their history is rhyming and consider how not to repeat past mistakes. As I wrote this blog post, I kept thinking of how much of today’s circumstances resemble the situations I described in my book which is now already a decade old…but, as André Gide noted, “Everything that needs to be said has already been said. But since no one was listening, everything must be said again.”

  1. Three of these have so far been cast at the Mirror Lab at the University of Arizona. []
  2. All of this history is detailed in my 2004 book Giant Telescopessome of it is captured in this article. []
  3. Obviously, my splitting an entire scientific community into two camps is an oversimplification. There are researchers from Caltech who use the National Radio Astronomy Observatory’s ‘scopes just as “have-nots” can sometimes compete for time at privately-operated Keck telescopes. The NSF even operated the “Telescope System Instrumentation Program” which would help fund “the development of instruments…for the private observatories, in exchange for which telescope time on those facilities will be made available to the community.” It’s unclear to me whether TSIP is still operating…The National Optical Astronomy Observatory’s page for the program doesn’t seem to have been updated recently…another NOAO page suggests that time is available in 2014 though. This page, however, has some interesting stats on the number of nights made available and the estimated costs. []

Regulating Nanotechnology Via Analogy, Pt. 2

[Blogger’s note: This post – a continuation of the previous one – is adapted from a talk I gave in March 2012 at the annual Business History Conference. Like the last post, it draws on research done by Roger Eardley-Pryor, an almost-finished graduate student I’m advising at UCSB, and me. I’m posting it here with his permission. Some of the images come from slides we put together for the talk.]

Screen Shot 2013-02-15 at 9.52.10 AM

Can historical analogies help us approach technological risks better?

In my last post, I discussed the ways in which policy makers could use historical analogies as tools when considering ways in which nanotechnologies might be regulated. At the end, I suggested that multiple definitions for nanotechnology posed a challenge for finding the one best analogy, however. So – what are examples of the analogies made between nanotech and other technologies and what does have to say about possible regulation paths…consider the following examples:

Example #1 – Genetically Modified Organisms

Screen Shot 2013-02-13 at 5.02.46 PM

Engineered nanomaterials bear some relation to GMOs…but it’s not necessarily a strong one.

In April 2003, Prof. Vicki Colvin testified before Congress. A chemist at Rice University, Colvin also directed that school’s Center for Biological and Environmental Nanotechnology. This “emerging technology,” Colvin said, had a considerable “wow index.”1 However, Colvin warned, every promising new technology came with concerns that could drive it from “wow into yuck and ultimately into bankrupt.” To make her point, Colvin compared nanotech to recent experiences researchers and industry had experienced with genetically modified organisms. Colvin’s analogy – “wow to yuck” – made an effective sound bite. But it also conflated two very different histories of two specific emerging technologies.

While some lessons from GMOs are appropriate for controlling the development of nanotechnology, the analogy doesn’t prove watertight. Unlike GMOs, nanotechnology does not always involve biological materials. And genetic engineering in general, never enjoyed any sort of unalloyed “wow” period. There was “yuck” from the outset. Criticism accompanied GMOs from the very start. Furthermore, giant agribusiness firms prospered handsomely even after the public’s widespread negative reactions to their products.  Lastly, living organisms – especially those associated with food – designed for broad release into the environment were almost guaranteed to generate concerns and protests.2 Rhetorically, the GMO analogy was powerful…but a deeper analysis clearly suggests there were more differences than similarities.

Example #2 – Asbestos

Screen Shot 2013-02-13 at 5.03.16 PM

Are engineered nanoparticles analogous to asbestos fibers?

A different definition of nanotech treats it like a new form of matter…stuff requiring special oversight, particularly in the workplace. Such a material definition of nanotechnology suggests a ready analogy to asbestos.  Given decades of enormous and expensive asbestos litigation, the analogies between asbestos and nanotechnology have prompted substantial toxicological analysis on new materials.3  Carbon nanotubes (CNTs) are best known of these new nano-materials. With a long thin structure that resembles that of asbestos, numerous toxicological studies indicate that nanotubes share a similar toxicity. These similarities and the historical circumstances of attempts to regulate asbestos in the United States offer suggestions for how to proceed toward the regulation of certain nanotechnologies.

Given the known threats of asbestos, the U.S. EPA attempted an all-out ban on its manufacture and use. However, in 1991, the U.S. Fifth Court of Appeals claimed EPA did not meet the requirements to impose the “least burdensome” controls. The court promptly lifted the ban for all but the most dangerous existing asbestos products. The inability of EPA to ban asbestos, despite decades of evidence confirming its hazards, indicates the need for serious reform of Toxic Substances Control Act or TOSCA, the existent United States’ law for chemical regulation.4 While this need for reform applies for existing substances like asbestos, it applies even more so for novel and analogous nanotechnologies like CNTs.

Example #3 Fallout

Screen Shot 2013-02-13 at 5.05.06 PM

Per capita thyroid doses in the continental United States resulting to atmospheric nuclear tests at the Nevada Test Site from 1951-1962.

With planetary fears about grey goo and self-replicating nanobots, figures like Michael Crichton, Bill Joy, Prince Charles, and, at times, even K. Eric Drexler, seemed to define nanotechnology as so broad, diverse, and nebulous that they rendered it as a questionable, minute, and invisible unknown.  This line of thinking suggested nanotechnology might be analogous to another existential and invisible, yet life-threatening technological byproduct – radioactive fallout.

Each of the hundreds of open-air nuclear devices exploded between 1945 and 1980 released minute, invisible, radioactive debris that circulated around the planet’s stratosphere before falling back to earth, exposing humans and the environment to its harmful radioactivity.5 The global spread of these materials throughout ecosystems and into human bodies occurred without full public or private consideration of their risks by policy-makers, by scientists, or by unknowingly exposed publics. In WWII and during the Cold War, the dictates of national security instigated the development and open-air testing of nuclear weapons.  However, by the end of the Cold War, national security came to be defined increasingly in terms of economic security.  Along those lines, American scientists and policy-makers in the late 1990s and early 2000s framed the need for the federal development of nanotechnology in the rhetoric of economic national security.

The nanotechnology enterprise has also yielded novel engineered particles that exist only at invisible scales; new particles that have found wide commercial distribution around the world before full public or private consideration of their potential risks to human health, or full consideration of their threats to our environmental security. In 2003, Oregon Congressman David Wu hinted at the analogy between nanotechnology and nuclear fallout by citing a historic example of regulating fallout’s novel and invisible threat via the Partial Nuclear Test Ban Treaty. 6  Though Representative Wu celebrated the Test Ban Treaty for its international cooperation and control of hazardous fallout, he noted that “In many respects, the Nuclear Test Ban Treaty is nothing but a ban on experimentation.”7 At the time, organizations like ETC, Greenpeace, and Friends of the Earth-Australia had also called for a ban on nanotechnology production until researchers clearly understood all of nanotechnology’s EHS risks.  As with other examples, one’s definition of nanotechnology – here as an invisible, existential, and global threat – determined the appropriate analogy to prior technologies.  That definition, in turn, indicated to various nano-stakeholders particular forms of precaution, regulation, and control.  If nanotechnology was analogous to fallout, maybe the analogous regulation would be an outright ban that would forestall all future risks?

Example #4 – Recombinant DNA

Screen Shot 2013-02-13 at 5.01.54 PM

A fourth definition for nanotechnology moves us beyond consideration of novel forms of matter and instead identifies nanotechnology as a suite of technological practices for manipulating nature – techniques that render the natural world as unnatural.  This identification of nanotechnology with particular lab practices yields an analogy to debate about recombinant DNA (rDNA) techniques of the 1970s.

In the mid-1970s, scientists agreed to a moratorium on rDNA practices until they better understood the technology and until the U.S. National Institutes of Health (NIH) could establish proper guidelines. After the famous 1975 Asilomar Conference, the NIH’s Recombinant DNA Advisory Committee produced its research guidelines. These guidelines clearly defined specific biological techniques and instituted multiple layers for control, including requirement of biological containments. This ensemble of lab practices helped stimulate the rapid commercialization of modern biotech research and, one could argue, consumer acceptance.

Nanotechnology-stakeholders have identified a similar goal of early anticipation and mutually agreeable control through their framework of anticipatory governance. For some nanotech stakeholders – particularly entrepreneurs affiliated with commercialized industry – the NIH’s decision to institute guidelines for rDNA technology, rather than push for legally binding regulations, offers possible paths for the eventual oversight of nanotechnology. Government guidelines consist of procedures that people are expected to follow when receiving federal dollars, whereas regulations are substantive rules for all actors that carry the authority of law. However, drawing lessons from rDNA and applying them to nano comes with drawbacks –for example, guidelines similar to those from the NIH might only apply to federally funded research. This would leave privately funded research in a different regulatory regime, subject not merely to guidelines, only to the hard law of regulation.

Some concluding thoughts…

“Nanotechnology” is a socially constructed collection of techno-scientific ideas, practices, and materials. But with such a broad and sometimes vague set of definitions for nanotechnology used by scientists, policy-makers, activists, and businesses, how can nano-stakeholders know what to regulate?

Some scholars, including Andrew Maynard, a leading expert on risk science, suggest that regulators’ wish for strict definitions is misplaced. Maynard, for instance, believes that a precise definition for nanotechnology would actually impede proper regulation.8  Instead of a categorical definition, Maynard now argues that regulation must focus on its various “trigger points,” or empirical points that transition a material from conventional to risky. Here, one could imagine officials looking to historical examples to find other such ‘tipping points’ which catalyzed regulatory reform.

But policy makers have moved in the opposite direction. In late 2011, Health Canada as well as the European Commission announced a specific set of politically designed definitions for nanomaterials to be used explicitly “for all regulatory purposes.”  Similarly, the United States’ most recent research strategy for environmental, health, and safety emphasized the need for federal agencies to establish agreed-upon definitions for nanomaterials. But, even as regulators moved toward a “one size fits all model”, analogies with other materials, techniques, and industries still prove useful. The US EPA, for instance, has considered whether certain materials should be regulated under rules that apply to insecticides. So, perhaps we can look forward to the drawing of new analogies, not to GMOs and asbestos and fallout but to DDT…

Screen Shot 2013-02-13 at 5.13.17 PM

Advert for DDT, c. 1947

So — If historical analogies teach can teach us anything about the potential regulation of nano and other emerging technologies, they indicate the need to take a little risk in forming socially and politically constructed definitions of nano. These definitions should be based not just on science but rather mirror the complex and messy realm of research, policy, and application. No single analogy fits all cases but an ensemble of several (properly chosen, of course) can suggest possible regulatory options.

  1. House of Representatives Committee on Science, “The Societal Implications of Nanotechnology,” Hearing before the Committee on Science, House of Representatives (108th Congress, 1st Session), April 9, 2003, p. 49. A quick web search on “Vicki Colvin + wow to yuck” yields some 360,000 hits including several presentations and papers she and her Rice colleagues gave that use the phrase. []
  2. Ronald Sandler, “The GMO-Nanotech (Dis)Analogy?,” Bulletin of Science, Technology, and Society 26:1 (2006): 57-62; Arie Rip, “Folk Theories of Nanotechnologists,” Science as Culture, 2006, 15, 4: 349-65. []
  3. A good article on this is Geoffrey Tweedale and Jock McCulloch, “Chrysophiles versus Chrysophobes: The White Asbestos Controversy, 1950s–2004,” Isis , Vol. 95, No. 2 (June 2004), pp. 239-259. []
  4. Marc Landry, “EPA and Nanotechnology: The Need for a Grand Bargain?,” in Governing Uncertainty: Environmental Regulation in the Age of Nanotechnology, edited by Christopher Bosso (Washington, DC: RFF Press, 2010), pp. 87. []
  5. Harold L. Beck and Burton G. Bennett, “Historical Overview of Atmospheric Nuclear Weapons Testing and Estimates of Fallout in the Continental United States,” Health Physics 82:5 (May 2002): 591-608. []
  6. This treaty, signed in 1963 by the United States, the Soviet Union, and Great Britain after years of international negotiation, banned nuclear test explosions in the air, above the atmosphere, or at sea. []
  7. House of Representatives Committee on Science, “The Societal Implications of Nanotechnology,” Hearing before the Committee on Science, House of Representatives (108th Congress, 1st Session), April 9, 2003: Wu, pg 91. []
  8. Andrew Maynard, “Don’t Define Nanomaterials,” Nature 475 (7 July 2011): 31. []

Regulating Nanotechnology Via Analogy, Pt. 1

[Blogger’s note: This post is adapted from a talk I gave in March 2012 at the annual Business History Conference; it draws on research done by Roger Eardley-Pryor, an almost-finished graduate student I’m advising at UCSB, and me. I’m posting it here with his permission. This is the first of a two-part essay…some of the images come from slides we put together for the talk.]

Screen Shot 2013-02-10 at 3.09.55 PM

Take a Little Risk?

Over the last decade, a range of actors – scientists, policy makers, and activists – have used  historical analogies to suggest different ways that risks associated with nanotechnology – especially those concerned with potential environmental implications – might be minimized. Some of these analogies make sense…others, while perhaps effective, are based on a less than ideal reading of history.

Analogies have been used before as tools to evaluate new technologies. In 1965, NASA requested comparisons between the American railroad of the 19th century and the space program. In response, MIT historian Bruce Mazlish wrote a classic article that analyzed the utility and limitations of historical analogies.1 Analogies, he explained, function as both model and myth. Mythically, they offer meaning and emotional security through an original archetype of familiar knowledge. Analogies also furnish models for understanding by construing either a structural or a functional relationship. As such, analogies function as devices of anticipation which what today is fashionably called “anticipatory governance.”2 They also can serve as a useful tool for risk experts.3

Screen Shot 2013-02-10 at 3.14.02 PM

Analogies may serve as a tool for crafting better governance of new technologies

Policy makers recognize the importance of analogies. In 2003, participants at an NSF-sponsored workshop on nanotechnology’s societal implications warned that “one of the more disturbing possibilities is that policy makers and leaders of social movements may respond to nanotechnology not as it actually is, but in terms of false analogies.”4 In 2003, policy makers and scientists were especially concerned about the public perceptions of nano.

When the U.S. government first launched its National Nanotechnology Initiative in 2000, few if any policy makers expressed concerns about its environmental implications.5 But by 2003, it was impossible for the people charged with managing nano to have ignored its environmental, health, and safety issues. So how did EHS issues get on the radar screen of policy makers and journalists? There are several causal factors; their common feature is that they all originated not in the laboratory but in the realms of popular culture, celebrity, and social activism.

An early shot across the bows came from an unexpected source. Bill Joy was a Berkeley-trained computer researcher and dot-com millionaire. His incendiary article – published by Wired in April 2000 – was titled “Why the Future Doesn’t Need Us.” It highlighted perils he saw in several emerging technologies. Motivated partly by controversies over corporate development of genetically-modified crops, Joy identified self-replication of newly emerging nanotechnologies as a clear and future danger. The solution? Joy proposed “relinquishment” and limiting development of “technologies that are too dangerous.” Accompanied by a flurry of international publicity, Joy’s article came at an inconvenient time for nano-boosters as Congress was preparing to vote on Clinton’s proposed new national nano initiative in 2000. Controversy stirred by articles like Joy’s threatened this initiative.

Nano-anxieties were fanned anew in late 2002 when HarperCollins published Prey by blockbuster novelist Michael Crichton. Central to its plot was the deliberate release of autonomous, self-replicating nanobots. Created by an amoral corporation working under contract to the Pentagon, the predatory swarm of millions of nanobots attacked people until it was destroyed. Crichton’s book hit every button that might stoke public alarm about nanotechnology: a greedy, high-tech firm; lack of government regulation; new technologies turned into military applications.

Non-governmental organizations helped keep controversies over nanotechnology in front of North American and European citizens. In January 2003, the Action Group on Erosion, Technology, and Concentration (ETC), a small Canadian NGO, released a report called The Big Down.

Screen Shot 2013-02-10 at 3.16.46 PM

ETC had previously led campaigns against genetically modified foods. Not surprisingly, their report savaged the idea of nanotechnology. ETC’s report reflected the group’s larger agenda, which was less about so-called emerging technologies per se and more about restricting corporate power and maintaining cultural diversity and human rights.

But none of the examples was about a specific existing technology. Instead, these spurs to regulation referred to hypothetical technologies and the creation of planet-threatening dangers. Soon however, concerns about nano’s regulation transcended vague existential threats and moved to specific and potentially troubling techniques and materials.

But what exactly was to be regulated? Was nanotechnology something with the capacity to spread across wide swaths of land and reap tremendous environmental damage with the fear amplified in part because of its minute size? Or perhaps nanotechnology was less an existential threat and instead a suite of scientific techniques and tools that require regulation?  If not a particular technique, was nanotechnology a particular product, a specific category of material, a hazardous form of matter that should be controlled for the health and safety of workers and consumers? Or, did nanotechnology represent an entire new industry in need of care and control in order to reap its economic benefits?

Screen Shot 2013-02-10 at 3.19.36 PM

How you define something helps determine how you regulate it…

So…in order to draw fitting analogies that might suggest an ideal path toward the appropriate oversight or regulation of nanotechnology, stakeholders first had to agree on its definition. And depending on what definition one chose, a different historical analogy could be found which suggested a different approach to regulation…more on this next time.  But, to give a hint, the frequent comparisons between nano and genetically-modified organisms were not necessarily the best way to build a regulatory policy.6

To be continued…

  1. Bruce Mazlish, “Historical Analogy: The Railroad and the Space Program and Their Impact on Society,” in The Railroad and the Space Program: An Exploration in Historical Analogy, edited by Bruce Mazlish (Cambridge, MA: M.I.T. Press, 1965), pp. 1-52. []
  2. Daniel Barben, Erik Fisher, Cynthia Selin, and David H. Guston, “Anticipatory Governance of Nanotechnology,” in The Handbook of Science and Technology Studies, Third Edition, edited by Edward J. Hackett, Olga Amsterdamska, Michael Lynch, and Judy Wajcman (Cambridge, MA: MIT Press, 2008), 979-1000. []
  3. Scott Knowles at Drexel University has a great book out on the role of “disaster experts” in modern America that is worth looking at. []
  4. “Workshop Breakout Session Reports,” in Nanotechnology: Societal Implications I, Maximizing Benefits for Humanity, edited by Mihail C. Roco and William Sims Bainbridge (Dordrecht, The Netherlands: Springer, 2007), 73. []
  5. When the National Academies of Science reviewed the NNI in 2002, its report shows that, out of $464 million allocated for nano in FY2001, less than 1% went to the Environmental Protection Agency. NAS, Small Wonders, Endless Frontiers report, 2002. []
  6. A rhetorically powerful example of this came from 2003 Congressional testimony given by Rice University chemist Vicki Colvin. Colvin, director of Rice’s Center for Biological and Environmental Nanotechnology, spoke about societal implications of nanotechnology. This “emerging technology,” Colvin said, had a considerable “wow index.”Nanotech offered “potential benefits to the economy, human health, and quality of life.” However, Colvin warned, every new such emerging technology came with its own particular set of concerns. If improperly handled, these concerns “can turn wow into yuck and ultimately into bankrupt.” To drive her point home, Colvin shrewdly drew an analogy between a future in which nano might go bankrupt and an example that would resonate with policy makers – the “genetically modified foods industry.” A quick web search on “Vicki Colvin + wow to yuck” yields some 360,000 hits including several presentations and papers she and her Rice colleagues gave that use the phrase. Her original testimony appears in House of Representatives Committee on Science, “The Societal Implications of Nanotechnology,” Hearing before the Committee on Science, House of Representatives (108th Congress, 1st Session), April 9, 2003. []