Getting Medieval On Your Asteroid

Prefatory note: On December 25, 2012, The New York Times published an article about a company’s plans to mine asteroids for precious metals. This resulted in some entertaining and thought-provoking discussions with Nicole Archambeau, the resident medieval historian at my house. Nicole offered to convert her thoughts into a guest blog post for Leaping Robot. Here it is…

Tapestry_of_bayeux10

The Bayeux Tapestry depicting Halley’s Comet in the 11th century; today some people propose mining asteroids or spent comets.

It may seem odd for a medieval historian, but I love a good article on space research and exploration. So when I saw the headline “A Start-Up Sees a Gold Rush Among the Stars” on Christmas day, I looked forward to the article, like an unexpected present. But this was a present obscured by its wrapping paper. And as I read the article, I realized an unlikely similarity between space exploration and medieval history – news coverage that subtly mocks and misleads in favor of the easy joke.

Inaccurate comparisons with fiction bedevil both fields. As I read the article on asteroid mining, I wondered why the author highlighted the movie Avatar, which primarily focuses on the interaction of living species. Yes, James Cameron is one of Planetary Resources advisers, but asteroid mining would hardly encounter problems seen in the movie and there were many other advisers to choose from. Worse yet was the logical fallacy used to end the article on a titillating note. Nothing about the company or its website suggests they work on Armageddon-style asteroids on a collision course with Earth, but the author forced it in there. The article brought up fictional planets and Earth’s destruction at the expense of interesting questions about the commercial and scientific nature of asteroid mining. Planetary Resources and their backers (many from Google) want to mine asteroids for metals in the platinum family – why are these metals so important that the expense and difficulty of off-planet mining makes sense and will pay off? What could these metals be used for? As a reader with little background in science but lots of curiosity, spare me the movie references and tell me what’s at stake.

I see this technique frequently when authors bring up Game of Thrones, Braveheart, Monty Python and the Holy Grail, and The Da Vinci Code to discuss artifacts and issues in the Middle Ages. While this pattern is starting to change (another NYT article by Karen Jones is a refreshing example of how every article doesn’t have to have a movie reference), that change is slow.  Journalists recently did some good work with a papyrus discovery that suggested Jesus had a wife, but even the NYT article ended with – no surprise – Dan Brown and barely addressed the impact such a text could have on Catholic hierarchy.

Screen shot 2012-12-25 at 12.25.06 PM

Witch-burning “medieval” peasants shown in the 1975 film Monty Python and the Holy Grail

More unfair, however, is a subtle undermining of topics like medieval history and space exploration by mixing any serious discussion with puns and kitschy references. In Johnson’s article, even the critic he spoke to didn’t think Planetary Resources’ project would fail, just that it would take longer and be more expensive than estimated. Hardly a reason to indirectly question the seriousness of the project with a tired Star Trek reference.

Medievalists experience this undermining all the time, though it’s usually much darker. While the adjective ‘medieval’ often appears alongside ‘fairy-tale’, which undermines its historical reality, phrases like ‘medieval’ and ‘dark ages’ are often used to suggest something is dirty, ugly, violent and ignorant. For example, an article about the Taliban described them as ruling Afghanistan with “medieval brutality, denying women education or health care.” Women were not denied health care in medieval Europe or the Mediterranean as if they all lived under a fundamentalist sect. And some prominent American schools denied women access until within my lifetime (including Westpoint and Caltech). But labeling the Taliban’s brutality “medieval” isn’t just wrong; it undermines the validity of medieval history in popular culture. I feel the impact of this when I tell people I’m a medieval historian and almost always hear about how interested they are in torture and witches.

So while the easy joke or movie reference may seem like it makes an article more fun for the audience, it more often oversimplifies and leads to shallow questioning. Planetary Resources is a for-profit company that does not have the same idealistic goals as a 1960s television program. The Taliban is very much a 21st century community, not a medieval one. By replacing thoughtful analysis with puns and pop culture, we belittle exactly what we as authors want our audiences to engage with and instead allow them to dismiss it as fiction.

Science vs. Technology, Redux and Again

Screen Shot 2012-12-20 at 12.02.14 PMLast week, Jacob Darwin Hamblin posted a great blog entry that struck at least two chords with me. Called “Science vs. Technology Smackdown: Have We Survived the 1950s?” Hamblin revisited the professional divide (of sorts) that is familiar to some historians of technology and science. A brief conversation with a European historian of science served as his point of departure as she asked whether “there was still discord between historians of science and historians of technology.” Briefly stated – historians of science and technology are thought not to play well with one another.

The root of this goes back to 1957 when, at a meeting of the American Society of Engineering Education at Cornell, proponents of including the history of technology under the umbrella of the History of Science Society (HSS) were spurned.1. Consequently, in the shadow of Sputnik, the Society for the History of Technology (SHOT) was started. Since then it is standard lore for graduate students starting in either field to hear tales – often over pints of beer – about how the two communities remain “largely separate and mutually wary.”2.  I have no intention of revisiting the consequences of this divide here or reexamining the many ways in which science and technology are both different and complementary ways of knowing about the world.

However, I appreciated Hamblin’s take on this divide with which I agree completely – “Why on earth would I ignore one by choosing the other?” I understand that some people feel greater loyalty to HSS or SHOT…but to imagine that somehow the discord of 1957 should be continued or maintained in some way is self-defeating. Both fields are small archipelagos compared with the larger field of history. Just as seriously, both need to find as many allies and members as they can within the humanities and beyond.

At recent SHOT and HSS meetings there were signs of progress in this area. In 2009, HSS featured a session which explored the historically (and scholarly) contested nature of “applied science.” Papers from this were published in the September 2012 “Focus” section of Isis. And in 2011, there was a jointly sponsored session between HSS and SHOT called “Beyond the Science-Technology Relationship.”

I was the commentator for the papers presented at this latter venue. As I put my thoughts together for it, I recalled how the relationship between science and technology has occupied many excellent scholars over several decades. Each time a new cohort re-examines it, we get new insights into how different historical actors, from different national traditions, have expressed the interactions between science and technology. As a result, we now have many ideas and metaphors that are required points of discussion for new graduate students: Ed Layton’s mirror-image twins, John Pickstone’s “ways of knowing,” Ron Kline’s focus on the rhetoric of American engineers, and Paul Forman’s consideration of technology’s primacy as a sign of post-modernity.

This took me back to a classic paper that looks at the science technology-divide: Otto Mayr’s article “The Science-Technology Relationship as a Historiographic Problem.”3 Otto Mayr was one of the first scholars to advocate a focus on discourse as the place of analyzing the distinctions and similarities between science and technology. In a 1976 article, Mayr – then a curator at the Smithsonian – suggested we treat relationships between science and technology as historically variable entities whose connection with one another was far from fixed. Today this seems almost axiomatic. But this wasn’t the case when Mayr’s article appeared.

Mayr first presented his ideas at a 1973 workshop Bern Dibner convened called “The Interaction of Science and Technology in the Industrial Age.” Historians attending it included Tom Hughes, Nathan Reingold, and Arnold Thackray. Their essays focused on the relative contributions of science and technology – variously defined – to one another. Recall that this conference occurred on the heels of Project Hindsight, the controversial study that refuted claims that basic research was the font of new applications.4 In 1973, debating the science-technology relationship had importance beyond the academy. We’re in similar situation now, I believe, as university administrators and policy wonks talk about post-academic and Mode 2 science.

Attendees at the ’73 meeting greeted Mayr’s essay with some skepticism. For example, one person asked why “we are hung up with the meaning of science and technology” given that “political historians have given up arguing about the meaning of politics?” But Mayr was less interested in definitions and more about discourse. He argued that the historian’s job is to understand the changing relationship between science and technology not as an absolute but as what it was at the time. Which brings us to the papers I commented on back in 2011…

One of the papers from the joint HSS-SHOT session I really liked was by Paul Lucier.5 Paul’s talk & paper takes us back to Henry Rowland’s 1883 classic “Plea for Pure Science.” Now standard reading in many history of science/technology courses, Rowland’s essay is both “eminently quotable” and wonderfully versatile as a tool to get undergrad students thinking about the status and nature of science and technology c. 1880. When Rowland gave this address may have been pleading but, based on Lucier’s new reading of it, I’m less convinced that his appeal was entirely pure.

In Lucier’s telling, the very categories that actors like Rowland used – inventor, pure science, applied, and so forth – had already been contested for decades. The idea of pure science as “science for its own sake” goes back to people like Charles Silliman, and Alexander Dallas Bache in the early 19th C – in other words, Rowland was continuing a long-standing conversation. Definitions, Lucier notes, of pure science changed over time – for Simon Newcomb it bespoke of membership in a select group with common goals. Later, during the Gilded Age, “pure” was re-interpreted to mean not membership but motivation. As Rowland saw it, pure science was an activity safely disconnected from the era’s political and financial corruption. (There is some irony here as Rowland himself achieved considerable renown for his development of tools to make better gratings for astronomical spectroscopy).

The distinction between pure and applied was, for actors like Rowland, more a matter of distance than kind. Pure science might indeed prove useful one day but, to paraphrase St. Augustine, “yes, but please…not just yet.” Discourse in the 1880s around “applied science” suggested a third and more desirable option. As explained in the journal Science – re-launched in 1883, the same year as Rowland’s plea –  so-called applied science could be an even more noble pursuit than pure science; such activities give us the “investigator-utilizer” a hybrid creature compared to the pure scientist or pure inventor. As the union of science and art (that is, technology) the practitioners of applied science had the potential to advance both knowledge and the public welfare.

Whereas people like Rowland were pessimistic about the effects of corruption, proponents of applied science, playing John Locke to Rowland’s dour Hobbes, imagined a world where patents and research advanced the pubic interest. Pure and applied were not separate and opposed but conjoined. Pure science, then as now, reflects prevailing political and economic realities. This is important not least because of our own work as historians as we, for example examine conceits such as the linear model of innovation.

National contexts are vitally important in unpacking discourse around the sci-tech relation. As we think about other ways to examine the science-technology relationship we might start to ask: How do formulations of this relationship cross national borders and languages? How do they move and change between time periods? And I wonder what this landscape will look like when we begin to look at other languages and contexts. A palette that includes Mexican, Russian, Brazilian, or Chinese examples is exciting to think of.

Looking more broadly, recent work on the histories of so-called emerging technologies – nanotechnology, biotech, genetics research, synthetic biology, green tech – shows the relationship between science and technology remains both fluid and a productive site for future inquiry. Consider this example. More than 20 years ago, Eric Drexler, arguably the most influential popularizer of nanotechnology, had the goal of designing nanoscale devices that were theoretically possible but could not yet be built. Drexler referred to his activities as “exploratory engineering” and the even more enigmatic “theoretical applied science.” Consider the semantic challenges of parsing “theoretical applied science”…

I suspect similar blurry boundaries can be found in other fields of techno-scientific research that have emerged in the post Bayh-Dole period. These contemporary examples – when researchers call “science” with a nudge and wink as “technology” and when visionary engineers blur already indistinct boundaries between science and engineering suggest that new graduate students who are less beholden to either the SHOT or HSS camp – will have plenty of new terrain to explore.

Note: I’ll be taking a break from Leaping Robot for a few weeks…I should be back at it right after the New Year starts. Best wishes for the holiday season! 

 

 

  1. This is recounted in Ch. 1 of John M. Staudenmaier, Technology’s Storytellers (Cambridge: MIT Press, 1985 []
  2. James McClellan III, “What’s Problematic About ‘Applied Science’ in The Applied Science Problem, ed. McClellan (Jersey City: Jensen/Daniels, 2008 []
  3. this appeared in Technology and Culture, 1976, 17, 4: 663-73. []
  4. C.W. Sherwin and R.S. Isenson, “Project Hindsight. A Defense Department Study of the Utility of Research,” Science, 1967, 156, June 23: 1571-7 []
  5. A version of this appears in the October 2012 Isis as “The Origins of Pure and Applied Science in Gilded Age America,”pp. 527-536 []

Psycho-Historicizing the Future

One of my favorite Internet mini-memes that circulated this past week has been Paul Krugman’s confession that his decision to become an economist was shaped in part by his reading of Isaac Asimov’s Foundation trilogy. For Krugman, Asimov’s depiction of “psychohistorian” Hari Seldon’s ability to chart the future of the galaxy was both compelling and inspiring. So it must be gratifying to the Nobel Prize winner to have written an introduction to a new edition of Asimov’s classic sci-fi novels.

Foundation

Cover of Asimov’s Foundation, first published as a book in 1951

In Asimov’s stories, “psychohistory” is rigorous field of study in which mathematicians, sociologists, historians, and other scholars can predict how society changes and the ways in which it can be encouraged to move in certain desired directions. As Krugman says, Seldon and company “discover that a carefully designed nudge can change that path. The empire can’t be saved, but the length of the coming dark age can be reduced to a mere millennium” – sort of what Krugman and the Federal Reserve have been trying to do for the world economy.

I was reminded of Krugman’s reflections when I read a New York Times article about a recently completed study titled Global Trends 2030: Alternative Worlds undertaken by the U.S. National Intelligence Council. As the report’s web page says, the goal of the exercise was “not seek to predict the future—which would be an impossible feat—but instead provide a framework for thinking about possible futures and their implications.” Among the study’s many conclusions are that China will supplant the U.S. as the leading global economic power before 2030; terrorists may be able to launch a computer-network attack that disrupts the lives of millions; and the growing economic importance of the developing world. More interesting to me was the identification of several core “disruptive technologies.” These included a certain form of transhumanism in the guise of the “enhancement of human mental and physical capabilities and anti-aging.” I’ll be coming back to this topic in a future post. But, all in all, not terribly shocking or surprising sorts of stuff.

GT

Cover of the new Global Trends report

Two aspects of the report caught my eye though, albeit in different ways. One of these was the report’s conclusion that there would be a “more fragmented international system” in which there is less cooperation and more competition. One result of this might be “increased resource competition, spread of lethal technologies and spillover from regional conflicts” which “increase the potential for interstate conflicts.”

This prediction reminded of the results from similar studies that were done by the Global Business Network in the late 1980s and early 1990s. Now part of the Monitor Group, GBN was a small but influential Bay Area-based consulting firm started in 1987. GBN offered what it called “survival insurance” to an expanding roster of “companies who wanted to be smarter.” To do this, GBN used “adaptive scenarios” which it described as “creative tools for ordering one’s perception about possible alternative future environments.”1 A legacy of the Cold War, researchers such as Herman Kahn had prepared scenarios to game nuclear war simulations. Oil giant Royal Dutch Shell had also used similar stories of “possible futures” to survive the energy crises of the 1970s. Futurist Peter Schwartz had worked at Shell and known Stewart Brand – famed for starting The Whole Earth Catalog in 1968 – since the early 1970s. After joining with Jay Ogilvy, a research manager at the Stanford Research Institute, the three men launched GBN.2

GBN started its work at a propitious time. With the end of the Cold War looming – indeed, the “end of history” if one believed Francis Fukuyama – there was a market for organizations that could provide business leaders and policy makers with some sense of the future. Corporate clients like AT&T, Volvo, as well as numerous energy-related companies hired paid upwards of $7,000 a day for GBN’s consulting services. GBN boasted an eclectic roster of business executives, artists, academics, and technologists who mingled with each other and GBN’s corporate clients at well-orchestrated events. For an annual fee of $25,000, clients could formally join the GBN network and get access to these experts and celebrities. Within three years, GBN was profitable.

To guide discussions with corporate clients, GBN prepared three “mental maps of the future” which covered a wide range of possibilities. The first of these, dubbed “Market World,” envisioned that expanding free market forces and neoliberal economics would create “a virtuous circle of technological innovation in an increasingly interactive and prosperous economy.” At the other end of the spectrum, “New Empires” predicted protectionist nation-states banding together to create competitive regional clusters and trade barriers. Finally, if “turbulence and volatility seem to be the only constant,” then a scenario described as “Global Incoherence” would dominate.3

GBN

Cover page of (draft) GBN scenario book

GBN and its clients clearly favored the triumph of “Market World” with its enhanced trade opportunities and favorable geopolitics. But, in all the forecasts GBN made during its early years of operation, it was technology that figured as one of, if not the most prominent force for social, economic, and political change. Although not quite as prominent in Global Trends 2030, technology “will continue to be the great leveler.” The report lists “four technology arenas” which will shape “global economic, social, and military developments” – information technologies (no surprise); new manufacturing technologies including 3-D printing (?!); resource-related technologies i.e. those which allow leaders to meet the food and energy needs of their populations; and finally advances in healthcare.

How does this stack up with what GBN saw roughly two decades ago? Its list isn’t that surprising – GBN co-founder Stewart Brand had been spending a lot of time of MIT while doing the research for his 1987 book The Media Lab: Inventing the Future at MIT and all of the technologies had ties to MIT in some fashion. In the first draft of its “1990 Scenario Book” it listed “massive parallel computing” (Danny Hillis, an MIT alum had started Thinking Machines which made parallel supercomputers) “interactive mass media” (Nicholas Negoponte ran MIT’s Media Lab) and “nanotechnology” (Eric Drexler, another MIT alum, popularized nano) as the areas to watch for the future. Who prepared the technology sections for the Global Trends 2030 report isn’t entirely clear.4 But it would be interesting to know to what degree personal and professional experiences shaped its visions of the technological future.

The second aspect of the Global Trends 2030 report that stirred my thoughts was its claim that the future is “malleable.” This idea resonates with some recent scholarship – including my own – that makes the case that the future is not a neutral space. Rather, we should view the technological future as a contested rather than neutral space.5 In this “predictive space,” the future exists as an unstable entity which different groups vie to claim and construct through their texts, their artifacts, and their activities while marginalizing alternative futures.

I am, of course, fascinated by the unrealized visions of the technological future that litter the past. In the late 1960s and early 1970s, a proliferation of futuristic visions about technology coincided with a larger wave of concern, even obsession, about the future. In a golden age of research and writing about technological tomorrows, professional “futurologists” became well-paid celebrities sought out for their glib advice. Hugely popular books such as Future Shock, Alvin Toffler’s 1970 bestseller, advised readers to brace for wrenching social changes as an old economy based on heavy industry gave way to a new one founded on information. The future also became an object of serious scholarly inquiry as economists, computer scientists, and sociologists attempted to understand – much like Asimov’s Hari Seldon – the future more “scientifically” and proposed ways in which society might navigate toward alternate, more desirable futures.

Screen Shot 2012-12-12 at 10.00.59 AM

Then, as now, people didn’t just look toward the future – they looked at it. Global Trends 2030 is a continuation and a legacy of this Cold War “future thinking.” As we move into 2013, one can only hope that the future can be as cheery as Krugman’s reminiscences about how he once wanted to think about tomorrow.

Followup: Since I first posted this, a few colleagues have directed me to some other related things on the web. One is a take on how Asimov’s Foundation series is “historical materialism distorted into cyclical psycho-history.” Another is a link to a PDF of Kugman’s introduction to the new edition. An on-line article picked up the thread about the “transhuman future” the Global Trends 2030 report describes. Finally, here is a 2008 post from Krugman with an early acknowledgment of his Foundation fetish.

  1. January 6, 1988 memo from Peter Schwartz; Folder 1, Box 68 of Stewart Brand’s papers at Stanford Archives i.e. SB/SA []
  2. There is a nice section on GBN in Fred Turner, From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism (Chicago: The University of Chicago Press, 2006) as well as in W. Patrick McCray, The Visioneers: How a Group of Elite Scientists Pursued Space Colonies, Nanotechnologies, and a Limitless Future (Princeton: Princeton University Press, 2013 []
  3. Described in December 1988 issue of GBN publication The Deeper News as well as the “1990 Scenario Book;” Folder 1, Box 66 and Box 75, Folder 7, both SB/SA []
  4. The report credits input from two people at Strategic Business Insights []
  5. Ideas presented in several of the essays contained in Nik Brown, et al. Contested Futures: A Sociology of Prospective Techno-Science(Aldershot, 2000) and Marita Sturken, Douglas Thomas, and Sandra J. Ball-Rokeach, eds., Technological Visions: The Hopes and Fears That Shape New Technologies (Philadelphia, 2004 []