Where Did That iPod Come From?

Screen Shot 2014-04-09 at 12.03.00 PM

Apple’s 1st Generation iPod, 2001

Few topics are more controversial among people who study histories of science and technology than the categories of basic versus applied research. In part, it’s because the borders between basic vs. applied research shifts over time. What was “basic” in 1880 might be applied (or simply black-boxed knowledge) a few decades later. Moreover, even within one particular time period, the labels “basic” vs “applied” are actors’ categories. They’re contingent on the circumstances in which researchers deploy them and the rhetorical work they (and we) want them to do.

I was reminded of these distinctions this week when the Technology Academy Finland announced the recipient of its prestigious Millennium Technology Prize (and a cool 1 million euros). The winner is Stuart S.P. Parkin, an IBM Fellow based at the Almaden Research Centre in San Jose as well as the director of the Max Planck Institute of Microstructure Physics.

2014 Millennium Technology Prize

1 million reasons to smile.

Parkin is being honored for his discoveries which have “enabled a thousand-fold increase in the storage capacity of magnetic disk drives.” These innovations, according to the Finns, have “underpinned the evolution of large data centers and cloud services, social networks, music and film distribution online.”

Parkin’s work is especially illuminating for the “what is science and what is technology?” question given the historical background of his work. The origins of it go back to basic physics research carried out in the late 1980s in France and Germany. In 1988, Albert Fert and Peter Grünberg independently discovered that tiny changes in magnetism can produce unexpectedly strong electrical signals. Because the response was so much greater than any of them expected, they named the phenomenon giant magnetoresistance (GMR).1 In mid-1988, both the French and the German research teams presented their results at a conference in France and submitted their studies for publication in the Physical Review. Aware now of each other’s work, the two scientists agreed to share credit for the discovery. Their work also helped initiate a new field of interdisciplinary research called spintronics which blends novel solid-state physics and device engineering with well-established areas of research such as magnetics and materials science.


Happy days…Fert and Grünberg, 2007

In 2007, the two European scientists shared the Nobel prize for physics. Yay! But what happened between 1988 and 2007 is the key here…and this is where Parkin entered the picture.

The first commercial application of the GMR phenomenon was in devices to detect slight magnetic fields for applications like landmine detection and traffic control systems. While sensor devices were fine for niche applications, other companies were eager to apply the GMR effect to more lucrative markets. This included IBM and the researchers it employed at its Almaden lab, a place where there was already a long tradition of work on magnetic information storage technologies.

Parkin was one of the IBM researchers who undertook this research. In 1980, Parkin earned his doctorate in physics while working at the Cavendish Laboratory in Cambridge. Still in his twenties, he joined IBM Almaden laboratory in 1982 and did research on topics such as high-temperature superconductivity for several years. After learning about the Europeans’ GMR research, he began to explore the magnetic properties of multilayer thin films with an eye toward improving the capabilities of the company’s hard disk drives.

In 1991, Parkin and his colleagues described what they called a “spin valve,” a device that makes use of the GMR phenomenon.2

Unlike Fert and Grünberg, who built samples using the more precise but slower and expensive tool of molecular beam epitaxy, Parkin’s group tried using sputter deposition equipment. This fit the goals of the Almaden group which wasn’t on basic science per se but on making devices that could be readily manufactured. Parkin’s use of sputtering held appeal for a company like IBM which had extensive experience in fabricating sputter-deposited magnetic storage media on an industrial scale. As one observer of Parkin’s research later recalled, the British scientist and his colleagues “simply engineered the shit” out the underlying GMR discovery as they made and characterized over 30,000 different multilayer combinations.

IBM eventually used the spintronics research Parkin and his colleagues had done to redesign and improve a basic element – the read head – in the company’s hard disk drives.

Screen Shot 2014-04-09 at 12.09.46 PM

In November 1997, The Wall Street Journal carried a front-page story about IBM’s unveiling of a new innovation for the personal computer industry. Based on the Almaden group’s exploitation of the GMR phenomenon, the new drives featured exquisitely sensitive magnetic read heads. For example, they could store eight times as much data as competitor’s equipment while still being smaller in size.3 This helped set the stage for the subsequent explosion in computer memory.


The result? Tiny hard drives that Apple incorporated into its early iPods. Fert and Grünberg’s work – channeled through Parkin and his colleagues at Almaden – found its way into the hands of millions of earbud wearing teenagers and subway commuters.

Historians, recognize that “pure science” is very much a social construction and one that often, after closer scrutiny, may not be quite so uncomplicated. Fert and Grünberg originally discovered GMR in the tradition of small-scale basic physics research. Parkin helped translate this into practical applications. Businesses, large and small, swiftly patented these applications and integrated it into products worth billions of dollars in annual sales. In this week’s Millennium Prize announcement, we can discern connections between contemporary scientific research and engineering applications and also see the shifting boundaries between science and technology.

  1. Peter Grünberg, then at the Jülich Research Center in western Germany, and his team made their discovery using Fe/Cr/Fe tri-layers. At the same time, Albert Fert and his group at Laboratoire de Physique des Solides at the University of Paris-Sud were examining more complex Fe/Cr multilayers. Fert and Grünberg’s work represented nanotechnology research long before the word itself fully entered the popular lexicon. For example, Fert’s team prepared its alternating iron and chromium layers – each less than ten nanometers thick – with molecular beam epitaxy, a key proto-nano research tool, before observing the GMR effect. []
  2. As described in its basic form, a spin valve is composed of two magnetic layers separated by a nonmagnetic layer. When the magnetic moment of the magnetic layers are aligned, electrons move more easily and the sample shows low resistance. If the magnetic layers are not aligned, the spin-dependent movement of electrons is impeded and resistance goes up. In this way, the device acts as a valve, affecting the passage of electrons depending on whether the valve is “open” or “closed. At about the same time, Parkin and four other colleagues filed for a patent. This was awarded in October 1992. []
  3. Raju Narisetti. “IBM Unveils Powerful PC Disk Drive, Confirms Plans to Join Two Divisions.” The Wall Street Journal, November 10, 1997: 1. IBM’s device held about 17 gigabytes of data (double what the company had previously offered) and was 3.25” in size; the best products from other firms had about thirty percent less storage capacity and were two inches bigger. []

Leaping Robot Heads Back to the Classroom

Today is the first day of the spring quarter at UCSB.  This naturally translates into less time for blog writing. But – on the plus side – I’m teaching two classes which should generate some interesting ideas for future posts.

One of them is an upper-division undergraduate course framed around the idea of “Technology, Power, and the American Century.” The goal is to look at the myriad ways technology and power (of all kinds) intersected during the long American century. (I’m defining this as roughly 1870 to the end of the Cold War…from when the U.S. became a key industrial power and a major figure on the global stage to when America emerged as the sole hyper-power after the collapse of the USSR). If you’re interested, a copy of the syllabus is here.

I am also co-teaching a graduate level class with a colleague from the Communications Department – it’s the “gateway seminar” for our Technology and Society doctoral emphasis (sort of like a graduate minor). We decided to focus on questions – historical as well as contemporary – about data for a class we’re calling “Data: Big & Small, Raw & Cooked.” Besides thinking about the changing historical context of data – what is it? How have people collected, managed, and used it over time? – we will also address contemporary issues associated with data, especially those related to Big Data.

As always, thanks for your interest in Leaping Robot :)  More anon…

It’s the End of the World and I Feel…Conflicted.

One of the more bizarre science-related stories last week had nothing to do with telescopes at the South Pole or the inflationary expansion of the Universe. Rather than focusing on the Beginning of Time, a flurry of activity was spurred by an allegedly “NASA-sponsored” study that addressed the End of Times.

The story got its start when a draft circulated of a peer-reviewed article soon be published in the journal Ecological Economics. Its authors were two researchers from the University of Maryland – Safa Motesharrei (a graduate research assistant in Applied Math and Public Policy) and Eugenia Kalnay (faculty) – and Jorge Rivas, with the University of Minnesota (although his web site is out of service). Motesharrei, the prime author, is affiliated with the National Socio-Environmental Synthesis Center at U-MD which in turn is funded by the National Science Foundation.

The title of their article gives some clue to the ensuing controversy: “Human and Nature Dynamics (HANDY): Modeling Inequality and Use of Resources in the Collapse or Sustainability of Societies.” (abstract below for what I’ll call the “HANDY paper”).

Screen shot 2014-03-22 at 1.25.04 PM

Their study was based on using a particular model – called H(uman) A(nd) N(ature DY(namics) or HANDY – to try to see mathematically whether contemporary society was susceptible to the sort of environmental or economic meltdowns that doomed earlier civilizations. The paper opens with a Jared Diamond-style summary of how earlier civilizations – Roman, Mesopotamian, Chinese, etc. – had all succumbed. The authors go on to detail their use of “four equations describ[ing] the evolution of Elites, Commoners, Nature, and Wealth.” Their resulting model “shows Economic Stratification or Ecological Strain can independently lead to collapse.” Such a fate “can be avoided…if the rate of depletion of nature is reduced to a sustainable level and if resources are distributed equitably.”

Normally, the appearance of a (draft) article like this would have been a non-event. But those words – inequality, collapse, equitably – somehow caught the attention of Nafeez Ahmed whose blog Earth Insight is hosted by The Guardian. Ahmed also helped create a 2011 documentary called The Crisis of Civilization which is about how global crises – economic as well as environmental – are the “converging symptoms of a single, failed, global system.”

Screen shot 2014-03-22 at 1.43.41 PM

Poster for Ahmed’s 2011 documentary

On March 14, Ahmed wrote about the HANDY paper in an eye-catching piece titled “NASA-funded study: industrial civilization headed for ‘irreversible collapse’?” Facebook shares (120,000+ so far) and tweets (8,000+) got the HANDY paper (and Ahmed) lots of attention. Stories about it appeared on NPR’s web page, Popular Science and in major national newspapers with hyperbolic headlines like “NASA Predicts the End of Western Civilization“.

Screen shot 2014-03-22 at 1.48.30 PM

Who wouldn’t click on the link to read this?

What struck me as most interesting was how media reports focused not so much on the content of the HANDY paper but on the fact that it was “NASA-sponsored.” At a time when the space agency is increasingly beleaguered in some quarters, it seemed almost quaint that journalists would see the NASA imprimatur – which was not as firmly stamped as Ahmed suggested – as proof of quality. This seemed to harken back to the glory days of the Space Race when “NASA Made This” really meant something.

As one might expect, there was a backlash. For example, Keith Kloor, writing on-line for Discover.com, critiqued not so much the HANDY paper’s conclusions but the manner in which Ahmed was publicizing them.1 Ahmed counterattacked via blog in the social media version of wash, rinse, repeat…and here we are.

Besides showing how social media and blogs can amplify a non-event into international handwringing, the hullabaloo around the HANDY paper is important for two reasons.

One is historical. The HANDY paper – its methods, its conclusions, and the reaction to them – bears a striking resemblance to the 1972 Limits to Growth report.

Screen shot 2014-03-22 at 2.25.54 PMSponsored by the Club of Rome, Limits was announced with a media blitz aimed at policy makers and ambassadors, its “doomsday timetable” predicted an inevitable collapse of societies all around the planet unless politicians and business leaders had the courage to restrict the growth of populations, industrialization, and resource use. Instead of continued expansion, it called instead for economic and ecological equilibrium commensurate with a species wholly dependent on limited planetary resources. Extensive computer-based calculations by researchers from MIT provided the Club of Rome with evidence needed to support its bleak assessment of the future.

Scientists and economists savaged the methodology that produced Limits, but the Club of Rome’s report sent a powerful message about a possible future. Its troubling conclusions compelled more than eight million people to buy copies of Limits and it was translated into some thirty languages. Limits fit the pattern of other eco-doom books that filled shelves in the late 1960s and throughout the 1970s. The 1970s were a very doomy time, in fact. Fears about the future were not just the province of activists, campus intellectuals, and ecologists. In the early 1970s, the Christian fundamentalist revival in the U.S. coincided with apocalyptic excitation about the future. In The Late Great Planet Earth, former tugboat captain Hal Lindsey used his own interpretation of the Book of Revelation to write what became the best-selling non-fiction book of the 1970s (and was made into a movie).

Screen shot 2014-03-22 at 2.32.24 PM

Critics attacked Limits throughout the 1970s.  Economists who assumed growth was a fundamental tenet of modernity proved particularly hostile. Demand for resources, critics said, was historically contingent. Factory owners did not clamor for coal in the 16th century nor was uranium a desired international commodity until after 1945. As a result, Limits appeared to some people as just a reprise of old Malthusian ideas. Released in the absence of peer review with a public relations extravaganza did not help assuage skeptics. Moreover, the data underlying Limits wasn’t available for inspection which aroused scientists’ suspicions further. Even the analytical tools the MIT group used provoked ire. The computer cliché “Garbage In, Garbage Out,” although still relatively novel in 1972, typified many experts’ responses.

Similar critiques of the 2014 HANDY paper have already appeared (here and here and here, for example) although the attacks seemed more aimed at Ahmed’s hyperbolic approach rather than the research itself. (One of the points argued in my book The Visioneers was that Limits served to motivate a cohort of dreamers and visionaries who applied their engineering and science skills to try to circumvent its pessimistic predictions. I’d be curious to hear if any new efforts are catalyzed by articles such as the forthcoming HANDY paper.)

Screen shot 2014-03-22 at 3.10.02 PM

Cover of the 7″ version of R.E.M.’s 1987 hit

The second reason why I think the HANDY article is important has to do with how we imagine doomsday scenarios. Like good science fiction, these tend to reflect not the future but rather the time in which they are put forth. In his 1992 book When Time Shall be No More, the late historian Paul Boyer surveys the history of apocalyptic thought in American culture. During the Cold War, the End would come via the flash of a thermonuclear explosion. Deténte and the environmental movement brought eco-catastrophism to the fore (as Jacob Hamblin so nicely writes about in Arming Mother Nature) as overuse of resources and overpopulation became the mechanisms of our demise.

Today, the collapse of society, as the HANDY article details, is predicated on the growing distinction between the haves and have-nots as destabilizing inequities will lead to overconsumption of resources and eventual conflict. In short – not surprising given the rise of attention to the 1% vs. 99%, disparities in CEO/employee pay, the collapse of the middle class, ad infinitum. Studies like the HANDY article may not be guides to the future but they offer a penetrating glimpse into the present. Every era gets its doomsday mechanism it deserves.

  1. He also dug into the NASA funding issue. Eventually, NASA issued a denial that it had indeed sponsored the study or endorsed its conclusions. This was largely a non-issue, though. The draft article notes some support from NASA while a statement from National Socio-Environmental Synthesis Center explained “Motesharrei received minor support from NASA (Award No. NNX12AD03A), through UMD’s Earth System Science Interdisciplinary Center, to develop a coupled earth system model. Some of this funding was spent on the mathematical development of the HANDY model. The research paper was not solicited, directed, or reviewed by NASA. It is an independent study by the researchers utilizing research tools developed for a separate NASA activity. []