Checks and Stripes

Screen Shot 2014-02-10 at 10.08.49 AM

What do you see here? Stripes? Or feedback?

Leaping Robot note: Historians, like most of us, like controversies. They provide a valuable lens to look at a range of issues. This post was inspired by an article that appeared recently in Science. It concerns a simmering feud between two groups of researchers over how to interpret some images of gold nano-particles made with a scanning tunneling microscope (STM). My colleague, Cyrus Mody, a historian of science at Rice University, wrote a prize-winning book called Instrumental Community that tells the story of the invention and spread of scanning probe microscopy.1 So Cyrus is an excellent person to offer a more nuanced reading of this controversy. One might think that the nano-feud is, as one person notes, just a “minor storm in a nano teapot.” Mody’s guest blog post goes deeper than this and shows how controversies like this, besides being rather common, also tell us something important about how scientists (and science journalists) communicate with each other and the public. Here’s Cyrus…

***

The January 24 issue of Science has a news item by Robert Service with the juicy titleNano-Imaging Feud Sets Online Sites Sizzling” describing a multiyear tussle over a decade’s worth of science based on some scanning tunneling microscope (STM) images of gold nanoparticles.  The STM images are from Francesco Stellacci’s group, first at MIT then at the Swiss Federal Institute of Technology in Lausanne.  They purport to show “stripes” of organic molecules attached to particles with diameters of 20 nanometers or less, in an arrangement resembling lines of latitude.  However, a number of critics have insisted – in blogs, Twitter feeds, and elsewhere – that Stellacci’s stripes are actually instrumental artifacts.

I’m not going to weigh in on whether Stellacci’s stripes are real or not.  I know some very smart STMers on either side, so I doubt consensus will be reached soon.  Instead, let me do what historians of science usually do: put controversial research in perspective, followed by a point for the prosecution and a point for the defense.

Controversies like this are pretty common.  The point of doing forefront research is to push our ability to make and measure stuff right to the limits.  Robert Service has made a career out of reporting such disputes, for which historians of science should be grateful.  For instance, I’ve made frequent use of his article on a related dispute from 2003, “Molecular Electronics – Next Generation Technology Hits an Early Mid-Life Crisis.”  If you look at the history of science you’ll find lots of disputes, sometimes extending over decades, over matters of fact that you would think could be easily resolved.  One of my favorite examples is the more than thirty year debate from the early 1920s to the late 1950s in which cytogeneticists nearly uniformly agreed that a normal human somatic cell contains 48 chromosomes (rather than the now accepted 46).2 How hard can it be to count chromosomes that are almost a thousand times larger than Stellacci’s nanoparticles?  And yet, even something as simple as counting can remain interdeterminate (or determined but incorrect) for a very long time.

The instrument Stellacci used to image his nanoparticles, the scanning tunneling microscope, has a particularly rich history of disputes.  STM images are made by bringing a sharp metal probe very close to the surface being imaged while maintaining a voltage difference between the probe and the sample.  This encourages some electrons to “tunnel” from the sample to the probe and vice versa.  Generally, the closer the probe is to the surface the higher the probability of tunneling in one direction rather than the other, so the number of tunneling electrons is a reasonable proxy for the z-height of the sample for any given x-y position of the probe.  As you move the probe around in x and y, you build up a matrix of z values for the strength of the tunnel current, which you can then convert into a three-dimensional image of the sample.

Screen Shot 2014-02-10 at 11.56.26 AM

Schematic of how an STM works

Roughly, the computer to which the STM is hooked up registers a single value for the tunnel current per increment of probe position.  That is, the STM periodically samples the sample, and each sampled value is fed both to the imaging output and to the circuit controlling the height of the probe at the next increment.  Too strong a feedback can make the probe constantly overshoot and then play catch-up, so even a smooth surface will seem to have undulations.  Unfortunately, the phenomena researchers are looking for on a surface are also often periodic in nature – as, for instance, are Stellacci’s stripes.  If you think you’ve made a sample with stripy features, you want to get an STM image with alternating patches of light and dark, up and down.  A nice analogy, suggested to me by my Rice colleague Kevin Kelly, is of a strobe light illuminating water coming out of a tap.  If the strobe samples the water at the right rate, we see a series of drops accelerating under the force of gravity.  If the strobe frequency and duration are varied, though, we might see a smooth flow of water or we might see water drops that appear to move upward into the tap.

So it’s not hard to find people who are skeptical of claims that a particular STM image indicates the existence of some periodic nanostructure, particularly if the distance between periodic features is near the limits of the instrument’s resolution.  Perhaps the most famous such case involved STM images of DNA made in the late 1980s and early 1990s.

Screen Shot 2014-02-10 at 11.53.35 AM

2009 cover of Nature showing false-colour STM image with a DNA molecule running from bottom left to top right.

At the time, many STMers harbored hopes that their instrument could resolve the base pairs in a strand of DNA and might therefore be used in genetic sequencing.  Lots of STMers tried to image DNA; several groups published such images; one Caltech group even managed to get an atomic resolution image of DNA on the cover of Nature that appeared to show a helix with the right pitch distance (the linear distance between two turns of the helix).  And yet, there were lots of good reasons to think that an organic molecule like DNA should be difficult to image via electron tunneling.  Whispers began to circulate that some of the best images of “DNA” were actually images of complex defects in a graphite substrate that might or might not have any DNA deposited on it.  In the end, images such as the 1990 Nature cover were never definitively disproved, but the uncertainties surrounding their validity became so insurmountable that almost everyone moved away from trying to image DNA with an STM.  In the late ‘90s and early 2000s, the three or four remaining groups showed that DNA could be imaged, but only under very specific and difficult conditions completely unlike those used in the early ‘90s, conditions that, so far, have barred using an STM for genetic sequencing.

The STM of DNA case, then, leads to my two points about Stellacci’s stripes.  My point for the defense is that even a result that turns out, after vigorous debate, to be wrong can be extraordinarily productive.  Lots of people got into STM on the basis of those images of “DNA”, even if it’s still uncertain whether there was actually any DNA there.  The debate about those images led to a general tightening of standards for STM image production and interpretation, and a better understanding of which applications STM was and was not good for.  The DNA boom provided an early market for commercial STMs that gave manufacturers the revenue to build better versions that could be used in more appropriate ways.  Since the vast majority of scientific findings are ignored, a finding that turns out to be questionable but which is actually taken up for productive debate is doing pretty well.  I don’t have the expertise to say whether Stellacci is correct or not, but even if the consensus emerges that he is wrong, it looks to me like he and his skeptics will still have managed to move the field forward, not back.  (Conversely, if his skeptics turn out to be wrong, they also will still have done the field – and Stellacci himself – a great service).

My point for the prosecution is that some of the criticism of Stellacci’s skeptics’ methods is misplaced.  Service’s article offers a number of quotes from Stellacci and his allies complaining that the skeptics have used extrascientific means to carry out the debate – that they have resorted to blog posts and non-peer-reviewed articles instead of remaining within the arena of peer-reviewed journals.  That’s a rather ahistorical view of how scientific controversies proceed.  Peer reviewed journals are, of course, an important mechanism for fostering the validity of scientific contributions, but we all know that peer review is slow and hardly error-free.  Often, peer-reviewed articles only make sense within some ecology of other forms of communication.3  In the “STM of DNA” controversy, the thread of argument left quite a light footprint in peer-reviewed articles – it’s difficult to piece together, just from published texts, who said what when, much less when and why various actors changed their minds about STM of DNA.  Most of the influential voices used conference presentations and post-presentation conversations to persuade themselves, each other, and the rest of the community that there were serious problems with STM of DNA.  Some of the forms of communication used by Stellacci’s skeptics today (blogs and Twitter) didn’t exist back then, but if they had you can bet they would’ve been used too.  If history is anything to go by (and I hope it is!) there’s nothing inherently unscientific about using any mode of communication you can to get your point across.

Screen Shot 2014-02-10 at 12.02.38 PM

If you liked analysis and narrative here, you’ll probably like Cyrus’s book (which won the 2013 Cushing Memorial Prize)

  1. By forming a community, he argues, these researchers were able to innovate rapidly, share the microscopes with a wide range of users, and generate prestige (including the 1986 Nobel Prize in Physics) and profit (as the technology found applications in industry). Mody shows that both the technology of probe microscopy and the community model offered by the probe microscopists contributed to the development of political and scientific support for nanotechnology and the global funding initiatives that followed. []
  2. described in Aryn Martin, “Can’t Any Body Count? Counting as an Epistemic Theme in the History of Human Chromosomes,” Social Studies of Science 2004 []
  3. For a particularly engaging article on this topic with a self-explanatory title, see Bruce Lewenstein, “From Fax to Facts: Communication in the Cold Fusion Saga,” Social Studies of Science (1995). []

Atoms to Art

About three weeks ago, IBM announced it had made the world’s smallest movie. Scientists at its research facility in San Jose used a scanning tunneling microscope (STM) to drag several score of carbon monoxide molecules around on a super-cooled copper surface.1 250 frames of stop-motion action were recorded to create a short film called “A Boy and His Atom.”2 According to IBM’s press release, the movie – set to a playful soundtrack –  “depicts a character named Atom who befriends a single atom and goes on a playful journey that includes dancing, playing catch and bouncing on a trampoline.”

Screen Shot 2013-05-12 at 12.32.34 PM

Image from IBM’s atomic film.

Given the movie’s low resolution, I think the film could (and should) have just as easily been called “A Girl and Her Atom” which would have been a lot cooler…what better way to say that women are welcome in science and this all isn’t just about boys and their toys (and their atoms)??

Anyway – This was not the first time that researchers from IBM made international headlines with their STM skills. In 1979, Heinrich Rohrer (who passed away last week) and Gerd Binnig, two scientists at an IBM lab in Zurich, began work on developing the STM.3 After Binnig and Rohrer announced their results in 1982, a flood of publications about the new instrument’s capabilities appeared in specialist journals. Meanwhile, continued improvements provided the basis for hundreds of patents as entrepreneurial researchers commercialized the STM. Within a few short years, the STM and its variants became ubiquitous instruments in labs and factories around the world. 4 In 1986, Binnig and Rohrer shared the Nobel prize in physics for their work.

During the 1990s, the STM became a poster child for nanotechnology. This was partly due to the fact that researchers soon realized they could use the instrument to not just image atoms but also to move them around. In late 1989, at the same IBM lab where “A Boy and His Atom” was later made, Donald Eigler and Erhard Schweizer sprayed a carefully prepared nickel substrate chilled by liquid helium with xenon vapor. By bringing the STM tip close to the sample, they learned how to slide and drop individual xenon atoms. In their demonstration of the ability to “fabricate rudimentary structures of our own design,” they precisely placed 35 xenon atoms on a small piece of nickel, cooled to almost absolute zero and held under an ultra-high vacuum, to make an IBM logo just three billionths of a meter long.5

Screen Shot 2013-05-12 at 1.06.04 PM

Xenon atoms used to spell IBM; from the 1989 paper by Eigler and Schweizer

Moving atoms around and putting them precisely where one wanted was exciting and newsworthy. Engineering at the nano-scale, as The New York Times described it, was not only potentially technologically ground-breaking but a downright nano-adventure.6 IBM’s nanoscale logo became one of the most iconic scientific images of the 1990s, the original Nature article announcing the feat was cited hundreds of times, and STM-generated images took a place at the intersection of scientific experimentation and artistic expression.

Eigler and Schweizer’s work coincided with the media’s growing interest in nanotechnology and their technical tour de force brought widespread attention to lab-based nanotechnology. The timing was key. “Nanotechnology” was trying to emerge from the shadow of Eric Drexler’s more radical (and biologically based) version of nanotech associated with self-replicating nanoscale “assemblers” that would build things molecule by molecule with atomic precision. The STM would become one of the innovations touted to Congress when researchers and science managers began to lobby the Clinton administration for what became the multi-billion dollar National Nanotechnology Initiative.

IBM’s latest nanoscale accomplishment connects back to this earlier history in two notable ways. First, the original impetus for developing the STM in the first place was the search for improved computer technologies.7 “A Boy and His Atom” was made as part of IBM’s larger effort to explore the limits of data storage. In 2012, the company successfully demonstrated the ability to store information in as few as 12 magnetic atoms.8 The image below shows a magnetic byte imaged 5 times in different magnetic states to store the ASCII code for each letter of the word THINK, IBM’s corporate mantra.

Screen Shot 2013-05-12 at 12.15.58 PM

But the second and far more interesting parallel is the use of the STM to create images in the first place. Here art of a sort serves as evidence of technological virtuosity. Eigler and Schweizer’s image was reproduced scores of times in the 1990s to make the point that researchers could manipulate and precisely position atoms. In some interpretations, this was seen as the first step toward being able to fabricate things from the atomic scale upwards.9 Researchers at other labs, meanwhile, used the STM to draw all sorts of other images. More than two decades later, IBM’s researchers again chose to showcase their acumen known via carefully constructed images – in this case about 250 of them strung together to make a short movie. The very last frames of the film again show “IBM” spelled out, perhaps in homage to Eigler and Schweizer’s earlier work.

Doing art with atoms became one way of showing what a nano-future might be like (or at least trying to make it more comprehensible to non-experts). In 2003-2004, the Los Angeles County Museum of Art sponsored an exhibit called nano. According to the book that came out in conjunction with LACMA’s show,the exhibit’s focus was the “idea of scale intrinsic to nanotechnology” with exhibits designed to give visitors “experiences suggestive of what it would be like to be a nanoparticle” and so forth.10 The LACMA show – with its focus on scale and the relationship of size to ordinary human experience – calls the famous 1977 film Powers of Ten by Ray and Charles Eames to mind.

Screen Shot 2013-05-20 at 3.44.39 PM

Scene from the 1977 film Powers of Ten

It’s easy to see how IBM’s short film – amazing but also a little gimmicky – links different attempts by Big Blue to build improved computer and data storage techniques. But I think we can also see this short film sitting on the fringe of a bigger and more important project. In the past year, the “STEM to STEAM” movement has gained some momentum in the academic community and in conjunction with the NSF. Like the LACMA exhibit, the STEAM movement hopes to foster collaborations between scientists, artists, and humanists. Besides highlighting the difficulties and rewards of interdisciplinary collaboration, cooperative efforts like STEAM can help show nanotechnology (or other areas of research) as cultural and social products as well as technoscience done solely for academic credit or corporate rewards. I’m not saying IBM’s stunt itself heralds some bold new era of interaction between different communities…but continued, deeper, and more thoughtful efforts like this could yield valuable dividends for scientists and humanists. Maybe playing around with atoms could be a start toward something bigger.

  1. The copper 111 crystal plane was used; IBM chose copper because that element, in combination with carbon monoxide, provides a stable materials combination. []
  2. It already has a Wikipedia entry. []
  3. Gerd Binnig and Heinrich Rohrer, “Scanning Tunneling Microscopy: From Birth to Adolescence,” Reviews of Modern Physics, 1987, 59, 3: 615-25. Rather than the lenses and mirrors a traditional optical microscope uses to produce an image, the new microscope used a sharp tip to probe the surface of a metal or semiconductor sample. By creating a voltage difference between the probe tip and the sample and then bringing the tip very close to the sample, some electrons would “tunnel” between the two. If Binnig and Rohrer moved their probe tip back and forth over the sample’s surface, they could measure the changing strength of the tunneling current. Then, by keeping the current constant and continuing to scan with the probe tip, they could capture and convert the electrical signal to produce an atomic-scale image of a sample’s surface. []
  4. My colleague, Cyrus Mody at Rice University has written an excellent – and prize-winning – history of the STM called Instrumental Community: Probe Microscopy and the Path to Nanotechnology (Cambridge: The MIT Press, 2011)…highly recommended! []
  5. D. M. Eigler and E. K. Schweizer, “Positioning Single Atoms with a Scanning Tunnelling Microscope,” Nature, 1990, 344, 6266: 524-26; Malcolm W. Browne. “2 Researchers Spell ‘IBM,’ Atom by Atom.” The New York Times, April 5, 1990, B11. []
  6. Eigler’s own lab notebook entry for one experiment in February 1990, “Success at pick up…Success at put down…I am really having fun!” Others could learn how to do this feat as well. One reporter described how, with Eigler coaching him, he managed to nudge a few atoms about. Charles Siebert. “The Next Frontier: Invisible.” The New York Times Magazine, September 29, 1996, 1996, 137. []
  7. Binnig and Rohrer were looking to find a way to better characterize thin superconducting films as part of IBM’s long-running program to build a supercomputer using Josephson junctions. As Cyrus Mody relates, by the early 1980s, IBM was spending about $20 million a year on the program and had about 150 people working on it. []
  8. From the press release; this involved using an STM and a “grouping of twelve antiferromagnetically coupled atoms that stored a bit of data for hours at low temperatures. Taking advantage of their inherent alternating magnetic spin directions, they demonstrated the ability to pack adjacent magnetic bits much closer together than was previously possible. This greatly increased the magnetic storage density without disrupting the state of neighboring bits.” []
  9. To be fair, Eigler himself noted, atom-manipulation-via STM remained a “laboratory tool,” not a “manufacturing tool.” The carefully-prepared crystals on which xenon and other atoms were so delicately placed had to be cooled close to absolute zero, hardly the conditions for assembly line production. []
  10. This is described in N. Katherine Hayles, ed. Nanoculture: Implications of the New Technoscience (Portland, OR: Intellect Books, 2004). Some of the most remarkable parts of the LACMA show was “nano-mandala” an art/science project by Victoria Vesna, a media artist, and James Gimzewski, a nanoscientist; both are professors at UCLA. []