The Once & Future Nano King*

K. Eric Drexler is back…and with a new set of bold ideas about the technological future, all detailed in his new book, Radical Abundance

Screen Shot 2013-11-13 at 11.04.10 AM

Drexler, shown in 2013, reading from his new book.

But there’s a twist! The man once christened by magazines like The Economist as the “father” of nanotechnology is imagining possible tomorrows made better without the n-word. More on this semantic shift below. But first – whatever happened to him in the first place?

Drexler’s story resembles those of once-prominent Communist Party members who fell into disfavor, vanished for a period and then, after some political rehabilitation, emerged back on the public stage.

In brief – Drexler first became interested in what he initially called “molecular engineering” in the late 1970s. A 1981 paper published in the Proceedings of the National Academy of Sciences laid out his general plan for “molecular manipulation.” (The word “nanotechnology” appeared nowhere in the manuscript. In fact, it would still be a few more years before Drexler began using the nano-word in talks and papers.) This was based, he predicted, on the design on de novo protein-based machines which would be able to move and position other molecules so as to build up structures and devices atom-by-atom. A 1986 book called Engines of Creation popularized Drexler’s ideas and the concept of nanotechnology to a wide audience. Tech enthusiasts, business leaders, mainstream scientists, and policy makers took note.

Screen Shot 2013-11-13 at 11.02.15 AM

Drexler’s 1986 book and various translations.

In Engines and other writings, Drexler took pains to describe his work not as scientific research but as “exploratory engineering” – designing things today that obey the laws of physics, chemistry, etc. yet which we can’t build yet. Konstantin Tsiolkovskii and other early space (as well as computer) pioneers worked in a similar fashion.

When it came to nanotechnology, Drexler literally defined it with his entry in Encyclopedia Brittanica’s Yearbook of Science and the Future 1990. Throughout much of the 1990s, when journalists and the general public considered nanotechnology, it was shot through of Drexlerian ideas (the nanobot meme, something Drexler himself eschewed, is a classic example of how some of his own imaginings were co-opted and adopted by others). Given the emerging differences between Drexler’s supporters and those in the mainstream research community who found his ideas too fanciful – a rift which widened into a chasm over the next decade – one senses that Drexler’s visioneering was succeeding perhaps too well. In the late 1990s and into the early 21st century, mainstream scientists gradually marginalized Drexler (my book The Visioneers describes all of this in great detail) and his ideas.

So, in 2003, when former president George W. Bush signed the 21st Century Nanotechnology Research and Development Act, there were a number of people standing behind him in the Oval Office. We see a Silicon Valley venture capitalist, a nano-business advocate, a Republican senator, a Nobel prize winning scientist. But Drexler was nowhere in sight.

Screen Shot 2013-11-13 at 11.01.43 AM

Where’s Drexler?

And the bill Bush signed bore scant resemblance to the type of nanotechnology Drexler had long promoted. In fact, by the time the National Nanotechnology Initiative was proceeding full-bore, Drexler was “the name that can’t be spoken in polite society,” or at least among many mainstream scientists and policy makers.1

But that is changing. Since 2011, Drexler has been in residence at the University of Oxford. He is currently listed as an “Academic Visitor” in the Oxford Martin Programme on the Impacts of Future Technology. (A video of him giving an address there is here…it’s worth watching if you have the time.) Drexler’s move to the U.K. is part of this rehabilitation. What could be more Establishment than Oxford? It also fits his overall career pattern of finding affiliations with elite schools (MIT, Stanford) while avoiding the traditional professorial career path. Drexler also has been giving more public talks and writing pieces for the mainstream press such as essays for The Guardian. Accompanying this are appearances at venues like TEDx, signs that Drexler is placing himself back in the role as a technology intellectual and public figure.

Screen Shot 2013-11-13 at 11.04.34 AM

Drexler at a 2013 TEDx event in Lisbon

But the signature event of Drexler’s rehabilitation back into public life (if not into the mainstream of scientific or engineering research) is the 2013 publication of his new book. In the same spirit as EnginesRadical Abundance is aimed at a popular audience. It’s much different from his highly technical 1992 tome Nanosystems, for example.

I’ll be saying more about the book itself in a later post. But without giving too much away, Radical Abundance is Drexler’s pivot away from nanotechnology. In his telling, that word has become much too politicized (thanks, in no small part, to his own writings) and vaguely applied. I mean, what exactly IS nanotechnology anyway? Is it passive nanoscale particles? Active nanoscale devices? A novel approach to building new materials? Or a massive government-run program? Well, it’s all of these and more. I tend to agree with Richard Jones’ interpretation and see nanotechnology more as a sociological phenomenon, a way of organizing and bridging research across disparate scientific fields. (The same can be said for the Obama administration’s current attempts to fashion a neuroscience initiative around brain mapping.)2

So, in Radical Abundance, Drexler has done away with nanotechnology and replaced it with — wait for it — “atomically precise manufacturing” (APM). This means two things in his telling – “manufacturing using machinery based on nanoscale devices” and “products built with atomic precision.”3 Drexler’s new term is, well, much more precise.

Screen Shot 2013-11-13 at 1.27.22 PM

A computer simulation of a nanoscale planetary gear. (Illustration: K. Eric Drexler/Nanorex Inc).

It’s about building things, not researching, for instance, the toxicological effects on nanoparticles on fish. And “manufacturing” connotes industry which suggests jobs. This is a good strategy. But, at its core, APM harkens back to his earlier program of building things atom by atom. APM invokes a world in which mechanical engineering combines with chemistry. And, just as in Engines, Radical Abundance offers plenty of examples of how APM offers promise and peril if adopted (which Drexler sees as pretty much inevitable).

I’m looking forward to reading Radical Abundance more closely. I’m very intrigued by the first skim I’ve made, especially those sections in which Drexler offers his view of the history of nanotechnology (and his removal from “official” narratives of same). The demarcation Drexler is trying to make between nanotechnology (old, confused) and “atomically precise manufacturing” (new, specific) is fascinating just as are Drexler’s borders between science and engineering.

Screen Shot 2013-11-13 at 1.38.12 PM

Mick in 1978…a long way from Exile on Mainstreet.

But what intrigues me most is the larger process of rehabilitation that I see taking place. We see attempts to do this quite frequently on the part of celebrities, musicians, and politicians. Sometimes this is done to craft a new public image (like the Rolling Stones going disco) or to try to atone for past sins (Newt Gingrich, ad infinitum). How common is this among scientists or technologists?

Sometimes such attempts at reinvention work. But many times – often? –  those attempting such feats end up singing the same songs and sinning as before.

* The joke here – probably not a good one – is that Drexler was also once part of a commercial venture called NanoRex.

 

  1. 2006 remark by physicist Richard A.L. Jones, quoted in Arie Rip and Marloes Van Ameron, “Emerging De Facto Agendas Surrounding Nanotechnology: Two Cases Full of Contingencies, Lock-Outs, and Lock-Ins,” in Governing Future Technologies, edited by Mario Kaiser, et al.  (New York: Springer, 2010), 131-55. []
  2. A recent article in Nature about the BRAIN initiative described recent efforts as “a large-scale sociological experiment, as the sprawling neuroscience community struggles to coalesce around a common research plan under intense public scrutiny and tough financial constraints.” []
  3. Radical Abundance, x. []

Atoms to Art

About three weeks ago, IBM announced it had made the world’s smallest movie. Scientists at its research facility in San Jose used a scanning tunneling microscope (STM) to drag several score of carbon monoxide molecules around on a super-cooled copper surface.1 250 frames of stop-motion action were recorded to create a short film called “A Boy and His Atom.”2 According to IBM’s press release, the movie – set to a playful soundtrack –  “depicts a character named Atom who befriends a single atom and goes on a playful journey that includes dancing, playing catch and bouncing on a trampoline.”

Screen Shot 2013-05-12 at 12.32.34 PM

Image from IBM’s atomic film.

Given the movie’s low resolution, I think the film could (and should) have just as easily been called “A Girl and Her Atom” which would have been a lot cooler…what better way to say that women are welcome in science and this all isn’t just about boys and their toys (and their atoms)??

Anyway – This was not the first time that researchers from IBM made international headlines with their STM skills. In 1979, Heinrich Rohrer (who passed away last week) and Gerd Binnig, two scientists at an IBM lab in Zurich, began work on developing the STM.3 After Binnig and Rohrer announced their results in 1982, a flood of publications about the new instrument’s capabilities appeared in specialist journals. Meanwhile, continued improvements provided the basis for hundreds of patents as entrepreneurial researchers commercialized the STM. Within a few short years, the STM and its variants became ubiquitous instruments in labs and factories around the world. 4 In 1986, Binnig and Rohrer shared the Nobel prize in physics for their work.

During the 1990s, the STM became a poster child for nanotechnology. This was partly due to the fact that researchers soon realized they could use the instrument to not just image atoms but also to move them around. In late 1989, at the same IBM lab where “A Boy and His Atom” was later made, Donald Eigler and Erhard Schweizer sprayed a carefully prepared nickel substrate chilled by liquid helium with xenon vapor. By bringing the STM tip close to the sample, they learned how to slide and drop individual xenon atoms. In their demonstration of the ability to “fabricate rudimentary structures of our own design,” they precisely placed 35 xenon atoms on a small piece of nickel, cooled to almost absolute zero and held under an ultra-high vacuum, to make an IBM logo just three billionths of a meter long.5

Screen Shot 2013-05-12 at 1.06.04 PM

Xenon atoms used to spell IBM; from the 1989 paper by Eigler and Schweizer

Moving atoms around and putting them precisely where one wanted was exciting and newsworthy. Engineering at the nano-scale, as The New York Times described it, was not only potentially technologically ground-breaking but a downright nano-adventure.6 IBM’s nanoscale logo became one of the most iconic scientific images of the 1990s, the original Nature article announcing the feat was cited hundreds of times, and STM-generated images took a place at the intersection of scientific experimentation and artistic expression.

Eigler and Schweizer’s work coincided with the media’s growing interest in nanotechnology and their technical tour de force brought widespread attention to lab-based nanotechnology. The timing was key. “Nanotechnology” was trying to emerge from the shadow of Eric Drexler’s more radical (and biologically based) version of nanotech associated with self-replicating nanoscale “assemblers” that would build things molecule by molecule with atomic precision. The STM would become one of the innovations touted to Congress when researchers and science managers began to lobby the Clinton administration for what became the multi-billion dollar National Nanotechnology Initiative.

IBM’s latest nanoscale accomplishment connects back to this earlier history in two notable ways. First, the original impetus for developing the STM in the first place was the search for improved computer technologies.7 “A Boy and His Atom” was made as part of IBM’s larger effort to explore the limits of data storage. In 2012, the company successfully demonstrated the ability to store information in as few as 12 magnetic atoms.8 The image below shows a magnetic byte imaged 5 times in different magnetic states to store the ASCII code for each letter of the word THINK, IBM’s corporate mantra.

Screen Shot 2013-05-12 at 12.15.58 PM

But the second and far more interesting parallel is the use of the STM to create images in the first place. Here art of a sort serves as evidence of technological virtuosity. Eigler and Schweizer’s image was reproduced scores of times in the 1990s to make the point that researchers could manipulate and precisely position atoms. In some interpretations, this was seen as the first step toward being able to fabricate things from the atomic scale upwards.9 Researchers at other labs, meanwhile, used the STM to draw all sorts of other images. More than two decades later, IBM’s researchers again chose to showcase their acumen known via carefully constructed images – in this case about 250 of them strung together to make a short movie. The very last frames of the film again show “IBM” spelled out, perhaps in homage to Eigler and Schweizer’s earlier work.

Doing art with atoms became one way of showing what a nano-future might be like (or at least trying to make it more comprehensible to non-experts). In 2003-2004, the Los Angeles County Museum of Art sponsored an exhibit called nano. According to the book that came out in conjunction with LACMA’s show,the exhibit’s focus was the “idea of scale intrinsic to nanotechnology” with exhibits designed to give visitors “experiences suggestive of what it would be like to be a nanoparticle” and so forth.10 The LACMA show – with its focus on scale and the relationship of size to ordinary human experience – calls the famous 1977 film Powers of Ten by Ray and Charles Eames to mind.

Screen Shot 2013-05-20 at 3.44.39 PM

Scene from the 1977 film Powers of Ten

It’s easy to see how IBM’s short film – amazing but also a little gimmicky – links different attempts by Big Blue to build improved computer and data storage techniques. But I think we can also see this short film sitting on the fringe of a bigger and more important project. In the past year, the “STEM to STEAM” movement has gained some momentum in the academic community and in conjunction with the NSF. Like the LACMA exhibit, the STEAM movement hopes to foster collaborations between scientists, artists, and humanists. Besides highlighting the difficulties and rewards of interdisciplinary collaboration, cooperative efforts like STEAM can help show nanotechnology (or other areas of research) as cultural and social products as well as technoscience done solely for academic credit or corporate rewards. I’m not saying IBM’s stunt itself heralds some bold new era of interaction between different communities…but continued, deeper, and more thoughtful efforts like this could yield valuable dividends for scientists and humanists. Maybe playing around with atoms could be a start toward something bigger.

  1. The copper 111 crystal plane was used; IBM chose copper because that element, in combination with carbon monoxide, provides a stable materials combination. []
  2. It already has a Wikipedia entry. []
  3. Gerd Binnig and Heinrich Rohrer, “Scanning Tunneling Microscopy: From Birth to Adolescence,” Reviews of Modern Physics, 1987, 59, 3: 615-25. Rather than the lenses and mirrors a traditional optical microscope uses to produce an image, the new microscope used a sharp tip to probe the surface of a metal or semiconductor sample. By creating a voltage difference between the probe tip and the sample and then bringing the tip very close to the sample, some electrons would “tunnel” between the two. If Binnig and Rohrer moved their probe tip back and forth over the sample’s surface, they could measure the changing strength of the tunneling current. Then, by keeping the current constant and continuing to scan with the probe tip, they could capture and convert the electrical signal to produce an atomic-scale image of a sample’s surface. []
  4. My colleague, Cyrus Mody at Rice University has written an excellent – and prize-winning – history of the STM called Instrumental Community: Probe Microscopy and the Path to Nanotechnology (Cambridge: The MIT Press, 2011)…highly recommended! []
  5. D. M. Eigler and E. K. Schweizer, “Positioning Single Atoms with a Scanning Tunnelling Microscope,” Nature, 1990, 344, 6266: 524-26; Malcolm W. Browne. “2 Researchers Spell ‘IBM,’ Atom by Atom.” The New York Times, April 5, 1990, B11. []
  6. Eigler’s own lab notebook entry for one experiment in February 1990, “Success at pick up…Success at put down…I am really having fun!” Others could learn how to do this feat as well. One reporter described how, with Eigler coaching him, he managed to nudge a few atoms about. Charles Siebert. “The Next Frontier: Invisible.” The New York Times Magazine, September 29, 1996, 1996, 137. []
  7. Binnig and Rohrer were looking to find a way to better characterize thin superconducting films as part of IBM’s long-running program to build a supercomputer using Josephson junctions. As Cyrus Mody relates, by the early 1980s, IBM was spending about $20 million a year on the program and had about 150 people working on it. []
  8. From the press release; this involved using an STM and a “grouping of twelve antiferromagnetically coupled atoms that stored a bit of data for hours at low temperatures. Taking advantage of their inherent alternating magnetic spin directions, they demonstrated the ability to pack adjacent magnetic bits much closer together than was previously possible. This greatly increased the magnetic storage density without disrupting the state of neighboring bits.” []
  9. To be fair, Eigler himself noted, atom-manipulation-via STM remained a “laboratory tool,” not a “manufacturing tool.” The carefully-prepared crystals on which xenon and other atoms were so delicately placed had to be cooled close to absolute zero, hardly the conditions for assembly line production. []
  10. This is described in N. Katherine Hayles, ed. Nanoculture: Implications of the New Technoscience (Portland, OR: Intellect Books, 2004). Some of the most remarkable parts of the LACMA show was “nano-mandala” an art/science project by Victoria Vesna, a media artist, and James Gimzewski, a nanoscientist; both are professors at UCLA. []

Brain Mapping and the *-omics of Everything

“It is not down in any map; true places never are.”

-Herman Melville

Screen Shot 2013-02-20 at 9.12.14 AM

I recently gave a talk at the Tech Museum of Innovation in San Jose. Afterwards, John Markoff from The New York Times came up and introduced himself. I’m a fan of his work – I enjoyed his 2005 book What the Dormouse Said on the interplay between the counterculture and the computer industry and have also used it as source material for my own writing.1 I asked John what he was working on and he just told me to keep an eye on tomorrow’s paper.

Markoff’s story – front-page, above the fold – revealed how the Obama administration is planning a major new initiative to map the active human brain. (Note: On February 25, a day after I first posted this, a follow-up article by Markoff appeared on-line.) As Markoff described it, the Brain Activity Map (BAM) project “would do for the brain what the Human Genome Project did for genetics.” The BAM project – which would be much more challenging than mapping the human genome – even got a shout out in Obama’s 2013 State of the Union address when the President noted that “our scientists are mapping the human brain to unlock the answers to Alzheimer’s” and hinted at the good things to follow if this work continued.

BAM’s basic goal is to create a dynamic map of the brain – it’s not clear whose grey matter will be studied but I can only imagine the field-day that Institutional Review Boards will have one day with this – rather than static maps. A June 2012 paper published in Neuron explains BAM’s goals.2 (The paper’s 6 co-authors are listed alphabetically. Together they are an odd ensemble of star scientists including Berkeley chemist Paul Alivisatos and Harvard geneticist George Church who I wrote about in an earlier post). According to their paper, the BAM project will “transcend the ‘structural connectome’” studied (and popularized) by people such as MIT’s Sebastian Seung.

Screen Shot 2013-02-19 at 11.08.09 AM

Title and abstract of the 2012 Neuron article

What intrigued me even more than BAM’s ambitious goals was the proposed methods. These would draw on some of the radical ideas for nanotechnology that have circulated since the mid-1980s in pop-science works like Eric Drexler’s Engines of Creation. For example, as Markoff described it, a “fleet of molecule-sized machines” – called “nanoprobes” in the Neuron paper – could noninvasively sense and record spikes of activity from every neuron.3 This massive amount of data would be wirelessly transmitted and recorded, an activity that would require formidable information process capabilities. One approach to this might be “novel techniques” enabled by advances in synthetic biology. For example, DNA polymerases could be used as “spike sensors” while “prechosen DNA molecules could be synthesized” to record data patterns.

BAM’s advocates make an economic as well as scientific argument for why the government should fund this decade-long endeavor at about $300 million/year. As Markoff points out, a recent Batelle study states that the $3.8 billion the feds spent on the Human Genome Project “drove $796 billion in economic impact” and “created 310,000 jobs.” Obama referenced these figures in his SOTU address. The Neuron paper predicted the BAM project might lead to similar “technological breakthroughs” at the interface of biotechnology and nanotechnology. These could include “intelligent nanosystems for fundamental investigations in the life sciences, medicine, engineering, and environmental applications; capabilities for storage and manipulation of massive data sets; and development of biologically inspired, computational devices.” In short – few tech sectors would be untouched by a major brain mapping project.

Several aspects of the BAM project intrigue me. As a historian, I’m fascinated by the invocation of the Human Genome Project (HGP) as an analogy for why BAM should be done. George Church explained that the “genome project arguably began in 1984, where there were a dozen of us who were kind of independently moving in that direction but didn’t really realize there were other people who were as weird as we were.” Church clearly hopes to replicate (in a good way) some of that weird chemistry with BAM. I’d be really interested to hear from my history of biology colleagues as to ways in which the HGP has been harnessed to make the case for other big science projects. Also – can/should we think of ourselves as being products of our genome or our connectome?4

Screen Shot 2013-02-19 at 11.18.41 AM

Are you your connectome or your genome?

As I’ve explored in previous posts here and here, historical analogies have great power and their use should also come with some scrutiny. For example, neither Markoff’s article nor the Neuron piece commented on the fact that the HGP was spurred on by competition with Craig Venter and Celera Corporation. Might BAM eventually become a public-private hybrid or a competition between federally-funded researchers and those operating with private monies?5 Moreover, comparisons between BAM and HGP reminded me that when scientists were lobbying for a National Nanotechnology Initiative (NNI) there would be the occasional allusion to the glory-years of the Apollo program.6 Obama’s 2013 SOTU also drew up the 60s era push to space: “Now is not the time to gut these job-creating investments in science and innovation,” the President said, “Now is the time to reach a level of research and development not seen since the height of the Space Race.” Likewise, advocates of the NNI made economic, environmental, and health arguments as to why nano should be the next tech frontier Uncle Sam funded.7

I am also intrigued by how advocates of BAM echo ideas from the early years of nanotechnology’s popularization. Although scientists like Smalley successfully marginalized images of bio-inspired nanobots – my Visioneers book has a chapter devoted to this – these ideas clearly had a powerful hold on the imagination of the public and policy-makers. So it’s intriguing to see references to “nanoprobes” and “biologically inspired, computational devices.” The self-replicating nanobots which could build things “atom by atom” that Drexler popularized are ancestors of a sort.

Finally, the field of brain mapping – like nanotech of the early 1990s – has a small population of its own strange and/or fringy characters. In July 2012, The Chronicle Review ran an article called the “The Strange Neuroscience of Immortality.”

Screen Shot 2013-02-19 at 10.44.53 AM

Fabulous illustration that accompanied the Chronicle piece

It focused on the work of Kenneth Hayworth, a postdoctoral researcher in Jeff Lichtman’s lab at Harvard University. A “self-described futuristic thinker,” Hayworth has developed a series of new machines that can help map the brain’s neural circuitry (the “connectome”). Hayworth’s automated “ultramictrotome”, for example, uses a tiny diamond saw to cut ultra-thin slices of brain tissue. Imaging these slices with an electron microscope can produce high-res maps – analogous to a circuit diagram – of them. What makes Hayworth’s research controversial is his belief that mapping the brain in exquisite detail is one path to brain preservation and eventual immortality. The Chronicle described it: “Then one day, not too long from now, his consciousness will be revived on a computer. By 2110, Hayworth predicts, mind uploading—the transfer of a biological brain to a silicon-based operating system—will be as common as laser eye surgery is today.” In other words, connectomics is, at least for Hayworth, one path to fulfilling the posthumanist dream of transcending the limits of the human body. “We’ve had a lot of breakthroughs—genomics, space flight—but those are trivial in comparison to mind uploading,” Hayworth said, “This will be earth-shattering because it will open up possibilities we’ve never dreamed of.”

To help realize his dream, Hayworth started the Brain Preservation Foundation. Currently, the Foundation offers a cash prize “for the first individual or team to rigorously demonstrate a surgical technique capable of inexpensively and completely preserving an entire human brain for long-term (>100 years) storage with such fidelity that the structure of every neuronal process and every synaptic connection remains intact…” Evan Goldstein interviewed me for his Chronicle piece. My response was that people like Hayworth have “ideas that stand out there as something to be looked at, maybe shot down, proven or disproven, but they are part of the process of staking out where the frontier of science is.”

Parallels between Hayworth’s activities and the early years of nano are striking. The Foresight Institute, a nanotech advocacy group that Eric Drexler helped start, offers large prizes to people who made significant advances in molecular engineering. Moreover, Hayworth’s goal and his approach (a non-profit funded by donations) resonates with those of the early nanoists. Like Drexler, Hayworth has turned to deep-pocketed entrepreneurs – Peter Diamandis, Peter Thiel, et al. –  for support. Finally, one of the first communities to embrace radical ideas for nanotechnology in the 1980s were those with a keen interest in life extension technologies such as cryonics and mind uploading. As MIT’s Sebastian Seung – an advisor for the Brain Preservation Foundation – put it, “Mind uploading is part of the zeitgeist…People have become believers in virtual worlds because of their experience with computers. That makes them more willing to consider far-out ideas.” Similar affinities brought people from the computer science and software communities to the radical flavors of nanotech c. 1990.

So, bringing this all back to BAM – I am looking forward to seeing whether and how a major national initiative do dynamic brain mapping takes shape. But already I’m fascinated by the ways in which BAM’s advocates have connected it to previous major national science initiatives as well as the dark matter of unusual ideas and individuals that forms a diffuse halo around the project.

  1. Markoff’s book preceded Fred Turner’s more scholarly treatment From Counterculture to Cyberculture by a few years []
  2. A earlier version of this paper from 2011 is available here. The report originated at a September 2011 conference at the Kavli Royal Society International Center and was organized by the Kavli Foundation, the Gatsby Charitable Foundation, and the Allen Institute for Brain Science. []
  3. I discussed the “molecule-sized machines” idea at length with a colleague of mine at Caltech. He noted that these are passive sensors, not active nanobots – i.e. millions of silicon-based nanoprobes spaced out at the micron scale. Caltech’s Michael Roukes, who is one of the BAM report authors, is a specialist in this line of research. []
  4. It seems as if everything has its own –omics­ these days: proteomics, epigenomics, and now connectomics. This meme has spread far beyond the life sciences – in the humanities we have “culturomics” while, in 2011, the Obama administration announced a “materials genome initiative” to “double the speed with which we discover, develop, and manufacture new materials.” I’m surprised there’s no nano-omics yet but I guess that lexigraphical ship already sailed []
  5. The 2011 Kavli workshop that led to the BAM report was funded by some private money, including the Allen Institute for Brain Science…so, there is private money that has skin in the game. []
  6. As Nobel laureate Richard Smalley advised Congress in 1999, “Somebody has to go out and put a flag in the ground and say: Nanotechnology, this is where we are going to go and we are going to have a serious national initiative in this area.” []
  7. Debates among scientist as to which is more important – your connectome or your genome – should be interesting topics for historians to explore. Perhaps these mirror nature vs. nurture debates which are, of course, not an either/or proposition. Also of interest would be the development of technologies to actually do brain mapping exercises…how is the field of neuroscience driven by technological and instrumental developments? How have these shaped research topics, etc. etc. – all the sorts of questions that STS scholars are well-equipped to study! []