Friday, August 24, 2007

My contribution to the synthetic biology debate

You may recall that the Royal Society is soliciting opinions on various aspects of the field of synthetic biology. What follows is a lightly edited version of my own submission, which I sent off today.

In what follows, I highlight some concerns and dangers, speaking as someone who has an definite interest in the field flourishing (and would therefore wish to see these concerns addressed).

1. Terminology

The first concern is over the term “synthetic biology” itself. The two main issues are “what does it mean?” and “what does it cover?” As pointed out at the BBSRC workshop, clinicians have used the term for a while to refer to prosthetic devices. In attempting to offer a fixed definition of the term, the community runs the risk of becoming overly exclusive at a premature stage. However, there is also a risk that “synthetic biology” will become a “catch-all” term that is too loosely applied. The emphasis on the term “biology” may also serve to alienate mathematicians, physicists, computer scientists and others, who may (wrongly) feel that they have no expertise to offer a “biological” discipline. As a counter-example, witness the success of the field of bioinformatics, which would appear to fairly represent the disciplinary expertise in the field (in terms of the general composition of the term, rather than the relative lengths of its components). As a very crude experiment, I searched in Google for both “computational biology” and “bioinformatics”; the first term returned around 1,530,000 hits, the second around 14,000,000.

This leads on to the issue of “language barriers”. This is always an issue in any new field that involves the collision of two or more (often very dissimilar) disciplines. Being seen to publically ask “stupid questions” is a daunting prospect to most young scientists, and yet many of the major breakthroughs have occurred through just that. This opens up the wider debate on inter-disciplinarity in 21st century science, and how we might best prepare its practitioners. Do we give students a broad, shallow curriculum to allow them to make connections, without necessarily having the background to “drill deeper” if required, or do we stick to the “old model” of “first degree” and subsequent training? My own intuition is that it is far better to intensively train in a single field at the outset, and then offer the opportunity to “cherry pick” topics from a different discipline at a later stage. This educational debate is, however, not one that should be the sole preserve of synthetic biology!

2. Expectation Management

Even when biologists and (say) computer scientists can agree a suitable shared terminology, there is still the risk of a mismatch occurring in terms of expectations of what might be achieved. For example, the notion of “scalability” might mean very different things to a computer scientist and a microbiologist. To the former, it means being able to increase by several orders of magnitude the number of data items processed by an algorithm, or double the (already vast) number of transistors we may place on the surface of a computer chip. To a biologist, the idea of scalability might currently be very different:

“What's needed to make synthetic biology successful, Rabaey said, are the same three elements that made microelectronics successful. These are a scalable, reliable manufacturing process; a scalable design methodology; and a clear understanding of a computational model. "This is not biology, this is not physics, this is hard core engineering," Rabaey said.

In electronics, photolithography provides a scalable, reliable manufacturing process for designs involving millions of elements. Biology has a long way to go. What's needed, Rabaey said, is a way to generate thousands of genes reliably in a very short time period with very few errors. The difference between what's available and what's needed is about a trillion to one.”

3. Conceptual Issues

As the leading nanotechnologist (and FRS) Richard Jones has pointed out, his field was dominated from an early stage by often inappropriate analogies with mechanical engineering (e.g., cogs). It may well be that case that we are in danger of the same thing happening with synthetic biology, where computer scientists impose rigid circuit/software design principles on "softer", more “fuzzy” substrates. Jones quotes, on his blog, an article in the New York Times:

“Most people in synthetic biology are engineers who have invaded genetics. They have brought with them a vocabulary derived from circuit design and software development that they seek to impose on the softer substance of biology. They talk of modules — meaning networks of genes assembled to perform some standard function — and of “booting up” a cell with new DNA-based instructions, much the way someone gets a computer going.”


4. Complexity


The issue of "grey goo" has persistently dogged the field of nanotechnology, and it would be tempting to dismiss similar criticisms of synthetic biology as well-intentioned but ultimately uninformed. However, if synthetic biologists are to avoid the mistake that researchers in GM research made (that is, to appear arrogant and dismissive, leading to mass public protest and restrictive legislation), then we should acknowledge and address the very real possibility of the biological systems under study behaving in very unpredictable ways. Anyone who has any degree of contact with studying biosystems will understand the notion of complexity; components that are connected in an unknown fashion behave in unpredictable ways, which may include evasion of any control mechanisms that have been put in place. As Douglas Kell and his colleagues have observed, it is perfectly possible to alter parameters of a system on an individual basis, and see no effect, only to observe wild variations in behaviour when exactly the same tweak is applied to two or more parameters at the same time. Working in an interdisciplinary fashion may address this issue, at least in part, if modellers work closely with bench scientists in a cycle of cooperation. Once again invoking the issue of scalability, studying the behaviour of complex biosystems through modelling alone will quickly become infeasible, due the the combinatorial explosion in the size of the search space (of parameter values). By actually making or modifying the systems under study in the lab, the problem may be reduced to manageable proportions.

5. Hype

In my own book, Genesis Machines (Atlantic Books, 2006), I illustrate the risk of promising too much at an early stage by describing the story of the “AI winter”. In the 1960s, researchers in artificial intelligence (AI) had promised human-level intelligence “in a box” within twenty years. By issuing such wild predictions, AI researchers set themselves up for a monumental fall, and, when the promised benefits failed to accrue, funding was slashed and interest dwindled. This AI winter (by analogy with “nuclear winter”) affected the field for over 15 years, and it would be disappointing (to say the least) if the same thing were to happen to synthetic biology.

Hubristic claims for synthetic biology should be avoided wherever possible; without singling out particular groups, I have already seen several predictions (again, often conflated with ambitions) that have absolutely no realistic chance of coming to fruition in any meaningful time-scale (if at all). In this more “media savvy” age, perhaps practitioners in synthetic biology might benefit, as their AI counterparts did not, from media training (I have personally benefited (June 2004) from the course provided by the Royal Society, and perhaps the Society might consider a “mass participation” version for new entrants to the field).

No comments: