I've been asked to comment on this week's news that Craig Venter's team have succeeded in building a "synthetic living cell" (you can read the full paper, for free, here), so I thought it might be useful to write a short post to explain just what they've achieved.
Cells may be thought of as biological "wetware", in the same way that the physical components of a personal computer (hard drive, processor, memory, etc.) form the "hardware". A computer can't work without an operating system; the central controller program that runs in the background, coordinating the various activities of the machine. Most people use Windows as their operating system, although there are others, such as Ubuntu Linux and MacOS. Similarly, a cell cannot survive without a working genome; the collection of genes that control and influence an organism's internal operation.
The core kernel (ie. the central "brain") of the Ubuntu Linux operating system running on my netbook is (roughly) 4 Megabytes in size, which is about four times the size of the genome of Mycoplasma mycoides. This is a bacterial parasite found in cattle and goats, and it was selected by Venter and his team because (a) it has a relatively small genome that has been fully-sequenced, and (b) it grows more quickly than bacteria they've used in the past.
Venter and his team have created an entirely synthetic copy of the genome of M. mycoides, which they then inserted into a related bacterium, M. capricolum. This new genome was "booted up" by the recipient, which then started "running" the new genetic program.
Importantly, the synthetic genome was completely pristine, in the sense that it had not been physically derived in any way from existing genetic material. Standard genetic engineering splices short synthetic sequences in to existing, "natural" DNA sequences, but Venter's "synthia" genome was created from scratch. It's the equivalent of taking the known binary sequence of a small operating system kernel, typing it into a text editor in small chunks, combining the chunks together into one big file, and then using it to boot up a PC. At no stage was the "new" kernel physically derived (copied) from a version stored on CD, DVD, or downloaded from the 'net.
Venter's team use a DNA synthesizer to piece together the A, G, C and T bases to form brand-new building blocks, which were then stitched together into a single sequence. This is the key technical achievement of the paper - a strategy for assembling an entire genome, from scratch, using synthetic components, and to get it "running" in a host cell. It's important to note that it was only the genome that was synthetic; the recipient cell was a pre-existing, "natural" bacterium.
This breakthrough is significant in that it demonstrates the feasibility of large-scale whole-genome transplantation, which will be an important component of the emerging field of synthetic biology. However, the real challenge lies in gaining a systems-level understanding of how even simple genomes operate, so that they may be fundamentally (re-)engineered.
Science has opened up a forum for posting questions, which will be answered later today by news writer Elizabeth Pennisi and philosopher and scientist Mark Bedau.
Update, 21/5/10, 11:13: Corrected kernel size assertions; Windows kernel is much larger than previously thought.
No comments:
Post a Comment