I'll be returning to my home city of Newcastle-Upon-Tyne next Wednesday (December 6th) to take part in an event organised by The Great Debate. I'll be discussing the topic of Reprogramming Life with Prof. John Burn of the Institute of Human Genetics and Caspar Hewett, the organiser of TGD. Audience participation is welcomed (and, indeed, necessary) at such events, so please come down and take part (if you need an extra incentive, Toby Mundy, my publisher at Atlantic, has very kindly stumped up some cash for a drinks reception afterwards!)
The event starts at 7pm, and further details are available here.
Wednesday, November 29, 2006
Monday, November 27, 2006
I am a commodity
I only just found out that, as of February last year, this blog's been listed on BlogShares, the "fantasy blog stock market". You can keep track of performance here; I'm not sure what I've been doing to raise the valuation (I think it's mainly to do with incoming links), and it only seems to take notice of blogspot pages, but it's an interesting idea nonetheless. I guess it's a natural extension of the Google pagerank algorithm, where pages with a relatively high number of incoming links are considered to be more authoritative.
Wednesday, November 22, 2006
On the shelf
I finally feel like a proper author after seeing my book nestling on the shelf of the Manchester Deansgate branch of Waterstone's, near to Isaac Asimov's classic New Guide to Science.
I did look a bit mental taking pictures of the shelf with my camera phone, but there you go...
Tuesday, November 21, 2006
The New Scientist looks forward
The magazine New Scientist recently celebrated its 50th Anniversary. While some might question its scientific rigour, there's no doubt that it generally does a decent job of bringing science to the public in an accessible fashion, whilst also flagging up to professional scientists the occasional article that might otherwise have gone unseen.
In the most recent issue, the editors asked a collection of the "smartest brains on the planet" to predict the most significant scientific advance of the next 50 years. While some are understandably reluctant
to set themselves up for a fall, most play along with the game.
It's interesting to note how many of the predictions seem to come at the intersection of information/computer science and biology. Lewis Wolpert talks of "computable embryos", Erik Horvitz gives a wide-ranging description of computation as the "fire in our modern-day caves", Paul Nurse anticipates an understanding of the cell as a "chemical and computational machine", Jaron Lanier argues for a restructuring of computer architecture(s) along "bio-mimetic" principles, while Peter Atkins believes that computer technology will allow us to observe and eventually control natural processes to construct "synthetic life".
In the most recent issue, the editors asked a collection of the "smartest brains on the planet" to predict the most significant scientific advance of the next 50 years. While some are understandably reluctant
to set themselves up for a fall, most play along with the game.
It's interesting to note how many of the predictions seem to come at the intersection of information/computer science and biology. Lewis Wolpert talks of "computable embryos", Erik Horvitz gives a wide-ranging description of computation as the "fire in our modern-day caves", Paul Nurse anticipates an understanding of the cell as a "chemical and computational machine", Jaron Lanier argues for a restructuring of computer architecture(s) along "bio-mimetic" principles, while Peter Atkins believes that computer technology will allow us to observe and eventually control natural processes to construct "synthetic life".
Friday, November 17, 2006
THES article from last week
The Times Higher Education Supplement published a feature article of mine last week. As we're on a new edition as of today, I can reproduce it below.
If you wish to cite it, please do so as follows (my suggested title was "Synthetic Biology: Where Top-Down Meets Bottom-Up"...):
Martyn Amos, A chip off Mother Nature's own hard-drive, Times Higher Education Supplement, November 9, 2006, pp. 16-17.
Possibly the most unusual reviewing assignment I have ever accepted came in 2002, when Guinness World Records asked me to help validate a claim made by a group of Israeli scientists to have built the "world's smallest computer". What made this machine radically different was not just its incredibly miniaturised state but its basic construction material. Rather than piecing together transistors on a silicon surface, Ehud Shapiro and his team at the Weizmann Institute had fabricated their device out of the very stuff of life itself - DNA.
Three trillion copies of their machine could fit into a single tear drop. This miracle of miniaturisation was achieved not through traditional technology but through a breakthrough in the emerging field of molecular computing. The team used strands of DNA to fuel these nanomachines, their latent energy freed by enzymatic "spark plugs". These were not computers in any traditional sense. Their computational capabilities were rudimentary and, rather than using the familiar zeroes and ones of binary code, their "software" was written in the vocabulary of the genes - strings of As, Gs, Cs and Ts.
One of the main motivations for shrinking traditional computer chips is to extract the maximum amount of computational power from a limited space. By placing ever smaller features on the silicon real estate of modern processors, chip-makers such as Intel continually try to keep in step with Moore's Law - the famous observation that computer power roughly doubles every 18 months.
Shapiro's computer was never going to win any prizes for mathematical muscle. All it could do was analyse a sequence of letters and determine whether or not it contained an even number of a specific character. Nevertheless, it represented the state of the art in a scientific field that had been in practical existence for less than a decade. In 1994, Len Adleman (previously better known as one of the co-inventors of the main Internet encryption scheme, and the man who gave a name to what we now know as computer viruses) stunned the computing world by demonstrating the feasibility of performing computations using molecules of DNA.
Rather than representing information as electronic bits inside a silicon chip, Adleman showed how to solve a problem using data encoded as sequences of bases on DNA molecules. One of his motivations lay in the storage capacity of DNA; nature has data compression down to a fine art. Every living cell in your body contains a copy of your unique 3Gb genome, the data equivalent of 200 copies of the Manhattan telephone directory. Adleman wanted to use the nature of chemical reactions to perform massively parallel computations on this molecular memory.
Each tube could contain trillions of individual DNA strands, and each molecule could encode a possible answer to a particular problem. The idea was to exploit the fact that enzymes and other biological tools act on every strand in a tube at the same time, quickly weeding out bad solutions and giving the potential for parallel processing on a previously unimagined scale.
Adleman's initial paper led to the emergence of a fully fledged field. A rash of papers appeared, describing proposals to use DNA to crack government encryption schemes or build real, "wet" memories more capacious than the human brain. After this flurry of untamed optimism - when some seriously thought that molecular machines could give traditional computers a run for their money - DNA computing matured into a more thoughtful discipline. Scientists no longer talk seriously about taking on silicon machines and are instead seeking out niche markets for their molecular machines, areas such as medical diagnostics and drug delivery, where traditional devices and methods are too large, invasive or prone to error.
Shapiro's simple computer was one example of such an application; a small step towards eventual "on-site" diagnosis and treatment of diseases such as cancer. A later version of his machine was capable (in a test tube, at least) of identifying the molecules that signal the presence of prostate cancer and then releasing a therapeutic molecule to kill the malevolent cells. Shapiro and his team have spoken about their aim of creating a "doctor in a cell", a reprogrammed human cell that could roam around the body, sniffing out and destroying disease. As physicist Richard Jones explains in his book Soft Machines, the Fantastic Voyage scenario of humans in a miniaturised submarine is "quite preposterous", but that doesn't rule out serious work into trying to engineer existing living systems to act as "medibots" able to detect and control disease at its source.
A growing band of experts is slowly coming together to form a whole new vanguard at the frontiers of science, where boundaries between biology, chemistry, engineering and computing become fluid and ever-changing. This is the new world of synthetic biology. "We want to do for biology what Intel does for electronics," states George Church, professor of genetics at Harvard University. The Massachusetts Institute of Technology's Tom Knight is even more blunt: "Biology is the nanotechnology that works."
DNA is so much more than an incredibly compact data storage medium. As physicist Richard Feynman explained: "Biology is not simply writing information; it is doing something about it." Floating inside its natural environment - the cell - DNA carries meaning, used to generate signals, make decisions, switch things on and off, like a program that controls its own execution. DNA, and the cellular machinery that operates on it, is the original reprogrammable computer, pre-dating our efforts by billions of years. By re-engineering the code of life, we may finally be able to take full advantage of the biological "wetware" that has evolved over millennia. We are dismantling living organisms and rebuilding them - this time according to a pre-planned design. It is the ultimate scrap-heap challenge.
As pioneers such as Alan Turing and John von Neumann discovered, there are direct parallels between the operation of computers and the gurglings of living "stuff" - molecules and cells. Of course, the operation of organic, bio-logic is more noisy, messy and complex than the relatively clear-cut execution of computer instructions. But rather than shying away from the complexity of living systems, a new generation of synthetic biologists is seeking to harness the diversity of behaviour that nature offers, rather than trying to control or eliminate it. By building devices that use this richness of behaviour at their very core, we are ushering in a new era in terms of practical devices and applications and of how we view the very notion of computation and of life itself.
The questions that drive this research include the following: Does nature "compute" and, if so, how? What does it mean to say that a bacterium is "computing"? Can we rewrite the genetic programs of living cells to make them do our bidding? How can mankind benefit from this potentially revolutionary new technology? What are the dangers? Could building computers with living components put us at risk from our own creations? What are the ethical implications of tinkering with nature's circuits? How do we (indeed, should we) reprogramme the logic of life?
The dominant science of the new millennium may well prove to be at the intersection of biology and computing. As biologist Roger Brent argues: "I think that synthetic biology will be as important to the 21st century as [the] ability to manipulate bits was to the 20th." This isn't tinkering around the edges, it's blue-skies research - the sort of high-risk work that could change the world or crash and burn. I took a huge risk in the 1990s when I gambled on DNA computing as the topic of my PhD research - a field with a literature base, at the time, of a single article.
It is exhilarating stuff, and it has the potential to change forever our definition of a "computer". But most researchers are wary of promising too much, preferring to combine quiet optimism with grounded realism. As researcher Drew Endy explains: "It'll be cool if we can pull it off. We might fail completely. But at least we're trying."
If you wish to cite it, please do so as follows (my suggested title was "Synthetic Biology: Where Top-Down Meets Bottom-Up"...):
Martyn Amos, A chip off Mother Nature's own hard-drive, Times Higher Education Supplement, November 9, 2006, pp. 16-17.
Possibly the most unusual reviewing assignment I have ever accepted came in 2002, when Guinness World Records asked me to help validate a claim made by a group of Israeli scientists to have built the "world's smallest computer". What made this machine radically different was not just its incredibly miniaturised state but its basic construction material. Rather than piecing together transistors on a silicon surface, Ehud Shapiro and his team at the Weizmann Institute had fabricated their device out of the very stuff of life itself - DNA.
Three trillion copies of their machine could fit into a single tear drop. This miracle of miniaturisation was achieved not through traditional technology but through a breakthrough in the emerging field of molecular computing. The team used strands of DNA to fuel these nanomachines, their latent energy freed by enzymatic "spark plugs". These were not computers in any traditional sense. Their computational capabilities were rudimentary and, rather than using the familiar zeroes and ones of binary code, their "software" was written in the vocabulary of the genes - strings of As, Gs, Cs and Ts.
One of the main motivations for shrinking traditional computer chips is to extract the maximum amount of computational power from a limited space. By placing ever smaller features on the silicon real estate of modern processors, chip-makers such as Intel continually try to keep in step with Moore's Law - the famous observation that computer power roughly doubles every 18 months.
Shapiro's computer was never going to win any prizes for mathematical muscle. All it could do was analyse a sequence of letters and determine whether or not it contained an even number of a specific character. Nevertheless, it represented the state of the art in a scientific field that had been in practical existence for less than a decade. In 1994, Len Adleman (previously better known as one of the co-inventors of the main Internet encryption scheme, and the man who gave a name to what we now know as computer viruses) stunned the computing world by demonstrating the feasibility of performing computations using molecules of DNA.
Rather than representing information as electronic bits inside a silicon chip, Adleman showed how to solve a problem using data encoded as sequences of bases on DNA molecules. One of his motivations lay in the storage capacity of DNA; nature has data compression down to a fine art. Every living cell in your body contains a copy of your unique 3Gb genome, the data equivalent of 200 copies of the Manhattan telephone directory. Adleman wanted to use the nature of chemical reactions to perform massively parallel computations on this molecular memory.
Each tube could contain trillions of individual DNA strands, and each molecule could encode a possible answer to a particular problem. The idea was to exploit the fact that enzymes and other biological tools act on every strand in a tube at the same time, quickly weeding out bad solutions and giving the potential for parallel processing on a previously unimagined scale.
Adleman's initial paper led to the emergence of a fully fledged field. A rash of papers appeared, describing proposals to use DNA to crack government encryption schemes or build real, "wet" memories more capacious than the human brain. After this flurry of untamed optimism - when some seriously thought that molecular machines could give traditional computers a run for their money - DNA computing matured into a more thoughtful discipline. Scientists no longer talk seriously about taking on silicon machines and are instead seeking out niche markets for their molecular machines, areas such as medical diagnostics and drug delivery, where traditional devices and methods are too large, invasive or prone to error.
Shapiro's simple computer was one example of such an application; a small step towards eventual "on-site" diagnosis and treatment of diseases such as cancer. A later version of his machine was capable (in a test tube, at least) of identifying the molecules that signal the presence of prostate cancer and then releasing a therapeutic molecule to kill the malevolent cells. Shapiro and his team have spoken about their aim of creating a "doctor in a cell", a reprogrammed human cell that could roam around the body, sniffing out and destroying disease. As physicist Richard Jones explains in his book Soft Machines, the Fantastic Voyage scenario of humans in a miniaturised submarine is "quite preposterous", but that doesn't rule out serious work into trying to engineer existing living systems to act as "medibots" able to detect and control disease at its source.
A growing band of experts is slowly coming together to form a whole new vanguard at the frontiers of science, where boundaries between biology, chemistry, engineering and computing become fluid and ever-changing. This is the new world of synthetic biology. "We want to do for biology what Intel does for electronics," states George Church, professor of genetics at Harvard University. The Massachusetts Institute of Technology's Tom Knight is even more blunt: "Biology is the nanotechnology that works."
DNA is so much more than an incredibly compact data storage medium. As physicist Richard Feynman explained: "Biology is not simply writing information; it is doing something about it." Floating inside its natural environment - the cell - DNA carries meaning, used to generate signals, make decisions, switch things on and off, like a program that controls its own execution. DNA, and the cellular machinery that operates on it, is the original reprogrammable computer, pre-dating our efforts by billions of years. By re-engineering the code of life, we may finally be able to take full advantage of the biological "wetware" that has evolved over millennia. We are dismantling living organisms and rebuilding them - this time according to a pre-planned design. It is the ultimate scrap-heap challenge.
As pioneers such as Alan Turing and John von Neumann discovered, there are direct parallels between the operation of computers and the gurglings of living "stuff" - molecules and cells. Of course, the operation of organic, bio-logic is more noisy, messy and complex than the relatively clear-cut execution of computer instructions. But rather than shying away from the complexity of living systems, a new generation of synthetic biologists is seeking to harness the diversity of behaviour that nature offers, rather than trying to control or eliminate it. By building devices that use this richness of behaviour at their very core, we are ushering in a new era in terms of practical devices and applications and of how we view the very notion of computation and of life itself.
The questions that drive this research include the following: Does nature "compute" and, if so, how? What does it mean to say that a bacterium is "computing"? Can we rewrite the genetic programs of living cells to make them do our bidding? How can mankind benefit from this potentially revolutionary new technology? What are the dangers? Could building computers with living components put us at risk from our own creations? What are the ethical implications of tinkering with nature's circuits? How do we (indeed, should we) reprogramme the logic of life?
The dominant science of the new millennium may well prove to be at the intersection of biology and computing. As biologist Roger Brent argues: "I think that synthetic biology will be as important to the 21st century as [the] ability to manipulate bits was to the 20th." This isn't tinkering around the edges, it's blue-skies research - the sort of high-risk work that could change the world or crash and burn. I took a huge risk in the 1990s when I gambled on DNA computing as the topic of my PhD research - a field with a literature base, at the time, of a single article.
It is exhilarating stuff, and it has the potential to change forever our definition of a "computer". But most researchers are wary of promising too much, preferring to combine quiet optimism with grounded realism. As researcher Drew Endy explains: "It'll be cool if we can pull it off. We might fail completely. But at least we're trying."
Wednesday, November 15, 2006
Last night
Thank you to everyone who helped make last night's book launch an extremely enjoyable event. I must especially thank my fellow panellists, Oliver Morton, Stephen Emmott, and Johnjoe McFadden, my publicist Annabel Huxley for arranging it in the first place, and the ICA and Royal Institution for hosting it. Thanks also to my publisher, Toby Mundy, at Atlantic Books.
(This is beginning to sound like a speech at the Oscars...)
Most of all, though, thank you to the 100+ people who turned up to find out more about the strange and exciting new world of the Genesis Machines - at 18:45 I was worried that the guest list would outnumber the paying attendees, but it was standing room only by 19:00. I hope you enjoyed it as much as we did.
(This is beginning to sound like a speech at the Oscars...)
Most of all, though, thank you to the 100+ people who turned up to find out more about the strange and exciting new world of the Genesis Machines - at 18:45 I was worried that the guest list would outnumber the paying attendees, but it was standing room only by 19:00. I hope you enjoyed it as much as we did.
Monday, November 13, 2006
Another radio appearance
I'm delighted to say that I've been invited, along with Richard Jones (who blogs here), to appear on BBC Radio 4's well-respected science show The Material World. We'll be on between 16:30 and 17:00 this Thursday (Nov. 16th), talking about biological computing and nanotechnology. Join us by tuning in on either 92-95FM or 198LW, or by listening online at the programme website.
Sunday, November 12, 2006
Sunday Times article
Steve Farrar interviewed me last week while I was in London, resulting in this feature article in today's Sunday Times.
Friday, November 03, 2006
Radio appearance
Advance notice for anyone who might be interested: I've been pencilled in to appear on Simon Mayo's afternoon radio show on BBC Radio Five Live. The date is next Thursday (Nov. 9), and I'll be on between 14:00 and 14:45 (with breaks for news and sport updates, thankfully!) to talk about the book.
You can tune in on MW909 or 693, or listen online via the Daily Mayo programme website.
Gulp.
You can tune in on MW909 or 693, or listen online via the Daily Mayo programme website.
Gulp.
Notes for Genesis Machines
I've put the bibliography and notes for Genesis Machines online here. The idea is that this page will serve as a useful resource for readers of the book, but it will also give prospective readers a flavour of what's contained within.
Thursday, November 02, 2006
Jonoska review
Further to my post a few days ago, Natasha Jonoska has very kindly agreed to my hosting a copy of the full review of my book Theoretical and Experimental DNA Computation.
As I already mentioned, the review serves as an excellent historical review of the early days of DNA computation, and I'm grateful to Natasha for allowing me to make it available.
As I already mentioned, the review serves as an excellent historical review of the early days of DNA computation, and I'm grateful to Natasha for allowing me to make it available.
Subscribe to:
Posts (Atom)