Monday, July 26, 2010

Weeknote #11 (w/e 25/7/10)

This week we finally submitted our paper on engineered oscillations in bacterial populations. This is something I've been working on with a colleague in Madrid, Angel Goni-Moreno, since he visited us in Manchester last year (in truth, he's been doing most of the work, although any delays have been entirely due to me).

In physics, an oscillator is a system that produces a regular, periodic "output". Familiar examples include a pendulum or a vibrating string. Linking several oscillators together in some way gives rise to synchrony -- for example, heart cells repeatedly firing in unison, or millions of fireflies blinking on and off, seemingly as one.

Oscillators are fundamental to biology, but they are also of interest to engineers, since they form the basis for counting (and synchronisation). Synthetic biology combines both disciplines, so the construction of oscillators within living cells is one of the main topics of interest in the field right now. However, until recently, most work has been restricted to single cells. In our paper, we have shown, in theory, how to engineer oscillations within populations of cells, using the "client-server" model familiar to computer scientists.

Update: the preprint version of the paper is here.

While writing the final draft, I was reminded of my brief contact with one of the founders of the field of theoretical biology. I first met Brian Goodwin in 2004, when I was still at the University of Exeter. He, along with Susan Blackmore, very kindly agreed to speak at the launch of a network I'd set up to encourage the study of complexity theory within the University. Best known in the broader community for his work on the evolution of complexity, Goodwin laid the foundations for recent research in synthetic biology with his seminal 1965 work on negative feedback. His later work focussed on the notion of a science of qualities (on which he spoke at our meeting), and when I first met him he was already formally retired, although still very active at Schumacher College, just down the road in Dartington. We also spent time chatting a year later, while we were both giving lectures at a summer school in Montpellier. I was struck most of all by his gentle nature and generosity of spirit, and we had the chance to discuss in greater depth the topics he'd touched on in his lecture.

Brian died just over a year ago; I first found out about his death while looking up references to give to my current Ph.D. student, who is now applying some of his ideas to the field of architecture. He had a great effect on me, and will continue to influence generations of students to come.

Sunday, July 18, 2010

Weeknote #10 (w/e 18/7/10)

Into weeknote double figures, but nothing much to report, as we've been on holiday at the Suffolk coast.

Normal service will resume next Monday.

Monday, July 12, 2010

Weeknote #9 (w/e 11/7/10)

Only one thing of significance to report since my last weeknote; the acceptance of a fun little conference paper on solving a puzzle game that has, so far, escaped the attention of the algorithms community.

The Zen Puzzle Garden is a one-player puzzle game, involving a monk raking a traditional Japanese rock garden. The aim is to find a series of moves that allow the monk to rake all of the available sand, whilst negotiating rocks, pushing statues and collecting leaves - all without getting stuck in a dead-end.

While the problem is easy to describe, it's related to puzzles like Sokoban, which are actually very difficult to solve automatically (ie. with a computer program), in the general case. Jack Coldridge, who graduated from MMU a year ago, worked on this problem with me for his final-year dissertation, and we then developed it further into a full paper. The title, Genetic algorithms and the art of Zen is a play on David Goldberg's 1989 paper Zen and the art of genetic algorithms, which itself references Robert Pirsig's famous book.

Problems such as Sokoban are difficult because there are, potentially, a vast number of possible solutions to consider (where a solution is a path through the garden, in this example). Most solutions will be incorrect or "illegal", and the problem is to find the "needles in the haystack" (that is, the correct solutions). These so-called NP-hard problems are the most "interesting" problems in combinatorial mathematics, because they're the most challenging. The practical significance of such problems lies in the fact that they are related to "real world" problems of great importance, such as scheduling, packing and routeing. For a nice review of hard puzzle games, see this paper (PDF).

Several methods have been applied to the solution of such problems, including "traditional" algorithms, which use a "tree-based" approach to searching the space of possible solutions, as well as biologically-inspired algorithms. In the paper, we used a genetic algorithm to "evolve" paths through the garden. We start with a set of random paths, and see how well they solve the problem. Some will be "less bad" than others, so we keep them and use them to "breed" the next generation of solutions. Gradually, the power of natural selection (combined with a sprinkling of mutation) forces the population towards better and better solutions.

We found that our method was capable of finding the optimal (ie. shortest) solutions in the vast majority of cases, and it required far less processing power than another standard algorithm. Importantly, we have highlighted a new problem for the AI/puzzle community to get its teeth into.

The paper has been accepted for presentation at the IEEE Fifth International Conference on Bio-Inspired Computing: Theories and Applications (BIC-TA), to be held in Liverpool, on 8-10 September 2010.

Thursday, July 08, 2010

Weeknote #8 (w/e 4/7/10)


To Paris, for the regular board meeting of our European Union BACTOCOM project. We launched the project with a workshop in Manchester, and partners take turns to organize subsequent meetings. We'll be in Santander in October, and then Berlin next year. Whilst browsing in the Abbey Bookshop in St. Michel, I noticed a copy of Genesis Machines, and had Justine record the fact that it was still on sale, in a proper shop. The subsequent scene ensured that I was brought to the attention of Brian, the proprietor, who kindly asked me to sign the last remaining copy in stock (i.e., the single copy they ordered three years ago).

Prior to leaving for Paris, we had quite a busy week; in addition to finishing off and submitting a research council proposal, we're now heavily into the preparations for our contributions to the Manchester Science Festival. So far, we have a couple of workshops lined up (I don't want to spoil the surprise until the details are confirmed), plus a public debate on the scientific, technological and ethical implications of synthetic biology and so-called artificial life. Watch this space for more details nearer the time.

On a personal note, I was delighted to receive confirmation of my promotion to a Readership. Most of my family members were quite baffled by this antiquated term, until I explained that it's the academic rank below Professor (in the UK), and is awarded on the basis of research.

Monday, June 28, 2010

Weeknote #7 (w/e 27/6/10)

While flicking through the June issue of the BBC's Focus magazine, I noticed that one of my research collaborators had received a nice mention from Ian Stewart at Warwick. He was asked to select three books on puzzles and games; Martin Gardner was the obvious first-choice author, and Winning Ways for your Mathematical Plays is a minor classic. Stewart's final choice was a book written by my collaborator at New York University, Dennis Shasha. In the column, Stewart describes Dr Ecco's Cyberpuzzles as "...a fantastic book if you want to spend some serious time solving puzzles and giving your brain a work-out."

Ian Stewart has been a significant influence on my career; as a popular science author, I've always been impressed by his writing, but he had a rather more direct effect on me back in the mid-1990s, when I was a graduate student at the University of Warwick. Ian very kindly wrote me a reference to attend the prestigious Complex Systems Summer School at the Santa Fe Institute, and the month I spent there was incredibly important in terms of shaping my personal ambitions and outlook on research.

Now, I'm fortunate in being able to collaborate with people of Dennis' calibre (see the previous note, below), and last week he very kindly sent me a copy of his latest book. Co-written with Cathy Lazere, Natural Computing is a profile of the frontiers of computer science, told through the stories of fourteen pioneers, such as Rodney Brooks, Ned Seeman and Paul Rothemund. I'll post a full review once I've finished it.

Monday, June 21, 2010

Weeknote #6 (w/e 20/6/10)

We (three colleagues and myself) were recently successful in obtaining funding from the NanoInfoBio project to test an idea that's been rattling around for a while. DNA hash pooling is a technique that Dennis Shasha developed, with some assistance from me, while I was visiting him. Dennis is an incredibly sharp and prolific Professor of Computer Science at the Courant Institute of New York University. He was the Series Editor for my first book, and we kept in touch since its publication. Justine, the little one and I visited Dennis while he was in Paris on sabbatical with his family, in the summer of 2007. While Tyler, Dennis and Karen's son, played American football, we walked round and round an athletics track on the edge of the city, knocking around our own particular problem.

The task of analysing large populations of mixed DNA strands is of particular relevance to the emerging field of metagenomics, which is concerned with understanding, in genetic terms, the vast complexity of the planet's biosphere. Methods for looking at environmental samples often require a lot of genetic sequencing; although new ways of doing this are constantly driving down the cost, it can still be expensive to sequence large populations, as well as time-consuming. Dennis and I developed a technique that combines computational analysis with simple rounds of laboratory steps, based on the computer science idea of hashing. The idea is to associate "labels" with individual sub-populations of genetic sequences, such that the number of different genomes with the same label is relatively low. In this way, each genome (or genomic fragment) is associated with its own "fingerprint", which we can then use to confirm its presence (or otherwise) in a sample. Our hope was that this technique would offer a cheap, quick and simple pre-processing step before any sequencing was required, thus reducing the cost and complexity of analysing a sample.

We finally published the theoretical paper last year, but have only just obtained the funding to actually test the idea in the lab. I floated the concept at one of the NIB brain-storming meetings, and it was picked up by a talented team of biologists (Trish Linton, Mike Dempsey and Robin Sen). We put together a proposal to NIB for a small amount of support (£25K), and we were fortunate enough to be one of three projects funded in the last round. The nine-month post-doctoral position is currently going through the MMU approval process, so watch this space if you're interested.

Monday, June 14, 2010

Weeknote #5 (w/e 13/6/10)


The focus of the past week has been on Getting Things Done. After what's been probably my busiest academic year so far, I finally decided that my workload was such that I required a rigourous approach to task management. I trawled around for methodologies that would allow me to organize a multitude of different jobs, whilst maximizing the time I could spend with my family. After reading about Getting Things Done (GTD) on Merlin Mann's well-respected 43 Folders blog, I decided to give it a go. There's a nice "getting started with GTD" article on 43 Folders, which summarises the approach thus:

  1. identify all the stuff in your life that isn’t in the right place (close all open loops)
  2. get rid of the stuff that isn’t yours or you don’t need right now
  3. create a right place that you trust and that supports your working style and values
  4. put your stuff in the right place, consistently
  5. do your stuff in a way that honors your time, your energy, and the context of any given moment
  6. iterate and refactor mercilessly

And that's it, really. Most of the week was spent on the first three steps (the creator of GTD recommends at least a couple of solid days), but the effort was well worth it. I started by taking the various slush piles, to-do lists and marked-up journals and papers in my home office, and merging them into one big "in" pile. I then had to do the same with my work and Gmail inboxes, extracting only the "open loops" (i.e., unfinished projects).

I had over 4,000 emails in my Gmail inbox, and working through the whole lot, deleting as I went, quickly lost its appeal. I therefore adopted a "tagging" approach; I created an "@action" tag in red, and then skimmed through my inbox, tagging anything that required an action on my part. Everything was then selected and archived (just "select all", answer "yes" when it asks you if you want to apply this to all conversations, and then hit "Archive"), leaving nothing in my inbox (for the first time in many years). I could then select only the tagged messages, which was much more manageable.

The end result of this physical and electronic clear-out was a car-full of paper to go to the recycling centre, a clean workspace (shown above) devoid of distracting piles of paper, and a fresh outlook on work. I'm already feeling the mental benefit, as I've been relieved of the self-inflicted stress brought on by my subconscious constantly asking "what am I currently not doing?" I've always been quite cynical in the past about "snake oil", management-driven "productivity" schemes, but I can honestly say that GTD is an eye-opener, and it actually seems to work.

I've managed to condense everything down to a list of just over forty "projects" (ranging from "Fix external hard drive" to "Write next book"), most of which have a discrete "next action" attached to them.

I'll be writing more about GTD in the coming weeks and months, as I learn more about the system and (hopefully) realise its potential.

Monday, June 07, 2010

Weeknote #4 (w/e 6/6/10)

I've spent the past week in Madrid, at the Universidad Politecnica. I was a Visiting Professor in the Faculty of Informatics, delivering a series of lectures on "molecular and cellular computing" to their Masters-level students.

In the past, some people have expressed an interest in the material, so I thought I'd make it available here. A lot of it is based on my book Theoretical and Experimental DNA Computation (Springer, 2005), although there's a lot of new material in the second half of the series.

The lectures are as follows (links to PDF versions of the slides):

Day 1: Molecular Computing

1. Introduction and historical motivation.

2. The first experiment.

3. Subsequent work.

Day 2: From in vitro to in vivo

1. Models, lab work, and the transition.

2. Laboratory implementations.

Day 3: Biological Engineering

1. Biological background.

2. Synthetic biology.

3. Synthetic Biology II.

Creative Commons License
Molecular and Cellular Computing course material by Martyn Amos is licensed under a Creative Commons Attribution-Non-Commercial-No Derivative Works 2.0 UK: England & Wales License.

Monday, May 31, 2010

Weeknote #3 (w/e 30/5/10)

Not a great deal to report this week, as I've been suffering from a particularly painful seasonal disorder (i.e. marking). The delay to our Madrid trip due to Icelandic intervention was a blessing in disguise, I think, as it allowed me to clear the decks of a load of scripts before jetting off to give three afternoons of lectures at the Universidad Polytecnica de Madrid. If we'd gone when we'd originally planned to then the scripts would have been sitting there in my study at home, a distant yet malign cloud hanging over the trip.

Arrived in Madrid yesterday, after a relatively painless flight from Liverpool with EasyJet. It was all going too well, however; on arrival at the hotel, our daughter ran towards a display of flowers in the lobby, caught her foot on a rug and went face-down onto a table. She cut her eye quite badly, but she's a hardy little thing, and was back on top form today.

I gave my first set of lectures this afternoon/evening, as the guest of Alfonso Rodriguez-Paton. He's the "Madrid node" of our BACTOCOM project, and kindly invited me to teach some of their postgraduates (others involved this year include Christof Teuscher, Milan Stojanovic and Friedrich Simmel, who's also involved with BACTOCOM). I'm here to talk about "molecular and cellular computing"; ŧoday was motivation and historial background, a bit of biology and an overview of Adleman's experiment. Tomorrow is formal models of DNA computation followed by self-assembly and DNA origami. The final set of lectures on Wednesday will deal mainly with synthetic biology, so I hope Fritz has left me something to talk about.

Monday, May 24, 2010

Weeknote #2 (w/e 23/5/10)



It's been a big week for synthetic biology, with the announcement by Craig Venter that he'd succeeded in creating a "synthetic cell". My previous post describes my take on the technical aspects of his achievement; it's not entirely accurate to call it a "synthetic cell", since they used existing cells as the recipients (that is, it was only the genome that was synthetic). It's more like "genomic transplantation" with de novo sequences. Technically challenging, but not the earth-shattering breakthrough that it's being sold/hyped as. They certainly didn't turn "inanimate chemicals into a living organism".

My own little piece of press coverage looked pretty low-key by comparison. I was interviewed ages ago by Louise Tickle for the Education section of the Guardian, and the story finally appeared last week.

This week, members of my group (specifically, Pete and Naomi) contributed to an event hosted by MMU. I'm a Director of ArcSpace Manchester, a Community Interest Company to support creative and ethical exchange, and on May 19th we held a video conference with collaborators in Sao Paolo, Brazil, to discuss "eco-techno" and public engagement. Unfortunately, other commitments meant that I was unable to attend either in person or in the form of an avatar, but my co-director, Vicky Sinclair, wrote up the event.

On the work front, I've been busy marking projects and exam scripts, although I did also submit this conference paper.

Friday, May 21, 2010

Team Venter's synthetic cell, explained

I've been asked to comment on this week's news that Craig Venter's team have succeeded in building a "synthetic living cell" (you can read the full paper, for free, here), so I thought it might be useful to write a short post to explain just what they've achieved.

Cells may be thought of as biological "wetware", in the same way that the physical components of a personal computer (hard drive, processor, memory, etc.) form the "hardware". A computer can't work without an operating system; the central controller program that runs in the background, coordinating the various activities of the machine. Most people use Windows as their operating system, although there are others, such as Ubuntu Linux and MacOS. Similarly, a cell cannot survive without a working genome; the collection of genes that control and influence an organism's internal operation.

The core kernel (ie. the central "brain") of the Ubuntu Linux operating system running on my netbook is (roughly) 4 Megabytes in size, which is about four times the size of the genome of Mycoplasma mycoides. This is a bacterial parasite found in cattle and goats, and it was selected by Venter and his team because (a) it has a relatively small genome that has been fully-sequenced, and (b) it grows more quickly than bacteria they've used in the past.

Venter and his team have created an entirely synthetic copy of the genome of M. mycoides, which they then inserted into a related bacterium, M. capricolum. This new genome was "booted up" by the recipient, which then started "running" the new genetic program.

Importantly, the synthetic genome was completely pristine, in the sense that it had not been physically derived in any way from existing genetic material. Standard genetic engineering splices short synthetic sequences in to existing, "natural" DNA sequences, but Venter's "synthia" genome was created from scratch. It's the equivalent of taking the known binary sequence of a small operating system kernel, typing it into a text editor in small chunks, combining the chunks together into one big file, and then using it to boot up a PC. At no stage was the "new" kernel physically derived (copied) from a version stored on CD, DVD, or downloaded from the 'net.

Venter's team use a DNA synthesizer to piece together the A, G, C and T bases to form brand-new building blocks, which were then stitched together into a single sequence. This is the key technical achievement of the paper - a strategy for assembling an entire genome, from scratch, using synthetic components, and to get it "running" in a host cell. It's important to note that it was only the genome that was synthetic; the recipient cell was a pre-existing, "natural" bacterium.

This breakthrough is significant in that it demonstrates the feasibility of large-scale whole-genome transplantation, which will be an important component of the emerging field of synthetic biology. However, the real challenge lies in gaining a systems-level understanding of how even simple genomes operate, so that they may be fundamentally (re-)engineered.

Science has opened up a forum for posting questions, which will be answered later today by news writer Elizabeth Pennisi and philosopher and scientist Mark Bedau.

Update, 21/5/10, 11:13: Corrected kernel size assertions; Windows kernel is much larger than previously thought.

Monday, May 17, 2010

Weeknote #1 (w/e 16/5/10)

In an effort to blog more regularly, I've decided to adopt the Weeknote model of short seven-day updates on what's been going on.

The weekend was dominated by my inability to leave the country; I was due to fly to Madrid to give a series of lectures on molecular and cellular computing to Masters and Doctoral students at the Universidad Politécnica de Madrid. It was also an opportunity to take a couple of days of much-needed time with my wife and daughter, who'd be travelling with me. As the airspace in Northern Ireland had already been closed, we checked the status of the flight before we set off for Liverpool Airport. Everything was ok, but by the time we got there a couple of hours later, they'd shut down. A maudlin hen party, wearing mandatory pink fluffy stetsons, were told that the next available flight was on Thursday; we just returned home, where I quickly rescheduled the lectures for two weeks time. My host, Alfonso Rodríguez-Patón, was incredibly understanding and helpful, managing to book a new hotel for us, despite the fact that my new schedule coincides with a major festival on the Thursday (making hotel rooms extremely rare).

Another significant event this week was the Future Everything festival, which was (if you read the various reviews and tweets) wildly successful. I contributed to a panel discussion on New Creativity, which also featured Anab Jain, a TED Fellow who talked about her Power of 8 project, Kerenza McClarnan of Buddleia, who's facilitating artist-led enquiry into urban spaces, and Adrian Hon of award-winning games company Six to Start, who talked about the purpose of play. It was a fascinating session, with a lot of dynamic connections made between the panelists (none of whom really knew anything in advance about what the others would say). The session was recorded, so I'll post a link if and when the video is made available.

In mid-week we had our latest brain-storming "away"-day for our Bridging the Gaps: NanoInfoBio (NIB) project. This is a two-year initiative, supported by the EPSRC, to encourage cross-disciplinary research within MMU (with specific focus on the life sciences/engineering/computing/maths/nanotechnology interface(s)). We're almost ten months into the project now, and are beginning to develop a coherent set of themes around which we can coalesce. We're giving out a few project grants of £25K in order to boot-strap small feasibility studies, so we arranged an afternoon at a Manchester hotel to generate some ideas. Experience has shown that it's best to get everyone away from the distractions of email, and the temptation to "just pop back to the office", and I think everyone was happy with how it went. Rather than dividing everyone into groups, as might seem natural, we first performed a general "audit" of possible project ideas (this first pass generated 12), and then "drilled down" as a whole group to examine each idea in turn. Once a page or so of flip-chart paper had been filled for each project, only then did we split up in order to go over the fine details of costings and so on. The group-level discussion led to some surprising contributions, which would have been lost if we'd split up too quickly. I think it worked.

Tuesday, May 11, 2010

The need for hacking

The following post is a lightly-edited version of an article I've just had published in the Spring 2010 issue of MMU's Success magazine:

The word "hacker" has, in recent years, acquired an unfortunate and perjorative meaning. The media portrayal is of a pale-faced teenage boy (for they are invariably male) crouched over a keyboard in a fetid room, determined to make their mark on the world through cyber-vandalism or malware scams. My teenage years were partly shaped by the movie WarGames, in which an inquisitive youth accidentally triggers the countdown to armageddon by wandering into a US military computer, while the recent case of the "UFO hacker" Gary McKinnon has merely reinforced the "misfit" stereotype.

They are almost universally despised by mainstream commentators, and yet the infrastructure on which all of us rely (mobile phones, computers and the internet) would not even exist in its current form were it not for the hacker.

The original hackers were the pioneers of the electronic age, when the term simply meant "one who hacks". A hack, back then, was just a clever or "pretty" solution to a difficult problem, rather than an attempt to gain unauthorised access to a system. These early hobbyists and developers created the first microcomputers, as well as the foundations of the global information network.

One of the key principles of the hacker ethic (as described in Steven Levy's book Hackers: Heroes of the Computer Revolution) is that the best computer system is one that may be inspected, dissected and improved upon. When I started programming back in the 1980s, games were often distributed as listings printed in magazines, which had to be typed in before playing. By messing around with this code, I picked up various tricks and learned important new techniques. As my programs became more sophisticated, I had to get "under the bonnet" of the machine and interact with the computer at a fundamental level. The so-called "hard skills" that I learned in those early years have stayed with me ever since.

Modern teaching increasingly promotes the "soft skills" agenda, such as the need for team-working, communication and negotiation. Whilst these abilities are undoubtedly important, we need to protect and promote technical content. I wouldn't want a mechanic delving under the bonnet of my car if all he or she had ever done was change a tyre or top up the screen-wash, even if they did describe themself as a personable, motivated team-player...

Computers now take many forms (consoles, phones and PCs, for example) and they're increasingly viewed as sealed appliances, intended for gaming, chatting or browsing. Members of tomorrow's workforce are immersed in social networking, app downloads and file sharing, but they often lack the fundamental knowledge that can only come by (either physically or metaphorically) opening up the box and tinkering with its insides. By that, I mean the acquisition of technical insights and skills required in order for a person to become a software producer, rather than simply a consumer of apps. New innovations such mobile and cloud computing mean that hard skills are more important than ever, as the digital infrastructure becomes ever more firmly rooted in our day-to-day lives.

The beauty of the situation is that these skills are no longer the sole domain of computing professionals. The availability of modern computers means that we are ideally-placed to develop the next hacker generation, capable of creating ingenious applications and web-based systems. We need to return to the playful principles of the original hackers, by promoting programming as a recreational activity. Modern software packages such as Alice allow us to teach complex concepts almost by stealth, through the medium of computer animation. Open-source operating systems encourage tinkering, and mobile app development is now a legitimate career path. The new generation of twenty-first century hackers may well be digital natives, but they first need to learn to speak the language.

Friday, April 02, 2010

"It's alive! ALIVE!"

I've decided to revive the blog, as several new projects have started recently, and I think it's useful to pass on news through informal channels such as this, as well as via the "official" websites. I'll be posting regular updates on our BACTOCOM project, funded by the European Commission, as well as news of Bridging the Gaps: NanoInfoBio, and any other snippets that I think might be of interest.

Sunday, June 01, 2008

Synthetic biology and Howard Hughes

The Howard Hughes Medical Institute has announced its latest set of investigator appointments. Awards are made to individuals, as opposed to the usual mode of funding, where money is assigned to a project, and the field of synthetic biology is represented by two of its leading figures in the current crop. Jim Collins at Boston and Michael Elowitz at Caltech both had papers in the important 2000 issue of Nature, which reported some of the first experimental results in the area (specific papers are here and here.)

Thursday, May 29, 2008

Genesis Machines in the USA

I'm pleased to report that Genesis Machines has just been published in the USA by The Overlook Press. The book is available via Amazon, and I'm delighted to be associated with another independent award-winning publisher (after Toby Mundy's 2005 triumph with Atlantic at the 2005 British Book Awards).

Sunday, February 24, 2008

Engineering biology, with Drew Endy



There's a fascinating essay by/interview with Drew Endy on the Edge website, which appears to be the latest in a series to have emerged from an event they organised last August. I've written about Endy in the past, and he features prominently in the final chapters of Genesis Machines; indeed, I wish I'd had such an illuminating transcript available when I wrote the book.

Endy is an Assistant Professor of Biological Engineering at MIT, and one of the leading figures in synthetic biology. In one particular paragraph, he captures the excitement of this emerging new discipline:

"Programming DNA is more cool, it's more appealing, it's more powerful than silicon. You have an actual living, reproducing machine; it's nanotechnology that works. It's not some Drexlarian (Eric Drexler) fantasy. And we get to program it. And it's actually a pretty cheap technology. You don't need a FAB Lab like you need for silicon wafers. You grow some stuff up in sugar water with a little bit of nutrients. My read on the world is that there is tremendous pressure that's just started to be revealed around what heretofore has been extraordinarily limited access to biotechnology."

Friday, February 15, 2008

Insect lab

I've spend all week running simulation experiments for our ongoing work on ant-based computing, so when I came across the Insect Lab it seemed strangely appropriate.

The artist takes real (dead) insects and customizes their bodies with parts taken from watches and other mechanical devices, to create "cybernetic sculptures".

I'd like to see him do an ant, though... Which train of thought lead me circuitously to Bill Bailey performing his wonderful song Insect Nation (if you just want the lyrics, they're here).

Friday, February 08, 2008

Dr Who

A wonderful present arrived in today's post, courtesy of our equally wonderful friend Eventhia; a signed photograph of Tom Baker! He is, of course, best known for playing the fourth Dr Who, but is probably most familiar to a younger generation as the narrator of Little Britain (and even the delightfully barmy Stagecoach adverts).

Most people of sound mind would name Baker as the best ever Dr Who, despite ludicrous polls to the contrary. A case can be made that the choice of favourite depends on which Doctor a person grew up with, and since Baker's tenure extended from 1974-1981, I would certainly agree.

Anyway, he recently did a signing in Norwich, attended by our friends Kris and Eventhia. They very kindly got Tom to sign the photo "For Martyn," (eventually, I think he had it down as "Martin", and you can see where he's corrected it at E's prompting) "Genetically yours, Tom Baker"

Sigh!

Tuesday, February 05, 2008

Biological complexity: from molecules to systems


I'm delighted to have been invited to speak at an event titled "Biological complexity: from molecules to systems", to be held at University College London from 12-13 June this year. The meeting is sponsored by both UCL and the Weizmann Institute of Science in Israel, and will feature speakers from the fields of immunology, computer science, mathematics, biological chemistry, molecular genetics and bioinformatics. I'll try my best to summarize below the research interests of the other invited speakers (but apologies to anyone whose work I misrepresent!)

Stephen Emmott from Microsoft Research in Cambridge will give the keynote address. Stephen is the founder and Director of Microsoft's European Science Programme, and was the driving force behind the influential Towards 2020 Science project and report.

Representing Israeli activity, Nir Friedman works in computational biology, and recently published a paper arguing that gene duplication may drive the "modularisation" of functional genetic networks (that is, genetic networks that are relatively self-contained, and which perform a specific task).

David Harel is a celebrated computer scientist, having carried out important work in logic, software engineering and computability theory. As a student, I often referred to his award-winning book Algorithmics: The Spirit of Computing, and he is currently working on topics that include the modelling and analysis of biological systems (eg. the nematode worm) and the synthesis and communication of smell.

Shmuel Pietrokovski works in bioinformatics, with particular interest in inteins (protein introns); "selfish" DNA elements that are converted into proteins together with their hosts.

Yitzhak Pilpel's lab takes a systems-level approach to how genes are regulated: "By applying genome wide computational approaches, backed-up by in house laboratory experiments, [the lab] devotes itself to both establishing an in-depth understanding of the different processes controlling gene expression, and to understand[ing] how these processes are orchestrated to establish robustness of the regulatory code."

Gideon Schreiber studies the precise nature of protein-protein interactions and the implications these have for complex biological processes.

Eran Segal is a computer scientist (predominantly) working in computational biology, who has recently reported some fascinating work on a "higher level" genetic code, as well as research on predicting expression patterns from their regulatory sequences in fruit flies.

I've already written at some length about Ehud Shapiro (also here); his recent work has centred on the construction of biological computing devices (known as automata) using DNA molecules and enzymes.

Yoav Soen's group is "using embryonic stem cells models to study how different layers of regulation interact to specify morphogenetic decisions, how these decisions are shaped by interactions between emerging precursors and how they are coordinated across a developing embryonic tissue." He has also worked with a colleague of mine, Netta Cohen at Leeds.

Representing activities in the UK, we have Cyrus Chothia from the Laboratory of Molecular Biology at Cambridge, who studies the "nature of the protein repertoires in different organisms and the molecular mechanisms that have produced these differences."

Jasmin Fisher is leading the new Executable Biology Group at Microsoft Research, and is primarily interested in systems/computational biology.

Mike Hoffman and Ewan Birney are at the European Bioinformatics Institute (EBI) in Cambridge, where Birney leads the EBI contribution to Ensembl. There's a transcript of an interview with him here.

Jaroslav Stark is the Director of the Centre for Integrative Systems Biology at Imperial College. He was recently interviewed for a piece on systems biology on BBC Radio 4's The Material World.

Michael Sternberg heads the Structural Bioinformatics Group and the Imperial College Centre for Bioinformatics. He was previously the head of biomolecular modelling at the Imperial Cancer Research Fund now part of Cancer Research UK.

Perdita Stevens is at Edinburgh, where she works on software engineering and theoretical computer science (with a growing interest in modelling viral infection).

The meeting organisers are particularly keen to encourage the participation of young researchers, and the registration fee for this two-day event is a very reasonable 50 pounds (30 for students). To register and for further information, please contact Michelle Jacobs at Weizmann UK at post@weizmann.org.uk or on 020 7424 6860. Attendance will be limited to 180 delegates.