Not a great deal to report this week, as I've been suffering from a particularly painful seasonal disorder (i.e. marking). The delay to our Madrid trip due to Icelandic intervention was a blessing in disguise, I think, as it allowed me to clear the decks of a load of scripts before jetting off to give three afternoons of lectures at the Universidad Polytecnica de Madrid. If we'd gone when we'd originally planned to then the scripts would have been sitting there in my study at home, a distant yet malign cloud hanging over the trip.
Arrived in Madrid yesterday, after a relatively painless flight from Liverpool with EasyJet. It was all going too well, however; on arrival at the hotel, our daughter ran towards a display of flowers in the lobby, caught her foot on a rug and went face-down onto a table. She cut her eye quite badly, but she's a hardy little thing, and was back on top form today.
I gave my first set of lectures this afternoon/evening, as the guest of Alfonso Rodriguez-Paton. He's the "Madrid node" of our BACTOCOM project, and kindly invited me to teach some of their postgraduates (others involved this year include Christof Teuscher, Milan Stojanovic and Friedrich Simmel, who's also involved with BACTOCOM). I'm here to talk about "molecular and cellular computing"; ŧoday was motivation and historial background, a bit of biology and an overview of Adleman's experiment. Tomorrow is formal models of DNA computation followed by self-assembly and DNA origami. The final set of lectures on Wednesday will deal mainly with synthetic biology, so I hope Fritz has left me something to talk about.
Monday, May 31, 2010
Monday, May 24, 2010
Weeknote #2 (w/e 23/5/10)
It's been a big week for synthetic biology, with the announcement by Craig Venter that he'd succeeded in creating a "synthetic cell". My previous post describes my take on the technical aspects of his achievement; it's not entirely accurate to call it a "synthetic cell", since they used existing cells as the recipients (that is, it was only the genome that was synthetic). It's more like "genomic transplantation" with de novo sequences. Technically challenging, but not the earth-shattering breakthrough that it's being sold/hyped as. They certainly didn't turn "inanimate chemicals into a living organism".
My own little piece of press coverage looked pretty low-key by comparison. I was interviewed ages ago by Louise Tickle for the Education section of the Guardian, and the story finally appeared last week.
This week, members of my group (specifically, Pete and Naomi) contributed to an event hosted by MMU. I'm a Director of ArcSpace Manchester, a Community Interest Company to support creative and ethical exchange, and on May 19th we held a video conference with collaborators in Sao Paolo, Brazil, to discuss "eco-techno" and public engagement. Unfortunately, other commitments meant that I was unable to attend either in person or in the form of an avatar, but my co-director, Vicky Sinclair, wrote up the event.
On the work front, I've been busy marking projects and exam scripts, although I did also submit this conference paper.
Friday, May 21, 2010
Team Venter's synthetic cell, explained
I've been asked to comment on this week's news that Craig Venter's team have succeeded in building a "synthetic living cell" (you can read the full paper, for free, here), so I thought it might be useful to write a short post to explain just what they've achieved.
Cells may be thought of as biological "wetware", in the same way that the physical components of a personal computer (hard drive, processor, memory, etc.) form the "hardware". A computer can't work without an operating system; the central controller program that runs in the background, coordinating the various activities of the machine. Most people use Windows as their operating system, although there are others, such as Ubuntu Linux and MacOS. Similarly, a cell cannot survive without a working genome; the collection of genes that control and influence an organism's internal operation.
The core kernel (ie. the central "brain") of the Ubuntu Linux operating system running on my netbook is (roughly) 4 Megabytes in size, which is about four times the size of the genome of Mycoplasma mycoides. This is a bacterial parasite found in cattle and goats, and it was selected by Venter and his team because (a) it has a relatively small genome that has been fully-sequenced, and (b) it grows more quickly than bacteria they've used in the past.
Venter and his team have created an entirely synthetic copy of the genome of M. mycoides, which they then inserted into a related bacterium, M. capricolum. This new genome was "booted up" by the recipient, which then started "running" the new genetic program.
Importantly, the synthetic genome was completely pristine, in the sense that it had not been physically derived in any way from existing genetic material. Standard genetic engineering splices short synthetic sequences in to existing, "natural" DNA sequences, but Venter's "synthia" genome was created from scratch. It's the equivalent of taking the known binary sequence of a small operating system kernel, typing it into a text editor in small chunks, combining the chunks together into one big file, and then using it to boot up a PC. At no stage was the "new" kernel physically derived (copied) from a version stored on CD, DVD, or downloaded from the 'net.
Venter's team use a DNA synthesizer to piece together the A, G, C and T bases to form brand-new building blocks, which were then stitched together into a single sequence. This is the key technical achievement of the paper - a strategy for assembling an entire genome, from scratch, using synthetic components, and to get it "running" in a host cell. It's important to note that it was only the genome that was synthetic; the recipient cell was a pre-existing, "natural" bacterium.
This breakthrough is significant in that it demonstrates the feasibility of large-scale whole-genome transplantation, which will be an important component of the emerging field of synthetic biology. However, the real challenge lies in gaining a systems-level understanding of how even simple genomes operate, so that they may be fundamentally (re-)engineered.
Science has opened up a forum for posting questions, which will be answered later today by news writer Elizabeth Pennisi and philosopher and scientist Mark Bedau.
Update, 21/5/10, 11:13: Corrected kernel size assertions; Windows kernel is much larger than previously thought.
Cells may be thought of as biological "wetware", in the same way that the physical components of a personal computer (hard drive, processor, memory, etc.) form the "hardware". A computer can't work without an operating system; the central controller program that runs in the background, coordinating the various activities of the machine. Most people use Windows as their operating system, although there are others, such as Ubuntu Linux and MacOS. Similarly, a cell cannot survive without a working genome; the collection of genes that control and influence an organism's internal operation.
The core kernel (ie. the central "brain") of the Ubuntu Linux operating system running on my netbook is (roughly) 4 Megabytes in size, which is about four times the size of the genome of Mycoplasma mycoides. This is a bacterial parasite found in cattle and goats, and it was selected by Venter and his team because (a) it has a relatively small genome that has been fully-sequenced, and (b) it grows more quickly than bacteria they've used in the past.
Venter and his team have created an entirely synthetic copy of the genome of M. mycoides, which they then inserted into a related bacterium, M. capricolum. This new genome was "booted up" by the recipient, which then started "running" the new genetic program.
Importantly, the synthetic genome was completely pristine, in the sense that it had not been physically derived in any way from existing genetic material. Standard genetic engineering splices short synthetic sequences in to existing, "natural" DNA sequences, but Venter's "synthia" genome was created from scratch. It's the equivalent of taking the known binary sequence of a small operating system kernel, typing it into a text editor in small chunks, combining the chunks together into one big file, and then using it to boot up a PC. At no stage was the "new" kernel physically derived (copied) from a version stored on CD, DVD, or downloaded from the 'net.
Venter's team use a DNA synthesizer to piece together the A, G, C and T bases to form brand-new building blocks, which were then stitched together into a single sequence. This is the key technical achievement of the paper - a strategy for assembling an entire genome, from scratch, using synthetic components, and to get it "running" in a host cell. It's important to note that it was only the genome that was synthetic; the recipient cell was a pre-existing, "natural" bacterium.
This breakthrough is significant in that it demonstrates the feasibility of large-scale whole-genome transplantation, which will be an important component of the emerging field of synthetic biology. However, the real challenge lies in gaining a systems-level understanding of how even simple genomes operate, so that they may be fundamentally (re-)engineered.
Science has opened up a forum for posting questions, which will be answered later today by news writer Elizabeth Pennisi and philosopher and scientist Mark Bedau.
Update, 21/5/10, 11:13: Corrected kernel size assertions; Windows kernel is much larger than previously thought.
Monday, May 17, 2010
Weeknote #1 (w/e 16/5/10)
In an effort to blog more regularly, I've decided to adopt the Weeknote model of short seven-day updates on what's been going on.
The weekend was dominated by my inability to leave the country; I was due to fly to Madrid to give a series of lectures on molecular and cellular computing to Masters and Doctoral students at the Universidad Politécnica de Madrid. It was also an opportunity to take a couple of days of much-needed time with my wife and daughter, who'd be travelling with me. As the airspace in Northern Ireland had already been closed, we checked the status of the flight before we set off for Liverpool Airport. Everything was ok, but by the time we got there a couple of hours later, they'd shut down. A maudlin hen party, wearing mandatory pink fluffy stetsons, were told that the next available flight was on Thursday; we just returned home, where I quickly rescheduled the lectures for two weeks time. My host, Alfonso Rodríguez-Patón, was incredibly understanding and helpful, managing to book a new hotel for us, despite the fact that my new schedule coincides with a major festival on the Thursday (making hotel rooms extremely rare).
Another significant event this week was the Future Everything festival, which was (if you read the various reviews and tweets) wildly successful. I contributed to a panel discussion on New Creativity, which also featured Anab Jain, a TED Fellow who talked about her Power of 8 project, Kerenza McClarnan of Buddleia, who's facilitating artist-led enquiry into urban spaces, and Adrian Hon of award-winning games company Six to Start, who talked about the purpose of play. It was a fascinating session, with a lot of dynamic connections made between the panelists (none of whom really knew anything in advance about what the others would say). The session was recorded, so I'll post a link if and when the video is made available.
In mid-week we had our latest brain-storming "away"-day for our Bridging the Gaps: NanoInfoBio (NIB) project. This is a two-year initiative, supported by the EPSRC, to encourage cross-disciplinary research within MMU (with specific focus on the life sciences/engineering/computing/maths/nanotechnology interface(s)). We're almost ten months into the project now, and are beginning to develop a coherent set of themes around which we can coalesce. We're giving out a few project grants of £25K in order to boot-strap small feasibility studies, so we arranged an afternoon at a Manchester hotel to generate some ideas. Experience has shown that it's best to get everyone away from the distractions of email, and the temptation to "just pop back to the office", and I think everyone was happy with how it went. Rather than dividing everyone into groups, as might seem natural, we first performed a general "audit" of possible project ideas (this first pass generated 12), and then "drilled down" as a whole group to examine each idea in turn. Once a page or so of flip-chart paper had been filled for each project, only then did we split up in order to go over the fine details of costings and so on. The group-level discussion led to some surprising contributions, which would have been lost if we'd split up too quickly. I think it worked.
The weekend was dominated by my inability to leave the country; I was due to fly to Madrid to give a series of lectures on molecular and cellular computing to Masters and Doctoral students at the Universidad Politécnica de Madrid. It was also an opportunity to take a couple of days of much-needed time with my wife and daughter, who'd be travelling with me. As the airspace in Northern Ireland had already been closed, we checked the status of the flight before we set off for Liverpool Airport. Everything was ok, but by the time we got there a couple of hours later, they'd shut down. A maudlin hen party, wearing mandatory pink fluffy stetsons, were told that the next available flight was on Thursday; we just returned home, where I quickly rescheduled the lectures for two weeks time. My host, Alfonso Rodríguez-Patón, was incredibly understanding and helpful, managing to book a new hotel for us, despite the fact that my new schedule coincides with a major festival on the Thursday (making hotel rooms extremely rare).
Another significant event this week was the Future Everything festival, which was (if you read the various reviews and tweets) wildly successful. I contributed to a panel discussion on New Creativity, which also featured Anab Jain, a TED Fellow who talked about her Power of 8 project, Kerenza McClarnan of Buddleia, who's facilitating artist-led enquiry into urban spaces, and Adrian Hon of award-winning games company Six to Start, who talked about the purpose of play. It was a fascinating session, with a lot of dynamic connections made between the panelists (none of whom really knew anything in advance about what the others would say). The session was recorded, so I'll post a link if and when the video is made available.
In mid-week we had our latest brain-storming "away"-day for our Bridging the Gaps: NanoInfoBio (NIB) project. This is a two-year initiative, supported by the EPSRC, to encourage cross-disciplinary research within MMU (with specific focus on the life sciences/engineering/computing/maths/nanotechnology interface(s)). We're almost ten months into the project now, and are beginning to develop a coherent set of themes around which we can coalesce. We're giving out a few project grants of £25K in order to boot-strap small feasibility studies, so we arranged an afternoon at a Manchester hotel to generate some ideas. Experience has shown that it's best to get everyone away from the distractions of email, and the temptation to "just pop back to the office", and I think everyone was happy with how it went. Rather than dividing everyone into groups, as might seem natural, we first performed a general "audit" of possible project ideas (this first pass generated 12), and then "drilled down" as a whole group to examine each idea in turn. Once a page or so of flip-chart paper had been filled for each project, only then did we split up in order to go over the fine details of costings and so on. The group-level discussion led to some surprising contributions, which would have been lost if we'd split up too quickly. I think it worked.
Tuesday, May 11, 2010
The need for hacking
The following post is a lightly-edited version of an article I've just had published in the Spring 2010 issue of MMU's Success magazine:
The word "hacker" has, in recent years, acquired an unfortunate and perjorative meaning. The media portrayal is of a pale-faced teenage boy (for they are invariably male) crouched over a keyboard in a fetid room, determined to make their mark on the world through cyber-vandalism or malware scams. My teenage years were partly shaped by the movie WarGames, in which an inquisitive youth accidentally triggers the countdown to armageddon by wandering into a US military computer, while the recent case of the "UFO hacker" Gary McKinnon has merely reinforced the "misfit" stereotype.
They are almost universally despised by mainstream commentators, and yet the infrastructure on which all of us rely (mobile phones, computers and the internet) would not even exist in its current form were it not for the hacker.
The original hackers were the pioneers of the electronic age, when the term simply meant "one who hacks". A hack, back then, was just a clever or "pretty" solution to a difficult problem, rather than an attempt to gain unauthorised access to a system. These early hobbyists and developers created the first microcomputers, as well as the foundations of the global information network.
One of the key principles of the hacker ethic (as described in Steven Levy's book Hackers: Heroes of the Computer Revolution) is that the best computer system is one that may be inspected, dissected and improved upon. When I started programming back in the 1980s, games were often distributed as listings printed in magazines, which had to be typed in before playing. By messing around with this code, I picked up various tricks and learned important new techniques. As my programs became more sophisticated, I had to get "under the bonnet" of the machine and interact with the computer at a fundamental level. The so-called "hard skills" that I learned in those early years have stayed with me ever since.
Modern teaching increasingly promotes the "soft skills" agenda, such as the need for team-working, communication and negotiation. Whilst these abilities are undoubtedly important, we need to protect and promote technical content. I wouldn't want a mechanic delving under the bonnet of my car if all he or she had ever done was change a tyre or top up the screen-wash, even if they did describe themself as a personable, motivated team-player...
Computers now take many forms (consoles, phones and PCs, for example) and they're increasingly viewed as sealed appliances, intended for gaming, chatting or browsing. Members of tomorrow's workforce are immersed in social networking, app downloads and file sharing, but they often lack the fundamental knowledge that can only come by (either physically or metaphorically) opening up the box and tinkering with its insides. By that, I mean the acquisition of technical insights and skills required in order for a person to become a software producer, rather than simply a consumer of apps. New innovations such mobile and cloud computing mean that hard skills are more important than ever, as the digital infrastructure becomes ever more firmly rooted in our day-to-day lives.
The beauty of the situation is that these skills are no longer the sole domain of computing professionals. The availability of modern computers means that we are ideally-placed to develop the next hacker generation, capable of creating ingenious applications and web-based systems. We need to return to the playful principles of the original hackers, by promoting programming as a recreational activity. Modern software packages such as Alice allow us to teach complex concepts almost by stealth, through the medium of computer animation. Open-source operating systems encourage tinkering, and mobile app development is now a legitimate career path. The new generation of twenty-first century hackers may well be digital natives, but they first need to learn to speak the language.
The word "hacker" has, in recent years, acquired an unfortunate and perjorative meaning. The media portrayal is of a pale-faced teenage boy (for they are invariably male) crouched over a keyboard in a fetid room, determined to make their mark on the world through cyber-vandalism or malware scams. My teenage years were partly shaped by the movie WarGames, in which an inquisitive youth accidentally triggers the countdown to armageddon by wandering into a US military computer, while the recent case of the "UFO hacker" Gary McKinnon has merely reinforced the "misfit" stereotype.
They are almost universally despised by mainstream commentators, and yet the infrastructure on which all of us rely (mobile phones, computers and the internet) would not even exist in its current form were it not for the hacker.
The original hackers were the pioneers of the electronic age, when the term simply meant "one who hacks". A hack, back then, was just a clever or "pretty" solution to a difficult problem, rather than an attempt to gain unauthorised access to a system. These early hobbyists and developers created the first microcomputers, as well as the foundations of the global information network.
One of the key principles of the hacker ethic (as described in Steven Levy's book Hackers: Heroes of the Computer Revolution) is that the best computer system is one that may be inspected, dissected and improved upon. When I started programming back in the 1980s, games were often distributed as listings printed in magazines, which had to be typed in before playing. By messing around with this code, I picked up various tricks and learned important new techniques. As my programs became more sophisticated, I had to get "under the bonnet" of the machine and interact with the computer at a fundamental level. The so-called "hard skills" that I learned in those early years have stayed with me ever since.
Modern teaching increasingly promotes the "soft skills" agenda, such as the need for team-working, communication and negotiation. Whilst these abilities are undoubtedly important, we need to protect and promote technical content. I wouldn't want a mechanic delving under the bonnet of my car if all he or she had ever done was change a tyre or top up the screen-wash, even if they did describe themself as a personable, motivated team-player...
Computers now take many forms (consoles, phones and PCs, for example) and they're increasingly viewed as sealed appliances, intended for gaming, chatting or browsing. Members of tomorrow's workforce are immersed in social networking, app downloads and file sharing, but they often lack the fundamental knowledge that can only come by (either physically or metaphorically) opening up the box and tinkering with its insides. By that, I mean the acquisition of technical insights and skills required in order for a person to become a software producer, rather than simply a consumer of apps. New innovations such mobile and cloud computing mean that hard skills are more important than ever, as the digital infrastructure becomes ever more firmly rooted in our day-to-day lives.
The beauty of the situation is that these skills are no longer the sole domain of computing professionals. The availability of modern computers means that we are ideally-placed to develop the next hacker generation, capable of creating ingenious applications and web-based systems. We need to return to the playful principles of the original hackers, by promoting programming as a recreational activity. Modern software packages such as Alice allow us to teach complex concepts almost by stealth, through the medium of computer animation. Open-source operating systems encourage tinkering, and mobile app development is now a legitimate career path. The new generation of twenty-first century hackers may well be digital natives, but they first need to learn to speak the language.
Subscribe to:
Posts (Atom)