According to Steve Furber, "what is taught at school is at a fairly basic level, and those who already have an interest in computing are already way ahead of that in what they’ve done at home. What schools are presenting as ICT as an academic subject is very mundane compared with what students know they can do."
This, I think is the central problem, and I personally believe that it's caused by the coupling of computer science and ICT.
I have a long-standing collaboration with Professor Dave Hodgson, a very well-respected microbiologist. When another (nameless) collaborator from computer science used to refer to Dave as "our chemist", Dave would get his own back by referring to us as "the IT guys". As the renowned theoretician Edsger Dijkstra once said, "computer science is no more about computers than astronomy is about telescopes."
The point is that computer science is about abstraction, the study and application of algorithms, problem-solving, and deepening our fundamental understanding of information processing (in all its forms). ICT, on the other hand, is concerned with the use and application of pre-existing software for the solution of well-defined tasks (eg. build a database, plan a budget, design a poster).
With ICT skills now almost mandatory for many spheres of work, it's clear that this subject should be part of the core curriculum at secondary level. However, if we are to encourage the next generation of computer scientists, they need to be able to develop their own particular skills and interests, which (as Furber indicates) are often far beyond the current subject matter.
My proposal is this: We need to separate ICT from computer science, and offer them as different subjects. Edited: 9/8/10, 14:40.
I recently blogged on this subject, and my post was followed by a nice related article in the Times Higher.