Why we need a new kind CS for the 21st century,
and why its the same as the old kind.
1. The business model of the entire computing industry
is based on a fundamentally unsustainable premise:
a form of pyramid sales,
which is quintessential of the late 20th century
The top of the pyramid is the ever faster,
ever bigger infrastructure service, and
"big iron" servers and routers and data centers,
and associated large oligopolistic service providers.
These encourage profligate power consumption,
require absurd heat dissipation, and are centralised in a risky way
(risky in terms of safety, sustainability, security and sanity).
They also require massive investments in ever faster
switch's and processors, disks, memory and software bloat.
The investment cycles are not in a smooth curve,
but are in ever larger step functions -
e.g. to build new plant for new fab for
new chip is now looking close to
getting a new drug to market -
only how much less useful?
ditto operating systems (viz vista).
ditto middleware. ditto any research programme in CS
likely to have an impact on practice.
2. the consumer end of the pyramid (base:)
requires ever increasing numbers of tiny devices run on batteries.
Originating in the switchover from PCs to laptops,
now PDAs, cell phones, and eventually
thousands of ubiquitous sensors and mote processors
per person, per car, many battery powered,
and often toxic to the environment,
with no model of recovery/recycling or even
actually a good cost/benefit requirement analysis!
3. Fundamentally, Moore's law is the wrong model, and
systems like the Internet, peer-to-peer, and cheap disks encourage
the uptake of Metcalfe's law with a vengeance -
of Course, neither Moore nor Metcalfe's laws
are actually true invariants/universals.
they are just observations of the value of
a particular combination of
scale through integration at the technical level
and the growth model of economy requiring
sustenance through ever new replacement tech.
The embedding of these "laws" leads to many
bad practice examples in terms of sustainability...
some modest ones: the Web consists of
many multiple libraries of congress' worth of store,
and the Internet's worth of computers consisting of
multiple millions of cray2s worth of CPU,
when actually we didn't add much to the global pool of knowledge post web,
and 99.999999% of those 1 billion CPUs are idle at any time!
Ditto the 2.5 billion cell phones,
which have now reached a mean lifetime of 18 months - sure this allows
deployment of new tech in phones without due regard for legacy support
for the moment. But it also means that by 2008, we will be disposing of
10^9 phones per year. That doesn't sound like something that can carry
on very long. Intel (and AMD etc), ARM, Nokia, Sony/Ericsson,
etc etc are all going to be in very very bad trouble in short order.
What will this mean for the rest of the pyramid?
well, it also means obviously that Microsoft
(and Symbian and the like) will be in bad trouble,
as the subsequent to next deployment of PCs might not happen in any
significant way at all, then post Vista OS
(and anything else that represents bloat and inability to
shift to some new steady state that doesn't have binomial expansion of
memory/storage/power needs/battery replacement duty cycle, etc etc,
will be very very dead in the water (analogy works well:)
What does this then mean for computer science research?
I think its a mixed bag.
1. fundamentally, an industry that moves to steady state moves to
process optimisation rather than innovation - so industry investment in
serious research might drop, although in some sense, process
optimisation is potentially something that fundamental computer science
can do a lot with (e.g. correct software would be an example)
2. on the other hand, the resources available at that steady state
point will be pretty staggering so exploiting them well will be an
interesting systems task.
3. what about alternatives to the high end power/heat problem
and the low end smart dust/pollution problems? good topic for DTG...
Of course, there are those people that believe technology
can work around the problems and continue "exponential" growth.
In Snow Crash, there's an image of the difference between the
Sumerian's and the Egyptians, both of whom sustained large scale
civilisations that lasted considerably longer than the Euro/US hi-tech
one has done to date, and I don't know why, but the image seems
informative in some way. Sumerian built out of clay, and wrote on clay
which was then baked to preserve it, and was abundant in the
Tigris/Euphrates flood plains. The buildings
washed away, but the baked tablets lasted for ever. The Egyptians built
out of stone, but wrote on papyrus. The buildings are still there, but
the papyrus rotted. Don't ask me what the analogy for
Mac OS X on a power PC versus Vista on an x86 is, as I havn't got a clue.