[geeks] fwd: IBM supercomputer dual-boots Windows and Linux

Jonathan C. Patschke jp at celestrion.net
Sat Jun 21 18:27:44 CDT 2008


On Sat, 21 Jun 2008, Mark wrote:

>> By your definition, my Mac Pro is not "a computer", it's several
>> combined together.
>
> Modern computers haven't been singular computing devices for over a
> decade.  3D cards, Sound processors, Power Managers, smart Disk
> Controllers, all have been around for ages and all are their own
> computers within a computer.

This is splitting hairs.  If your definition of a "computer" is limited to
a device containing exactly component capable of executing a stored
program of any sort, such a thing hasn't existed since the 1950s.  My
toaster oven likely has some tiny microcontroller in it just because ADCs
and sensors are cheaper and more reliable than mechanical thermostats, but
we wouldn't call it a computer.

Modern computers have attached processors.  Older computers also had
attached processors.  Even the programmer's console on many PDPs had an
i8008 to control it.  There even existed microcoded -terminals- as far
back as 30 years ago.  Nearly any peripheral device has to have some sort
of embedded control program to be useful.

The fact is that a power manager or a disk controller is no more visible
to the end-user (and, in many cases, the programmer) as an independently
programmable entity than the embedded i8008 in the front-panel of an
11/34.

This industry tends to oscillate between concentrating computing power
along a specific resource (memory bandwidth, CPU power, thermal concerns)
and then redistributing it again.  With every tremendous leap forward in
CPU power, we try to centralize operations because there's "no way" the
little peripheral devices will ever be able to catch up.  With every huge
leap forward in memory bandwidth, we put more things on the processor-
memory interconnect because there's "no way" the CPU would ever be able to
need that fat a pipe to RAM. Between those leaps, the power bleeds
outwards into attached processors as the technological advanced made in
microprocessor technology trickle down to other sectors of industry.

Possibly the best example of this is RISC design, where the organization
of the CPU and massive memory-mapped I/O were all built on the assumption
of a 200MHz or thereabouts clock ceiling, where memory accesses would be
as cheap as clock ticks.  Thankfully, process improved, PLLs got better,
differential signalling hit the processor-to-memory interconnects, and
someone got the bright idea of triggering on both edges of the clock
pulse.  Now we're clocking CPUs at 5GHz and buses at nearly a quarter
that, and we're back to designing networks of tightly-coupled attached
processors (northbridges, 3D accelerators) that share ports to memory and
fan out to slower attached processors (physics processors, codec
accelerators, network controllers, storage controllers).

If nothing else, all this serves to illustrate that the rapid pace of
change in information technology makes useless any sort of concrete terms
describing the arrangement of any particular architecture.  Trying to etch
the term "computer" into stone doesn't make any sense in this age of
processors all over the place than it did in the age of processors so
crude that their registers might be in physically separate relay racks.

-- 
Jonathan Patschke | "There is more to life than increasing its speed."
Elgin, TX         |                                   --Mahatma Gandhi
USA               |



More information about the geeks mailing list