Blog

We’ve come full circle—again

Jon wasn’t the only one to play with time-sharing computers; Bill Gates cut his teeth on them, too. When I was a pup first playing with computers, the machines we had to play with were big things with vacuum tubes and a single console, and you loaded programs using 3 x 7 punched cards. The next phases were transistors and ...

Robert Dow
fig01
Jon wasn’t the only one to play with time-sharing computers; Bill Gates cut his teeth on them, too.

When I was a pup first playing with computers, the machines we had to play with were big things with vacuum tubes and a single console, and you loaded programs using 3 x 7 punched cards.

The next phases were transistors and terminals. Then you could sit at a CRT-based terminal with a keyboard, enter code, and, in a big computer somewhere else, the work would get done. You could use punched tape or mag tape for program storage.

But you didn’t have that computer to yourself; you shared it with a dozen or more other users. It was called time-share, and it was the main method of computing until the late 1970s when the minicomputer bubble occurred.

After the minicomputer (which, BTW, had the exact same characteristics as the Internet bubble, an explosion of suppliers (about 50 or 60), a consolidation, and a collapse of the industry), the PC came to life in the late 1970s.

Again, history repeated itself and the new, smaller machines were no match for the bigger minicomputers, and so they were not taken seriously until IBM introduced “The PC” in 1981. And, even then, they were used to try to run bigger programs that were already running on a minicomputer, so the industry had to be invented.

All the while, after getting into and then out of the memory business (in favor of developing the 4004 for DataPoint, which is how Intel got into the CPU biz), Gordon Moore discovered the fab relationship to memory and named it his law, and with it came the steady improvement in CPUs for PCs (and, of course, other things).

Now, flash forward 36 years and we have dual- and quad- (and triple) core CPUs, super-fast, small, cheap, and large-capacity disk drives, super high-speed LANs, and all kinds of wireless stuff. We have supercomputers in our laptops, for crying out loud.

And we still have the poor, the underserved and underrepresented, and the digital divide.

So it’s with a combination of delight and some amusement that I read about NComputing helping underprivileged (economically, that is, not potential-wise) countries and maybe U.S. communities cutting costs by using one of today’s amazingly powerful PCs in a time-sharing mode.

NComputing users plug a keyboard, a mouse, and a monitor into a little box that maintains a connection to one hub PC or server. Wires are necessary for now, but soon wireless links will be possible. NComputing doesn’t call this a terminal, but rather a thin client, or quite incorrectly a workstation.

Software in the NComputing boxes gives each of the users an individual computing session—with different desktop appearances and different programs—even though all of them are sharing the central processor and hard drive in the hub PC. As a result, NComputing says each terminal can cost as little as $150 to $175, including installation, technical support, and the requisite hardware.

thumbsupSo we applaud NComputing and all of their customers who take advantage of this logical and economical solution (although we doubt AMD, Intel, or VIA will think it’s such a great idea), and we forgive them their misuse of terminology—these are time-sharing systems with remote terminals. gray

IBC, what a lot to see

Enigma in the blogosphere