Progress must be one of the toughest things to sell, because in some very important respects, we##re almost back to square one in technology.
For decades, the mantra from the developers of computers, chips and software has been to empower the user with compute power and free up the back-end server rooms. Now, because users want to be able to do everything on some sort of mobile Internet device–whether it##s a BlackBerry, an iPhone or a netbook–the battery has to last at least through the day. That means more processing has to return to the back-end servers, or users have to plug in their devices.
It##s true that handheld devices have more power and storage than mainframes did several decades ago. But with everything now digitized, including photos and video, there##s more processing to be done and more data to store. So, in effect, you now have a slick, dumb terminal that can also do unusual things, like help you find your car in a parking lot or entertain you with some eyestrain-inducing games.
There##s nothing wrong with dumb terminals, of course. It##s just that they##re, well, dumb. We##ve been told for decades that owning a dumb terminal is stupid. A notebook PC allows you to do really smart things locally; a dumb terminal requires a connection to the server to do anything except play games and perform minimal local processing. Even phone calls are done through a remote server somewhere.
On the server side, technology seems to have regressed even more. For all the talk about new technology like Qbits, which can be ones or zeroes, depending on the state–something that##s really complicated and, so far, only doable at temperatures close to absolute zero–we##re still using the regular old ones and zeroes. To put it in perspective, that concept was in widespread use in the late 1880s–with U.S. Census punch cards.
Moreover, when electronic computers first became available, they were so expensive that people had to book time on them. Batch processing was a scheduled function on a computer for that reason. Virtualization and multicore chips are the modern-day ways of scheduling computing on processors when they##re available. The scheduling is all done internally, but basically, it##s the same thing.
The power to run these servers has gotten so expensive that it makes sense to utilize all the capacity, and the only way to do that is to run multiple applications on a single machine and schedule them to make optimal use of the processing resources. You don##t have to fill out a chart for computer time anymore; there are plenty of computers these days. In fact, there are too many. But the goal of any IT manager is to get more jobs scheduled on any single computer by increasing utilization.
That is not the end of the march backward. The loudest warning among IT managers two decades ago was that desktop computing would create enormous security problems. They were right, of course. The centralization of data–or at least important data–makes it easier to manage, even though the distribution of much of the data that used to be centralized makes a company more responsive and competitive.
But perhaps the biggest step backward is in how computing is arranged and prioritized when it is done centrally. The concept of a computing architecture was born when military experts discovered that if you arranged the desks in a certain way for people doing computations, these people can be far more efficient. That design was inspired by World War I battlefields.
In some respects, we##ve come a very long way, but in others, all we##re doing is driving around in a circle–much, much faster. How are the marketers going to spin that?