Datamation October 1997


DATAMATION 40TH ANNIVERSARY

40 Years of IT History

The first issue of Datamation was published 40 years ago. In the past four decades, we've seen computing go from an obscure technical discipline to the heart and soul of business profitability.

By Paul A. Strassmann


Editor's Note: In 1957, 40 years ago this month, the first issue of Datamation rolled off the presses. In those four decades--nearly two generations!--this magazine has chronicled the development of our unique, complex, maddening, and fascinating industry.

This month, we take a quick look back over the last 40 years. To that end, we could think of no better observer than Paul A. Strassmann, who has analyzed, written about, and participated in IT since 1953. We have, therefore, arranged to reprint excerpts from Chapter 25, "A Historical Perspective," of his newest book, The Squandered Computer (The Information Economics Press, 1997, New Canaan, Conn., http://www.strassmann.com). To illustrate Mr. Strassmann's points, we have taken headlines, stories, and quotes from past issues of Datamation.


Mr. Strassmann begins his overview almost 70 years ago.

1930-1957: The machine accounting era

Much of this earliest cycle originated in increased government tax regulations that required the documentation of financial transactions. Centralization of control over logistics, the result of World War II controls, also required detailed accountability for physical goods. During this era, even senior executives with a lifetime of experience in marketing punched-card equipment were unable to understand the explosive demand for calculating devices.

Tabulating installations made possible the direct displacement of clerical labor, since government and corporate bureaucracies were unable to hire sufficient staffs to perform boring, low-paid work with high reliability. Actual savings were subject to after-the-fact audits. However, there was a limit to how much of the routine low-grade clerical labor could be displaced by machine accounting methods. Costs rose as organizations hired more administrators, analysts, and accountants.

October 1957

Research and Engineering Magazine repositions itself with the title Datamation. The new publication is widely seen as doomed to failure on the grounds that, with the possible exception of a few UNIVAC operators, no one would want to read about computers.


1957-1963: The mechanization cycle

The headquarters' staffs of increasingly bureaucratized global corporations now expected to receive large amounts of detailed data from local operations. They also required a computing capacity to process the punched cards and paper forms that flooded into their mail rooms. The demand for data-processing devices accelerated. Even the head of the world's largest tabulating equipment manufacturer underestimated the demand for stored-program computers by a wide margin. It was qualitative change in the relationships between centralized and decentralized operations that motivated corporations to create datacenters at corporate headquarters.

This investment cycle took place almost exclusively under the auspices of financial executives who discovered that control over mechanization of financial records expanded their power enormously. Cost reduction was not the purpose of this cycle. It catered to the needs of the controllership function that used computers to enlarge financial executives' influence into every detail of production and marketing. Questions about the financial justification of information technology investments did not arise, because the financial executives were trusted to make the right decisions. Toward the end of this cycle, large numbers of labor-intensive and inefficient local computers started to feed data to the central machines. This offered opportunities for new computing manufacturers, especially where national origin made it politically expedient to set up firms in protected markets.

The tabulating installations were then renamed, becoming the data-processing departments. They fell under the supervision of a data-processing manager, who usually belonged to the Data Processing Management Association.

July 1962

In "A Survey of Computer Facility Management," p. 29, Charles M. Lawson, of System Development Corp. in Santa Monica, Calif., reports on the results of a survey of 30 IT officers in Southern California. He writes that centralization, as well as its opposite, is a major issue: "As for the issue of 'centralization vs. decentralization' of computers within a company, two to one predicted a trend toward consolidation of hardware. Several respondents predicted the growth of the satellite concept, wherein small-scale computers communicated with a larger centralized system."


1963-1969: The datacenter cycle

In this stage, the demand for computers expanded faster than the manufacturing capacity that produced them. Complex priority schemes assigned waiting-list positions for equipment orders. Corporate information technology budgets were rising at a rate of over 30% per annum. Consolidation of local marginal datacenters and massive reprogramming of applications inherited from the tabulating cycle now took place, the primary objective being the extension of the influence and control of the financial executives over every routine business transaction. Mainframe technology fit this approach perfectly, and a symbiotic relationship between IBM and the financial executives was formed. During this era, IBM maintained a price protection umbrella over all equipment costs, which simplified the investment decisions. One negotiated delivery schedules and technical support, not costs or budgets.

From the standpoint of corporate financial planning, the expenses for information technologies did not require much justification, as long as the data-processing function remained under the tutelage of the chief financial officer. As reflected in national economic statistics, investments in computers became noticeable, but not significant, for the first time.

December 1968

In his article, "Computers in the World of Real People," on p. 47, George Glaser, a principal of McKinsey & Co., writes about the lack of understanding of managerial and other nontechnical corporate employees (real people) and the emerging IT professional: "[IT tells itself] plaintively that if management only understood us, it would love us.... Starting from that fuzzy assumption, various remedies are proposed and attempted. One popular approach is to urge managers to go to 'computer school.' Having seen executives without previous experience or interest in the computer get highly excited about their ability to write FORTRAN programs after a two-day course, we know computer programming can be a seductive task; so we resort to seduction. These courses are in fact an effective way to stimulate interest--but interest alone is not enough."


1969-1975: The time-sharing cycle

The totalitarian-like reign of central mainframes and finance-department dominance gave rise to intra-organizational conflict. The control of finance over all computers suffered erosion as other departments, such as production, engineering, and marketing, discovered that the possession of information was synonymous with organizational power. The central information systems organization responded by instituting a new round of equipment acquisitions to preserve its hold on new technologies. They offered the nonfinancial organizations access to centralized computing through time-sharing of mainframe computers over slow telephone circuits. This was an attempt to preserve an untenable position by extending the reach of information technology through inappropriate technologies.

Some scientific, engineering, and, particularly, production departments responded to these costly and nonresponsive services by taking their business to independent information-service suppliers. The concept of outsourcing information services took hold for the first time.

The defectors from the clutches of the centralized financial establishment also started purchasing small-scale computers that offered on-line access to computing power through high-speed terminals over local cabling. This avoided dependency on expensive telephone circuits. The contest between centralized and decentralized computing stimulated the expansion of relatively sophisticated computing capacity at local levels. It also generated budget growth where the expansion of computing capacity escaped attention from corporate staffs. During this cycle, the need for integration between computing and telecommunication became apparent.

Toward the end of this era, questions about computer investments appeared on the agendas of executive committees. The usual reason was to settle disputes between competing groups, each of which claimed it could save money by gaining computing independence. Business executives came to recognize that the lifecycle support costs of information processing was much larger than the acquisition cost of computers as soon as each local site acquired its own staff, consultants, and suppliers.

With rising budgets and increasing equipment acquisitions, a few corporations started to promote their directors of data processing to vice presidents of information systems. In isolated instances, the shape of the future relationships became apparent when companies created captive information services divisions to serve internal needs, as if the company's computer users were customers. This was the beginning of the breakdown of one-time monopolies into service-oriented units.

January 1969

Robert V. Head, an author and software consultant, writes on the new order of computing in "Obsolescence in Business Organization and Management," p. 29. He warns, "...once information is collated in a common database and the processing of this information is approached from a company-wide standpoint, much of the rationale underlying presently defined organization boundaries can be seriously questioned."


1975-1981: The minicomputer cycle

At this point, the purchasing power of computing and information services started to migrate from being centralized, administered by the financial executives, to the realm of consumers of computing, where it disappeared into the cost of goods. This liberated enormous amounts of new discretionary spending. Operating executives now found it attractive to trade off expenditures for labor in favor of computerized automation. Local operations also acquired their own systems development staffs or created relationships with consulting firms. This crystallization of enclaves of local competence acted as a catalyst for a chain reaction that then propelled the expanding demands for computing.

The minicomputer investment cycle inaugurated the proliferation of computing capacity. It initiated the shift from disciplined mainframe computing to improvised computing whenever people could afford to purchase their own equipment. Prices of computing dropped rapidly, as competition became more intense and the costs of information processing shifted from the central processing unit, where there was little competition, to computer peripherals and to software, where new and hungry entrants made prices drop precipitously. In this phase of development, it became increasingly difficult to account for computer spending. Few directors of information systems could tell how much their firms spent on information technologies. Computing started to become embedded into a firm's goods-creation processes and therefore disappeared into accounts, such as manufacturing machinery or research.

Toward the end of this cycle it is possible to observe the phenomenon that is to emerge in the future. Computer technologies would cease acting as the means for exercising control over individuals and their work. The image of the omnipresent "big brother" watching employees is now mitigated by the discovery that the possession of computing power makes local management feel more empowered. For the first time, computer technologies became subservient to the capacity of local managers and professional employees to satisfy their rising information-processing needs.

The emphasis on clever software ceased to be a priority concern. Increasingly wasteful uses of cheap computing offered affordable processing for even poorly designed applications. During this stage, the traditional formality of systems-development methodologies lost importance as experimental and interactive improvisations offered faster completion schedules. As a testimony to the attractiveness of these approaches, a flood of new orders for computing capacity was placed with computer vendors. It set the conditions for the next investment cycle, as employees had acquired the taste for the instant availability and personal ownership of the coveted computing resources.

October 1976

None other than Paul Strassmann, then at Xerox, writes about the development of IT in "Stages of Growth," p. 46. He notes: "When one views the extent of office automation, the rate of growth of the 'white collar' section in the economy, the increased complexity of information handling demanded by our society, and the high rate of inflation in labor rates while the cost of technology is dropping radically, it is hard to accept the idea that we have reached maturity in growth of office automation."


1981-1988: The microcomputer cycle

The overwhelming acceptance of the microcomputer by office workers is a phenomenon that hardly anyone foresaw. A business systems analyst, a financial controller, or a senior executive in the 1970s would find it inconceivable that, within 15 years, approximately three-quarters of U.S. office workers would operate their own computers at their own desks.

The driving force behind the microcomputer investment cycle was the anxiety of office workers that they may become obsolete. Another strong motivation was the desire to get rid of the burdensome dependency on the monopolistic computer department. The microcomputer investment cycle reflects the shift from the dominance of computer experts to the defensive enthusiasm of the office workers. It was the office workers who wished to add computer skills to their resumes when large corporations began downsizing. In many respects, it was the professional workers who led the charge to acquire total computing independence. This was also the beginning of a transition from people acquiring computer expertise to computer software acquiring a better match with the capabilities of people. At this stage, the introduction of microcomputers takes place without any pretense to attempt a financial justification. It provides no possibility for even the most elementary way for verifying the claimed benefits.

The explosive nature of the microcomputer investment cycle introduced an element of discontinuity and surprise that is unprecedented in the history of technology. The firms that dominated the earlier investment cycles, such as Burroughs, Digital Equipment Corporation, Honeywell-Bull, IBM, National Cash Register, and UNIVAC, toward the end of this stage showed enormous financial losses and dismissed a large share of their work force.

The corporation that embodied the attributes of computing independence and local processing capacity better than others, Digital Equipment Corporation, was unable to stretch its success from the minicomputer era into the microcomputer cycle. It was a reflection of a bias that could not conceive of the acquisition of a personal computer as more a matter of psychological effectiveness than of engineering efficiency. Students will study the DEC case for years as an example of engineering ideology overtaking socioeconomic comprehension, and how those misjudgments led an exceptionally successful organization to follow a path close to disastrous demise.

The personally operated computer offered an enormous expansion in accessibility to calculating power. However, from the standpoint of improving cooperation and coordination among office workers, the microcomputer inhibited communication. Increasingly, it came to be recognized by management as a source of spreading chaos. Nobody could determine the payoff from the microcomputer investments made during this cycle since corporate overhead expenses rose faster than revenues and profits. The generous spending on personal computers did not produce convincing economic evidence, other than glowing anecdotes, that organizations became more productive.

Those that gained from the increasing confusion were the computer professionals, consultants, and sellers of prepackaged solutions. The revenues of the consulting firms rose four times faster than sales of computer equipment. The revenues of computer services firms grew twice as fast as those of the consultants. The sales of shrink-wrapped software topped the growth rates of everything else in the computer industry while realizing gross margins exceeding those earned from luxury goods.

To cope with the proliferation of choice and the huge budget increases, corporate management now elevated their principal computer executive to the corporate rank with the title of chief information officer.

February 1, 1988

As the minicomputer market began to change radically as a result of a move into business computing, major minicomputer vendors had to make enormous changes in direction and technical outlook. Some could and some could not. Gary McWilliams, in "Can Digital Stay On Track?" p. 52, writes of DEC: "Can this 30-year-old powerhouse, so well accepted in the scientific and engineering communities...become the principal engine driving corporate computing?" The answer, alas, would be a resounding no.


1988-1995: The client/server investment cycle

The client/server concept of organizing information services arrived as a reaction against the unmanageable proliferation of stand-alone computers. It did not happen, as was claimed, as the need to replace expensive mainframe computing with a more distributed computing architecture. The evidence is now overwhelming that mainframe computing is, in fact, more economical than the highly acclaimed server substitute.

In terms of intracorporate politics, it was not prudent to promote this investment cycle as a way of curbing microcomputer independence. Employees enjoyed their newly gained computing independence too much to sanction such an attack. However, something had to be done to restrain mushrooming local initiatives. Excessive local autonomy inhibited attempts to interconnect individual microcomputers and did not remedy the burdens of incompatible, noninteroperable, insecure, and costly local area networks.

The generous funding for the client/server investment cycle, without much economic justification, was a counterrevolutionary response to the microcomputer uprisings against the central computer establishment. Unfortunately, instead of delivering lower costs of computing through consolidation of local networks under a network administrative discipline, the client/server architecture turned out to be more expensive and less reliable.

For example, the vendors of client/server architectures, who have claimed, for the last five years, to offer a low-cost alternative to mainframe computing, will have to recant. In a recent survey, 53% of respondents to a survey of 225 key IT managers reported that their client/server projects were over budget, late, and had fewer features than specified. The survey also found that 31% of all client/server projects suffered cancellation. Nobody would ever fly an airline where 31% of flights crashed and where 53% of the flights did not arrive at the scheduled destination.

A major contributor to the rising expenses for client/servers arose from the propensity of supporters to specify the latest technologies for all new applications. This had and still has the unfortunate result that obsolescence sets in soon after the initial installation takes place. The proponents of the best and latest equipment choices rarely, if ever, included the costs of the upgrades in their project lifecycle projections. Yet the costs of upgrades continue to rise steadily as the technological life of equipment and software shrinks. For instance, a usually reliable source of financial data estimates the cost of upgrading to a Windows95 environment for a client/server configuration to be as much as $18,000 per server. The average cost of upgrading a single local PC software from Windows 3.1 to Windows95 ranges from $360 to $960. The estimated cost to the U.S. economy to proceed with the Windows95 upgrades would then be at least $20 billion.

The personnel support costs for loosely organized distributed computing far exceed any conceivable reductions in the costs of processing equipment. However, favoring the client/server architecture is its capacity to improve response times to inquiries. The increased independence of the microcomputer operators from the disciplines of the central regime also allows the quick development of local adaptations, modifications, and enhancements of standard applications, now readily available at a low cost from off-the-shelf software packages. However, the greatest gain from the client/server environment is in its facility to permit a great deal of local experimentation that leads to innovative computer solutions. With the increased attainment of computer literacy, microcomputer customers now demand instant network feedback and application flexibility. They now demand superior network performance, rapid database access times, and instant response to the keystrokes that they became accustomed to when they had stand-alone desktop machines at their complete control.

To obtain these conveniences, someone must pay a higher price for technology, support services, and training. For instance, the estimated first-year client/server training costs for a mid-size network of 200 clients is $150,000, for customer education $32,850, and for support staff training $87,000. That adds up to $1,350 per client per year, or at least 50% more than the first year depreciation of hardware and software.

September 15, 1992

In "A Survey of Computer Facility Management," p. 29, Charles M. Lawson, of System Development Corp. in Santa Monica, Calif., reports on the results of a survey of 30 IT officers in Southern California. He writes that centralization, as well as its opposite, is a major issue: "As for the issue of 'centralization vs. decentralization' of computers within a company, two to one predicted a trend toward consolidation of hardware. Several respondents predicted the growth of the satellite concept, wherein small-scale computers communicated with a larger centralized system."


Today and tomorrow

From here, Mr. Strassmann moves to the present, which he sees as a transition period moving toward Web and intranet-based computing, and looks to the future of computing. He provides us with a warning against those who believe that computing technology will inevitably lead to better, freer, and more humane societies:

What may be politically acceptable under the guise of efficiency and simplicity could easily be perverted into computer-perfected monitoring of what people eat, where they are at any moment, what their habits are, and with whom they communicate. Government-sanctioned elimination of all cash in favor of electronically monitored transactions may be convenient for the consumer and profitable for the credit-card companies. It may solve many of the obstacles in law enforcement of cases involving tax evasion, fraud, illegal payments, corruption, and information terrorism. However, all of these advantages may be overshadowed by the specter of intrusions on privacy and defenselessness from financial penalties from a hostile government. I believe that finding a balance between efficiency and the protection of constitutional rights will rise as one of the most burning issues of the information age. //


| Table of Contents | Cover story | Cutting Edge | Online ad index |

DATAMATION Copyright © 1997. All rights reserved

This article has been cached. The original is available from the Datamation web site.