Original article

Situation wanted

| continued from "Of men and machines" |

Productivity declines

The big lie in technology is that it increases productivity. It doesn't: Productivity as measured by the U.S. Bureau of Labor Statistics has been, and is still, declining. Critics such as Paul Strassmann, author and CEO of the Software Testing Assurance Corp. in Stamford, Conn., have done the analysis and come away with the firm conclusion that computers have not increased productivity in any measurable way. In a book excerpt available at his web site, Strassmann looks at the banking industry because it has invested disproportionately more in information technology than any other segment of the economy and because IT is the largest cost center in a financial institution. All those automatic teller machines and online banking systems should make the average bank a lot more productive. Right?

Paul Strassmann:  productivity prophetNope. Strassmann writes: "If information technology increased the productivity of the U.S. banks' personnel, then one would find a declining ratio of staff expense for each dollar of revenue. Such an improvement would come from an effective substitution of more productive computers for less productive administrative labor.... In 1989 one dollars' worth of staff expense supported $7.50 dollars worth of revenue, whereas seven years later--after a period of sustained computerization--the same dollar supports only about $5.30 of revenue."

Strassmann goes on to examine the effect of computerization on noninterest expenses: buildings, payrolls, the sort of things that should be reduced when branch offices are closed and ATMs are brought in. The results? Strassmann concludes: "In 1989 one dollars' worth of non-interest expense supported $3.65 dollars worth of revenue, whereas seven years later--after a period of sustained computerization--the same dollar supports only $2.80 of revenue."

If banks aren't more productive because of the miracle of computers, then what about the computer industry? Have computers made for better programmers? Have they reduced the need for support personnel? Is it a case of physician, heal thyself, but in this case an industry that needs to rethink its personnel policies?

"It's a myth that computers have measurably increased overall U.S. productivity of information."

An interesting dichotomy exists between the hardware and software sectors. From 1960 to 1984 employment in the manufacturing of computers and computer equipment rose 259% compared with an increase of 74% in nonfarm employment. However, from 1984 to 1995, computer manufacturing lost 32% of its workforce, according to the Bureau of Labor Statistics. In a 1996 report published in the Monthly Labor Review, Jacqueline Warnke, an economist at the BLS, wrote: "As the technologies of manufacturing computers becomes more routine and cheaply available, jobs are eliminated from computer manufacturing. On the other hand, industries that support computers have shown a remarkable increase in employment."

"It's a myth that computers have measurably increased overall U.S. productivity of information. Whatever productivity gains may have happened to increase profits took place in the factories and the warehouses," writes Strassmann. And that's where automating technologies such as robotics and machine vision have had the most impact. Not in the executive offices--where solitaire and extracurricular web surfing are the order of the day--but on the factory floor.

The most white-collar of the productivity-enhancing tools is a technology known as Case-Based Reasoning. CBR is a technology that collects information, say complaints from irate owners of home PCs that won't work, and retains it in a database so future operators can type in a few strategic keywords, such as model, software title, and installed peripherals and find an answer from a past case. This is the 1990s version of the expert system, which was the 1980s incarnation of artificial intelligence. Distill the wisdom of the chief cookie baker, code it into a program, and you no longer need the baker.

Now, with unemployment at its lowest level since 1970, with birth rates declining and Silicon Valley companies so desperate that they're recruiting programming talent out of high schools, the big threat confronting the technology industry is not the next product cycle, software patents, or the breakup of the Microsoft/Intel oligopoly, but the lack of qualified personnel. Next in line in the threat department: the growing movement that challenges the notion that computers are making us more productive.

Not since Sputnik kicked off a round of national hand-wringing over the strategic importance of engineering and science has there been so much attention paid to the issue of where America's brains are. Ask any CEO what their biggest problem is, and it isn't the government, but stupid, untrained, illiterate employees. No wonder the private sector spends more money on computer-based training than the public school system does, but, as David Gelertner has so eloquently opined, perhaps what we need more of are the three fundamentals of reading, writing and arithmetic and not a PC for every pupil.

External links:
American Engineering Association
Information Technology Association of America
U.S. Bureau of Labor Statistics
Paul Strassmann

Comments? Send a letter to the editor

| top |

See also:

Productivity by the numbers
Statistical proof.

Learning by example (Forbes, June 8, 1992)
The case method is being rediscovered by the artificial intelligence crowd.

Forbes Front Page | Forbes Magazine | The Toolbox

Sitemap | Help | Search | Webmaster

© 1998 Forbes Inc. Terms, Conditions and Notices