Trends in Computing

Taking Computers to Task

Coming generations of computers will be more fun and engaging to use.
But will they earn their keep in the workplace?

by W. Wayt Gibbs, staff writer


SUBTOPICS:

  • The Productivity Puzzle
  • PC Pathologies
  • An NYSE Model
  • Toward Useful Tools
  • ILLUSTRATION:

  • Creeping Featurism
  • SIDEBARS:

  • Beyond the Desktop
  • Four Innovations
  • Investment vs. Productivity

  • Productivity Gain?

    At a grand Silicon Valley expo convened in March by the Association for Computing Machinery, a handful of chief technologists from industry and academia rose before a rapt audience of 2,000 to forecast how computers will evolve over the next 50 years. The exercise, as most of the gurus admitted, was specious--even the clearest crystal ball clouds hopelessly beyond a decade or so. But in the process of extrapolating a distant future, they tipped their hands to reveal what wonders they believe lie just around the bend.

    Computers exchanging video calls as commonly as e-mail. Three-dimensional windows that open into virtual worlds instead of virtual scrolls. Machines that speak and respond to human languages as well as their own. Personal "agent" programs that haggle for concert tickets, arrange blind dates and winnow useful information from the chaff of daily news. And everything, from our medical records to our office files to the contents of our refrigerators, hypertextually linked via the great global network.

    These transformations in the way we interact with software--its so-called user interface--have begun to graduate from idle speculation to working prototypes and even a few shipped products. It is widely expected that before long they will replace the flat windows, icons, menus and pointers that for 12 years have dominated personal computer interfaces. In demos, the new technologies are inarguably cool, and as Nathan Myhrvold, Microsoft's vice president of applications and content, observed during his turn at the dais, "'Cool' is a powerful reason to spend money."

    But in the computer industry and in the media that cover it, it has become common to tout with almost millennial fervor that the changing face of computers will make them not just more enjoyable but also dramatically more useful. Historian (and Scientific American columnist) James Burke spoke for many at the conference when he asserted that "we stand today on the threshold of an explosion in information technology, the social and economic consequences of which will make everything that came before look like slow motion."

    In fact, the explosion is well under way, and its economic blessings so far appear decidedly mixed. For all the useful things computers do, they do not seem, on balance, to have made us much richer by enabling us to do more work, of increasing value, in less time. Compared with the big economic bangs delivered by water-, steam- and electricity-powered machines, productivity growth in the information age has been a mere whimper.

    Anyone who has whiled away an afternoon upgrading a word processor, taken a break at work to download box scores from ESPN.com or watched in horror as a system crash obliterated several hours' work can attest to part of the problem. Recent studies of computer use in offices reveal that much of the time saved by automation is frittered away by software that is unnecessarily difficult, unpredictable and inefficient. Design experts warn that current industry trends toward increasingly complex programs and new, untested ways of presenting information could do more harm than good--and will almost certainly do less good than advertised. The road to improved productivity, they argue, heads in a very different direction.

    The Productivity Puzzle

    Which direction businesses follow is important because productivity growth is "at the crux of economic success," says Stephen S. Roach, chief economist at Morgan Stanley. "It is the only way a nation can increasingly generate higher lifestyles for its households and separate itself competitively from its peers." That much economists agree on. But the past decade has seen vigorous debate over the seemingly poor payoff from industrial nations' 25-year bet on information technology (IT) as an engine of economic growth.

    The stakes continue to mount. Despite a doubling every 18 months in the processing power a dollar buys, corporations have been pouring more and more dollars into computers. In the U.S., Roach reports, companies last year spent 43 percent of their capital budgets--$213 billion--on hardware alone. That is more than they invested in factories, vehicles or any other type of durable equipment. Adding in software, networks and the people needed for computer support and training brings the total IT bill for 1996 to about $500 billion in the U.S. and more than $1 trillion worldwide, according to Paul A. Strassmann, chairman of Method Software and a former chief information officer for Xerox and the Pentagon. Polls indicate that executives intend to spend even more next year.

    Businesses buy computers for many reasons but most ultimately aim for two goals: lowering the labor and overhead needed to make their product, and raising the number and price of products they sell. In both cases, IT investments should boost national productivity, corporate profits and standards of living. What puzzles economists is that productivity growth measured in the seven richest nations has instead fallen precipitously in the past 30 years, from an average of 4.5 percent a year during the 1960s to a rate of 1.5 percent in recent years. The slowdown has hit the biggest IT spenders--service-sector industries, especially in the U.S.--hardest. Most of the economic growth of the 1990s can be explained by increased employment, trade and production capacity. Computers' contributions, in contrast, nearly vanish in the noise.

    There have been a few notable exceptions. Telecommunications companies extracted almost 7 percent more work per hour from their employees each year between 1973 and 1983, for example. "They had many highly routine tasks that were relative easy to automate," observes Tom Landauer, a former Bellcore cognitive scientist now at the University of Colorado. That, Roach says, is typical. "What IT payback we've seen has been confined largely to low value, transaction-processing functions: moving trades, clearing checks, processing orders." In the larger occupations where most economic activity happens--sales, management, professional work--"productivity gains have been limited and disappointing," he says.

    Economists have proposed four plausible explanations for the puzzle. One is that the slowdown is a mirage created by outdated measuring tools. Computers allow companies to offer speedier, more personal services and a wider variety of goods, advocates argue, and those benefits often escape the standard economic statistics. In education, finance and a few other IT-intensive industries, output is inherently hard to measure. Perhaps the payoff is real, just hard to quantify.

    "There may be some legitimate measurement problems of output," Roach concedes. "But I would argue that there are more compelling biases in the labor input because of the enormous volume of unreported work time in the U.S." Cellular phones, laptops and networks, he adds, "all allow knowledge workers to work longer than ever before." Unlike their productivity, however, employees' time has definite limits.

    In his 1995 book, The Trouble with Computers, Landauer points out that if mismeasurement is the answer, it must be mismeasurement on an implausibly colossal scale. For if productivity growth was in fact just 1.25 percentage points higher than the economists have measured since the 1960s, then by 1995 official statistics understated the U.S. gross national product by roughly $1 trillion, an error of about $10,000 per household per year.

    A second explanation is that U.S. industries have so far purchased only enough computer hardware to account for 2 to 5 percent of their capital stock. Roach, though critical of IT's contributions so far, claims that "you just can't expect such a small slice to transform the performance of the corporate sector." But computers are mere doorstops without their software. Adding the cost of programs, telecommunications and other office equipment brings the total to almost 12 percent, according to Daniel E. Sichel of the Federal Reserve Board. That still excludes the cost of support staff and maintenance. Railroads energized the economy when they accounted, at their peak, for just 12 percent of capital stock, editor Pam Woodall pointed out in The Economist last September. Why not computers?

    Paul A. David, a Stanford University economist, suggested a third possibility in 1990. Electric motors, he observed, did not boost productivity growth appreciably until more than 40 years after Edison installed the first dynamo in 1881. It wasn't until 1919 that half of American plants were wired for power. And it was later still before factories reorganized their production lines to exploit the new technology fully. By analogy, now that about half of American jobs involve some form of computer and companies are deploying IT more strategically, perhaps the big productivity gains are just beginning.

    "An information revolution that transforms work overnight is just not likely," argues Erik Brynjolfsson of the Massachusetts Institute of Technology's Sloan School of Management. "But I think productivity may grow disproportionately in recent years as companies reorganize." IT advocates have widely touted Brynjolfsson's 1993 analysis of 367 large firms, which calculated an average gross return of 81 percent on computer purchases--more than 13 times the gross return on other kinds of capital investments.

    That sounds impressive, but Roach complains that the study attributed all the success of firms to the variations in their computer investments, ignoring other important factors. Brynjolfsson disputes that assertion but acknowledges that computers' rapid depreciation and many hidden costs undoubtedly cut net returns considerably. Indeed, in a study last year using similar data, Sanjeev Dewan of the University of California at Irvine confirmed high gross payoffs but found little evidence that IT performs significantly better than other kinds of equipment it tends to displace. Sichel, in a book published in June, concludes that "the contribution of computing services to growth is unlikely to rise substantially in coming years, as long as [IT] earns the same net return as other capital."

    Strassmann and others have suggested a fourth explanation. Perhaps computers, despite the enthusiastic claims made for them, are still mediocre tools for improving the efficiency and quality of most information work. Maybe the productivity boom of the 1950s and 1960s, and the rise in living standards it purchased, was a fluke, the result of an unsustainable postwar boom. After all, the languid productivity growth of recent years is right in line with that of the first half of this century.

    Substantial evidence supports the unpleasant conclusion that much of the $1-trillion annual IT bet is poorly wagered. According to the Standish Group on Cape Cod, Mass., 31 percent of the computer systems that corporations build for their employees are either canceled or rejected as unfit for duty. When they are installed, Strassmann says, "computers often create pathologies and increased costs." Consider U.S. hospitals, he suggests in his new book, The Squandered Computer. In 1968 they employed 435,000 administrative staff and served 1.4 million patients at a time. Although the average daily patient population dropped to 853,000 by 1992, administrative employment actually rose to 1.2 million--in large part because information processing consumed an increasing amount of staff time.

    Many industries that made strategic investments in technology to become more flexible and responsive to changing markets, Roach says, have in fact accomplished quite the reverse. "Here's the rub," he explains, "About 85 percent of these outlays over the years have gone into banks, securities firms, insurance companies, airlines, retail and the like. It used to be that these companies' main assets were people." During recessions, they could lay off workers and remain competitive. "But now they have this massive infrastructure of installed IT" whose expenses are fixed, he points out, adding that in the next recession, "there could be an extraordinary crunch on their bottom line. So there is a real downside here to the information age."

    Still, there is clear upside potential. If companies can find more effective ways, if not to automate, then at least to augment their higher functions, computers and networks might produce as strong a productivity kick as the generations of machines that preceded them. To do that, firms will need to pay closer attention to what IT actually costs and truly delivers.

    PC Pathologies

    A typical desktop PC carries a price tag of about $3,000 in the U.S.--or about $1,000 a year over the average life span of an office machine. But research by the Gartner Group in Stamford, Conn., reveals that in corporate practice, the average annual bill is more like $13,000. Most corporate PCs are linked to a network and contain a few standard programs: that adds $1,730 a year. Because computers are often hard to use, companies have to provide about $3,510 worth of technical support for every user. Then there are the technicians needed to keep the network humming, who add $1,170 to the check.

    The largest notch in Gartner's tally, however, is for the time that employees waste "futzing" with their computers rather than working on them. That costs employers another $5,590 per computer each year, the group estimates. Its guess may be low. SBT Accounting Systems in San Rafael, Calif., found in a 6,000-person survey that office workers futz with their machines an average of 5.1 hours--more than half a workday--each week. One fifth of that time was wasted waiting for programs to run or for help to arrive. Double-checking printouts for accuracy and format ran a close second. Lots of time goes into rearranging disk files. And then there are games; Microsoft Windows comes with four preinstalled. All told, SBT estimates, futzing costs American businesses on the order of $100 billion a year in lost productivity.

    There may be little that companies can do to reduce hardware and futzing costs. Boeing removed Windows's solitaire game from all its machines, Landauer notes. Sun Microsystems reportedly banned its managers from using presentation software to create fancy slides for meetings.


    Image: Jennifer C. Christiansen

    Of course, businesses could lower their IT budgets considerably by holding on to their computers and software for more than three or four years. But few do. "My guess is that 80 to 90 percent of that $213 billion [U.S. investment] goes to replace obsolete IT each year," Roach says.

    The software industry frustrates long-term investments by producing ever larger, slower programs that require ever larger, faster machines. At the March conference, Myhrvold modestly proposed "Nathan's First Law": "Software is a gas," he said. "It expands to fill its container." In fact, that is more of a policy than a necessity. "After all," he observed later with a laugh, "if we hadn't brought your processor to its knees, why else would you get a new one?"

    But when it comes to software, new is not necessarily improved. Behavioral studies have shown that "creeping featurism" is often counterproductive. Software developers know this. "The biggest single problem we see users having is mapping their goal to the function in the program that will perform it," says Ken Dye, Microsoft's usability manager for desktop applications. "Adding features that have less and less utility has made it more difficult for people to do that, because there is a larger set to choose from." So why do this year's Microsoft programs have hundreds more features than the previous versions? Dye blames it on trade press reviewers who evaluate products with checklists.

    If increasingly feature-laden interfaces are a business necessity for software companies, it may be because their customers underestimate the price of hard-to-use software. Four years of surveys by Margaret Hurley, director of research for the Nolan Norton Institute in Melbourne, Australia, show that nontechnical employees take 4 to 10 percent of their time to help co-workers solve computer problems. That huge reservoir of hidden support, Hurley calculates, lofts the total annual cost for a PC from $13,000 to about $23,500. "The factor most closely linked to support costs," she says, "was the extent to which the user interface matched the way users thought and worked."

    Designing software that is both efficient and easy to use is hard, says Jakob Nielsen, an interface expert at Sun Microsystems. "User interfaces only work if every detail works. If you get one button wrong, people can easily waste half an hour recovering from a mistake." Because most interfaces are designed by technically savvy programmers and are rarely tested on typical users, he says, more often than not they contain dozens of significant flaws.

    The Microsoft team in charge of designing the user interface for Windows 95 studied people using the previous version, which was the most common PC operating system for three years. Almost all users, the team reported at the CHI '96 conference on computer-human interaction, had trouble managing the software's overlapping windows. Nearly half avoided running more than one program at a time because doing so confused them. A large fraction were bewildered by the filing system, which places folders within folders within folders, unlike the real world. Many had trouble double-clicking the mouse. The average beginner took 10 minutes to open a program if it was not already visible. Many of these basic problems also bedevil Apple Macintosh and UNIX users; custom-built software is often worse.

    "It's time to get angry about the quality of user interfaces," exclaims Ben Schneiderman, head of the Human-Computer Interaction Laboratory at the University of Maryland. "The public doesn't understand what they could have had by now," agrees Bruce Tognazzini, a designer who helped to develop the original Macintosh interface. They and others argue that applying human-factors research to existing software technology could make workplace computer systems dramatically more productive, easier to use and cheaper to support.

    An NYSE Model

    The New York Stock Exchange (NYSE) took this approach four years ago when it hired Mauro/Mauro/Design to help upgrade the four primary computer systems used on its trading floor. Stock specialists, each trading just a few companies, were using vintage 1987 software and millions of paper cards to record quotes and sales. But the market's growth was crowding more traders into very limited space. "In two weeks we now process as many shares as we handled each year in the late 1970s," explains William A. Bautz, the exchange's senior vice president for technology. The volume threatened to choke the system.

    The NYSE needed its upgrade soon, but instead of jumping into programming, "we spent the first six months observing traders on the floor, modeling their cognitive work flow," recalls Charles Mauro, president of the design firm. "Then we would develop a prototype, test a piece, expand the functions, test again--we went through 30 iterations of testing." The software engineers hated this approach, he says, because the specifications kept changing.

    But the results were impressive. The upgrade took just two years to complete; previous revisions had taken six. Wireless handheld computers replaced the paper cards and readers, saving $1 million a year, according to Mauro. As the system has been rolled out, Bautz reports, specialists' productivity has risen dramatically. "We think we can now get over two billion shares a day through the system," he says. Error rates have fallen by a factor of 10 since 1987 even as workloads more than doubled.

    Bautz figures that despite the fees of two design firms, the upgrade cost no more than it would have using a traditional development process. "I'm firmly convinced that within 20 years, this methodology will be the prime factor in increasing productivity in the information age," Mauro testifies.

    He is not alone in his enthusiasm. Sarah A. Bloomer, director of the Hiser Group, reports that when her firm reformatted a grant administration system to reflect the way workers at Australia's health and family services agency approach their jobs, employees were able to shave five minutes off each task, on average--equivalent to an annual savings of $3.5 million. After American Express redesigned its computer interface for bank authorizations, training times reportedly fell from 12 hours to two, and task times dropped from 17 minutes to four--a 325 percent efficiency improvement.

    Unfortunately, few of the futuristic technologies that so excite academic and industrial researchers have demonstrated such utility. "There is no evidence yet that any of these putative replacements for the standard graphical user interfaces are any better," Landauer says. "It's people's pipe dreams at this point."

    Indeed, researchers presented at least 83 novel interfaces at the CHI '97 conference. Many do truly nifty things. One program takes dictation with 97 percent accuracy from radiologists. Another allows users to walk through virtual-reality supermarkets. A "HyperMirror" both reflects users' images and projects video of distant collaborators. A Web-surfing agent automatically races ahead on the Net and brings back pages that it calculates will interest its owner.

    But only nine of those 83 projects compared workers' performance on real tasks using the new interface with their current way of doing things. Four offered no gains at all. Radiologists completed their reports faster without the computer. Video offered no improvement over audio for collaborative writing or design. Only three new interfaces--an interactive blueprint program, the combination of a keyboard joystick with a mouse for two-handed input, and a "wearable" computer--sped work significantly.

    That last device is worth noting, for it exemplifies what some argue is the direction most likely to lead to strong productivity growth. Developed at Carnegie Mellon University, the VuMan3 computer straps onto a belt; a wire connects it to a small display worn over one eye. Although meant to be used for three-dimensional tasks where both hands must be free, the device does not use 3-D graphics or speech recognition. It has simple controls: a large dial and three oversized buttons. Yet in field tests at a military maintenance depot, the device cut the time needed to inspect amphibious vehicles by 40 percent and trimmed another 30 percent off paperwork. Like the stock traders' handheld pads and the Australian bureaucrats' grant forms, its interface is specialized for a single task.

    Toward Useful Tools

    Specialization, suggests Donald Norman, former vice president of research for Apple, can help provide a "better match between people's software and their jobs." He proposes abandoning giant, general-purpose application suites for simpler tools designed to do just one function well. Some, he suggests, should be separate "information appliances" outside the PC but linked to it by a network. The much heralded convergence of PCs, televisions and telephones "is totally wrong," concurs William Buxton, chief scientist for Silicon Graphics. "We need to diverge."

    Apple tried last year to introduce interchangeable software components with its OpenDoc system. OpenDoc found few takers, but the industry appears ready to try again. Several groups have proposed competing standards that would allow tools to work together, regardless of who made them.

    Once compromise is reached, the Internet will allow anyone to sell the software equivalent of a better mousetrap; the Web has lowered software distribution costs to virtually nil. Many companies hope the Net will similarly slash their distribution, transaction and marketing expenses. The path toward that grail is filled with dragons, however. As firms push their employees and internal communications onto the network, futzing opportunities multiply. Software that Netscape plans to release later this year will enable a company to create Web "channels" that serve as the default interface on its employees' computers. Workers can subscribe to other channels (such as ABC and ESPN) as well: "We have the most powerful media companies in the world working to take over the desktop," grins Mike McCue, Netscape's director of advanced technology.

    Given such software and the proliferation of corporate-wide networks, "every company suddenly has its own user-interface design project," Nielsen points out. "If they don't do it right, they are going to face immense productivity loss." On the other hand, if businesses seize the opportunity to study what their workers do, to test software carefully and to demand provable payoffs from their information technology, Landauer's rough calculations suggest that productivity growth in the U.S. could rise back above the 4.5 percent mark of the 1960s.

    Perhaps, but nations might do well to hedge their bets. Even if user-centered design becomes the rule rather than the exception, Roach of Morgan Stanley warns against expecting macroeconomic miracles. "Productivity is a trend that moves glacially," he says. "You cannot expect instant payback from any input factor, whether it is increased employment, better training, new technology or more capital. It's just not going to happen overnight."