At the same time that these kinds of budget cuts are being introduced, organisations are discovering that the assets that are in place no longer meet the strategic needs of the enterprise--business processes are being rendered obsolete at such a rapid rate. Consequently, the fundamental issue that has to be faced by IT managements is the balancing of two forces --reductions in total costs while preserving the ability to continue at a high rate of application enhancement and development.
What is dear is that if you introduce new systems that have high maintenance and operating costs, you are going to use a large portion of the cash that should otherwise be allocated to the development budget. If you want to preserve the level of investment in enhancement and development capacity, savings have to come from operating costs.
Consequently, one of the prime parameters in the implementation of new systems is the velocity at which they can be injected into your environment because you need them to generate the operational savings that support further investments. The driving parameter in this kind of model is time. If you make the time, you make the money; and so timing becomes a critical element in the realisation of business value.
However, you need to ensure that the investments that are undertaken not only generate quick returns, but also create a renewable and permanent asset. What you have to do is to replace existing systems with systems that have much longer life.
To be economically advantageous, new systems must have substantially lower maintenance costs than earlier implementations, and they must have a very high degree of code portability. My minimum target for portability is that after 6 years 60% of the code is portable to the next generation of hardware and becomes part of the next generation of applications.
The 6-year cycle is dictated not by hardware technology, which is now on a 2-year cycle, but by the rate of innovation in business. It is crucial to understand that it is the internal organisational structure that generates the kind of information system requirements that lead to investment in new applications. Therefore, the rate of change of internal procedures and the internal organisational structure is what governs the rate of replacement of applications systems -- and any corporation that believes that their procedures and organisational structure will be able to survive more than 6 years is working under highly questionable assumptions.
1986 1987 1988 1989 1990 1991 Annual Benefits 0.0 0.0 83.4 72.3 82.2 91.4 Annual Costs 63.3 31.5 14.5 15.0 15.0 17.2 Net present value of cash benefits @ 25% = 123 Net present value of cash costs @ 25% = 94 Benefit/Cost Ratio = (123 - 94)/94 = 31%Figure A shows what an investment would look like without residual value--the way these analyses are usually done. The number that is important is the cash benefit to cash expenditure ratio of 31%. This allows for the discounting of future gains, but that is not adequately realised even at a high discount rate like the 25% that has been used in this example.
Figure A: Project Benefit/Cost Ratio
1986 1987 1988 1989 1990 1991 Residual Value Annual Benefits 0.0 0.0 83.4 72.3 82.2 91.4 340.5 Annual Costs 63.3 31.5 14.5 15.0 15.0 17.2 64.1 Net present value of cash benefits @ 25% with residual value = 195 Net present value of cash costs @ 25% with residual value = 107 Cash+Residual Benefit/Cost Ratio = (195 - 107)/107 = 81%Figure B provides the identical example but with a residual value. The payback has now gone up to 81%. What this demonstrates is that if the useful life of the application can be extended beyond the immediate planning period, and thus the software asset can continue to contribute even in a different organisational environment, then the return on the investment is almost trebled.
Figure B: Project Benefit/Cost Ratio with a 12-year Residual Value
This method of determining payback is embodied in a piece of software that is called the Functional Economic Analysis which was developed by the DoD on the basis of my book The Business Value of Computers, and is generally available as public domain software. It is dearly in the public interest to stimulate a standard approach: to determining economic payback using the same kind of reusable routines. Needless to say, the DoD wants to increase the residual value of its software developments.
I suggests that when you look at the total functional cost of an organisation, the most valuable property is user training. Consequently, rather than looking at information technology assets as merely particular pieces of hardware that have to be used until they die, it should be remembered that the fundamental organisational cost is on the user side of the equation. In the defence environment, the user cost is particularly severe because the operators have to use information technology under conditions of extreme stress, and therefore dissimilar protocols, graphic interfaces and commands are not only costly but extremely dangerous.
The other crucial long-term information assets are data and software-- software representing the collective intelligence of how the enterprise is put together, and data representing the facts about the environment of the enterprise. Everything in between--equipment and operating software-- is commodity products that are being rendered obsolete at a prodigious rate. The current measure of obsolescence of microcomputers, for instance, is the monthly recline in the prices of Intel 486 based micro-computers, now averaging 6.5% per month. Nothing in the history of mankind has ever depreciated or been deflated at the rate of 6.5% per month. What is more, it is considered that over the next decade, the rate at which new technology will arrive will mean that real obsolescence will increase to about 11% per month. At this point, technology will have a half-life of less than 18 months and will dearly be a junkable commodity.
Consequently, the goal of the residual value approach is the preservation of long-term assets, such as the behaviour and training contents of your information system and its supporting intelligence, while ensuring that what is in between--the hardware and operating system software--can easily be removed. In order to protect the long-term assets and to be able to detach obsolescent commodity assets, you have to create systems and engineering interfaces and tools that keep the long-term and short-term assets separated. This is a snap-in, snap-out, disposable type of economy that has to be highly standardised. In fact, one way of looking at the value of open systems is that they make it easier for customers to obsolete the equipment that lies between their permanent assets.
To achieve this snap-in, snap-out world requires an integrated computer aided system engineering environment, I-CASE. I have come to the conclusion that open systems without open systems tools do not work. You must have a way of bolting onto open systems hardware "wheels", so that the old superstructure can be removed and replaced with a new superstructure with very little pain and a very large amount of reusability of information.
The underlying concept is to have a development environment which has a life of 25 to 30 years. This development environment is not based upon a particular procedural code or target hardware, but upon the functional processes which really define the operational rules and requirements. This development approach employs fourth and fifth-generation machine independent languages that are specifically targeted at the open systems environment.
Once you have a development and testing environment against which you can continually validate the logic of any extensions, the retention of any particular implementation is not important. The executable code is not maintained in the object environment, but at the requirements level. This means that, for instance, if you wish to move the application to a pocket computer with a special operating system and a unique chip, you can still achieve the portability of the application to that particular machine, provided, of course, that it supports open systems interfaces which are in compliance with the X/Open CAE Specifications.
I want to emphasize the increasing importance of software tools as the key enabler for bolting together long-term assets, such as knowledge about the human interface and user training, with changeable assets of hardware and system software. Open systems toolsets are the key enabler necessary to obtain the economic value of the open systems environment.
Are these tools alone sufficient to preserve the long-term residual value of software investment? I would suggest that in the same way that operating systems standards, interoperability standards and communications standards are necessary but are not sufficient to guarantee the preservation of software investments, tools are necessary but not sufficient. We have to raise our sights to ensure that organisational knowledge is preserved within our organisations.
This leads to the startling conclusion that what we call software expense is, in fact, purely a way of recording expenses that cover meetings, head scratching and shouting matches. Therefore, if you want real open systems, the human dimensions of software creation, maintenance and development have to be open, transparent, easier and more graceful than has traditionally been the case.
How do you then conserve software assets? You should start to look at software as a way of codifying how the enterprise does its business. It represents the collective experience and the collected, accumulated memory of every person who has ever participated in the conception of that software. You can look at software development as a collective process, a form of recording organisational memory and of encapsulating how members of the organisation have negotiated how they will cooperate. Software should be seen as part of a continual, evolutionary process rather than something that is designed and then thrown away.
Organisations have deep roots. This is seen in the accumulation of what we call organisational culture, that provides a stability which is independent of the individuals making up the organisation at any particular time.
An organisation, particularly for residual value, must be able to carry its culture like a genetic code, with only very small mutations from generation to generation. Software is a form of wealth. In fact, a large number of organisations today have software assets which are worth more than the tangible assets on their balance sheet. Thus, the primary purpose of open systems is to manage that wealth and manage it so that as little as possible is destroyed; so that it accumulates rather than is replaced.
Every new system and enhancement should be conceived of as a way to exploit and increase this legacy value with as little as possible discarded. When you analyse the structure of information systems you discover it is not the elements of logic that change, but the way they are put together. In a typical business application, more than 85% of basic routines deal with information retrieval, information management and information display. That applies to accounting, medical, material or inventory systems. We continually throw out systems even though the fundamental underlying genetic attributes of that system are the same from application to application. We can no longer afford to do this, which is why the implementation of an open systems environment really calls not just for hardware independence, but for a symbiosis between software preservation at a component level and the hardware. By this means, you can move as your environment evolves.
My conclusion is that open systems are there to provide the infrastructure that allows graceful accommodation by the organisation to changing administrative processes and to changing relationships with vendors and customers. Therefore, the underlying reason why you want to have open systems is not because you want to buy cheap hardware, though you may want to do that too, but ultimately because you want to be able gracefully to evolve your business processes on 6-monthly or 4-monthly cycles, rather than the years it takes today. Ultimately, the winning ticket that will show in the residual value of functional economic analysis is business process redesign.
There is sharing.
Given the organisational premise outlined above--that software is a form of organisational memory for complex organisations--I would like to suggest a model that is neither centralised nor fully decentralised as being appropriate. In this model, used in the DoD, the various software assets have to be put into the right level of a federal structure.
What I am suggesting is that if you really preserve the collective knowledge of the enterprise, there is large residual value from software development. You cannot simply look at interoperability standards, you have to look at what I consider the rules of governance, or the way the organisation is put together. You need to ensure that you put as much of the investment as possible in long-lasting assets like data, the telecommunications structure, and configuration management. However, at the same time it is important that innovation, which is fleeting, local and absolutely essential as a way of preserving a sense of freedom and independence, is not stifled. The beauty of this kind of approach is that when you develop local applications that turn out to have permanent value because they are widely imitated, they can be quickly moved up in the hierarchy. It is extremely important that the whole idea of the dynamics of innovation is preserved in the layered structure.
Thus, the underlying issue of open systems--the preservation of assets-- is bigger than just standards. It really deals with the governance and constitutionality of the structure of the information society. It is concerned with how this is organised within the enterprise and then how enterprises are organised to cooperate as a national and then global society.