Numbers Add Up to a Bigger Year 2000 Disaster |
---|
by Paul Strassmann |
Uploaded to the CPSR-Y2K Mailing List, 1997-08-09 |
Economist Paul Strassmann is a leading student of the impact of technology on productivity in business. His view on the economic impact of Y2k is from Computerworld.
The year 2000 disaster is worse than claimed. The frequently quoted $600 billion estimate for fixing the problem worldwide, far more than the combined costs of three of this decade's natural disasters, the Kobe and Los Angeles earthquakes and Hurricane Andrew, doesn't go far enough. The actual cost will likely be much larger.
The reason for the shortfall: Most estimates leave out work that will have to be done and the cost of doing it. I have examined internal estimates by large corporations and government agencies and reports from the most widely quoted IT advisory services. All woefully misstate the work required to fix the problem and the financial consequences.
MIS-ESTIMATES
Here's where companies and consultancies go wrong in making their year 2000 calculations:
- Underestimating the scope: The year 2000 preparedness exercises concentrate on systems created by the IS organization: financial, accounting, billing and customer-related systems. Yet most public mischief will be caused by failing embedded systems, which are rarely under IS control: global positioning satellites, building security systems, logistics tracking and so forth.
- Neglecting test programs: Is the test software year 2000-compliant? Testing programs can account for as much as 30% of code inventory, and it costs more to validate and upgrade test code than code used in general applications.
- Misusing lines-of-code estimates: The number of lines of code bears no relationship to an application's complexity as measured by "function points." For instance, it may take between 200 and 450 lines of assembly code to define a single function point, whereas Smalltalk may take only 15 to 40 lines to perform the identical task.
- Depending on cost-per-line estimates: These estimates assume that remedial and diagnostic tools are available to fix popular languages such as Cobol and C. Yet these popular compilers account for only 45% of the inventory. The balance consists of 60 languages, including Pascal, PL/1, Ada, Jovial and supplier-specific assembly languages. The cost of fixing year 2000 problems will depend on what tools and expertise you have available.
- Omitting database rectification tasks: Everyone is concentrating on fixing code logic, but ensuring that database records remain usable may take at least as much effort.
- Overlooking litigation expenses: Willful disregard of a known danger can be construed as an act of negligence. When the inevitable epidemic of systems failures takes place, lawyers and litigants will seize the opportunity to collect big damages and exorbitant legal fees. The lawsuits can rapidly cascade into a series of damage claims, where Company A will sue Company B, which will then sue Company C, which in turn will sue Company A to recover costs.
- Neglecting warranties: The bids given by the firms offering cures for potential year 2000 malfunctions lack warranties and avoid independently verifiable safeguards. Budget estimates based on these price bids are worth little, because they don't cover the eventual litigation should these "cures" fail.
- Misjudging interoperability testing: Everyone is concentrating on testing individual programs and applications at the expense of how they integrate with one another. This is a particular problem with applications that depend on receiving transaction data from other companies.
- Forgetting about consequential costs: In the rush to meet year 2000 deadlines, IS executives will make many imprudent concessions that will cost money in the long term, such as deferring essential maintenance, compromising information security through unmanageable outsourcing and upsetting salary structures by paying ransom rates for year 2000-related positions.
WHAT IS THE EXPECTED COST?
Are there any credible sources of year 2000 costs? So far, I have found only one: Capers Jones, the president of Software Productivity Research, a consultancy in Burlington, Mass. He fully discloses the assumptions on which he bases his projections. Following are my conclusions, which are based on his latest report:
- All year 2000 estimates so far exclude the home-brewed code that has been placed into workstations and local servers by casual programmers. That now accounts for almost 25% of all U.S. function points. With about 40 million function points in this category that may need fixing, and a cost of something like $600 to fix a function point, that adds up to $24 billion in the U.S.
- The total U.S. inventory of professionally managed code that requires fixing is about 100 million function points. That would consume about 6 million person-months of effort. The cost of identifying, fixing and testing that software by the year 2000 deadline comes to more than $70 billion. Add to that as much as $60 billion for database authentication and repairs, $10 billion for test library development and repairs and $10 billion for post-year 2000 remedial work to correct errors from hastily executed patches.
- Chalk up another $20 billion for hardware, either to be bought for testing and parallel running of applications or to be upgraded to make poorly repaired applications run faster.
- Litigation over negligence is the largest unknown expense for the year 2000 disaster. Capers Jones estimates the cost at $100 billion but cautions that the figure could be much larger. Altogether, this amounts to $294 billion to fix year 2000 problems in the U.S. alone. That's nearly half of the $600 billion worldwide figure. Because the estimated U.S.-based code makes up only 16% of all function points on the planet, it's safe to say that the widely quoted worldwide estimate of $600 billion is low: Fixing the other 84% of the world's function points will cost far more than another $300 billion.
Count on it.
Paul Strassmann (paul@strassmann.com) has just published The Squandered Computer, which outlines how to remedy executives' disappointment with the trustworthiness of their information managers.
Copyright 1997 by Computerworld, Inc. All rights reserved.
@Computerworld is a service mark of International Data Group, Inc.
Index | Home Page |