|
||||
Year 2000: crisis or opportunity?Dec. 7, 1997The November 1997 issue of IEEE Computer, Object Technology department, contains an article by Christopher Creel (Hewlett-Packard), Bertrand Meyer (ISE) and Philippe Stephan (CALFP Bank) on the Year 2000 challenge and how object technology can help turn it into an opportunity. The article is entitled The Opportunity of a Millenium. Here are excerpts from a slightly different version. Article excerptsBy now you have probably heard more than you really care about the "millenium crisis", also known as "Year 2000" and, license-plate style, "Y2K". Even Newsweek had it as its cover story, complete with a cartoon to explain the technical details. If you haven't brushed off the whole thing, you may have included in your New Year's resolution for January 1st, 2000, to wait a little before rushing to the airport, stepping into an elevator, or cashing in on your mutual funds. And your company may be scurrying to meet the deadline, for once immovable. Yet for all the books, conferences and articles, for all the doomsday predictions of conversion costs in the hundreds of billions and endless lawsuits bringing companies to their knees, for all the potential developments cancelled to make room for Y2K projects, for all the new industry spawned by these efforts and attendant consulting opportunities, the software community, with a few exceptions, has neither heeded the lesson nor seized the opportunity. But it may not be too late. For most companies, "doing something about Y2K" has usually involved sending stern letters to suppliers asking them to guarantee compliance of their products, and appointing a team, internal or external, to convert existing software, where conversion usually means going through old code, typically Cobol, looking for explicit uses of 2-digit date fields and extending them to 4 digits. If this describes your company's Year 2000 effort, it is missing the boat. Engineering tradeoffsHow did the (expected) crisis happen in the first place? The conventional wisdom is that programmers are byte-hoarders, who did not have the vision to plan far enough, and foolishly chose instead to save on tiny amounts of memory. A variant of this explanation puts the blame on the programmers' managers; this is consistent with the currently hip Dilbertist trend, whereby managers are all morons and the only ones who understand anything are the bottom-of-the-ladder employees. Either of these variants is off the mark. It is very easy in 1997 to laugh at a programmer or his manager for saving on 2 bytes per database field, back in 1965. We are not the ones who had to make the tough technical decisions. Software construction, like other engineering efforts, is a constant search for the right tradeoffs between factors such as space efficiency, time efficiency, flexibility and maintainability of the architecture, time to market, compatibility with earlier products, ease of use for the customer, ease of implementation for the programmer. One person's brilliant tradeoff may become, in different circumstances -- say if memory costs drop a thousandfold and a product meant to be used for two years lasts well into the next millenium -- a paragon of silliness and stuff for Newsweek cartoons. The two-digit date field may in many cases have been a silly choice, but it is just a choice among millions that programmers have to make all the time, not because they or their managers are cretins with no long-term vision, but because that's what their job takes. While you are reading this, some programmer somewhere is consciously and perhaps competently deciding to limit the size of a data field to what he feels is reasonable. Twenty years from now, someone is going to scream that this is unacceptably constraining. Both statements may be true: that given today's knowledge the size is enough; and that in 2017 it is ridiculously low. Take the example of IP addresses: when the size was set during the design of the Internet's predecessor -- a network mostly designed to enable computer scientists to communicate -- who could have foreseen that twenty-five years later we would need a new standard supporting five IP addresses for every square meter on the planet? So the millenium problem is not to have chosen 2-digit date fields. The problem is much deeper, and it seems that the software industry as a whole has not understood it. The problem is abstraction, information hiding, modularity, reuse: in other words, the set of fundamental software engineering issues that object technology is precisely here to address. The true scandal2 digits, 3 digits, 4 digits, who cares? If you have a huge database, not that much fast-access inexpensive storage, and lots of date-dependent information in the database records, all concentrated on a 50-year period, you may be right to refuse paying (in terms of money, but also of space lost to some other information) for digits that you don't need. What matters is something else: that if someone, down the line, decides to override your original choice because the context has changed and more options are available, he can do it with a small effort. Prescribing a particular number of digits is a trivial design decision. That's not the scandal. The scandal is that a trivial decision, say at the very most a $1000 decision (if we have taken the trouble of paying a software professional to ponder a few hours over the problem), may have million-dollar or even billion-dollar consequences. Why are the consequences so outrageous? The reason is that with standard software technology the ramifications of the choice are spread all over the software. It's not that we chose a certain representation (such as two digits for the date): it's that after we made that choice any one line in our entire software base may depend on it, forcing us to sift through all of our code, as if looking for a few needles -- we don't even know how many -- in a gigantic haystack. Be it dates or any other information, the major problem with traditional ways of building software is uncontrolled distribution of information. In an object-oriented architecture, "date" is an abstraction, and is managed by a module (a class) through which any other part of the system must go whenever it needs to access a date, modify a date or perform any other date operation. Then if something changes in the notion of date, the date class will -- obviously -- require updating, but if the change is to the concrete representation rather than the abstract notion the other modules will not be affected. Too often, software is not built like that. One of the authors has told elsewhere [1] the story of how, a few years ago, he attempted for the purposes of setting up a class project to remove any mention of the sender's name in a well-known mailing program. It took considerable effort to obtain the result, because information about the sender, instead of being concentrated in just one place, was known throughout the tens of thousands of lines of the program. Whenever we thought we had removed the last reference, another would still pop up from elsewhere! Although this example has nothing to do with dates and the millenium problem, it is typical of the same syndrome: lack of abstraction. Object technology is entirely aimed at abstraction. Classes, inheritance, polymorphism, dynamic binding, Design by Contract all help us limit the flow of information in our systems and isolate concrete details from the bigger picture, yielding architectures that lend themselves more gracefully to change. True, this is not yet universally understood, as evidenced by the possibility in Java of writing instructions such as my_object.date = '97' which violate abstraction principles by allowing users of a concept to access and modify an object's field directly, bypassing the corresponding class interface. But true object technology is pitiless about information hiding, and will not allow you to access an object's properties except through the class interface. This is the only known technique for avoiding Y2K-like catastrophes. The Year 2000 example is by far the most visible; anyone who reads newspapers or watches TV is aware of it, and it is one of the most expensive collective efforts in our history. But technically it is only one minute example of a far bigger problem. Regardless of how many sleepless nights and billions of dollars it will take to correct date fields in old Cobol programs, all that effort will not, unless we plan for it, move us one inch closer to solving the bigger problem. From crisis to opportunity"Unless we plan for it." Because the conversion effort is so huge and expensive, it is silly to make it just a Year 2000 conversion effort. This is where crisis becomes opportunity. The truly enlightened companies, which unfortunately appear to be only a minority so far, have understood the Y2K conversion for what it is: a once-in-a-lifetime (once in a millenium?) chance to rip apart the mission-critical enterprise applications and prepare them for the future and its inevitable shocks. It is not too late! Since you are going to have to look into the entrails of your applications anyway, why not take advantage of this effort and the available money to reengineer these applications? Reorganize the architecture; use the best technology available today for such purposes -- object technology; make sure that when the next crisis comes it will not be a crisis. We propose to call this process the millenium rip-apart. Here are some of the principal components of the millenium rip-apart. One of the authors (Stephan) recently put these ideas into practice in making a large banking system Y10K-compliant.
It is essential in particular to preserve the work of the best designers. As Barry Boehm, Fred Brooks and others have pointed out, and as any software manager knows, a few individuals can, in our field, do ten or twenty times as much as the other developers, and they are often crucial in establishing the base architecture. But what commonly happens -- and undoubtedly explains some of the worst nightmares in Y2K conversion efforts -- is that their successors, little by little, destroy the original idea behind an architecture, because they don't understand it, and because there is no contract enforcement to preserve it. By being fanatic about contracts (attaching invariants to classes, and preconditions and postconditions to routines), you can ensure [2] that the best designers' legacy -- imagine using this word in a positive sense! -- will survive. Don't convert: rearchitectRegardless of how far you already are in your conversion efforts, the costs and stakes are so high that a half-hearted effort makes little sense. This is the time to be bold. The Year 2000 crisis is an opportunity. By applying a full-fledged version of object technology, including abstraction, other O-O principles, hooks, reuse and Design by Contract, you can seize this opportunity to overhaul your company's software investment, making it incomparably better in terms of correctness, robustness, extendibility, reusability and ease of use -- and, if you do your job well, ready to withstand another millenium or two. Bibliography[1] Bertrand Meyer, Object-Oriented Software Construction, second edition, Prentice Hall, 1997. [2] Christopher Creel interview at http://eiffel.com. AuthorsChristopher Creel is an architect for Hewlett-Packard color laserjet product line. Bertrand Meyer, editor of the Object Technology department, is president of ISE. Philippe Stephan directed the design and implementation of CALFP Bank's Rainbow generic trading system. He has recently established a new software company in the San Francisco Bay Area and can be reached at <philippe@bluecanvas.com>. ReferenceThe Opportunity of a Millenium by Christopher Creel, Bertrand Meyer and Philippe Stephan, in IEEE Computer, volume 30, no. 11, November 1997, Object Technology department. To other "news stories of the week".
|