Sunteți pe pagina 1din 4

1.

Introduction

The software crisis was a term used to describe the impact of rapid increases in computer
power and the complexity of the problems which could be tackled. In essence, it refers to the
difficulty of writing correct, understandable, and verifiable computer programs. The roots of
the software crisis are complexity, expectations, and change. Various processes and
methodologies have been developed over the last few decades to “tame” the software crisis,
with varying degrees of success. However, it is widely agreed that there is no “silver bullet” ―
that is, no single approach which will prevent project overruns and failures in all cases. In
general, software projects which are large, complicated, poorly-specified, and involve
unfamiliar aspects, are still particularly vulnerable to large, unanticipated problems.

“The major cause of the software crisis is that the machines have become several orders of
magnitude more powerful! To put it quite bluntly: as long as there were no machines,
programming was no problem at all; when we had a few weak computers, programming
became a mild problem, and now we have gigantic computers, programming has become
an equally gigantic problem.”

–Edsger Dijkstra,The Humble Programmer (EWD340),Communications of the ACM

2. Overview

Software engineering was spurred by the so-called software crisis of the 1960s, 1970s, and
1980s, which identified many of the problems of software development. Many software
projects ran over budget and schedule. Some projects caused property damage. A few projects
caused loss of life. The software crisis was originally defined in terms of productivity, but
evolved to emphasize quality. Some used the term software crisis to refer to their inability to
hire enough qualified programmers. The most visible symptoms of the software crisis are
o Late delivery, over budget
o Product does not meet specified requirements
o Inadequate documentation
• Some observations on the software crisis
o “A malady that has carried on this long must be called normal” (Booch, p. 8)
o Software system requirements are moving targets
o There may not be enough good developers around to create all the new software that
users need
o A significant portion of developers’ time must often be dedicated to the maintenance
or preservation of geriatric software

3. Year 2000 problem

The Year 2000 problem (also known as the Y2K problem, the millennium bug, the Y2K bug,
or simply Y2K) was a problem for both digital (computer-related) and non-digital
documentation and data storage situations which resulted from the practice of abbreviating a
four-digit year to two digits. During 1960s to late 80s there was a widespread practice in all
computer softwares to use two digits for representing a year rather than using 4 digits. This
was done to save computer disk and memory space because these resources were relatively
expensive in those times. As the year 90's approached experts began to realize this major
shortcoming in the computer application softwares. In year 2000, the computer systems
could interpret 00 as 1900 messing up all the computing work. For example if a program
function is calculating difference between two dates, it would calculate a negative number. For
example difference between 1 Jan 2000 and 31 Dec 1999 could be calculated as -100 years
rather than 1 day. This was a major bug for the whole finance industry. The bug not only
existed in computer software but it also existed in the firmware being used in the computer
hardware. In general this bug threatened all the major industries including utilities, banking,
manufacturing, telecom, airlines.

3.1 What caused the problem?

In the 1960s, computer memory and mass storage were scarce and expensive, and most data
processing was done on punched cards which represented text data in 80-column records.
Programming languages of the time, such as COBOL and RPG, processed numbers in their
character representations. They occasionally used an extra bit called a zone punch to save one
character for a minus sign on a negative number, or compressed two digits into one byte in a
form called binary-coded decimal, but otherwise processed numbers as straight text. Over
time the punched cards were converted to magnetic tape and then disk files and later to
simple databases, but the structure of the programs usually changed very little. Popular
software like dBase continued the practice of storing dates as text well into the 1980s and
1990s.

Saving two digits for every date field was significant in the 1960s. Since programs at that time
were mostly short-lived to solve a specific problem, or control a specific hardware setup,
neither managers nor programmers of that time expected their programs to remain in use for
many decades. The realization that databases were a new type of program with different
characteristics had not yet come, and hence most did not consider missing two digits of the
year a significant problem.

“I'm one of the culprits who created this problem. I used to write those programs back in the
1960s and 1970s, and was proud of the fact that I was able to squeeze a few elements of
space out of my program by not having to put a 19 before the year. Back then, it was very
important. We used to spend a lot of time running through various mathematical exercises
before we started to write our programs so that they could be very clearly delimited with
respect to space and the use of capacity. It never entered our minds that those programs
would have lasted for more than a few years. As a consequence, they are very poorly
documented. If I were to go back and look at some of the programs I wrote 30 years ago, I
would have one terribly difficult time working my way through step-by-step.”

- Alan Greenspan, 1998

3.2 Preventing

The first person known to publicly address this issue was Bob Balmer, who had noticed it in
1958 as a result of work on genealogical software. He spent the next twenty years trying to
make programmers, IBM, the US government and the ISO aware of the problem, with little
result. This included the recommendation that the COBOL PICTURE clause should be used to
specify four digit years for dates. This could have been done by programmers at any time from
the initial release of the first COBOL Compiler in 1961 onwards. However, lack of foresight,
the desire to save storage space, and overall complacency prevented this advice from being
followed. Despite magazine articles on the subject from 1970 onwards, the majority of
programmers only started recognizing Y2K as a looming problem in the mid-1990s, but even
then, inertia and complacency caused it to be mostly unresolved until the last few years of the
decade. In 1989 Eric Naggum was instrumental in ensuring that Internet mail used four digit
representations of years by including a strong recommendation to this effect in the Internet
host requirements document RFC 1123.

Y2K bug was a clicking time bomb for all major computer applications. The computer and
system application companies came out with year 2000 compliant operating systems and
system software. IT companies around the world spent billions of dollars to go through their
entire application source code to look for the Y2K bug and fix it. Almost everybody raced
around to make themselves Y2K compliant before the fast approaching deadline. Finally when
the big day came, many utilities and other companies switched off their main computers and
put the backup computers on work. When the clock ticked Jan 1, 2000, no major problems
were reported. Almost every bank worked fine, no major power outages were reported,
airplanes still flew and the whole world went on with its normal life.

3.2 Responsibilities

Since saving space seems so crucial at early age of software development, developers were
trying to save as much bytes as they can while writing softwares. IT companies, software
development firms and individual early software developers who used to write 2 digit for
representing year instead of using 4 digit should take the responsibility for this crisis.

4. Conclusion

Various processes and methodologies have been developed over the last few decades to "tame"
the software crisis, with varying degrees of success. However, it is widely agreed that there is
no "silver bullet"― that is, no single approach which will prevent project overruns and failures
in all cases. In general, software projects which are large, complicated, poorly-specified, and
involve unfamiliar aspects, are still particularly vulnerable to large, unanticipated problems.

S-ar putea să vă placă și