Sunteți pe pagina 1din 8

A Brief History of Software Engineering

Niklaus Wirth
ETH Zurich

This personal perspective on the art of programming begins with a


look at the state of programming from about 1960, and it follows
programmings development through the present day. The article
examines key contributions to the field of software engineering and
identifies major obstacles, which persist even today.

This article originally appeared in L. Boszormenyi, we should be concerned about the resulting
ed., MEDICHI 2007Methodic and Didactic deterioration in programming quality. Our
Challenges of the History of Informatics, limitations in designing complex systems are
Austrian Computer Society (OCG), Klagenfurt, no longer determined by slow hardware, but
Austria, 178 pp. The Austrian Computer Society by our own intellectual capability. From
(OCG) has kindly granted permission for an edited experience, we know that most programs
version to be published by the Annals. could be significantly improved, made more
reliable, more economical, and easier to use.
The term programming was commonly used
through the mid-1960s, and referred essential- The 1960s and the origin of
ly to the task of coding a computer. The term software engineering
software engineeringreferring to the highly It is unfortunate that people dealing with
disciplined, systematic approach to software computers often have little interest in the
development and maintenancecame into history of their subject. As a result, many
existence after a NATO-sponsored conference concepts and ideas are propagated and adver-
in 1968. At that conference, the difficulties tised as being new, when in fact they existed
and pitfalls of designing complex systems were decades ago, perhaps under a different name. I
explored in depth, and a search for solutions believe it worthwhile to occasionally consider
began that concentrated on better methodol- the past and to investigate how computing
ogies and tools. The most prominent of these terms and concepts originated.
tools were languages reflecting procedural, I regard the late 1950s as a period essential
modular, and object-oriented styles of pro- to the era of computing. At that time, large
gramming. Since 1968, the development of computers became available to research insti-
software engineering has been intimately tied tutions and universities. Computers were then
to these tools emergence and improvement, used primarily in engineering and the natural
as well as to efforts for systematizing or sciences, but they soon became indispensable
automating program documentation and test- in business, too. The time when they were
ing. Ultimately, analytic verification and cor- accessible only to a few insiders in laboratories,
rectness proofs were supposed to replace when they frequently broke down whenever
testing, but that has not happened. one wanted to use them, belonged to the past.
As this article will explain, the rapid growth Computers emergence from the closed labo-
of computing power has made it possible in ratory of electrical engineers into the public
recent years to apply computing to tasks that domain meant that computers use, in partic-
are ever more complicated. This trend has ular their programming, became an activity of
dramatically increased the demands on soft- many. As a result, a new profession was born,
ware engineers. Programs and systems have and all kinds of companies began to hire
become increasingly complex and almost programmers. The actual computers, however,
impossible for a single individual to fully remained hidden, enclosed within special
understand. The abundance of computing rooms built to house them in those same
resources coupled with a significant drop in companies. Programmers would write code
their cost inevitably reduced the attention and bring their programs to a counter where a
given to good design. At the expense of quality, dispatcher would pick up the programs and
the pursuit of profit became paramount, but queue them for processing. The programs

32 IEEE Annals of the History of Computing Published by the IEEE Computer Society 1058-6180/08/$25.00 G 2008 IEEE
results would be fetched from the counter
hours or days later. There was no interactivity The transition from
between man and computer: programming
and computing were separate tasks. batch processing to time
Programming was understood to be a
sophisticated task requiring a painstaking sharing was more
attention to detail, along with a love for
obscure codes and what might even be called difficult than anyone
tricks. To facilitate this coding, formal
notations were created that we now call anticipated. As a result,
programming languages. The primary idea
was to replace sequences of special instruction systems could not be
code by mathematical formulas. The first
widely known language, Fortran, was devel- built or delivered on
oped by IBM in 1957, soon followed by the
first Algol version in 1958 and its official time, bringing some
successor in 1960. As computers were then
used for computation rather than for data companies to the brink
storage and communication, these early lan-
guages catered mainly to numerical mathe- of collapse.
matics. In 1962, the language Cobol was issued
by the US Department of Defense specifically
for business applications.
But as computing capacity grew, so did the It turned out that the transition from batch-
demands on programs and on programmers: processing to time-sharing systems was vastly
tasks became ever more intricate. It was slowly more difficult than anyone had anticipated.
recognized that programming was a difficult Among other difficulties, systems were an-
task, and that mastering complex problems nounced but then could not be delivered on
was nontrivial, even thoughor perhaps be- time; the operational problems were too
causecomputers were so powerful. Salvation complex; research had to be conducted on
was sought in better programming languag- the job. The topics of multiprocessing and
es, in more tools, and in automation. concurrent programmingcentral ingredients
To be considered better, a language of time-sharing systemshad not been en-
should be useful in a wider area of application; countered before and were insufficiently mas-
it should be more like a natural language tered. Consequently, systems were promised
and also offer more facilities. For example, PL/ but could not be completed and delivered on
1, developed by IBM in the early 1960s, was time. The difficulties brought big companies
designed to unify the scientific and commer- to the brink of collapse.
cial worlds. It was advertised under the slogan In the midst of this situation, in 1968
Everybody can program thanks to PL/1. NATO sponsored a conference dedicated to
Programming languages and their compilers the topic of software engineering.1 Although
became a principal cornerstone of computing critical comments had occasionally been
science, but they fitted into neither mathe- voiced earlier about the programming profes-
matics nor electronics, the two traditional sion,2,3 it was not until the 1968 conference
sectors where computers were used. A new that programmers difficulties were openly
discipline soon emerged, which was called discussed in a public forumand with unusual
computer science in the US and informatics in franknessand that the terms software engi-
Europe. neering and software crisis were coined. It was
In 1963, the first time-sharing system recognized that current techniques of software
appeared. It was designed by John McCarthy development were inadequate and that new,
at MIT and implemented on a slightly extend- more methodical ways had to be adopted.
ed DEC PDP-1 computer. This time-sharing
system provided the interactivity that batch- Programming as a discipline
processing systems lacked. Computer manu- The software crisis in 1968 existed despite
facturers seized on the idea and soon an- efforts made several years previously to specif-
nounced time-sharing systems for their large ically address the issue of programmings
mainframes (for example, the IBM 360/67 and increasing complexity. In the academic world,
the General Electric GE-645). it was mainly Edsger W. Dijkstra and C.A.R.

JulySeptember 2008 33
A Brief History of Software Engineering

Hoare who had recognized the problems Unix and C


endemic to programming and who offered While the concepts of structured program-
new ideas. In 1965, Dijkstra had written his ming slowly gained acceptance, notably in
famous Notes on Structured Programming4 academia, another movement started to invade
and declared programming to be a discipline the programmers world. It was spawned by the
rather than a craft. Also in 1965, Hoare had spread of the Unix operating system, which
published an important paper about data Ken Thompson had developed at Bell Labs, and
structuring.5 These ideas had a profound in its simplicity contrasted markedly with the
influence on new programming languages complexity of MITs Multics. Unix was specif-
in particular, Pascal.6 New languages were the ically designed for, and was small enough to fit,
vehicles in which these new ideas were to be the rapidly emerging minicomputers. It was a
expressed. Structured programming became sup- highly welcome relief from the large operating
ported by a structured programming language. systems established on mainframe computers.
In 1966, Dijkstra had written a seminal In its wake came the language C,8 which
paper about harmoniously cooperating pro- had been explicitly designed by Dennis Richie,
cesses,7 postulating a discipline based on also at Bell Labs, to support the development
semaphores. Semaphores, primitives for the of Unix. It was therefore at least attractive, if
synchronization of concurrent processes, can not mandatory, to use C for developing
be regarded as a data type with the values 0 and applications that ran under Unix, which thus
1 and the atomic operators P and V. (P(s) delays acted like a Trojan horse for C.
the calling process until s hasby action of But C did not carry the spirit of structured
another processobtained the value 1, and programming. It was rather like an assembler
then sets the semaphore to 0. V(s) sets s to 1.) code in the disguise of a remotely Algol-like
Hoare had followed in 1966 with his Commu- syntax. Neither did it allow for strict checking
nicating Sequential Processes (CSP),8 based on of data types. From the point of view of
the concept of channels and conditions, and software engineering, the rapid spread of C
also two operators, which he denoted by ? therefore represented a great leap backward (a
(inquiring) and ! (asserting a condition). nod to the Chinese cultural revolution, the
Hoare had fully recognized that in the future Great Leap Forward, taking place at that time).
programmers would have to cope with the It revealed that the community at large had
difficulties posed by concurrent processes. The hardly grasped the true meaning of the term
necessity of dealing with concurrency would high-level language, which became a poorly
clearly make a structured and disciplined understood buzzword. What, if anything, was
methodology even more compelling. to be high level now? Because this issue lies
All these efforts aside, the field of program- at the very core of software engineering, we
ming was still in a state of upheaval and even need to elaborate.
disarray by the time of the 1968 NATO
conference. The important developments by Abstraction
Dijkstra, Hoare, and others did notcould Computer systems involve machines of
notchange the software situation nor dispel great complexity. This complexity can be
all difficulties overnight. Industry could mastered intellectually by one tool only:
change neither its policies nor its tools rapidly abstraction. A language represents an abstract
enough to be of use to programmers, who were computer whose objects and constructs lie
restricted to working with available tools and closer to, and reflect more directly, the
languages, none of which incorporated these problem to be represented than the concrete
new ideas in the 1968 time frame. Neverthe- machine. For example, in a high-level lan-
less, intensive training courses on structured guage we deal with numbers, indexed arrays,
programming began to be organized, notably data types, and conditional and repetitive
by Harlan D. Mills at IBM. Even the US statements, rather than with bits and bytes,
Department of Defense realized that software addressed words, jumps, and condition codes.
problems were urgent and becoming more so. However, these abstractions are beneficial only
For its part, the DoD initiated a project that if they are consistently and completely defined
ultimately led to the programming language in terms of their own properties. If this is not
Ada, a highly structured language suitable for a the case, if the abstractions can be understood
wide variety of applications. Software devel- only in terms of the facilities of an underlying
opment within the DoD would then be based computer, then the benefits are marginal,
exclusively on Ada, and still is to a significant almost given away. If debugging a program
extent.9 undoubtedly the most pervasive activity in

34 IEEE Annals of the History of Computing


software engineeringrequires a hexadeci-
mal dump, then the use of a high-level The emergence of cheap
language is hardly worth the trouble.
The widespread use of C effectively, if Pascal implementations
unintentionally, sabotaged the programming
communitys attempt to raise the level of involved nothing less
software engineering. This was true because
C offers abstractions which it does not in fact than a turning point.
support: arrays that remain without index
checking, data types without a consistency Suddenly, there was a mass
check, pointers that are merely addresses
where addition and subtraction are applicable. market. Computing
One might have classified C as being some-
where on a scale between misleading and went mainstream.
(possibly) dangerous. Regardless, people (par-
ticularly those in academia) found C to be
intriguing and better than assembly code
because it featured some syntax. but on other fronts: in schools and the home,
The trouble was that Cs rules could easily in no small way because of the emergence of
be broken, exactly what many programmers microcomputers. Microcomputers first ap-
valued. C made it possible for programmers to peared on the market in 1975 (Commodore,
manage access to all of a computers idiosyn- Tandy, and Apple; much later, IBM entered
crasies, even to those items that a high-level the market). They were based on single-chip
language would properly hide. C provided processors (such as the Intel 8080, Motorola
freedom, whereas high-level languages were 6800, and Rockwell 6502) with 8-bit data
considered straitjackets, enforcing unwanted buses, 32 Kbytes or less of memory, and clock
discipline. The nature of C was effectively an frequencies less than 1 MHz. These advances
invitation for programmers to use tricks and made computers affordable for individuals, in
loopholes that had been necessary to achieve contrast to large organizations: companies and
efficiency in the early days of computers, but universities. But microcomputers at that time
which now were pitfalls that made large were toys, not useful computing engines.
systems error-prone, and costly to debug and The breakthrough for microcomputers, in
maintain. the context of software development, came
Languages appearing around 1985, such as when it was shown that high-level languages
Ada and C++, tried to remedy this defect and to could be used in conjunction with microcom-
address a much wider variety of foreseeable puters. This happened in 1975 when the group
applications. As a consequence, these newer that Ken Bowles led at the University of
languages became large and their descriptions California, San Diego, built a text editor, a file
voluminous. Compilers and support tools system, and a debugger around the portable
became bulky and complex. Ultimately, in- Pascal compiler (P-code) developed at ETH in
stead of solving problems, these new languag- Zurich, and they distributed it for $50. Soon,
es became problems themselves. As Dijkstra Borland Software came out with its version
frequently said: They belonged to the problem (Turbo Pascal) of a high-level compiler. This
set rather than the solution set. was at a time when other compilers were very
Consequently, progress in software engi- expensive, so the emergence of these compar-
neering seemed to stagnate. The difficulties atively cheap Pascal implementations in-
grew faster than new tools could resolve or volved nothing less than a turning point in
restrain them. Pseudotools like software met- commercializing software for development
rics, like counting the number of go-to state- purposes. Suddenly, there was a mass market.
ments in a program or measuring the average Computing went mainstream.
length of identifiers, had revealed themselves Meanwhile, requirements on software sys-
as being of no help, but at least software tems continued to grow and, in turn, so did
engineers were no longer judged by the the complexity of programs. The craft of
number of lines of code produced per hour. programming turned to hacking in many
cases. Methods were sought to systematize if
Advent of the microcomputer not construction, then at least program testing
The propagation of software engineering and documentation. Although this was help-
and Pascal notably did not occur in industry, ful, the real problems of hectic programming

JulySeptember 2008 35
A Brief History of Software Engineering

under time pressure remained. In conversa- pilation with automatic checking of interface
tion, more than once, Dijkstra would often compatibility.
pinpoint the difficulty by saying that testing Just as structured programming had been
may show the presence of errors, but it can the guiding spirit behind Pascal, modulariza-
never prove their absence. He also sneered that tion was the principal idea behind the lan-
software engineering is programming for guage Modula-2, Pascals successor, published
those who cant. in 1979.15 In fact, the motivation for Modula-2
actually came from the language Mesa, an
Programming as a internal development of the Xerox Palo Alto
mathematical discipline Research Center (PARC), and itself a descen-
In 1967, Robert W. Floyd had suggested the dant of Pascal. The concept of modularization
idea of assertions of states, of truths always and separate compilation was also adopted by
valid at given points in a program.10 It led to the language Ada (1984), which was also based
Hoares seminal paper titled An Axiomatic largely on Pascal. In Ada, modules were called
Basis of Computer Programming, postulating packages.
the so-called Hoare logic.11 A few years later, in
1975, Dijkstra deduced from it the calculus of Era of the personal workstation
predicate transformers.12 Programming was However, another development influenced
obtaining a mathematical basis. Programs the computing field more profoundly than all
were no longer just code for controlling programming languages. It was the worksta-
computers, but static texts that could be tion, whose first incarnation, the Alto, was
subjected to mathematical reasoning. built in 1975 by the Xerox PARC lab.16 In
Although these developments were recog- contrast to the aforementioned microcomput-
nized at some universities, they passed virtu- ers, the workstation was powerful enough to
ally unnoticed in industry. Indeed, Hoares allow serious software development, complex
logic and Dijkstras predicate transformers computations, and the use of a compiler for an
were explained on interesting but small algo- advanced programming language. Most im-
rithms such as multiplication of integers, portant, the Alto pioneered the bit-mapped,
binary search, and greatest common divisor. high-resolution display and the pointing de-
But industry was plagued by large, even vice called a mouse, which together brought
gargantuan, systems. It was not obvious that about a revolutionary change in computer
mathematical theories would ever solve real usage. Along with the Alto, the concept of a
problems when the analysis of simple algo- local area network was introduced, and that of
rithms was demanding enough. central servers for (laser) printing, large-scale
An eventual solution to the dilemma of file storage, and email service.
mathematical rigor for small programs against It is no exaggeration at all to claim that the
the intractability of large programs as they modern computing era started in 1975 with
existed in industry emerged in the form of a the Alto. The Alto caused nothing less than a
disciplined manner of programming, rather revolution, albeit slowly, and as a result many
than from rigorous scientific theory. A major people today have no idea how computing
contribution to structured programming was was done before 1975, without personal,
made by David Parnas in 1972 with the highly interactive workstations. The influence
concept of information hiding,13 and at the of these developments on software engineer-
same time by Barbara Liskov with the concept ing cannot be overestimated.
of abstract data types.14 Both concepts em- As the demand for ever more complex
body the notion of breaking up large systems software persistently grew, and as the difficul-
into parts called modules, and clearly defining ties became more menacing, illustrated by
their interfaces. If module A uses (or imports) some spectacular failures (of which the most
module B, then A is called a client of B. The conspicuous was the crash of a rocket that
designer of A then need not know the details, abruptly ended a space mission), the search for
the functioning of B, but only the properties as panaceas began. Many cures were offered,
stated by its interface. sold, and soon forgotten. One of them,
This principle, modularization, probably however, proved fruitful and has survived:
constituted the most important contribution object-oriented programming (OO).
to software engineering, that is, to the con- Up until 1980, the commonly accepted
struction of systems by large groups of people. model of computing was transforming data
The concept of modularization is greatly from their given state to the result, gradually
enhanced by the technique of separate com- transforming input into output. In its simplest

36 IEEE Annals of the History of Computing


abstract form, this is the finite-state machine. a bandwidth that is apparently unlimited. I am
This view of computing stemmed from the overwhelmed when I compare this develop-
original task of computers: performing numer- ment with the first, stand-alone minicomputer
ical computations. However, another model that I worked with in 1965, a DEC PDP-1. The
gained ground in the 1960s, originating from PDP-1 had a clock rate of less than 1 MHz,
the simulation of discrete-event systems (such memory of 8,000 18-bit words, and drum
as supermarkets, assembly lines, and logistics). storage of some 200 Kbytes. It was time-shared
A discrete-event system consists of actors by up to 16 users. It is a miracle that some
(processes) that come and go, that pass phases people persisted in believing that one day
in their lifetime, and that carry a set of private computers would become powerful enough to
data representing their current state. It proved be useful for more than just accounting and
natural to think of such actors with state as a academic exercises.
unit, as an object. In the 1990s, the open source phenomenon
Some programming languages were de- took hold and started to spread. The distrust
signed on the basis of this model, their with which programmers regarded huge sys-
ancestor being Ole-Johan Dahl and Kristen tems designed in industrial secrecy became
Nygaards Simula in 1965. But such languages manifest. Although programmers had previ-
remained confined to the field of simulation of ously made a limited amount of software
discrete-event systems. Only after the emer- available for free, once the Internet became
gence of powerful personal computers did the ubiquitous, an even wider community of
OO model gain wider acceptance. Now, programmers decided to build software and
computing systems would feature windows, more systematically distribute their products
icons, menus, buttons, toolbars, and so on, all for free through the Internet. Although it is
easily identifiable as visible objects with difficult to recognize this as a sound business
individual state and behavior. Languages principlemaking the idea of patents obso-
appeared supporting this model, among them letethe bandwagon turned out to have a
Smalltalk (developed by Adele Goldberg and rather successful following. The notions of
Alan Kay in 1980), Object-Pascal (Larry Tesler, quality and responsibility in case of failure
1985), C++ (Bjarne Stroustrup, 1985), Oberon seemed irrelevant. Open source appeared as
(Niklaus Wirth, 1988), Java (Sun Microsys- the welcome alternative to industrial hegemo-
tems, 1995), and C# (Microsoft, 2000). ny and abrasive profit, and also against
Object orientation became both a trend and helpless dependence on commercial software.
a buzzword. Indeed, choosing the right model It is often difficult in software engineering
for an application is important. Nevertheless, to distinguish between business strategies and
OO is not appropriate for all applications. scientific ideas. Concerning the latter, open
source appears to be a last attempt to cover up
Abundance of computer power failure. The writing of complicated code and
The period since 1985 has, until a few years the nasty decryption by others is apparently
ago, chiefly been characterized by enormous considered easier or more economical than the
advances in hardware technology. Today, careful design and description of clean inter-
even tiny computers, such as mobile tele- faces of modules. The easy adaptability of
phones, have 100 times more power and modules when available in source form is also
capacity than the biggest of 20 years ago. It is a poor argument. In whose interest would a
fair to say that semiconductor and disk wild growth of varieties of variants ever be?
technologies have recently determined all Not in that of anyone concerned with high-
advances. Who, for example, would have quality engineering and professionalism.
dreamed in 1990 of memory sticks holding
several gigabytes of data, of tiny disks with Wasteful software
dozens-of-gigabytes capacity, of processors Whereas the incredible increase in the
with clock rates of several gigahertz? power of hardware was very beneficial for a
Such speedy development has vastly wid- wide spectrum of applications (we think of
ened the area of computer applications. This administrations, banks, railways, airlines, guid-
has happened particularly in connection with ance systems, engineering, science), the same
communications technology. It is now hard to cannot be claimed for software engineering.
believe that before 1975 computer and com- Surely, software engineering has profited too
munications technologies were considered from the many sophisticated development
separate fields. Electronics has united them, tools. But the quality of its products hardly
and has made the Internet pervasive, featuring reflects signs of great progress. No wonder: after

JulySeptember 2008 37
A Brief History of Software Engineering

all, the increase of power was itself the reason do so as long as they (however carefully) built
for the terrifying growth of complexity. What- their own work on top of complex base
ever progress was made in software methodol- softwarea platformthat was neither fully
ogy was quickly compensated for by still higher described nor dependable. We know that any
complexity of the software tasks. This is chain is only as strong as its weakest link. This
reflected by Martin Reisers law: Software holds also for module hierarchies. Systems can
is getting slower faster than hardware is getting be designed with utmost care and profession-
faster.17 Indeed, new problems have been alism, yet they remain error-prone if built on a
tackled that are so difficult that engineers often complex and unreliable platform.
have to be admired more for their optimism The crazy drive for more complexity
and courage than for their success. euphemistically called sophisticationlong
What has happened in software engineer- ago had also afflicted the most essential tool
ing was predictable, because inherent in a field of the software engineer: the programming
of engineering, where the demands rise, work language. Modern languages like Java and C#
is done under time pressure, and the cost of may well be superior to old ones like Fortran,
materials dramatically drops. The conse- PL/1, and C, but they are far from perfect, and
quence is a true waste of cheap resourcesof they could be much better. Their manuals of
both computing power and storage capacity, several hundred pages are an unmistakable
resulting in inefficient code and bulky data. symptom of their inadequacy. Engineers in
This waste has become ever present and industry, however, are rarely free from con-
represents a grave lack of concern for software straints. Their work theoretically must be
quality. Programs inefficiency is easily cov- compatible with the rest of the world, and to
ered up by the use of faster processors, and deviate from established standards might
poor data design by the use of larger storage prove fatal to the engineers companies.
devices. But their side effect is a decrease of But this cannot be said about academia. It is
qualityof reliability, robustness, and ease of therefore a sad fact that academia has re-
use. Good, careful design is time-consuming, mained inactive and complacent about the
and costly. It is, however, still cheaper than state of programming languages. Not only has
unreliable, difficult software, when the cost of research in languages and design methodology
maintenance is factored in. The trend is lost its attractiveness and glamour, but worse,
disquieting, and so is the complacency of the tools common in industry have now
customers. quietly been adopted in the academic world,
without debate and criticism. Current lan-
Personal reflections and conclusions guages may be inevitable in industry use, but
What can we do to release this logjam? for teaching, for an orderly, structured, sys-
There is little point in reading history unless tematic, well-founded introduction to pro-
we are willing to learn from it. Therefore, I dare gramming and designing algorithms, they
to reflect on the past and will try to draw some are wrong and obsolete.
conclusions. A primary effort must be educa- This situation is notably, and sadly, in
tion concerning a sense of quality. Program- accord with other trends of the 21st century:
mers must become engaged crusaders against we teach, learn, and perform only what is
homemade complexity. The cancerous growth immediately profitable, what is requested by
of complexity is not a thing to be admired; it students. In plain words: we focus on what
must be fought wherever possible.18 Program- sells. Universities have traditionally been
mers must be given time and afforded respect exempt from this commercial focus. Universi-
for work of high quality. This is crucial, ties were places where people were expected to
ultimately more effective than more and better ponder about what matters in the long run.
tools and rules. Let us embark on a global effort Universities were spiritual and intellectual
to prevent software from becoming known as leaders, showing the path into the future. In
softwaste! our field of computing, I am afraid, they have
Recently I have become acquainted with a simply become docile followers. Universities
few projects where large, commercial operat- appear to have succumbed to the trendy
ing systems were discarded in favor of the yearning for continual innovation, and to
Oberon system, whose principal design objec- have lost sight of the need for careful crafts-
tive had been perspicuity and concentration manship.
on the essentials.19 The project leaders, being If we can learn anything from the past, it is
obliged to deliver reliable, economical soft- that computer science is in essence a method-
ware, had recognized that they were unable to ological subject. It is supposed to develop

38 IEEE Annals of the History of Computing


(teachable) knowledge and techniques that are 10. R.W. Floyd, Assigning Meanings to Programs,
generally beneficial in a wide variety of Proc. Symp. Applied Mathematics, Am.
applications. This does not mean that com- Mathematical Soc., vol. 19, 1967, pp. 19-32.
puter science should drift into all these diverse 11. C.A.R. Hoare, An Axiomatic Basis for Computer
applications and lose its identity. Software Programming, Comm. ACM, vol. 12, no. 10,
engineering would be the primary beneficiary 1969, pp. 576-580.
of a professional education in disciplined 12. E.W. Dijkstra, Guarded Commands,
programming. Among its tools, languages Nondeterminacy and Formal Derivation of
figure in the forefront. A language with Programs, Comm. ACM, vol. 18, no. 8, 1975,
appropriate constructs and structure, resting pp. 453-457.
on clean abstractions, is instrumental in 13. D.L. Parnas, Abstract Types Defined as Classes of
building artifacts, and mandatory in educa- Variables, ACM SIGPLAN Notices, vol. II, no. 2,
tion. Homemade, artificial complexity has no 1976, pp. 149-154.
place in languages. And finally: it must be a 14. B. Liskov and S. Zilles, Programming with
pleasure to work with languages, because they Abstract Data Types, Proc. ACM SIGPLAN Symp.,
enable us to create artifacts that we can show ACM Press, 1974, pp. 50-59.
and be proud of. 15. N. Wirth, Programming in Modula-2, Springer, 1974.
16. C.P. Thacker et al., Alto: A Personal Computer,
Xerox PARC, tech. report CSL-79-11, Aug. 1979.
References and notes 17. N. Wirth, A Plea for Lean Software, Computer,
1. P. Naur, and B. Randell, eds., Software Engineering, Feb. 1995, p. 64.
Report on a Conference Sponsored by the NATO 18. Ibid., pp. 64-68.
Science Committee, Scientific Affairs Division, 19. M. Franz, Oberon: The Overlooked Jewel, The
NATO, 1968. School of Niklaus Wirth: The Art of Simplicity, L.
2. E.W. Dijkstra, Some Meditations on Advanced Boszormenyi, J. Gutknecht, and G. Pomberger,
Programming, Proc. IFIP Congress, 1962, North- eds., Morgan Kaufmann, 2000, pp. 41-54.
Holland, pp. 535-538.
3. R.S. Barton, A Critical Review of the State of the Niklaus Wirth is professor
Programming Art, Proc. Spring Joint Computer emeritus of the Swiss Federal
Conf., AFIPS Press, 1963, pp. 169-177. Institute of Technology (ETH)
4. E.W. Dijkstra, Notes on Structured in Zurich. He designed the
Programming, Structured Programming, O.-J. programming languages Pas-
Dahl, E.W. Dijkstra, and C.A.R. Hoare, eds., cal (1970), Modula (1979),
Academic Press, 1972, pp. 1-82. and Oberon (1988), and the
5. C.A.R. Hoare, Notes on Data Structuring, workstations Lilith (1980) and
Structured Programming, O.-J. Dahl, E.W. Dijkstra, Ceres (1986), as well as their operating software.
and C.A.R. Hoare, eds., Academic Press, 1972, Wirth received a PhD from the University of
pp. 83-174. California at Berkeley in 1963. His honors include
6. N. Wirth, The Programming Language Pascal, the IEEE Emanuel R. Piore Award (1983), the ACM
Acta Informatica, vol. 1, 1971, pp. 35-63. A.M. Turing Award (1984), and the IEEE Computer
7. E.W. Dijkstra, Cooperating Sequential Pioneer Award (1987). Wirth is a Foreign Associate
Processes, Sept. 1965. Reprinted in Programming of the National Academy of Engineering.
Languages, F. Genuys, ed., Academic Press, 1968,
pp. 43-112.
8. C.A.R. Hoare, Communicating Sequential Readers may contact Niklaus Wirth about this
Processes, Comm. ACM, vol. 21, no. 8, 1978, article at wirth@inf.ethz.ch.
pp. 666-677.
9. J.G.P. Barnes, An Overview of Ada, Software For further information on this or any other
Practice and Experience, vol. 10, 1980, computing topic, please visit our Digital Library
pp. 851-887. at http://computer.org/csdl.

JulySeptember 2008 39

S-ar putea să vă placă și