Sunteți pe pagina 1din 8

draft, march 2010

computer science must be naturalized... effectively!

introduction

when discussing apparently the most enduring problems of computer science, the prevailing positivist
intellectual framework supervising both this discipline itself and its immediate periphery is seemingly
infected by talks of an alleged trichotomy of mathematics, science and technology; further, it is
claimed, this alleged trichotomy triggers the wildly circulated but eventually relatively innocuous
identity crisis, usually seen as something more than a suspected coming-of-age syndrome (surely,
some developmental stories happen to be unhealthy); a better explanation for such chronic problems,
however, is that, from the very beginning, the discipline is hijacked by a rationalist ideology which is,
indeed, the very source of most harm; with a meta-level analysis given below, this fact can be exposed
more specifically: it is, after all, either essentially a bloated rationalism (mathematicism), or some
persisting vestiges of rationalism (a primary emphasis on computation/programming), or else, simply
the side effects of a skewed rationalist attitude (contempt for and suppression of empirical issues)...

as a consequence of this domination of computer science's life story by the rationalist ideology,
extending from the past up to the present times (and already shaping the foreseeable future), not only
the inherently empirical aspects of computation per se, but probably more important than that, some
other relevant parts of the discipline that have been perennially empirical (e.g. computer organization,
programming languages, information/knowledge management, human-computer interaction, etc.) did
not receive the "scientific" attention they deserved, either... thus, alongside some other factors at work,
considerable damage has been incurred on computer science; if this situation cannot be changed, the
cost will increase in terms of lost and wasted resources; therefore, rationalism's dominance should be
ended as soon as possible in favor of an explicitly stated empiricism, that is, computer science should
be effectively naturalized...

historical backdrop

to be able to put the ensuing analysis into proper perspective, the historical backdrop should be given,
even if only in some broad terms; two crosscutting issues are at work when examining the
developmental history (historiography) of computer science: first, the definition of the scope and
subject matter of the discipline is problematic, second, its analysis within the wider (social) context, i.e.
its practical/social functioning is of great interest; in the course of events, things get more complicated
as these two issues strongly interact and reshape each other...

from the very beginning, the scope and subject matter of computer science has been changing
considerably, and it is surely not the first nascent "science" that underwent radical mutations on its
path toward maturity; interestingly, however, such developments did not help much the hot debates -
which also started at the outset- whether it is science, mathematics or engineering, definitely a first for
a discipline (it is also worthwhile to note that talk about this issue prevails, particularly, in the positivist
literature); although the tension between alternative views is somewhat mitigated, it is still there, and
lack of consensus by all parties has been not much productive...

on the other hand, the wider (social) context, i.e. the practical/social functioning as shaped by all kinds
of major emerging phenomena that define the nature, breadth and epicenter of the practice which can
be aptly called the pragmatics of the practice at large -and should be distinguished form the
pragmatics of the constructs that will be explained further below- provides the primary dynamics of the
interaction between these two crosscutting issues...
a historiographical analysis of this pragmatics at large, for example, would divide the past 70 years
into three periods; an inaccessible, jovian priesthood era governed by mathematicians/logicians and
some technocratic elite behind pressurized room doors from the early days on, the so called "number
crunching" days; a more mundane commoners' era, heralded and propelled by the personal computer
revolution which resulted in a computing sprawl characterized by two unexpected phenomena: access
to "production" by "lay" people and an accompanying mass consumption, both focusing on
"information processing" and not merely numbers; and, finally, the connected/social era, still in
process -governing the present and the foreseeable future- and emphasizing the "communication"
aspect; thus, the discussion about the subject matter of computer science is determined by these
three different historical periods as they each gave rise to different conceptions of it...

metamodel

for both its historiographical and current conceptions of computer science, first a metamodel, i.e. a
more formal intellectual framework should be created to enable a comparative critique of hitherto
suggested analyses; in broad terms, the classical triad of the operation/operator/operand abstraction
can serve such a purpose: here, the "operation" is computation, the "operator" is both the computer
and the user who interacts with the computer (the latter will be called specifically the "operative"), and,
finally, the "operand" is whatever is "computed" in the process

under the categorization "operation/computation", the concept "program" is also subsumed both in the
sense of programming/program development, i.e. "production", and program usage, i.e.
"consumption"; likewise, the categorization "operator/computer" would also include, if any, some
communication/interaction medium that facilitates the whole process; finally, the "operand/computer"
designates (numeric) data, (symbolic) information, representations, etc.

although such a theoretical framework attempt is relatively simple to accomplish, surprisingly most
studies on the subject fail to do so and fall victim to a narrow perspective attitude as shown in the table
below; just to give an example, the Stanford Encyclopedia of Philosophy entry on the "philosophy of
computer science" [Turner+2008] applies the basic stock subdisciplines of philosophy (ontology,
epistemology, ethics) to the subject, but eventually fails by merely focusing on a -nowadays- extremely
narrow subject area, i.e. the "programming" component

but what is more interesting is the stubborn preservation of this narrow attitude -which was
understandable to a certain degree at the very beginning- during the second historical period and
thereafter, in spite of the new dynamics introduced then by the underlying socialization as exemplified
by the phenomenal "computing sprawling" and hence the inevitable incorporation of ordinary human
factors and social interaction dimension; furthermore, the fact that this stubbornness focuses
almost exclusively on the "computation/programming" component cannot be a mere
coincidence, all this is probably a reflection of an ideological bias...

comparison

scope model concepts focus characterized by remarks

[Turner+2008] narrow n/a operation program, computation entities, identity, semantics, correctness,
computability, programming (languages)

[Eden2007] narrow technocratic operation program ?data, a posteriori knowledge, empirical testing
rationalist operation program correctness, deduction, a priori knowledge
scientific operation program (AI) AI, a priori, a posteriori, deduction, scientific
experimentation

[Milner2007] narrow "model"? operation software "principled, organized, scientific" engineering


activity

[Hoare2006] narrow science/engineering operation program error elimination, program verification,


scientifically sound engineering activity

[Colburn2000] narrow mathematical paradigm (formal) operation program (Hoare) p132 mathematical expression
exact science (formal) operation program (Hoare) p154 formal specification/verification, deductive
reasoning about programs
empirical science (informal) operation programming hypothesis testing (of a "computer model of
reality")
[Smith1996] wide ontology of computation operation nature of computation realism, constructivism
ontology of computation operand representation, independence of representation and ontology
registration

wide empirical operation/operand ? what Silicon Valley does, computation in the wild
/operative
conceptual operation/operand theory of mind cognitivism

[ACM/Denning1989] narrow theory/mathematics operation paradigm, cultural style


abstraction/science n/a
design/engineering n/a

[Wegner1976]

[Wegner1970] wide computer technician operation pragmatic, against "models"


computer mathematician operation "mathematical model"
computer scientist operand data (1960's)
information (1970's)

[Perlis1982] narrow programming operation programming programming at the root of CS; not
mathematics, not algorithms,
not mathematics not recursive function theory, not mathematical
linguistics
not EE

[Slamecka1968] narrow information science operand information


information engineering operand information

[Forsythe1967] wide information processing operand information processing art and science of processing information,
abstract medium (information)
information processing operator "logical" engine ?
not engineering

[Gorn1963] widest computer science operation


information science operand, operative pragmatics
operand,,operative

the anatomy of the trichotomy, a persisting idiosyncracy

before analyzing the source of and motivation for this purported bias, however, first, the contaminated
intellectual "scene" must be cleared of the misguiding perturbations fueled by the
science/mathematics/technology debate which is peculiar only to computer science, and practically
has been acting for a long time as a cover for some other more concrete problems...

apart from the anachronistic but radically insightful views of Saul Gorn [Gorn1963] [Gorn1982a], the
"philosophically abstruse" contemplations of Brian C. Smith [Smith1996], and, perhaps, the more
liberal analyses of Gordana Dodig-Crnkovic [Dodig-Crnkovic2003c], an outstanding group of some
researchers, i.e. [Eden2007], although somewhat in a confusing manner [Colburn2000],
[ACM/Denning1989] and [Wegner1970], strikingly have a consensus in approaching the development
of "computer science" from a trichotomic perspective, and almost always with the same triad of
categorical "tags" (science, mathematics, engineering); but the fact that these approaches offer, as
shown in the metamodel above, almost entirely different extensions for the said three categorical tags,
implies an inconsistency that begs for a thorough explanation; to start with, no other scientific
discipline seems to "suffer" from, interestingly, such a trichotomic analysis as computer science does;
and as one of these constituents already implies, the subject matter's human-madeness is offered as
the source of this trichotomy; but human-madeness cannot be accounted for all these three
phenomena, it explains at best the science/technology dichotomy; AI, another "human-made" subject,
for example, neither in its highly rationalist GOFAI nor its highly empirical nouvelle AI genres, in and by
itself did trigger a trichotomy (indeed, even a dichotomy) talk during its developmental historiography:
GOFAI's inherent rationalism was challenged and coped with, within the discipline, without much
meta-level fanfare; in cognitive science, another neighboring discipline, rationalism (Putnam's
T(o)uring Machines [Putnam1988]) also had a stronghold from the very beginning and blocked and
deprived functionalism, the dominant practicing paradigm of the endeavor -which studies mind
empirically- from a theoretical support; unfortunately, a substantial protest call against this
[Scheutz2002] is still unheeded today

furthermore, whether or not a discipline can be called "science" if its subject matter is "human-made",
which underlies the gist of this science/technology dichotomy as one of the axes of the trichotomy, is
just another debate which dominated a few decades' agenda toward the end of the last century, but
should not be further pursued here, because, in the overall discussion, it is not of primary relevance as
the other more pressing issue, i.e. the rationality dimension (variously conceptualized as "rationalist"
in [Eden2007], "formal" in [Colburn2000], theory/mathematics in [ACM/Denning1989] and "computer
mathematician" in [Wegner1970]); likewise, the mathematics/technology dichotomy axis, which, in
terms of a formal semantics vs. practical testing debate, has occupied, philosophically, both Fetzer
[Fetzer1988] and Colburn [Colburn2000] so much, will not be examined here, as anyone would attest
its contemporary futility once it became miniaturized relative to the gigantic web of problems at hand

finally, before analyzing rationalism in detail, another red herring should be also eliminated, i.e. the
claim that industry's scholarly unchecked gold rush adventurism at an unheard of before scale drives
things out of control; the correct diagnosis would be to see that this whimsicality is, in a circular
fashion, only partly a reason for, but more than that, a consequence of the academia/industry chasm;
the gap exists, simply, because of academy's disinterest in studying empirical issues as a
consequence of its rationalist preference over them

the proper juxtaposition: a dichotomy of rationalism vs. empiricism

the real explanatory mechanism is a simpler dichotomy: the struggle is between the neat, sterile,
idealist, perfectionist, fundamentalist rational camp and the scruffy, "realist", down-to-earth, pragmatist
empirical one; Wegner agrees:
The term "fundamental" as in "fundamental particles" or "foundations of mathematics", is a codeword for
rationalist models. The presence of this same codeword in terms like "religious fundam entalism" suggests that
social and scientific rationalism have the common roots in the deep-seated human desire for certainty.
Rationalism is an alluring philosophy with many guises (disguises). [Wegner1996c]

when not entangled by the trichotomic perspective, Wegner is explicit and very conscious of the
source of this tension: ... empiricism is not expressible by rationalism, validating arguments by Hume
and Kant that contingent facts cannot be inferred by "pure reason" [Wegner1999a]; he is also aware,
from the very beginning, what the real target of his argumentation is:

The intuition that empiricism extends rationalism can be proved for computing. Irreducibility establishes the
intellectual legitimacy of empirical computer science by freeing researchers from the obligation of expressing
their models in algorithmic terms. It establishes computer science as a discipline distinct from mathematics and,
by clarifying the nature of empirical models of computation, provides a technical rationale for calling computer
science a science. [Wegner1998b]

beyond empirical contrapositioning against rationalism intradisciplinarily in their own fields by a


considerable size of researchers, some others tried, philosophically, (against Tarski [Sloman1971],
against the Church-Turing thesis [Sloman2002][Smith1996] to give a few examples) to make justice to
the empirical cause at large...

on the other hand, rational elements' share in the mix is undeniable; but a more acceptable solution on
the way to maturation for computer science depends, if it can be ever achieved, on a correct
assignment of its rational and empirical constituents, and their intriguing interplay...

rationalism: another idiosyncracy

therefore, unless the source of this rationalistic approach (mathematicism), i.e. the third leg of the
trichotomy (or better, the exalted side of the rationalism/empiricism dichotomy) is exposed, the
problem cannot be analyzed properly: indeed, how does rationalism come into play? interestingly
again, rationalism is yet another idiosyncracy of computer science; it is absent in other human-made
disciplines (technologies); sure, it is true, as most computer science researchers rightfully reflect in
their models, that part of the whole endeavor is subject to some rationalist analysis (specifically, the
theories of computation and complexity, and formal semantics); but again, only some "part" of it, and
not the whole discipline...

beyond this point, the rightful and dutiful core effort of rationalism transforms into a dominating
ideology where empirical turfs become simply ignored, downplayed, deprioritized, eclipsed or
audaciously invaded; ideologically, the rationalists are either devout mathematicians/logicians, or
interestingly, technologists with a mathematical twist (having a rational touch, preferring verification,
pursuing proofs, enjoying the axiomatic/deductive method, i.e. the "neat" ones), or put in negative
terms, with a science distaste (lacking an empirical touch, not preferring falsification, not fond of the
"hypothesis" approach; not enjoying observations and the inductive method, i.e. abstaining from the
"scruffy" stuff)

but, factually, how did a convenient climate exist for rationalists to prosper to such an extent at the
expense of empiricists? although inventors like James Thomson (EE/physicist), Caldwell (EE), Bush
(EE), Hartree (physicist) of the earlier analog differential analyzer were empiricists (i.e. electrical
engineers and/or physicists), it was the rationalists who reigned during the dawn (i.e. 50's) of the
nascent discipline; for example, the administrative Goldstine and the supreme supervisor of IBM's
weekly "Madison Court trials" in late fifties, von Neuman, (or other luminaries like Church, Turing,
McCarthy, Minsky, Davis, Smullyan, Kleene, Scott) were all mathematicians (and also, strikingly, all
from Princeton, and the latter group had their PhD from there, too) whereas the rest of the pioneers
were all physicists (Atanasoff, Amdahl, Mauchly, Eckert (EE), Forrester (EE), Wilkes -Zuse was not
part of the "official" genealogy; obviously, the "mathematicians" who were busy with the abstraction
side (software was not an appropriate term yet) which was much more rigid, offering some needed
rigor, and hence, did not change course very often, had, because of this simple fact, academically a
superior and dominating position as compared to the "physicists" who were rather involved in the
hardware that was mostly a practical experimentation without any theoretical basis -and, as such, had
a more divergent, transient nature... therefore, the said idiosyncracy can be said to have emerged as a
consequence of different historical development lines of the two aspects (SW/HW) of this dual natured
discipline, a characteristics that no other properly demarcated field seems to possess (one could point
to a similar tension in physics against ancillary mathematics, but here the roles are clearly defined -
except perhaps the more recent emergence of rationalistic theories -superstring theory, M-theory-
which cannot be justified empirically)

as of today, however, the empirical domain (of "computer science") is undeniably there: with its own
turf, own problems, and own theories...

the empirical

perennially empirical issues

"When subjected to the empirical dem ands of practice and the [...] conceptual demands of cognitive science, all
seven primary construals [of computation] fail - for deep, overlapping, but distinct, reasons."
...
By my lights, an adequate theory must make a substantive empirical claim about what I call computation in the
wild: that eruptive body of practices, techniques, networks, machines, and behavior that has so palpably
revolutionized late twentieth century life. [Smith2002]

long before the "machine", as we know it, emerged in 1940's, at the moment the concept of
mechanical computation was born or whenever articulated, be it exemplified in concrete earthly
incarnations by Pascal, Lebnitz, Babbage et al. in the previous centuries, or in abstract machine
contemplations by Turing, Church, Post, Kleene and others in the 20th century, some aspects of all
these endeavors were, against all rationalist sanctioning and glorification, perennially empirical by their
birthright, and this fact has not changed a bit since then; the list is substantial: organization of the
system ("the computer"), communicative symbolism ("programming languages") to program it,
representational symbolism ("information/knowledge, and management thereof) of the input/output,
"computation in the wild" [Smith1996] [Forrest+2002], etc.; the biggest sin of the rationalist approach,
then, has been ignoring or, at best, downplaying the significance of all these subjects...

the latecomer

the practical manipulation of the whole process, i.e. "human-computer interaction", gained prominence
later on; Peter Wegner has been using its obvious empiricalness as the cornerstone of his critique
against rationalism
Then Peter Wegner, before his naked emperor, had the temerity to announce that algorithms are not what most
of computing is about. Instead, the management of interaction better describes in abstract the day-to-day
computing practice. It is as scandalous as the observation that scientists do not follow scientific method, or that
economic behaviors are neither Bayesian nor utilitarian, or that logic is about notation, not reasoning. There is
no disputing the observation. The fact is that there is more interaction than transform ation on automata today.
All that remains is to understand clearly what the difference is and how the century-old logico-mathematical
legacy went wrong. [Loui1998]

lack of theories

and somehow, these "peripheral" (empirical) subjects in computer science, perennial or latecomer,
lacked the support of rigorous theoretical foundationing (ad hoc theories for their sub-parts, here or
there, do exist, but what is missing is foundational theories that cohere all parts at system level);
witness the explicit complaints by the software profession which is infamous of being entrapped in a
chronic software crisis for the last 30 years: its latest initiative (SEMAT- Software Engineering Method
and Theory) launched by its luminaries and major corporations in March 2010 reflects the problem
pretty clearly with their concise "call for action" statement:
Software engineering is gravely hampered today by immature practices. Specific problems include:

 The prevalence of fads more typical of fashion industry than of an engineering discipline.
 The lack of a sound, widely accepted theoretical basis.
 The huge number of methods and method variants, with differences little understood and artificially magnified.
 The lack of credible experimental evaluation and validation.
 The split between industry practice and academic research.

[Jacobson+2010]

indeed, academic critiques (e.g. [Coy1997], [Hirose1990]) have been reporting this dire situation from
the very beginning, but, as said, somehow, lack of foundational theories for empirical subdisciplines in
computer science is still a big problem:

and despite many accomplishments there is still no visible science of programming. ... there is few research on
the theoretical foundations of the discipline besides its mathematical and logical constructs. Even the
development of new programming languages lacks solid theoretical foundations (again, besides mathematical
and logical calculi). It remains an art or craft - sometimes with convincing results. A theory that shows limits and
perspectives of inform atics is still to be developed. [Coy1997]

There are many "theories" in computer science, but there is still little "foundation". For example, the theory of
compiling, which has been developed so far in terms of the formal theory of languages and automata, is
generally accepted by engineering people as a typical example of "successful" scientific works in computer
science. However, in the author's opinion, it is not acceptable as a fundam ental work because it is based only
on the syntactic theory of language and ignores the fundam ental studies of human activities in programming.
Note that the formal language theory originated by Chomsky, although a wonderful theory, does not serve as a
"foundation" for programming and programming languages in general.
We can take the computational complexity theories as another exam ple. The theory of NP-completeness is
formulated in terms of the worst-case time complexity. It is undoubtedly one of the most established of
mathematical theories, but it is almost useless as a "foundation" for efficient algorithm design. [Hirose1990]

consequences

as a consequence of this theoretical weakness, failures, messiness and incapacitation became


inevitable; unfulfilled promises abounded, unpredictability of future has been a constant nuisance... to
start with, the list of failures is a long one: AI, ubiquitous computing, human-computer interaction,
operating system reliability. etc. are all names that are so often associated with the word "failure"; the
software industry is infamous with messiness due to "unlearned" experimentation,
conflicting/mismatching products (the compatibility problem), unsound version management, security,
and similar problems; and its incapacitation in terms of HW/SW progress mismatch is a result of all
these factors and complexity, an issue that still begs a widely accepted theoretical handling

there are two phenomena which might be responsible for this situation: the academy/industry chasm,
and/or, as already mentioned above, a prolonged (in the second and third periods) domination of the
rationalist attitude; the former one is rather easier to pronounce and several authors have done so;
however, the latter one is politically more explosive and only a few voices are heard as mentioned
previously: mainly, these are the "virtual" Vienna group (Smith, Agre, Copeland, Cussins, Harnad,
Haugeland, Scheutz) that gathered in Vienna during the NTCS'99, Computationalism - The Next
Generation conference [Scheutz2002], and Peter Wegner and company...

furthermore, it is pretty difficult to determine if these two factors work independently or somehow
interact with each other, and if so, in which way and order

naturalization

the naturalization process should start with an inner demarcation task, that is, separation of the
rationalistic elements from the empirical; but it is the next step that is more challenging: namely,
articulating competent foundational theories that govern the empirical realm, and then offering a
coherent grand picture that binds all loose ends (parts) together; the current federational structure
without a uniting backbone is not a classificatory preference but rather a symptom of a problematic
situation...

sterilization

an outer demarcation process is also inevitable; sure enough, the plethora of disciplinary items as
given in [Denning2003] under a more generic term computing (instead of "computer science") should
be radically reorganized such that related but essentially distinct subjects (i.e. artificial intelligence,
robotics, vision, etc.) are clearly left out

an epilogue: the pragmatics of the constructs

as properly explicated and defended from the outset by Gorn [Gorn1963] [Gorn1982a], the shaping of
the very subject matter of "computer science" (i.e. notations, tactical methodologies, products, etc. of
the "constructs", that is, of both computation and the above given perennially empirical issues) in
social context underlines their pragmatic aspect, another empirical issue that must be taken into
consideration to achieve a more complete definition of the discipline (however, this should not be
confused with the pragmatics of the practice at large, as mentioned in the introduction above and
dictated by emerging social phenomena that define the nature, breadth and epicenter of the practice
and suggests a different discussion; also, ethical and social implications, which are beyond the realm
of "computer science" itself, should be studied under a sociology of computer science); furthermore, it
must be also noted that the lack of theories related to the perennially empirical issues poses a higher
priority problem, and unless these are discovered/constructed at least up to a satisfactory point, a
healthy study of pragmatic aspects -of the constructs- will be hampered considerably

bibliography

 [Jacobson+2010] - The First SEMAT Workshop: a Blueprint for SEMAT, Ivar Jacobson, Bertrand Meyer, Richard Soley,
2010
 [Turner+2008] - The Philosophy of Computer Science, Raymond Turner, Amnon Eden, Stanford Encyclopedia of
Philosophy, 2008
 [Eden2007] - Three Paradigms of Computer Science, Amnon H. Eden, 2007, Minds and Machines, Special issue on the
Philosophy of Computer Science, Vol. 17, No. 2 (Jul. 2007), pp. 135 167
 [Milner2007] - Mem ories of Gilles Kahn, and the Informatic Future, Robin Milner, 2007, transcript of speech before
Colloquium in memory of Gilles Kahn, INRIA Research Institute, January 2007
 [Hoare2006] - The Ideal of Program Correctness, Tony Hoare, May 2006, draft,
(http://www.bcs.org/upload/pdf/correctness.pdf)
 [Dodig-Crnkovic2003c] - Shifting the Paradigm of the Philosophy of Science: the Philosophy of Information and a New
Renaissance, Gordana Dodig-Crnkovic, 2003, Minds and Machines: Special Issue on the Philosophy of
Information,November 2003, Volume 13, Issue 4
 [Forrest+2002] - Computation in the Wild, Stephanie Forrest, Justin Balthrop, Matthew Glickman, David Ackley, 2002
 [Slom an2002] - The Irrelevance of Turing Machines to AI, Aaron Sloman, 2002, in [Scheutz2002]ed
 [Smith2002] - Foundations of Computing, Brian C. Smith, 2002, in [Scheutz2002]ed
 [Scheutz2002]ed - Computationalism - New Directions, Matthias Scheutz, 2002, ed.
 [Colburn2000] - Philosophy and Computer Science, Timothy R. Colburn, 2000
 [Loui1998] - Some Philosophical Reflections on The Foundations of Computing, Ronald P. Loui, 1998, Society for Exact
Philosophy, Georgia Meeting, Athens, 1998
 [Wegner1998b] Interactive Foundations of Computing, (final draft), Theoretical Computer Science, February 1998
 [Coy1997] - Defining Discipline, Wolfgang Coy, 1997, Lecture Notes in Computer Science, Volume 1337/1997, pp. 21-35
 [Wegner1996c] The Paradigm Shift From Algortihms to Interaction, Peter Wegner, 1996
 [Wegner1996b] Interactive Software Technology, Handbook of Computer Science and Engineering, Peter Wegner, 1996
 [Wegner1996a] Coordination as Constrained Interaction, Peter Wegner, 1996, LNCS 1061, pp. 28-33
 [Smith1996] - On the Origin of Objects, Brian C. Smith, 1996
 [Hirose1990] - On Computer Science Curricula at Universities, K. Hirose, 1990, New Generation Computing, 8, 183-184
 [Denning+1989] - Computing as a Discipline, Peter J. Denning, Douglas E. Comer, David Gries, Michael C. Mulder, Allen
Tucker, A. Joe Turner, Paul R. Young, 1989
 [Putnam1988] - Representation and Reality, Hilary Putnam, 1988, MIT Press
 [Fetzer1988] - Program Verification: The Very Idea, James H. Fetzer, 1988, Communications of the ACM, September 1988
Volume 3; Number 9
 [Gorn1982b] - A Pragmatist Replies, Saul Gorn, 1982, Science Communication 1982; 4; 244
 [Gorn1982a] - Informatics (Computer and Information Science) Its Ideology, Methodology, and Sociology, Saul Gorn, 1982,
Science Communication 1982; 4; 173
 [Perlis1982] - The Role of Information in Computer Science, Alan J. Perlis, Science Communication.1982; 4: 208-210
 [Slam ecka+1978] - Information Sciences at Georgia Institute of Technology: The Formative years 1963-78, Vladimir
Slamecka, John Gehl, eds., 1978, in Information Processing and Management, vol. 14 (no. 5, 1978), pp. 320-361
 [Wegner1976] - Research Paradigms in Computer Science, Peter Wegner, 1976, Proc. 2nd Int. Conference on Software
Engineering, San Francisco, California, 1976
 [Wegner1975] - Abstraction - A Toll for the Management of Complexity? Peter Wegner, 1975
 [Slom an1971] - Tarski, Frege,and the Liar Pradox, Aaron Sloman, 1971, Philosophy, Vol XLVI, pp. 133-147
 [Wegner1970] - Some Thoughts on Graduate Education in Computer Science, Peter Wegner, 1970
 [Wegner1968] - Programming Languages, Information Structures, and Machine Organization, Peter Wegner, 1968
 [Forsythe1967] - A University's Educational Program in Computer Science, George E. Forsythe, 1967, Comm. ACM. Vol 10,
No 1, pp3-11
 [Gorn1963] - The Computer and Information Sciences: A New Basic Discipline, Saul Gorn, 1963, SIAM Review, Vol 5,
pp.150-155

S-ar putea să vă placă și