Sunteți pe pagina 1din 4

1945 AD

Von Neumann writes the "First Draft"


In June 1944, the Hungarian- American mathematician Johann (John) Von
Neumann first became aware of ENIAC.
Von Neumann, who was a consultant on the Manhattan Project, immediately
recognized the role that could be played by a computer like ENIAC in solving the
vast arrays of complex equations involved in designing atomic weapons.

John von Neumann


Copyright (c) 1997. Maxfield & Montrose Interactive Inc.
A brilliant mathematician, Von Neumann crossed mathematics with subjects such
as philosophy in ways that had never previously been conceived; for example, he
was a pioneer of Game Theory, which continues to find numerous and diverse
applications to this day.
Von Neumann was tremendously excited by ENIAC and quickly became a
consultant to both the ENIAC and EDVAC projects. In June 1945, he published a
paper entitled "First Draft of a report to the EDVAC," in which he presented all of
the basic elements of a stored-program computer:

1. A memory containing both data and instructions. Also to allow both data and
instruction memory locations to be read from, and written to, in any desired
order.
2. A calculating unit capable of performing both arithmetic and logical operations
on the data.
3. A control unit, which could interpret an instruction retrieved from the memory
and select alternative courses of action based on the results of previous
operations.
The key point made by the paper was that the computer could modify its own
programs, in much the same way as was originally suggested by Charles Babbage
for his Analytical Engine. The computer structure resulting from the criteria
presented in the "First Draft" is popularly known as a von Neumann Machine, and
virtually all digital computers from that time forward have been based on this
architecture.
The "First Draft" was a brilliant summation of the concepts involved in stored-
program computing; indeed, many believe it to be one of the most important
documents in the history of computing. It is said that the paper was written in a way
that possibly only von Neumann could have achieved at that time.
However, although there is no doubt that von Neumann made major contributions
to the EDVAC design, the result of the "First Draft" was that he received almost all
of the credit for the concept of stored-program computing, while Mauchly and
Eckert received almost none. But Mauchly and Eckert discussed stored-program
computers a year before von Neumann arrived on the scene, and Eckert wrote a
memo on the subject six months before von Neumann had even heard about
ENIAC.
It has to be said that there is no evidence that Von Neumann intended to take all
of the credit for EDVAC and the stored program computing concept (not the least
that his paper was titled "First Draft ...."), but it also cannot be denied that he didn't
go out of his way to correct matters later.
These notes are abstracted from the book Bebop BYTES Back
(An Unconventional Guide to Computers) Copyright Information
Harvard architecture
From Wikipedia, the free encyclopedia.
The term Harvard architecture originally referred to computer architectures
that used separate data storage for their instructions and data (in contrast to the
Von Neumann architecture). The term originated from the Harvard Mark I
relay based computer, which stored instructions on punched tape and data in
relay latches.
The term Harvard architecture is usually used now to refer to a particular
computer architecture design philosophy where separate data paths exist for
the transfer of instructions and data.
All computers consist primary of two parts, the CPU which processes data,
and the memory which holds the data. The memory in turn has two aspects to
it, the data itself, and the location where it is found - known as the address.
Both are important to the CPU, as many common instructions boil down to
something like "take the data in this address and add it to the data in that
address", without actually knowing what the data itself is.
In recent years the speed of the CPU has grown many times in comparison to
the memory it talks to, so care needs to be taken to reduce the numbers of
times you access it in order to keep performance. If, for instance, every
instruction run in the CPU requires an access to memory, the computer gains
nothing for increased CPU speed - a problem referred to as being memory
bound.
Memory can be made much faster, but only at high cost. The solution then is
to provide a small amount of very fast memory known as a cache. As long as
the memory the CPU needs is in the cache, the performance hit is very much
less than it is if the cache then has to turn around and get the data from the
main memory. Tuning the cache is an important aspect of computer design.
The Harvard architecture refers to one particular solution to this problem.
Instructions and data are stored in separate caches to improve performance.
However this has the disadvantage of halving the amount of cache available to
either one, so it works best only if the CPU reads instructions and data at
about the same frequency.
Von Neumann architecture
From Wikipedia, the free encyclopedia.
Von Neumann architecture refers to computer architectures that use the
same data storage for their instructions and data (in contrast to the Harvard
architecture). The term originated from First Draft of a Report on the EDVAC
(1945), a paper written by the famous mathematician John von Neumann, that
proposed the stored program concept. The paper was written in connection
with plans for a successor machine to the ENIAC and its concepts were
discussed by J. Presper Eckert, John Mauchly, Arthur Burks, and others over a
period of several months prior to Von Neumann writing the draft report.
A von Neumann Architecture computer has five parts: an arithmetic-logic unit,
a control unit, a memory, some form of input/output and a bus that provides a
data path between these parts.
A von Neumann Architecture computer performs or emulates the following
sequence of steps:
1. Fetch the next instruction from memory at the address in the program
counter.
2. Add 1 to the program counter.
3. Decode the instruction using the control unit. The control unit
commands the rest of the computer to perform some operation. The
instruction may change the address in the program counter, permitting
repetitive operations. The instruction may also change the program
counter only if some arithmetic condition is true, giving the effect of a
decision, which can be calculated to any degree of complexity by the
preceding arithmetic and logic.
4. Go back to step 1.
Very few computers have a pure von Neumann architecture. Most computers
add another step to check for interrupts, electronic events that could occur at
any time. An interrupt resembles the ring of a telephone, calling a person away
from some lengthy task. Interrupts let a computer do other things while it
waits for events.
Von Neumann computers spend a lot of time moving data to and from the
memory, and this slows the computer (this problem is called von Neumann
bottleneck ) so, engineers often separate the bus into two or more busses,
usually one for instructions, and the other for data.

S-ar putea să vă placă și