Sunteți pe pagina 1din 4

HISTORY OF COMPUTERS

The first true digital computer, called the Colossus Mark I, was built in 1943 with
funding from the U.S. Military and used in airplane design and other complex
engineering applications. At the same time, Bell Laboratories was working on
development of a computer, as were two scientists at the University of Pennsylvania, J.
Presper Eckert and John Mauchly, later founders of Eckert-Mauchly Corporations.
The prototype World War II military computers were very different from todays
computers. First, they were big. A computer with much less power than an ordinary
desktop computer of the 1990s took up an entire room. Second, there was relatively few
operations they could perform as compared to todays computers. Essentially, they were
giant and complex mathematical calculators. Third, they were difficult to program. In
fact, they were programmed by the scientists getting to the black of the computer and
changing the wires. This approach was slow, tedious, and impractical for a commercial
machine.
After the war, Eckert and Mauchly produced the first vacuum tube computer, the
electronic numerical integrator and computer, more commonly known as the ENIAC. In
1950, the Remington Rand Corporation bought Eckert and Mauchlys company (Unisys,
date unknown) and 1 year later began to market the first large scale commercial computer
system, called UNIVAC-I. In 1955, the Sperry Corporation merged with Remington
Rand forming the giant Sperry Rand Corporation. That year, the very first commercial
application was run when General Electric processed its payroll on a UNIVAC computer,
and the age of business computing was born. The American business establishment
recognized the value of this machine that could do thousands of repetitive, mathematical
calculations. In response, companies such as Bell Labs, National Cash Register (NCR),
Burruoghs, and IBM began to develop their business computer products. Today, these
early computers are called First generation computers.
The Univac and other first generation computer used vacuum tubes in their
design. Those computers ran hot and thus required a great deal of cooling. Vacuum
tubes got hot easily, and when they got hot, they failed regularly. Given that those
computers used many vacuum tubes, and the high (and random) failure rate of vacuum
tubes, the early computers were a real challenge to keep operational.
For the first generation of computers, the speed of the main processor was
measured in access speeds (how fast the CPU could access commands entered through
punched cards). Access speeds were measured in thousandths of a second (milliseconds).
First generation computers were physically huge (one computer took up a large room),
but their power was much less than that of the average desktop computer of the 1900s.
Main memory was less than 10 K of storage.
Second generation computers were introduced in the last 1950s. They included
the IBM 1401 and 1620. They used transistors instead of vacuum tubes. This meant less
heat, improved reliability, and much greater speeds. Second generation CPU access
speeds were measured in millionths rather than thousandths of a second (microseconds).

They still were quite large, but transistors were smaller and more durable than vacuum
tubes. They also allowed for the development of much more powerful computers.
Third generation computers were introduced in the mid-1960s. These used
microminiature, solid state components. Third generation CPU access speeds were
measured in billionths of a second (nanoseconds). The IBM 360 and 370 were the classic
computers in this generation. They had about 110 K of main memory, and it was this
generation in which hard disk drives were introduced. These hard disks were not encased
in protective plastic cases, so they were very vulnerable to dust. Any magnetic media is
vulnerable to dirt, even the diskettes used today. However, today the hard drives are
much better protected against dust than was the case in the 1960s. That is why pictures
of computer rooms taken during the third generation era often show people in surgical
type garb. They were trying to keep the failure rate down by keeping the computer room
as clean as possible.
In November 1972, Intel Corporation introduced the first commercial
microprocessor, called the Intel 8008. This invention made the PC, or microcomputer,
possible. Shortly thereafter, two teenaged boys named Steve Jobs and Steve Wozniak
who shared an intense interest in electronics bought a microprocessor for $2.5 and built a
very simple computer they called the Apple. Like Henry Fords dream of bringing
automobiles to everyone, Jobs had a passionate dream of bringing computers to
everybody. They failed to interest Wozniaks employer, Hewlett-Packard (HP)
Corporation, in their idea to build a small computer that people could have and use in
their homes. At that time, according to legend, HP executives could not imagine why
anyone would want such a machine in the home. They were focused on business
computing, with its billing and payroll processing, and peoples home finances simply did
not require such power. Not to be refused, the two Steves decided to pursue their dream
anyway. They began building the machines in Steve Jobs garage, and in May of1976,
they introduced their first computer at a meeting of the Homebrew Computer Club, at
which Paul Terryl, president of the Byte Shop chain, ordered 50 computers. At the time,
Steve Jobs was 21 years old and Wozniak was 26. The Apple Computer Company and
the first PC were born. In 1999, Steve Jobs was chairman and CEO of Pixar, the
computer animation studio that won an Academy Award for its work on the motion
picture, Toy Story.
At the same time that Jobs and Wozniak were working in the garage, IBM
introduced the first fourth generation mainframe, the IBM 370. This was the first
mainframe family that had printed circuits. This computer was so fast that the old
measurement of speed was deemed unsuitable. Since a CPU processes instructions
(which the CPU fetches, decodes, executes, and store), the new CPUs speed was
measured by the speed with which it could process instructions, rather that accesses.
Fourth generation computer CPU speeds were (and still are today) measured by
instructions per second that they ca process. The IBM 370s CPU speed was measured in
millions of instructions per second (MIPS). Todays mainframes are measured in billions
of instructions per second (BIPS) or giga-instructions per second (GIPS).

Three broad classes of computers exist: the analog computer, the digital
computer, and the hybrid computer.
Analog Computer
The analog computer operates on continuous physical or electrical magnitudes,
measuring ongoing continuous analog quantities such as voltage, current, temperature,
and pressure. Selected physiologic monitoring equipment, which accepts continuous
input/output signals, is in the analog class of computers. An example of these machines
in the clinical setting includes heart monitors and fetal monitors. An analog computer
handles data in continuously variable quantities rather than breaking the data down into
discrete digital representations.
Digital Computer
The digital computer, on the other hand, operates on discrete discontinuous
numerical digits using the binary numbering system. It represents data using discrete
values for all data. Its data are represented by numbers, letters, and symbols rather than
by waveforms such as on a heart monitor. Most of the computers used in the health care
industry for charting and decision support are digital computers.
Hybrid Computer
The hybrid computer, as its name implies, contains features of both the analog and
the digital computer. It is used for specific applications, such as complex signal
processing and other engineering-oriented applications. It is also found in some
monitoring equipment that converts analog signals to digital ones for data processing.
For example, physiologic monitors that are able to capture the heart waveform and also to
measure the core body temperature at specific times of the shift are actually hybrid
computers. Some physiologic research projects can make use of hybrid computers that
have analog ability to capture waveforms of physiologic monitors (i.e. ECG, EEG, and so
forth) and convert them into digital format suitable for analysis.

References:
Ball, Marion, et. al., (2011). Nursing Informatics: Where Technology and Caring Meet.
London: Springer-Verlag
Sewell, Jeanne (2015). Informatics and Nursing: Opportunities and Challenges.

LWW.

Science, H. of C. (2016). History of computer science. In Wikipedia. Retrieved from


https://en.wikipedia.org/wiki/History_of_computer_science
Brief history of computer. Retrieved August 21, 2016, from
http://people.bu.edu/baws/brief%20computer%20history.html

History of computers. Retrieved August 21, 2016, from


http://homepage.cs.uri.edu/faculty/wolfe/book/Readings/Reading03.htm

S-ar putea să vă placă și