Sunteți pe pagina 1din 6

The word computer is an old word that has changed its meaning several times in the last few

centuries. Originating from the Latin, by the mid-17th century it meant someone who computes.
The American Heritage Dictionary (1980) gives its first computer definition as a person who
computes. The computer remained associated with human activity until about the middle of the
20th century when it became applied to a programmable electronic device that can store, retrieve,
and process data as Websters Dictionary (1980) defines it. Today, the word computer refers to
computing devices, whether or not they are electronic, programmable, or capable of storing and
retrieving data.

In modern day America, computers are an integral part of our daily lives. Even though computers
are not imbedded into our bodies like cybernetic devices for all their impact, they might as well
be. We have more computers in our homes than we have TVs. And for many of us, regardless of
our field of work, the computer is an indispensable part of our day.
Networked computers via Wi-Fi or broadband is the norm, and that gives us access to everything
from banking to books to food, clothing and medicines. Indeed, if we wanted to live our entire
lives in one room with just a computer and internet connection, we could probably manage just
fine: workwise and personally.

And for those individuals that dont own a computer, they still come in contact with computer
operated systems on a daily basis through banking, healthcare, education, social media,
transportation, shopping and entertainment.

In 1889, an American inventor, Herman Hollerith (1860-1929), also applied the Jacquard loom
concept to computing. His first task was to find a faster way to compute the U.S. census. The
previous census in 1880 had taken nearly seven years to count and with an expanding population,
the bureau feared it would take 10 years to count the latest census. Unlike Babbages idea of using
perforated cards to instruct the machine, Holleriths method used cards to store data information
which he fed into a machine that compiled the results mechanically. Each punch on a card
represented one number, and combinations of two punches represented one letter. As many as 80
variables could be stored on a single card. Instead of ten years, census takers compiled their results
in just six weeks with Holleriths machine. In addition to their speed, the punch cards served as a
storage method for data and they helped reduce computational errors.
Hollerith brought his punch card reader into the business world, founding Tabulating Machine
Company in 1896, later to become International Business Machines (IBM) in 1924 after a series
of mergers.
Todays generation of kids act like computers were invented a few years before they were born
and their parents level of computer knowledge revolves around playing pong. The truth is that
the oldest computer was a calculating device called an abacus, and it was created in Asia about
5,000 years ago! The abacus used sliding beads on shafts in a rack to calculate trading
transactions.
The next significant development happened in 1642 when Blaise Pascal, an 18-year old whiz kid,
invented a numerical wheel calculator to help his father, a tax collector, with his business! The
machine was called a Pascaline and it was only capable of addition. Not very good if you wanted
to pay less taxes!

In 1694, the German mathematician and philosopher, Gottfried Wilhem von Leibniz (1646-1716),
improved the Pascaline by creating a machine that could also multiply. But it wasnt until 1820,
however, that mechanical calculators gained widespread use.

Charles Xavier Thomas de Colmar, another Frenchman, invented a machine that could perform
the four basic arithmetic functions. Colmars mechanical calculator, the Arithometer, presented a
more practical approach to computing because it could add, subtract, multiply and divide. With its
enhanced versatility, the Arithometer was widely used up until the First World War.

The real beginnings of computers as we know them today, however, lay with an English
mathematics professor, Charles Babbage (1791-1871). Babbage noticed a natural harmony
between machines and mathematics: machines were best at performing tasks repeatedly without
mistake; while mathematics, particularly the production of mathematic tables, often required the
simple repetition of steps. The problem centered on applying the ability of machines to the needs
of mathematics.

Babbage begin work on the first general-purpose computer, which he called the Analytical
Engine. Consisting of over 50,000 components, the basic design of the Analytical Engine included
input devices in the form of perforated cards containing operating instructions and a store for
memory of 1,000 numbers of up to 50 decimal digits long. It also contained a mill with a control
unit that allowed processing instructions in any sequence, and output devices to produce printed
results.
Babbage borrowed the idea of punch cards to encode the machines instructions from the Jacquard
loom. The loom, produced in 1820 and named after its inventor, Joseph-Marie Jacquard, used
punched boards that controlled the patterns to be woven.

First generation computers were made-to-order for the specific task for which the computer was
to be used: And its Operating System was created for just that task. In other words, each computer
had a different binary-coded program called a machine language that told it how to operate. First
generation computers were difficult to program and limited. Other distinctive features of first
generation computers were the use of vacuum tubes (responsible for their huge size) and magnetic
drums for data storage.
First generation computers impact on society was limited to improvements in aviation and the
military. The use of computers as code-breakers during WWII helped defeat the Germans and
shorten the war by up to two years by one estimate, saving countless lives in the process.

The Second World War caused governments to increase funding for computers to exploit their
potential strategic importance. German engineer Konrad Zuse developed a computer, the Z3, to
design airplanes and missiles in 1941.

In 1943, the British completed a secret code-breaking computer called Colossus to decode German
messages which was later created with shortening the war by a couple of years

The Harvard-IBM Automatic Sequence Controlled Calculator (Mark I) was an electronic relay
computer using electromagnetic signals to move mechanical parts. It was used to create ballistic
charts for the U.S. Navy, and was about half as long as a football field and contained about 500
miles of wiring. It took 2-3 seconds to perform one calculation.

The Electronic Numerical Integrator and Computer (ENIAC) consisted of 18,000 vacuum tubes,
70,000 resistors and 5 million soldered joints. The ENIAC was a general-purpose computer that
computed at speeds 1,000 times faster than Mark I.

In 1945 John Von Neumann designed the Electronic Discrete Variable Automatic Computer
(EDVAC) with a memory to hold a stored program as well as data. The key element was the central
processing unit (CPU), which allowed all of the computer functions to be coordinated through a
single source.

The UNIVAC I (Universal Automatic Computer), built in 1951 by Remington Rand, became one
of the first commercially available computers to take advantage of these advances.

The invention of the transistor in 1948 greatly changed the computers development from being
large, relatively heavy devices to more compact versions by replacing the large, cumbersome
vacuum tube in televisions, radios and computers. As a result, the size of electronic machinery has
been shrinking ever since and the power of a laptop is almost equal to a supercomputer of
yesteryear that occupied an entire room. By 1956, transistors were working in computers. Coupled
with early advances in magnetic-core memory, transistors led to second generation computers that
were smaller, faster, more reliable and more energy-efficient than their predecessors.

Initially, Second generation computers main impact on society was through the military and the
scientific community in the area of atomic development. But by the mid-1960s companies like
IBM were developing main stream computers for use in diverse sectors including business,
universities, and government.
The first large-scale machines to take advantage of this transistor technology were early
supercomputers developed for atomic energy laboratories. Second generation computers replaced
machine language with assembly language that used abbreviated programming codes to replace
long, difficult binary codes.

The use of transistors became known as solid state design, and 2nd gernation computers also
contained the components we associate with the modern day computer: printers, tape storage, disk
storage, memory, operating systems, and stored programs.

The stored program concept meant that instructions to run a computer for a specific function
(known as a program) were held inside the computers memory, and could quickly be replaced by
a different set of instructions for a different function.

More sophisticated high-level languages such as COBOL (Common Business-Oriented


Language) and FORTRAN (Formula Translator) started during this time and continue to today.
These languages replaced cryptic binary machine code with words, sentences, and mathematical
formulas, making it much easier to program computers.

Careers like programmer, analyst, and computer systems expert and the entire software
industry began with second generation computers

Transistors were a major improvement over the vacuum tube but they still generated a great deal
of heat which could damage a computers internal parts. Jack Kilby, an engineer working
with Texas Instruments, developed the integrated circuit (IC) in 1958 which combined three
electronic components onto a small silicon disc that was made from quartz. Computers became
smaller and smaller as more components were squeezed onto the chip. Also, 3G computers also
employed the use of an operating system that allowed machines to run many different programs
at once with a central program that monitored and coordinated the computers memory.

The impact of advances from Third generation computers the integrated circuit and operating
system allowed the computer to move from the domain of big business, government agencies
and universities to be more accessible to smaller scale business operations. And because of the
reductions in physical size and cost, computers became more wide-spread which helped launch
entire industries like programming and hardware development.

During this time there were several innovations including Integrated Circuits (ICs),
Semiconductor Memories, Microprogramming, parallel processing and the introduction of
Operating Systems and time-sharing.
Parallelism used of multiple functional units, overlapping CPU and I/O operations and internal
parallelism in both the instruction and the data streams. Functional parallelism was first embodied
in CDC6600, which contained 10 simultaneously operating functional units and 32 independent
memory banks. Seymour Cray Supercomputer had a computation of 1 million flopping point per
second (1 M Flops). 5 years later, the CDC7600 was the first vector processor and it had of a speed
of 10 M Flops.

After the creation of integrated circuits, the only place to go was down, and computers have been
getting smaller and smaller ever since. Large scale integration (LSI) fit hundreds of components
onto one chip. Very large scale integration (VLSI) squeezed hundreds of thousands of components
onto a chip by the 1980s. Ultra-large scale integration (ULSI) increased that number into the
millions!

This helped reduce the size and price of computers while increasing their power, efficiency and
reliability. Developed in 1971, the Intel 4004 chip, took the integrated circuit one step further by
locating all the main computer components (central processing unit, memory, and input and output
controls) on a chip the size of a half dollar. Previously the integrated circuit was manufactured to
fit a special purpose, now one microprocessor could be programmed to meet any number of
demands.

The Microprocessor allowed computers to transition from big business and government agencies
into the realm of mass market consumer. The computer revolutionized society unlike any other
product in the history of the civilized world even moreso than the automobile or the airplane.
Through increases in miniaturization computers have continued to evolve and adapt to take on
more and more roles in industries as diverse as business, electronics, entertainment, finance, health
care and transportation. Everyday household items such as microwave ovens, television sets
and automobiles with electronic fuel injection incorporated microprocessors.

Computers were no longer developed exclusively for large business or government contracts. By
the mid-1970s, computer manufacturers worked to bring computers to mass market consumers.
Commodore, Radio Shack and Apple Computers were pioneers in the field and early computers
came complete with user-friendly software packages including word processing and spreadsheet
programs.

IBM introduced its personal computer (PC) in 1981 for use in the home, office and schools. The
number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in
1982. Ten years later, 65 million PCs were being used.
Computers continued their trend of getting smaller in size, going from large desktops to laptops to
palmtop. Apples Macintosh line, introduced in 1984, boasted a user-friendly design that allowed
people to move screen icons with a mouse instead of typing instructions.

As smaller computers became more powerful, they could be networked to communicate and share
memory space, software and information with each other. Using either direct wiring, called a Local
Area Network (LAN), or telephone lines, these networks could reach enormous proportions.

The Internet is a global web of computer circuitry that links computers worldwide into a single
network of information.

The fifth generation of computers is still in its infancy although the concepts that defined it have
been around for a long time. Arthur C. Clarkes novel, 2001: A Space Odyssey had its
computer HAL performing all of the functions envisioned for fifth generation
computers including: reasoning well enough to hold conversations with humans, use of visual
input, and learning from its own experiences.

The impact of 5th Generation computers is yet to be known, but given the impact it has already had
in its infancy, it is sure to be a game changer. Increased miniaturization, more brain-power and
operating speed will allow computers to come ever closer to the holy grail of the computer age
duplicating the computational power, response rate and reasoning power of the human brain.

Many advances in the science of computer design and technology are coming together to enable
the creation of fifth-generation computers including parallel processing (using many CPUs to
work as one) and superconductor technology (allows the flow of electricity with little or no
resistance, to greatly improving the speed of information flow). AI systems assist doctors in
making diagnoses by applying the problem-solving steps a doctor might use in assessing a patients
needs.

S-ar putea să vă placă și