Sunteți pe pagina 1din 7

What is Computer?

A computer is one type of machine manage different kinds of programs to perform useful tasks to the users.
Computer gives proper response to the users through the set of instructions arranged in proper order. It has
storage capacity along with execution benefits. Computers can perform both complex and simple operations.
Based on types of actions, computers design in several formats with hardware and software specifications.
Computer contains wires, transistors, circuits, hardware parts and etc. Computer designs with both software and
hardware. Software is a process of preparing program through instructions and data. General computers made
with following types of components in hardware.

1. CPU - Central Processing Unit maintains three kinds of activities to store the data and processing data with
several kinds of operations. PU is major part to the computer that tells about the computer to perform which
task for every time.

2. Memory - Memory is one part that store data, programs and etc. It categorized into several types those can
work at special purpose.

3. I/P Devices - Devices which is useful to provide input data to the computer that will be processed to provide
output from other devices.

4. O/P Devices - Input devices give input to the computers. After process the data of input the proper output will
release to the users from output devices.

Computer types:

1. Personal Computer
2. Mini Computer
3. Main Frame Computer
4. Super Computer
5. Workstation
Personal computer:

It maintain above list of hardware and software components. It can be defined as a small that range up to the
limited pounds. It appeared in the year of 1970’s that work with small CPU, RAM and memory chips. It is
useful to the word processing, accounting, desktop, database management applications and etc. Number of
home users use this software for play games and learn anything from internet easily.

Personal computer maintains several kinds of computers such as following.

1. Notebook
2. Tower computer
3. Laptop
4. Subnotebook
5. Handheld
6. Plamtop
7. PDA

Mini Computer:

It is a midsize computer useful in work stations that can cover 200 users simultaneously.

Workstation:

It designs forengineering applications SDLC and various kinds of applications with moderate power and
graphic technologies. It generally maintains high storage media along with large RAM. Workstation only work
by the UNIX and Linux operating systems. It has several types of storage media that maintain both diskless and
disk drive workstations.

Supercomputer and Mainframe:

Supercomputer is best fastest computer in world that is very expensive. It work based on mathematical
calculations so, everything work well with simple procedure. Supercomputers deals with scientific simulations,
animated graphics, electrical design, perform dynamic calculations and etc.

COMPUTER - a programmable electronic device designed to accept data, perform prescribed mathematical and
logical operations at high speed, and display the results of these operations. Mainframes, desktop and laptop
computers, tablets, and smartphones are some of the different types of computers.

- a person who computes; computist.

Technically, a computer is a programmable machine. This means it can execute a programmed list of
instructions and respond to new instructions that it is given. Today, however, the term is most often used to
refer to the desktop and laptop computers that most people use. When referring to a desktop model, the term
"computer" technically only refers to the computer itself -- not the monitor, keyboard, and mouse. Still, it is
acceptable to refer to everything together as the computer. If you want to be really technical, the box that holds
the computer is called the "system unit."

Some of the major parts of a personal computer (or PC) include the motherboard, CPU, memory (or
RAM), hard drive, and video card. While personal computers are by far the most common type of computers
today, there are several other types of computers. For example, a "minicomputer" is a powerful computer that
can support many users at once. A "mainframe" is a large, high-powered computer that can perform billions of
calculations from multiple sources at one time. Finally, a "supercomputer" is a machine that can process billions
of instructions a second and is used to calculate extremely complex calculations.

First Generation: Vacuum Tubes (1940-1956)

The first computer systems used vacuum tubes for circuitry and magnetic
drums for memory, and were often enormous, taking up entire rooms.
These computers were very expensive to operate and in addition to using a
great deal of electricity, the first computers generated a lot of heat, which
was often the cause of malfunctions.

First generation computers relied on machine language, the lowest-level


programming language understood by computers, to perform operations,
and they could only solve one problem at a time. It would take operators
days or even weeks to set-up a new problem. Input was based on punched
cards and paper tape, and output was displayed on printouts.

The UNIVAC and ENIAC computers are examples of first-generation


computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S.
Census Bureau in 1951.

Second Generation: Transistors (1956-1963)

The world would see transistors replace vacuum tubes in the second generation of computers. The transistor
was invented at Bell Labs in 1947 but did not see widespread use in computers until the late 1950s. 

The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more
energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated
a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube.
Second-generation computers still relied on punched cards for input and printouts for output.

From Binary to Assembly

Second-generation computers moved from cryptic binary machine language to symbolic, or assembly,
languages, which allowed programmers to specify instructions in words. High-level programming languages
were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the
first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic
core technology.

The first computers of this generation were developed for the atomic energy industry.
Third Generation: Integrated Circuits (1964-1971)

The development of the integrated circuit was the hallmark of the third generation of computers. Transistors
were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and
efficiency of computers.

Instead of punched cards and printouts, users interacted with third generation computers through keyboards and
monitors and interfaced with an operating system, which allowed the device to run many different applications
at one time with a central program that monitored the memory. Computers for the first time became accessible
to a mass audience because they were smaller and cheaper than their predecessors.

An integrated circuit (IC) is a small electronic device made out of a semiconductor material. The first
integrated circuit was developed in the 1950s by Jack Kilby of Texas Instruments and Robert Noyce of
Fairchild Semiconductor.

Fourth Generation:  Microprocessors (1971-Present)

The microprocessor brought the fourth generation of


computers, as thousands of integrated circuits were built
onto a single silicon chip. What in the first generation filled
an entire room could now fit in the palm of the hand. The
Intel 4004 chip, developed in 1971, located all the
components of the computer—from the central processing
unit and memory to input/output controls—on a single chip.

In 1981 IBM introduced its first computer for the home


user, and in 1984 Apple introduced the Macintosh.
Microprocessors also moved out of the realm of desktop
computers and into many areas of life as more and more
everyday products began to use microprocessors.

As these small computers became more powerful, they could be linked together to form networks, which
eventually led to the development of the Internet. Fourth generation computers also saw the development of
GUIs, the mouse and handheld devices.
Fifth Generation: Artificial Intelligence (Present and Beyond)

Fifth generation computing devices, based on artificial intelligence, are still in development, though there are
some applications, such as voice recognition, that are being used today. The use of parallel processing and
superconductors is helping to make artificial intelligence a reality.

Quantum computation and molecular and nanotechnology will radically change the face of computers in years
to come. The goal of fifth-generation computing is to develop devices that respond to natural language input
and are capable of learning and self-organization.

Charles Babbage created the concept of a programmable computer.

We could argue that the first computer was the abacus or its descendant, the slide rule, invented by
William Oughtred in 1622. But the first computer resembling today's modern machines was the Analytical
Engine, a device conceived and designed by British mathematician Charles Babbage between 1833 and 1871.
Before Babbage came along, a "computer" was a person, someone who literally sat around all day, adding and
subtracting numbers and entering the results into tables. The tables then appeared in books, so other people
could use them to complete tasks, such as launching artillery shells accurately or calculating taxes.

It was, in fact, a mammoth number-crunching project that inspired Babbage in the first place [source:
Campbell-Kelly]. Napoleon Bonaparte initiated the project in 1790, when he ordered a switch from the old
imperial system of measurements to the new metric system. For 10 years, scores of human computers made the
necessary conversions and completed the tables. Bonaparte was never able to publish the tables, however, and
they sat collecting dust in the Académie des sciences in Paris.

In 1819, Babbage visited the City of Light and viewed the unpublished manuscript with page after page
of tables. If only, he wondered, there was a way to produce such tables faster, with less manpower and fewer
mistakes. He thought of the many marvels generated by the Industrial Revolution. If creative and hardworking
inventors could develop the cotton gin and the steam locomotive, then why not a machine to make calculations

Babbage returned to England and decided to build just such a machine. His first vision was something
he dubbed the Difference Engine, which worked on the principle of finite differences, or making complex
mathematical calculations by repeated addition without using multiplication or division. He secured government
funding in 1824 and spent eight years perfecting his idea. In 1832, he produced a functioning prototype of his
table-making machine, only to find his funding had run out.

Some people might have been discouraged, but not Babbage. Instead of simplifying his design to make
the Difference Engine easier to build, he turned his attention to an even grander idea -- the Analytical Engine, a
new kind of mechanical computer that could make even more complex calculations, including multiplication
and division.

The basic parts of the Analytical Engine resemble the components of any computer sold on the market
today. It featured two hallmarks of any modern machine: a central processing unit, or CPU, and memory.
Babbage, of course, didn't use those terms. He called the CPU the "mill." Memory was known as the "store." He
also had a device -- the "reader" -- to input instructions, as well as a way to record, on paper, results generated
by the machine. Babbage called this output device a printer, the precursor of inkjet and laser printers so
common today.

Babbage's new invention existed almost entirely on paper. He kept voluminous notes and sketches about
his computers -- nearly 5,000 pages' worth -- and although he never built a single production model of the
Analytical Engine, he had a clear vision about how the machine would look and work. Borrowing the same
technology used by the Jacquard loom, a weaving machine developed in 1804-05 that made it possible to
create a variety of cloth patterns automatically, data would be entered on punched cards. Up to 1,000 50-digit
numbers could be held in the computer's store. Punched cards would also carry the instructions, which the
machine could execute out of sequential order. A single attendant would oversee the whole operation, but steam
would power it, turning cranks, moving cams and rods, and spinning gearwheels.

Unfortunately, the technology of the day couldn't deliver on Babbage's ambitious design. It wasn't until
1991 that his particular ideas were finally translated into a functioning computer. That's when the Science
Museum in London built, to Babbage's exact specifications, his Difference Engine. It stands 11 feet long and 7
feet tall (more than 3 meters long and 2 meters tall), contains 8,000 moving parts and weighs 15 tons (13.6
metric tons). A copy of the machine was built and shipped to the Computer History Museum in Mountain View,
Calif., where it remained on display until December 2010. Neither device would function on a desktop, but they
are no doubt the first computers and precursors to the modern PC. And those computers influenced the
development of the World Wide Web.

The Programmer and the Prophet

If Charles Babbage was the genius behind the Analytic Engine, then Augusta Ada Byron, or Ada
Lovelace, was the publicist (and, arguably, the very first computer programmer). She met Babbage at a party
when she was 17 and became fascinated by the mathematician's computer engine. From that chance meeting
grew a strong, dynamic relationship. Ada discussed Babbage's ideas with him and, because she was gifted in
mathematics, offered her own insights. In 1843, she published an influential set of notes describing Babbage's
Analytical Engine. Ada also added in some sage predictions, speculating that Babbage's mechanical computers
might one day "act upon other things besides numbers" and "compose elaborate and scientific pieces of music
of any degree of complexity …"

A computer is a machine that can be instructed to carry out sequences of arithmetic or logical operations
automatically via computer programming. Modern computers have the ability to follow generalized sets of
operations, called programs. These programs enable computers to perform an extremely wide range of tasks. A
"complete" computer including the hardware, the operating system (main software), and peripheral equipment
required and used for "full" operation can be referred to as a computer system. This term may as well be used
for a group of computers that are connected and work together, in particular a computer network or computer
cluster.

Computers are used as control systems for a wide variety of industrial and consumer devices. This includes
simple special purpose devices like microwave ovens and remote controls, factory devices such as industrial
robots and computer-aided design, and also general purpose devices like personal computers and mobile
devices such as smartphones. The Internet is run on computers and it connects hundreds of millions of other
computers and their users.

Early computers were only conceived as calculating devices. Since ancient times, simple manual devices like
the abacus aided people in doing calculations. Early in the Industrial Revolution, some mechanical devices were
built to automate long tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines
did specialized analog calculations in the early 20th century. The first digital electronic calculating machines
were developed during World War II. The first semiconductor transistors in the late 1940s were followed by the
silicon-based MOSFET (MOS transistor) and monolithic integrated circuit (IC) chip technologies in the late
1950s, leading to the microprocessor and the microcomputer revolution in the 1970s. The speed, power and
versatility of computers have been increasing dramatically ever since then, with MOS transistor counts
increasing at a rapid pace (as predicted by Moore's law), leading to the Digital Revolution during the late 20th
to early 21st centuries.

Conventionally, a modern computer consists of at least one processing element, typically a central processing
unit (CPU) in the form of a metal-oxide-semiconductor (MOS) microprocessor, along with some type of
computer memory, typically MOS semiconductor memory chips. The processing element carries out arithmetic
and logical operations, and a sequencing and control unit can change the order of operations in response to
stored information. Peripheral devices include input devices (keyboards, mice, joystick, etc.), output devices
(monitor screens, printers, etc.), and input/output devices that perform both functions (e.g., the 2000s-era
touchscreen). Peripheral devices allow information to be retrieved from an external source and they enable the
result of operations to be saved and retrieved.

S-ar putea să vă placă și