Sunteți pe pagina 1din 8

WHAT IS COMPUTER?

Computer is an electronic device , a programmable machine designed to sequentially and automatically carry out a sequence of arithmetic or logical operations.

History of computing
The first use of the word "computer" was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued with the same meaning until the middle of the 20th century. A few devices are worth mentioning though, like some mechanical aids to computing, which were very successful and survived for centuries until the advent of the electronic calculator, like the Sumerian abacus, designed around 2500 BC[ which descendant won a speed competition against a modern desk calculating machine in Japan in 1946. This portrait of Jacquard was woven in silk on a Jacquard loom and required 24,000 punched cards to create (1839). It was only produced to order. Charles Babbage owned one of these portraits ; it inspired him in using perforated cards in his analytical engine. Henry Babbage, completed a simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906. This machine was given to the Science museum in South Kensington in 1910.

1 Limited-function early computers. In 1801, Joseph Marie Jacquard made an improvement to the textile loom by introducing a series of punched paper cards as a template which allowed his loom to weave intricate patterns automatically. The resulting Jacquard loom was an important step in the development of computers because the use of punched cards to define woven patterns can be viewed as an early, albeit limited, form of programmability. 2 First general-purpose computers George Stibitz is internationally recognized as a father of the modern digital computer. While working at Bell Labs in November 1937, Stibitz invented and built a relay-based calculator he dubbed the "Model K" (for "kitchen table", on which he had assembled it), which was the first to use binary circuits to perform an arithmetic operation. Later models added greater sophistication including complex arithmetic and programmability. 3 Stored-program architecture Nearly all modern computers implement some form of the stored-program architecture, making it the single trait by which the word "computer" is now defined. While the technologies used in computers have changed dramatically since the first electronic, general-purpose computers of the 1940s, most still use the von Neumann architecture 4 Semiconductors and microprocessors Computers using vacuum tubes as their electronic
elements were in use throughout the 1950s, but by the 1960s had been largely replaced by transistor-based machines, which were smaller, faster, cheaper to produce, required less power, and were more reliable. The first transistorised computer was demonstrated at the University of Manchester in 1953.[31] In the 1970s, integrated circuit technology and the subsequent creation of microprocessors, such as the Intel 4004, further decreased size and cost and further increased speed and reliability of computers.

1.1 Limited-function early computers 1.2 First general-purpose computers 1.3 Stored-program architecture 1.4 Semiconductors and Microprocessors

Hardware:-The term hardware covers all of those parts of a computer that are
tangible objects. Circuits, displays, power supplies, cables, keyboards, printers and mice are all hardware.
First Generation (Mechanical/Electromechanical) Calculators Programmable Devices Antikythera mechanism, Difference engine, Norden bombsight Jacquard loom, Analytical engine, Harvard Mark I, Z3 AtanasoffBerry Computer, IBM 604, UNIVAC 60, UNIVAC 120 Colossus, ENIAC, Manchester Small-Scale Experimental Machine, EDSAC, Manchester Mark 1, Ferranti Pegasus, Ferranti Mercury, CSIRAC, EDVAC, UNIVAC I, IBM 701, IBM 702, IBM 650, Z22 IBM 7090, IBM 7080, IBM System/360, BUNCH PDP-8, PDP-11, IBM System/32, IBM System/36

Calculators

Second Generation (Vacuum Tubes)

Programmable Devices

Third Generation (Discrete transistors and SSI, MSI, LSI Integrated circuits)

Mainframes Minicomputer

32-bit microcomputer
Fourth Generation(VLSI integrated circuits)

Intel 80386, Pentium, Motorola 68000, ARM architecture Alpha, MIPS, PA-RISC, PowerPC, SPARC, x8664 Intel 8048, Intel 8051

64-bit microcomputer[52] Embedded computer

ANALOG AND DIGITAL


Analog:Describes a device or system that represents changing values as continuously variable physical quantities. A typical analog device is a clock in which the hands move continuously around the face.

When used in reference to data storage and transmission, analog format is that in which information is transmitted by modulating a continuous transmission signal, such as amplifying a signal's strength or varying its frequency to add or take away data. Computers, which handle data in digitalform, require modems to turn signals from digital to analog before transmitting those signals over communication lines such as telephone lines that carry only analog signals. The signals are turned back into digital form (demodulated) at the receiving end so that the computer can process the data in its digital format.

Digital :- Describes any system based on discontinuous data or events.


Computers are digital machines because at their most basic level they can distinguish between just two values, 0 and 1, or off and on. There is no simple way to represent all the values in between, such as 0.25. All data that a computer processes must be encoded digitally, as a series of zeroes and ones. The opposite of digital is analog. A typical analog device is a clock in which the hands move continuously around the face. Such a clock is capable of indicating every possible time of day. In contrast, a digital clock is capable of representing only a finite number of times (every tenth of a second, for example). Most analog events, however, can be simulated digitally. Photographs in newspapers, for instance, consist of an array of dots that are either black or white. From afar, the viewer does not see the dots (the digital form), but only lines and shading, which appear to be continuous. Although digital representations are approximations of analog events, they are useful because they are relatively easy to store and manipulate electronically. The trick is in converting from analog to digital, and back again.

ASCII :-The American Standard Code for Information Interchange

is a character-encoding scheme originally based on the English alphabet. ASCII codes represent text in computers, communications equipment, and other devices that use text. Most modern character-encoding schemes are based on ASCII, though they support many more characters than ASCII does. US-ASCII is the Internet Assigned Numbers Authority (IANA) preferred charset name for ASCII.

ASCII includes definitions for 128 characters: 33 are non-printing control characters (now mostly obsolete) that affect how text and space is processed; 95 are printable characters, including the space, which is considered an invisible graphic. The most commonly used character encoding on the World Wide Web was US-ASCII until December 2007, when it was surpassed by UTF-8. An Internet Protocol address (IP address) is a numerical label assigned to each device (e.g., computer, printer) participating in a computer network that uses the Internet Protocol for communication.[1] An IP address serves two principal functions: host or network interface identification and location addressing. Its role has been characterized as follows: "A name indicates what we seek. An address indicates where it is. A route indicates how to get there. PROTOCAL:- An agreed-upon format for transmitting data between two devices. The protocol determines the following: the type of error checking to be used data compression method, if any how the sending device will indicate that it has finished sending a message There are a variety of standard protocols from which programmers can choose. Each has particular advantages and disadvantages; for example, some are simpler than others, some are more reliable, and some are faster. From a user's point of view, the only interesting aspect about protocols is that your computer or device must support the right ones if you want to communicate with other computers. The protocol can be implemented either in hardware or in software.

S-ar putea să vă placă și