Sunteți pe pagina 1din 7

History of computing

The history of computing is longer than the history of computing hardware and
modern computing technology and includes the history of methods intended for pen
and paper or for chalk and slate, with or without the aid of tables. The timeline of
computing presents a summary list of major developments in computing by date.

History of computing
A computer might be described with deceptive simplicity as an apparatus that performs routine
calculations automatically. Such a definition would owe its deceptiveness to a naive and narrow
view of calculation as a strictly mathematical process. In fact, calculation underlies many
activities that are not normally thought of as mathematical. Walking across a room, for instance,
requires many complex, albeit subconscious, calculations. Computers, too, have proved capable
of solving a vast array of problems, from balancing a checkbook to evenin the form of
guidance systems for robotswalking across a room.
Before the true power of computing could be realized, therefore, the naive view of calculation
had to be overcome. The inventors who laboured to bring the computer into the world had to
learn that the thing they were inventing was not just a number cruncher, not merely a calculator.
For example, they had to learn that it was not necessary to invent a new computer for every new
calculation and that a computer could be designed to solve numerous problems, even problems
not yet imagined when the computer was built. They also had to learn how to tell such a general
problem-solving computer what problem to solve. In other words, they had to invent
programming.
They had to solve all the heady problems of developing such a device, of implementing the
design, of actually building the thing. The history of the solving of these problems is the history
of the computer. That history is covered in this section, and links are provided to entries on many
of the individuals and companies mentioned. In addition, see the articles computer science and
supercomputer.

Nanocomputing

Nanocomputing
Nanocomputing describes computing that uses extremely small, or nanoscale, devices (one
nanometer [nm] is one billionth of a meter). In 2001, state-of-the-art electronic devices could be
as small as about 100 nm, which is about the same size as a virus. The integrated circuits (IC)

industry, however, looks to the future to determine the smallest electronic devices possible within
the limits of computing technology.
Until the mid-1990s, the term "nanoscale" generally denoted circuit features smaller than 100
nm. As the IC industry started to build commercial devices at such size scales since the
beginning of the 2000s, the term "nanocomputing" has been reserved for device features well
below 50 nm to even the size of individual molecules, which are only a few nm. Scientists and
engineers are only beginning to conceive new ways to approach computing using extremely
small devices and individual molecules.
All computers must operate by basic physical processes. Contemporary digital computers use
currents and voltages in tens of millions of complementary metal oxide semiconductor (CMOS)
transistors covering a few square centimeters of silicon. If device dimensions could be scaled
down by a factor of 10 or even 100, then circuit functionality would increase 100 to 10,000
times.
Furthermore, if such a new device or computer architecture were to be developed, this might lead
to millionfold increases in computing power. Such circuits would consume far less power per
function, increasing battery life and shrinking boxes and fans necessary to cool circuits. Also,
they would be remarkably fast and able to perform calculations that are not yet possible on any
computer. Benefits of significantly faster computers include more accuracy in predicting weather
patterns, recognizing complex figures in images, and developing artificial intelligence (AI) .
Potentially, single-chip memories containing thousands of gigabytes of data will be developed,
capable of holding entire libraries of books, music, or movies.
Modern transistors are engineering marvels, requiring hundreds of careful processing steps
performed in ultraclean environments. Today's transistors operate with microampere currents
and only a few thousand electrons generating the signals, but as they are scaled down, fewer
electrons are available to create the large voltage swings required of them. This compels
scientists and engineers to seek new physical phenomena that will allow information processing
to occur using other mechanisms than those currently employed for transistor action.
Future nanocomputers could be evolutionary, scaled-down versions of today's computers,
working in essentially the same ways and with similar but nanoscale devices. Or they may be
revolutionary, being based on some new device or molecular structure not yet developed.
Research on nano-devices is aimed at learning the physical properties of very small structures
and then determining how these can be used to perform some kind of computing functions.
Current nanocomputing research involves the study of very small electronic devices and
molecules, their fabrication, and architectures that can benefit from their inherent electrical
properties. Nanostructures that have been studied include semiconductor quantum dots, single

electron structures, and various molecules. Very small particles of material confine electrons in
ways that large ones do not, so that the quantum mechanical nature of the electrons becomes
important.
Quantum dots behave like artificial atoms and molecules in that the electrons inside of them can
have only certain values of energy, which can be used to represent logic information robustly.
Another area is that of "single electron devices," which, as the name implies, represent
information by the behavior of only one, single electron. The ultimate scaled-down electronic
devices are individual molecules on the size scale of a nm.
Chemists can synthesize molecules easily and in large quantities; these can be made to act as
switches or charge containers of almost any desirable shape and size. One molecule that has
attracted considerable interest is that of the common deoxyribonucleic acid (DNA), best known
from biology. Ideas for attaching smaller molecules, called "functional groups," to the molecules
and creating larger arrays of DNA for computing are under investigation. These are but a few of
the many approaches being considered.
In addition to discovering new devices on the nanoscale, it is critically important to devise new
ways to interconnect these devices for useful applications. One potential architecture is called
cellular neural networks (CNN) in which devices are connected to neighbors, and as inputs are
provided at the edge, the interconnects cause a change in the devices to sweep like a wave across
the array, providing an output at the other edge.
An extension of the CNN concept is that of quantum-dot cellular automata (QCA) . This
architecture uses arrangements of single electrons that communicate with each other by Coulomb
repulsion over large arrays. The arrangement of electrons at the edges provides the
computational output. The electron arrangements of QCA are controlled by an external clock and
operate according to the rules of Boolean logic .
Another potential architecture is that of "crossbar switching" in which molecules are placed at
the intersections of nanometer-scale wires. These molecules provide coupling between the wires
and provide computing functionality.
The fabrication of these nanoscale systems is also a critical area of investigation. Current ICs are
manufactured in a parallel process in which short wavelength light exposes an entire IC in one
flash, taking only a fraction of a second. Serial processes, in which each device is exposed
separately, are too slow as of early 2002 to expose billions of devices in a reasonable amount of
time. Serial processes that are capable of attaining nanometer, but not molecular, resolution
include using beams of electrons or ions to write patterns on an IC. Atomic resolution can be
achieved by using currents from very sharp tips, a process called scanning probe lithography, to

write on surfaces one atom at a time, but this technique is too slow for manufacturing unless
thousands of tips can be used in parallel.
It is reasonable to search for nanoscale particles, such as molecules, that do not require difficult
fabrication steps. An alternative to the direct patterning of nanoscale system components is that
of self assembly, a process in which small particles or molecules arrange themselves. Regardless
of the method used to create arrays of nanostructures, organizing the nanodevices into useful
architectures, getting data in and out, and performing computing are problems that have not yet
been solved.
In summary, nanocomputing technology has the potential for revolutionizing the way that
computers are used. However, in order to achieve this goal, major progress in device technology,
computer architectures, and IC processing must first be accomplished. It may take decades
before revolutionary nanocomputing technology becomes commercially feasible.
Quantum computing

Quantum computing studies theoretical computation systems (quantum computers) that make
direct use of quantum-mechanical phenomena, such as superposition and entanglement, to
perform operations on data.[1] Quantum computers are different from binary digital electronic
computers based on transistors. Whereas common digital computing requires that the data be
encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1),
quantum computation is analog and uses quantum bits, which can be in an infinite number of
superpositions of states. A quantum Turing machine is a theoretical model of such a computer,
and is also known as the universal quantum computer. Quantum computers share theoretical
similarities with non-deterministic and probabilistic computers. The field of quantum computing
was initiated by the work of Paul Benioff[2] and Yuri Manin in 1980,[3] Richard Feynman in 1982,
[4]
and David Deutsch in 1985.[5] A quantum computer with spins as quantum bits was also
formulated for use as a quantum spacetime in 1968.[6]

Developments
There are a number of quantum computing models, distinguished by the basic elements in which
the computation is decomposed. The four main models of practical importance are:

Quantum gate array (computation decomposed into sequence of few-qubit


quantum gates)

One-way quantum computer (computation decomposed into sequence of


one-qubit measurements applied to a highly entangled initial state or cluster
state)

Adiabatic quantum computer, based on Quantum annealing (computation


decomposed into a slow continuous transformation of an initial Hamiltonian
into a final Hamiltonian, whose ground states contain the solution) [36]

Topological quantum computer[37] (computation decomposed into the braiding


of anyons in a 2D lattice)

The Quantum Turing machine is theoretically important but direct implementation of this model
is not pursued. All four models of computation have been shown to be equivalent; each can
simulate the other with no more than polynomial overhead.
For physically implementing a quantum computer, many different candidates are being pursued,
among them (distinguished by the physical system used to realize the qubits):

Superconductor-based quantum computers (including SQUID-based quantum


computers)[38][39] (qubit implemented by the state of small superconducting
circuits (Josephson junctions))

Trapped ion quantum computer (qubit implemented by the internal state of


trapped ions)

Optical lattices (qubit implemented by internal states of neutral atoms


trapped in an optical lattice)

Quantum dot computer, spin-based (e.g. the Loss-DiVincenzo quantum


computer[40]) (qubit given by the spin states of trapped electrons)

Quantum dot computer, spatial-based (qubit given by electron position in


double quantum dot)[41]

Nuclear magnetic resonance on molecules in solution (liquid-state NMR)


(qubit provided by nuclear spins within the dissolved molecule)

Solid-state NMR Kane quantum computers (qubit realized by the nuclear spin
state of phosphorus donors in silicon)

Electrons-on-helium quantum computers (qubit is the electron spin)

Cavity quantum electrodynamics (CQED) (qubit provided by the internal state


of trapped atoms coupled to high-finesse cavities)

Molecular magnet[42] (qubit given by spin states)

Fullerene-based ESR quantum computer (qubit based on the electronic spin of


atoms or molecules encased in fullerenes)

Linear optical quantum computer (qubits realized by processing states of


different modes of light through linear elements e.g. mirrors, beam splitters
and phase shifters)[43]

Diamond-based quantum computer[44][45][46] (qubit realized by electronic or


nuclear spin of nitrogen-vacancy centers in diamond)

BoseEinstein condensate-based quantum computer[47]

Transistor-based quantum computer string quantum computers with


entrainment of positive holes using an electrostatic trap

Rare-earth-metal-ion-doped inorganic crystal based quantum computers [48][49]


(qubit realized by the internal electronic state of dopants in optical fibers)

Metallic-like carbon nanospheres based quantum computers [50]

The large number of candidates demonstrates that the topic, in spite of rapid progress, is still in
its infancy. There is also a vast amount of flexibility.
Nanocomputing Technologies

Nanocomputing
A nanocomputer is similar in many respects to the modern personal computerbut on a scale
that's very much smaller. With access to several thousand (or millions) of nanocomputers,
depending on your needs or requirementsgives a whole new meaning to the expression
"unlimited computing"you may be able to gain a lot more power for less money.
Nanocomputing is evolving along two distinct paths:

New nanoproducts, techniques, and enhancements will be integrated into current


technology such as the PC, the mainframe, and servers of all types. Mass storage will
change significantly as thousands of cheap storage devices will become available.
Storage need never be a problem or cost again.

Research and development are working toward making entirely new nanocomputers that
run softwaresimilar to that on today's PC.

We can make a number of statements about nanocomputing that put it into perspective with a
healthy dose of realityand less hype:

Nanocomputing is an emerging technology that is at the early stage of its development.

Worldwide initiatives are in progress to develop the technology. Japan, Europe, and the
United States are in a race to the finish line.

Breakthroughs and announcements are increasing with significant rapidity.

Governments are beginning to see the potential and are investing heavily in research and
development programs. This interest and investment will accelerate progress.

Nanocomputing shows great potential, but there are significant technical barriers and
obstacles to overcome.

The true technology of nanocomputing will not be available for some time. Much
development work has to be completed before success can be claimed.

Nanocomputing will come from two sources:

It will be integrated into existing products and technology (disk drives, for
example).

Fundamentally new products, software, and architectures will be developed.

More hype will follow due to media exploitation.

Major corporations such as IBM, Intel, Motorola, HP, and others are investing significant
amounts of money in research to develop nanocomputers. The market for such devices far
surpasses the market for the everyday PC. The new technology may even become known as the
personal nanocomputer (PN).

S-ar putea să vă placă și