Sunteți pe pagina 1din 5

Intel Company History: Intel Corporation is the largest semiconductor manufacturer in the world, with major facilities in the

United States, Europe, and Asia. Intel has changed the world dramatically since it was founded in 1968; the company invented the microprocessor, the 'computer on a chip' that made possible the first handheld calculators and personal computers (PCs). By the early 21st century, Intel's microprocessors were found in more than 80 percent of PCs worldwide. The company's product line also includes chipsets and motherboards; flash memory used in wireless communications and other applications; hubs, switches, routers, and other products for Ethernet networks; and embedded control chips used in networking products, laser printers, imaging devices, storage media, and other applications. Intel remained competitive through a combination of clever marketing, well-supported research and development, superior manufacturing proficiency, a vital corporate culture, legal proficiency, and an ongoing alliance with software giant Microsoft Corporation often referred to as 'Wintel.' 1968--79: From DRAM to the 8086 Intel's founders, Robert Noyce and Gordon Moore, were among the eight founders of Fairchild Semiconductor, established in 1957. While at Fairchild, Noyce and Moore invented the integrated circuit, and, in 1968, they decided to form their own company. They were soon joined by Andrew Grove, a Hungarian refugee who had arrived in the United States in 1956 and joined Fairchild in 1963. Grove would remain president and CEO of Intel into the 1990s. To obtain start-up capital, Noyce and Moore approached Arthur Rock, a venture capitalist, with a onepage business plan simply stating their intention of developing large-scale integrated circuits. Rock, who had helped start Fairchild Semiconductor, as well as Teledyne and Scientific Data Systems, had confidence in Noyce and Moore and provided $3 million in capital. The company was incorporated on July 18, 1968, as N M Electronics (the letters standing for Noyce Moore), but quickly changed its name to Intel, formed from the first syllables of 'integrated electronics.' Intel gathered another $2 million in capital before going public in 1971. Noyce and Moore's scanty business proposal belied a clear plan to produce large-scale integrated (LSI) semiconductor memories. At that time, semiconductor memories were ten times more expensive than standard magnetic core memories. Costs were falling, however, and Intel's founders felt that with the greater speed and efficiency of LSI technology, semiconductors would soon replace magnetic cores. Within a few months of its startup, Intel produced the 3101 Schottky bipolar memory, a high-speed random access memory (RAM) chip. The 3101 proved popular enough to sustain the company until the 1101, a metal oxide semiconductor (MOS) chip, was perfected and introduced in 1969. The following year, Intel introduced the 1103, a one Kilobyte (K) dynamic RAM, or DRAM, which was the first chip large enough to store a significant amount of information. With the 1103, Intel finally had a chip that really did begin to replace magnetic cores; DRAMs eventually proved indispensable to the personal computer. The company's most dramatic impact on the computer industry involved its 1971 introduction of the 4004, the world's first microprocessor. Like many of Intel's innovations, the microprocessor was a byproduct of efforts to develop another technology. When a Japanese calculator manufacturer, Busicom, asked Intel to design cost-effective chips for a series of calculators, Intel engineer Ted Hoff was assigned to the project; during his search for such a design, Hoff conceived a plan for a central processing unit (CPU) on one chip. The 4004, which crammed 2,300 transistors onto a one-eighth- by one-sixth-inch chip, had the power of the old 3,000-cubic-foot ENIAC computer, which depended on 38,000 vacuum tubes. Although Intel initially focused on the microprocessor as a computer enhancement that would allow users to add more memory to their units, the microprocessor's great potential--for everything from calculators to cash registers and traffic lights--soon became clear. The applications were facilitated by

Intel's introduction of the 8008, an 8-bit microprocessor developed along with the 4004 but oriented toward data and character (rather than arithmetic) manipulation. The 8080, introduced in 1974, was the first truly general purpose microprocessor. For $360, Intel sold a whole computer on one chip, while conventional computers sold for thousands of dollars. The response was overwhelming. The 8080 soon became the industry standard and Intel the industry leader in the 8-bit market. In response to ensuing competition in the manufacture of 8-bit microprocessors, Intel introduced the 8085, a faster chip with more functions. The company was also developing two more advanced projects, the 32-bit 432 and the 16-bit 8086. The 8086 was introduced in 1978 but took two years to achieve wide use, and, during this time, Motorola produced a competing chip (the 68000) that seemed to be selling faster. Intel responded with a massive sales effort to establish its architecture as the standard. When International Business Machines Corporation (IBM) chose the 8008, the 8086's 8-bit cousin, for its personal computer in 1980, Intel seemed to have beat out the competition. During the 1970s, Intel had also developed the erasable programmable read-only memory (EPROM), another revolutionary but unintended research byproduct. Intel physicist Dov Frohman was working on the reliability problems of the silicon gate used in the MOS process when he realized that the disconnected, or 'floating,' gates that were causing malfunctions could be used to create a chip that was erasable and reprogrammable. Since conventional ROM chips had to be permanently programmed during manufacture, any change required the manufacture of a whole new chip. With EPROM, however, Intel could offer customers chips that could be erased and reprogrammed with ultraviolet light and electricity. At its introduction in 1971, EPROM was a novelty without much of a market. But the microprocessor, invented at the same time, created a demand for memory; the EPROM offered memory that could be conveniently used to test microprocessors. Another major development at Intel during this time was that of peripheral controller chips. Streamlined for specific tasks and stripped of unneeded functions, peripheral chips could greatly increase a computer's abilities without raising software development costs. One of Intel's most important developments in peripherals was the coprocessor, first introduced in 1980. Coprocessor chips were an extension of the CPU that could handle specific computer-intensive tasks more efficiently than the CPU itself. Once again, innovation kept Intel ahead of its competition. Intel's rapid growth, from the 12 employees at its founding in 1968 to 15,000 in 1980, demanded a careful approach to corporate culture. Noyce, Moore, and Grove, who remembered their frustration with Fairchild's bureaucratic bottlenecks, found that defining a workable management style was important. Informal weekly lunches with employees kept communication lines open while the company was small, but that system had become unwieldy. Thus, the founders installed a carefully outlined program emphasizing openness, decision making on the lowest levels, discipline, and problem solving rather than paper shuffling. Moreover, the company's top executives eschewed such luxuries as limousines, expense account lunches, and private parking spaces to establish a sense of teamwork with their subordinates. In an interview with the Harvard Business Review in 1980, Noyce remarked on the company's hiring policy, stating, 'we expect people to work hard. We expect them to be here when they are committed to be here; we measure absolutely everything that we can in terms of performance.' Employee incentives included options on Intel stock, and technological breakthroughs were celebrated with custom-bottled champagne--'Vintage Intel' marked the first $250 million quarter, in 1983--the year sales reached $1 billion for the first time. 1980s: From 286 to 486 During the 1974 recession, Intel was forced to lay off 30 percent of its employees, and morale declined substantially as a result. Thus, in 1981, when economic struggles again surfaced, instead of laying off more employees, Intel accelerated new product development with the '125 Percent Solution,' which

asked exempt employees to work two extra hours per day, without pay, for six months. A brief surge in sales the following year did not last, and, again, instead of more layoffs, Intel imposed pay cuts of up to ten percent. Such measures were not popular among all its workforce, but, by June 1983, all cuts had been restored and retroactive raises had been made. Moreover, in December 1982, IBM paid $250 million for a 12 percent share of Intel, giving the company not only a strong capital boost, but also strong ties to the undisputed industry leader. IBM would eventually increase its stake to 20 percent before selling its Intel stock in 1987. During the early 1980s, Intel began to slip in some of its markets. Fierce competition in DRAMs, static RAMs, and EPROMs left Intel concentrating on microprocessors. While competitors claimed that Intel simply gave away its DRAM market, Moore told Business Week in 1988 that the company deliberately focused on microprocessors as the least cyclical field in which to operate. Customer service, an area Intel had been able to overlook for years as it dominated its markets, became more important as highly efficient Japanese and other increasingly innovative competitors challenged Intel's position. In addition, Intel's manufacturing record, strained in years past by undercapacity, needed fixing. Fab 7, Intel's seventh wafer-fabrication plant, opened in 1983 only to face two years of troubled operations before reaching full capacity. Between 1984 and 1988, Intel closed eight old plants, and in 1988 it spent some $450 million on new technology to bring its manufacturing capacity into line with its developmental prowess. Despite these retrenchments, the company continued to excel in the microprocessor market. In 1982 Intel introduced its 80286 microprocessor, the chip that quickly came to dominate the upper-end PC market, when IBM came out with the 286-powered PC/AT. The 286 was followed in 1985 by Intel's 80386 chip, popularized in 1987 by the Compaq DESKPRO 386, which, despite bugs when it first came out, became one of the most popular chips on the market. While the 286 brought to the personal computer a speed and power that gave larger computers their first real challenge, the 386 offered even greater speed and power together with the ability to run more than one program at a time. The 386 featured 32-bit architecture and 275,000 transistors--more than twice the number of the 286. In 1989 Intel introduced the 80486, a chip Business Week heralded as 'a veritable mainframe-on-a-chip.' The 486 included 1.2 million transistors and the first built-in math coprocessor, and was 50 times faster than the 4004, the first microprocessor. In designing the i486, Intel resisted an industry trend toward RISC (reduced instruction-set computing), a chip design that eliminated rarely used instructions in order to gain speed. Intel argued that what RISC chips gained in speed they lost in flexibility and that, moreover, RISC chips were not compatible with software already on the market, which Intel felt would secure the 486's position. A new chip, the 64-bit i860 announced in early 1989, however, did make use of RISC technology to offer what Intel claimed would be a 'supercomputer on a chip.' Also in 1989, a major lawsuit that Intel had filed against NEC Corporation five years before was decided. Intel had claimed that NEC violated its copyright on the microcode, or embedded software instructions, of Intel's 8086 and 8088 chips. Although Intel had licensed NEC to produce the microcode, NEC had subsequently designed a similar chip of its own. At issue was whether microcode could be copyrighted. The court ruled that it could but that NEC had not violated any copyright in the case at hand. The suit made public some issues surrounding Intel's reputation. Some rivals and consumers, for example, claimed that Intel used its size and power to repress competition through such tactics as filing 'meritless' lawsuits and tying microprocessor sales to other chips. Other observers, however, praised Intel's protection of its intellectual property and, subsequently, its profits. The Federal Trade Commission conducted a two-year investigation of Intel's practices and did not recommend criminal charges against the company, but two rival companies--Advanced Micro Devices Inc. and Cyrix Corp.-filed antitrust lawsuits against Intel in 1993.

The History of Intel


Intel began in 1968. It was founded by Gordon E. Moore who is also a physicist and chemist. He was accompanied by Robert Noyce, also a fellow physicist and co-creator of integrated circuitry, after they both had left Fairchild Semiconductor. During the 1980s Intel was run by a chemical engineer by the name of Andy Grove, who was the third member of the original Intel family. Many other Fairchild employees participated in other Silicon Valley companies. Andy Grove today is considered to be one of the companys essential business and strategic leaders. As the 1990s concluded, Intel had become one of the largest and by far the most successful businesses in the entire world. Intel has gone through many faces and phases. In the beginning Intel was set apart by its ability primarily to create memory chips or SRAM. When the firm was founded, Gordon Moore and Robert Noyce had the idea to name their company Moore Noyce. However when the name is spoken it is heard as More Noise This idea was quickly abandoned and the pursuit of a more suitable name one which was not associated with a bad interface. The name NM Electronics was shortly thereafter chosen and used for nearly a year, when the company experienced a name change to Integrated Electronics, or INTEL for short. The rights to the name however had to be purchased as it was already in use by a fairly well known hotel chain. Though Intel had mastered the first microprocessor called the Intel 4004 in 1971 and also one of the worlds very first microcomputers in 1972, in the early 80s the focus was primarily on Random Access Memory chips. A new client in the early 70s from Japan wanted to enlist the services of Intel to design twelve chips for their calculators. Knowing that they did not have the manpower or the resources to complete this job effectively, Ted Hoff agreed to the job just the same. His idea was: What if we can design one computer chip which could function the same as twelve microchips?. Hoofs idea was completely embraced by Moore and Noyce. If this project were successful the chip would have the ability to receive command functions. This is where the 4004 model came from. After a painstaking 9 months. It measured 1/8th inch by 1/6th inch long and contained 2,300 transistors. History was made and changed that day. The Pentium Pro processor had 5.5 million transistors, making the chip so affordable that it could be imbedded in common household appliances. After this success Intel decided to completely embrace this and to pursue its production. Some notable dates in the history of Intel are: 1968 Robert Noyce and Gordon Moore incorporate NM Electronics 1970 The development of DRAM and dynamic RAM 1971 The worlds first microcomputer is introduced 1974 The first general purpose microprocessor is introduced to the world 1980 The Intel microprocessor is chosen by IBM for the first ever personal computer. 1992 Intels net income tops the one billion dollar point 1993 The Pentium is introduced, a fifth generation chip 1996 Intels revenue exceeds twenty billion dollars and the net income surpasses five billion dollars 1997, The Pentium 11 microprocessor is introduced to the world 1999 Intel is added to Dow Jones Averages. 2000 The worlds very first Intel 1 gigahertz processor hits the shelves. To this day Intel continues to make strides in the computing and micro computing world.

S-ar putea să vă placă și