Sunteți pe pagina 1din 21

TECHNOLOGICAL UNIVERSITY OF PANAMA WEST REGIONAL HEADQUARTERS OF PANAMA FACULTY OF COMPUTER SYSTEMS ENGINEERING I YEAR II Unfinished 2011

COMPUTER CONCEPTS

Presented by: Julio Cesar Ruiz

cedula: 8-860-695

Belonging to the group: 9IL111

Aimed at: Prof. Teodoro Castillo

Introduction

In this work you can find the definition of technological concepts used throughout the development of the same we need to learn about that. Also the definition of them for which they were or are used, their importance in the development of this. Hoping that the work is to your liking and that meets your expectations as it did with me and without further ado I leave with this

VCR

The videocassette recorder (or VCR, also known as the video recorder), is a type of electromechanical device that uses removable videocassettes that contain magnetic tape for recording analog audio and analog video from broadcast television so that the images and sound can be played back at a more convenient time. This facility afforded by a VCR machine is commonly referred to as television program Timeshifting. Most domestic VCRs are equipped with a television broadcast receiver (tuner) for TV reception, and a programmable clock (timer) for unattended recording of a certain television channel at a particular time. These features began as simple mechanical counter-based single event timers, but were later replaced by multiple event digital clock timers that afforded greater flexibility to the user. In later models the multiple timer events could be programmed through a menu interface that was displayed on the playback TV screen. This allowed multiple programs to be recorded easily, and this particular feature of the video recorder quickly became a major selling point and benefit to people working unsociable hours who usually missed many television broadcasts. A VCR operates in a different way from a video tape recorder. VTRs originated as individual tape reels, serving as a replacement for motion picture film stock and making recording for television applications cheaper and quicker. VTRs are reel-to-reel devices that require hand-threading of the tape from a single supply reel, through the recording mechanism and onto a separate take-up reel. VCRs tend to be lower maintenance than reel-to-reel VTRs, since the tape path is usually fully enclosed to keep dust out of the mechanism, and the tape is almost never touched by the user except when malfunctions occur.

USB

USB (Universal Serial Bus) is an industry standard developed in the mid-1990s that defines the cables, connectors and protocols used for connection, communication and power supply between computers and electronic devices. USB was designed to standardise the connection of computer peripherals, such as Keyboards, pointing devices, digital cameras, printers, portable media players, disk drives and network adapters to personal computers, both to communicate and to supply electric power. It has become commonplace on other devices, such as smartphones, PDAs and video game consoles. USB has effectively replaced a variety of earlier interfaces, such as serial and parallel ports, as well as separate power chargers for portable devices. As of 2008, about 2 billion USB devices were sold each year, and approximately 6 billion devices have been sold in total. At first we had the serial and parallel interface, but it was necessary to unify all the connectors creating a simpler and higher performance. Thus was born the USB (Universal Serial Bus) speed of 12Mb/sec. and its evolution, USB 2.0, USB high-speeddubbed, with speeds now up to 480 Mb / sec., ie, 40 times faster than USB 1.1 cable connection. USB is a new bus architecture or a new type of bus developed by a group of seven companies (Compaq, Digital Equipment Corp, IBM PC Co., Intel, Microsoft, NEC and Northern Telecom), part of the plug-and advances -play and allows you to installperipherals without having to open your machine to install hardware, ie, simply connectyour peripherals in the back of your computer and go.

CPU

The central processing unit (CPU) is the portion of a computer system that carries out the instructions of a computer program, to perform the basic arithmetical, logical, and input/output operations of the system. The CPU plays a role somewhat analogous to the brain in the computer. The term has been in use in the computer industry at least since the
early 1960s. The form, design and implementation of CPUs have changed dramatically since the earliest examples, but their fundamental operation remains much the same. On large machines, CPUs require one or more printed circuit boards. On personal computers and small workstations, the CPU is housed in a single chip called a microprocessor. Since the 1970s the microprocessor class of CPUs has almost completely overtaken all other CPU implementations. Modern CPUs are large scale integrated circuits in small, rectangular packages, with multiple connecting pins. Two typical components of a CPU are the arithmetic logic unit (ALU), which performs arithmetic and logical operations, and the control unit (CU), which extracts instructions from memory and decodes and executes them, calling on the ALU when necessary. Not all computational systems rely on a central processing unit. An array processor or vector processor has multiple parallel computing elements, with no one unit considered the "center". In the distributed computing model, problems are solved by a distributed interconnected set of processors.

RAM

Random-access memory (RAM) is a form of computer data storage. Today, it takes the form of integrated circuits that allow stored data to be accessed in any order with a worst case performance of constant time. Strictly speaking, modern types of DRAM are therefore not random access, as data is read in bursts, although the name DRAM / RAM has stuck. However, many types of SRAM, ROM, OTP, and NOR flash are still random access even in a strict sense. RAM is often associated with volatile types of memory (such as DRAM memory modules), where its stored information is lost if the power is removed. Many other types of non-volatile memory are RAM as well, including most types of ROM and a type of flash memory called NOR-Flash. The first RAM modules to come into the market were created in 1951 and were sold until the late 1960s and early 1970s. Other memory devices (magnetic tapes, disks) can access the storage data only in a predetermined order, because of mechanical design limitations.

An early type of widespread writable random-access memory was magnetic core memory, developed from 1955 to 1975, and subsequently used in most computers up until the development and adoption of the static and dynamic integrated RAM circuits in the late 1960s and early 1970s. Before this, computers used relays, delay line/delay memory, or various kinds of vacuum tubearrangements to implement "main" memory functions (i.e., hundreds or thousands of bits), some of which were random access, some not. Drum memory could be expanded at low cost but retrieval of non-sequential memory items required knowledge of the physical layout of the drum to optimize speed. Latches built out of vacuum tube triodes, and later, out of discrete transistors, were used for smaller and faster memories such as random-access register banks and registers. Prior to the development of integrated ROM circuits, permanent (or read-only) random-access memory was often constructed using semiconductor diodematrices driven by address decoders, or specially wound core memory planes.

CD-ROM

CD-ROM stands for Compact Disc Read-Only Memory (Compact Disc Read Only Memory). It is a means of mass data storage that uses an optical laser to read microscopic cavities are stamped into the surface of an aluminum disc coated polycarbonate. It was designed for use as a storage device to replace the audio cassette tape, but later took a lot of popularity on personal computers today, so much so that a computer without a CD-ROM simply could not install most of the commercially available software. It is based on the same system used by audio CDs. With its high storage capacity, safety and low cost, CD-ROM is becoming a storage medium increasingly popular. The first standard CD format was used to produce audio CDs that would play on all CD players. Digital audio CD (CD-Digital Audio, CD-DA). The storage capacity of a CD is around 650 megabytes, or the equivalent of more than 500 discs, 3.5 "high density ... something like a sum of 250,000 pages of typed text. To store data on CDStandard DA had to be modified. In 1984, Philips and Sony issued the Yellow Book standard that defined the CD-ROM for storing computer data. The Yellow Bool defined two new classes of content sectors: sectors Mode 1 and Mode 2 sectors . Mode 1 sectors stored computer data and Mode 2 sectors used to store compressed audio or video and graphical data. This new rule recognized the need for the CD-ROM stores data more accurate than audio CDs. As the Yellow Book severely restricted for producers, Philips, Sony and Microsoft Corporation joined together to develop CD-ROM XA. CD-ROM XA formats can mix CD-ROM Mode 1 and Mode 2 for storing computer data, content compressed audio, graphics and video. This format is inserted, which means that combines different types of formats differently on the same CD, allowing music, data, graphics programming and share a single CD. By 1986, due to the growing demand of multimedia was created interactive CD-which contain text, graphics, audio and video on a single disc format. We used special hardware to connect the CD-I players to television screens for operation.

PC

A personal

computer (PC)

is

any

general-

purpose computer whose size, capabilities, and original sales price make it useful for individuals, and which is intended to be operated directly by an enduser with no intervening computer operator. In contrast, the batch processing allowed or timelarge sharing models

expensive mainframe systems to be used by many people, usually at the same time. Large data processing systems require a full-time staff to operate efficiently. Software applications for personal computers include, but are not limited to, word processing, spreadsheets, databases, Web browsers and e-mail clients, digital media playback, games, and myriad personal productivity and special-purpose software applications. Modern personal computers often have connections to the Internet, allowing access to the World Wide Web and a wide range of other resources. Personal computers may be connected to a local area network (LAN), either by a cable or a wireless connection. A personal computer may be a desktop computer or a laptop, tablet PC, or ahandheld PC. While early PC owners usually had to write their own programs to do anything useful with the machines, today's users have access to a wide range ofcommercial software and free software, which is provided in ready-to-run or ready-to-compile form. Since the 1980s, Microsoft and Intel have dominated much of the personal computer market, first with MSDOS and then with the Wintel platform. Alternatives to Windows include Apple's Mac OS X and the open-source Linux OSes. AMD is the major alternative to Intel. Applications and games for PCs are typically developed and distributed independently from the hardware or OS manufacturers, whereas software for many mobile phones and other portable systems is approved and distributed through a centralized online store.

VDT

A computer monitor; an input/output device with a display screen and an input keyboard.

APL
APL (named after the book A Programming Language) is an interactive array-oriented language and integrated development environment which is available from a number of commercial and non-commercial vendors and for most computer platforms. It is based on a mathematical notation developed by Kenneth E. Iverson and associates which features special attributes for the design and specifications of digital computing systems, both hardware and software. APL has a combination of unique and relatively uncommon features that appeal to programmers and make it a productive programming language:

It is concise, using symbols rather than words and applying functions to entire arrays without using explicit loops. It is solution focused, emphasizing the expression of algorithms independently of machine architecture or operating system. It has just one simple, consistent, and recursive precedence rule: the right argument of a function is the result of the entire expression to its right. It facilitates problem solving at a high level of abstraction.

APL is used in scientific, actuarial, statistical, and financial applications where it is used by practitioners for their own work and by programmers to develop commercial applications. It was an important influence on the development of spreadsheets, functional programming, and computer math packages. It has also inspired several other programming languages. It is also associated with rapid and lightweight development projects in volatile business environments.

CUA
Common User Access (CUA) is a standard for user interfaces to operating systems and computer programs. It was developed by IBM and first published in 1987 as part of their Systems Application Architecture. Used originally in the OS/MVS, VM/CMS, OS/400, OS/2 and Microsoft Windows operating systems, parts of the CUA standard are now implemented in programs for other operating systems, including variants of Unix. It is also used by Java AWT and Swing.

CBT
Computer-Based Trainings (CBTs) are self-paced learning activities accessible via a computer or handheld device. CBTs typically present content in a linear fashion, much like reading an online book or manual. For this reason they are often used to teach static processes, such as using software or completing mathematical equations. The term Computer-Based Training is often used interchangeably with Web-based training (WBT) with the primary difference being the delivery method. Where CBTs are typically delivered via CD-ROM, WBTs are delivered via the Internet using a web browser. Assessing learning in a CBT usually comes in form of multiple choice questions, or other assessments that can be easily scored by a computer such as drag-and-drop, radial button, simulation or other interactive means. Assessments are easily scored and recorded via online software, providing immediate end-user feedback and completion status. Users are often able to print completion records in the form of certificates.

CAD
Computer-aided design (CAD), also known as computer-aided design and drafting (CADD), is the use of computer technology for the process of design and design-documentation. Computer Aided Drafting describes the process of drafting with a computer. CADD software, or environments, provides the user with input-tools for the purpose of streamlining design processes; drafting, documentation, and manufacturing processes. CADD output is often in the form of electronic files for print or machining operations. The development of CADD-based software is in direct correlation with the processes it seeks to economize; industry-based software (construction, manufacturing, etc.) typically uses vector-based (linear) environments whereas graphic-based software utilizes raster-based (pixelated) environments. CADD environments often involve more than just shapes. As in the manual drafting of technical and engineering drawings , the output of CAD must convey information, such as materials , processes, dimensions , and tolerances , according to application-specific conventions.

ASCII

The American Standard Code for Information Interchange ASCII is a character-encoding scheme based on the ordering of the English alphabet. ASCII codes represent text in computers, communications equipment, and other devices that use text. Most modern character-encoding schemes are based on ASCII, though they support many more characters than ASCII does. US-ASCII is the Internet Assigned Numbers Authority (IANA) preferred charset name for ASCII. Historically, ASCII developed from telegraphic codes. Its first commercial use was as a sevenbit teleprinter code promoted by Bell data services. Work on ASCII formally began on October 6, 1960, with the first meeting of the American Standards Association's (ASA) X3.2 subcommittee. The first edition of the standard was published during 1963, a major revision during 1967, and the most recent update during 1986. Compared to earlier telegraph codes, the proposed Bell code and ASCII were both ordered for more convenient sorting (i.e., alphabetization) of lists, and added features for devices other than teleprinters.

UPS
An uninterruptible power supply, UPS (in English Uninterruptible Power Supply, UPS) is a device that, thanks to its battery can providepower after a blackout on all attached devices. Another function of UPS is to improve the quality of electrical energy that reaches the

load, filtering up and down and eliminating voltage harmonics of the network in case of using alternating current . The UPS provide power to critical loads teams called, such as medical devices, industrial computer, as mentioned above, need to always have food and that it is of quality, due to the need to be operational at all times and without failure (spikes or drops in voltage).

Gb
A gigabit is a unit of computer storage as usually abbreviated Gbit or sometimes Gb, which is equivalent to 10 bits . The gigabit is closely related to the gibibit , which in binary is equal to 2 bits. Note however, that a gibibit (1,073,741,824 bits) is greater than one gigabit (1,000,000,000 bits) over 7%.

GB
The gigabyte is a multiple of the unit byte for digital information storage. The prefix giga means 109 in the International System of Units (SI), therefore 1 gigabyte is 1000000000bytes. The unit symbol for the gigabyte is GB or Gbyte, but not Gb (lower case b) which is typically used for the gigabit. Historically, the term has also been used in some fields of computer science and information technology to denote the gibibyte, or 1073741824(10243 or 230) bytes.

I/O
input/output, or I/O, refers to the communication between an information processing system (such as a computer), and the outside world, possibly a human, or another information processing system. Inputs are the signals or data received by the system, and outputs are the signals or data sent from it. The term can also be used as part of an action; to "perform I/O" is to perform an input or output operation. I/O devices are used by a person (or other system) to communicate with a computer. For instance, a keyboard or a mouse may be an input device for a computer, whilemonitors and printers are considered output devices for a computer. Devices for communication between computers, such as modems and network cards, typically serve for both input and output.

Kb
A kilobit is a unit of information measure (abbreviated kb or kbit). In practice, the unit used to measure kilobit traffic information for a digital channel, and is represented as kilobits / second (kbps), this unit represents the number of bits that are transferred from one point to another in a second. Be careful not to confuse the transmission rate in bits with the unit of measure of capacity in bytes and that is 8 times the other. 1 byte = 8 bits .

KB
A kilobyte (pronounced / kilobit / or popular jargon / ka /) is a storage unit whose symbol is the kB and is equivalent to 10 3 bytes. On the other hand, like the rest of SI prefixes to the computer are often mistaken for February which must be called kibibyte according to standard IEC 60027-2. The prefix kilo comes from the Greek , meaning thousand.

BIT
A bit (a contraction of binary digit) is the basic unit of information in computing and telecommunicat ions, it is the amount of information stored by a digital device or other physical system that exists in one of two possible distinct states. These may be the two stable states of a flip-flop, two positions of an electrical switch, two distinct voltage orcurrent levels allowed by a circuit, two distinct levels of light intensity, two directions of magnetization or polarization, etc. In computing, a bit can also be defined as a variable or computed quantity that can have only two possible values. These two values are often interpreted as binary digits and are usually denoted by the Arabic numerical digits 0 and 1. Indeed, the term "bit" is a contraction of binary digit. The two values can also be interpreted as logical values (true/false,yes/no), algebraic signs (+/), activation states (on/off), or any other two-valued attribute. The correspondence between these values and the physical states of the underlyingstorage or device is a matter of convention, and different

assignments may be used even within the same device or program. The length of a binary number may be referred to as its "bit-length."

BPI

"Baseline Privacy Interface", provides a data encryption scheme that protects data sent to and from cable modems in a data over cable network. BPI can also be use to authenticate cable modems and authorize the transmission of multicast (communication between a single sender and multiple receivers on a network) traffic. Multicast can also be define as a twoway communication between multiple sites or to transmit a single message to a select group of recipients (i.e. sending an email to a mailing list). BPI gives subscribers data privacy across the RF Network between CMTS (Cable Modem Termination System) and CM (Cable Modem). BPI uses DOCSIS 1.0 (Data Over Cable Service Interface Specifications). DOCSIS 1.0 was confirmed by the (ITU) (International Telecommunication Union) in March 1998. This is a standard interface for cable modems.

DBMS

A database management system (DBMS) is a software package with computer programs that control the creation, maintenance, and the use of a database. It allows organizations to conveniently develop databases for various applications by database administrators (DBAs) and other specialists. A database is an integrated collection of data records, files, and other database objects. A DBMS allows different user application programs to concurrently access the same database. DBMSs may use a variety ofdatabase models, such as the relational model or object model, to conveniently describe and support applications. It typically

supports query languages, which are in fact high-level programming languages, dedicated database languages that considerably simplify writing database application programs. Database languages also simplify the database organization as well as retrieving and presenting information from it. A DBMS provides facilities for controlling data access, enforcing data integrity, managing concurrency control, recovering the database after failures and restoring it from backup files, as well as maintaining database security.

CRT
The cathode ray tube (CRT) is a vacuum tube containing an electron gun (a source of electrons) and a fluorescent screen, with internal or external means to accelerate and deflect the electron beam, used to create images in the form of light emitted from the fluorescent screen. The image may represent electrical waveforms (oscilloscope), pictures (television, computer monitor), radar targets and others. The CRT uses an evacuated glass envelope which is large, deep, heavy, and relatively fragile.

Mb
The megabit (Mbit or Mb) is a unit of information measure widely used in the transmission of data in a telematics .It is often confused megabit equivalent to 10 6 (1,000,000) bits , with the mebibit which is equivalent to February (1,048,576) bits. The difference is that the former applies one of the prefixes of the International System and the second applies one of the binary prefixes .

MB
Megabyte (MB) megabyte (MB) is a unit of measure of the amount of data computer. Is a multiple of byte or octet , which is equivalent to 10 6 bytes. It is represented by MB, not Mb, which would correspond to megabit . colloquially to megs megabytes are called.

DASD
In mainframe computers and some minicomputers, a direct access storage device, or DASD, is any secondary storage device which has relatively low access time relative to its capacity. Historically, IBM introduced the term to cover three different device types: disk drives, magnetic drums, data cells. The direct access capability, occasionally and incorrectly called random access (although that term survives when referring to memory or RAM), of those devices stood in contrast to sequential accessused in tape drives. The latter required a proportionally long time to access a distant point in a medium.

WORM
WORM stands for English for Write Once Read Many, ie write once read many. This designation is awarded to storage media (usually removable) that have this property: the data already written can not be erased or overwritten later. The importance of WORM media is to ensure the integrity and preservation of information stored there. Infrastructure is used in document management electronically. You can store documents of value or those for which there is a tax regulations with those required under the law.

EGA

EGA is the acronym for Enhanced Graphics Adapter English, the standard specification of IBM PC for graphics display, located betweenCGA and VGA in terms of graphics performance (ie, range of colors and resolution). Introduced in 1984 by IBM for its new IBM Personal Computer / AT , EGA had a color depth of 16 colors and a resolution of 640 350 pixels. The EGA card had 16 kilobytes of ROM to expand the BIOS with additional features and included a video address generator Motorola 6845 . Each of the 16 colors they could assign a color RGB palette of high-resolution mode 640 x 350, EGA allowed to pick colors from a palette of 64 different (two bits per pixel for red, green and blue) . EGA also included full function 16-color CGA graphics modes and 320 640 200 200, only 16 color CGA / RGBI were available in this mode. The original CGA modes were present, but EGA was not 100% compatible with CGA. EGA monitor could also control a MDA adjusting the jumpers on the board, only 640 350.

RISC

Reduced instruction set computing, or RISC, is a CPU design strategy based on the insight that simplified (as opposed to complex) instructions can provide higher performance if this simplicity enables much faster execution of each instruction. A computer based on this strategy is a reduced instruction set computer (also RISC). There are many proposals for precise definitions,[1] but the term is slowly being replaced by the more descriptive load-store architecture. Well known RISC families include DEC Alpha, AMD 29k, ARC, ARM, Atmel AVR, Blackfin, MIPS,PARISC, Power (including PowerPC), SuperH, and SPARC.

EPROM

An EPROM (rarely EROM), or erasable programmable read only memory, is a type of memory chip that retains its data when its power supply is switched off. In other words, it is nonvolatile. It is an array of floating-gate transistors individually programmed by an electronic device that supplies higher voltages than those normally used in digital circuits. Once programmed, an EPROM can be erased by exposing it to strong ultraviolet light from a mercury-vapor light source. EPROMs are easily recognizable by the transparent fused quartz window in the top of the package, through which the siliconchip is visible, and which permits exposure to UV light during erasing.

MODEM

A modem (modulator-demodulator) is a device that modulates an analog carrier signal to encode digital information, and also demodulates such a carrier signal to decode the transmitted information. The goal is to produce a signal that can be transmitted easily and decoded to reproduce the original digital data. Modems can be used over any means of transmitting analog signals, from light emitting diodes to radio. The most familiar example is a voice band modem that turns the digital data of a personal computer into modulated electrical signals in the voice frequency range of a telephonechannel. These signals can be transmitted over telephone lines and demodulated by another modem at the receiver side to recover the digital data.

MS-DOS

MS-DOS (EM-es-DOSS; short for Microsoft Disk Operating System) is an operating system for x86based personal computers. It was the most commonly used member of the DOS family of operating systems, and was the main operating system for IBM PC compatible personal computers during the 1980s to the mid 1990s, until it was gradually superseded by operating systems offering a graphical user interface (GUI), in particular by various generations of the Microsoft Windows operating system. During its life, several competing products were released for the x86 platform,[4] and MS-DOS itself would go through eight versions, until development ceased in 2000. Ultimately it was the key product in Microsoft's growth from a programming languages company to a diverse software development firm, providing the company with essential revenue and marketing resources. It was also the underlying basic operating system on which early versions of Windows ran as a GUI.

Conclusion

1. First of all I have discovered some little-used items have now been replaced today by others who have the same objective but with greater ease of use and performance.

2. As I have also gained knowledge about some concepts that were knowledge.

S-ar putea să vă placă și