Sunteți pe pagina 1din 4

PREFACE

"A scientific truth does not triumph by convincing its opponents and making them see
the light, but rather because its opponents eventually die and a new generation grows
up that is familiar with it." - Maxwell Planck

Let's have a talk...


No one had heard of any operating system (OS) till late 1940s. Then programs were used
to enter bit by bit. History says that it was General Motors Research Laboratories who
implemented the first operating systems in early 1950's for their IBM 701 with the
introduction of punch cards. By the early 1960s, commercial computer vendors were
supplying quite extensive tools for streamlining the development, scheduling, and
execution of jobs on batch processing systems. Examples were produced by UNIVAC
and Control Data Corporation, amongst others.

The operating systems originally deployed on mainframes, and, much later, the original
microcomputer operating systems, only supported one program at a time, requiring
only a very basic scheduler. Each program was in complete control of the machine while
it was running. Multitasking (timesharing) first came to mainframes in the 1960s.

Growth of muilti tasking had its own reasons (as you will see inside the manual). One
reason for multitasking was in the design of real-time computing systems (RTCS), where
there are a number of possibly unrelated external activities needed to be controlled by a
single processor system. A real-time operating system (RTOS) such as VxWorks, eCos,
QNX, and RTLinux ,is a multitasking operating system intended for applications with
fixed deadlines (real-time computing). Such applications include some small embedded
systems (PDAs, cellphones, point-of-sale devices, VCRs, industrial robot control, or even
your toaster), automobile engine controllers, industrial robots, spacecraft, industrial
control, and some large-scale computing systems. Some embedded systems use
operating systems such as Palm OS, Windows CE, BSD, and Linux, although such
operating systems do not support real-time computing.

Long since PDP of '64 to Dragonfly BSD of 2K8, categories of OS are ever evolving, as so
as more complexities (atleast for students like us). OS are categorized by technology,
ownership, licensing, working state, usage, and by many other characteristics. In
practice, many of these groupings may overlap. however classification by usage such as
Disk OS, Networking OS, web based, embedded etc seems most meaningful. (We
however focus only on desktop OS whose primary goal is twofold, convenient for the
user and efficient operation of the computer system.)
Together with OS categories, 'in what computing environment/model an operating
system will be used' are also evolving. Single/sequential, distributed, associative,
networked, are common words now.
The 1960’s definition of an operating system was “the software that controls the
hardware". However, today, due to microcode we need a better definition. We see an
operating system as the program that makes the hardware usable, efficient, and even
interactive. The structurally Operating Systems can be design as a monolithic system, a
hierarchy of layers, a virtual machine system, an exokernel, or using the client-server
model. However, the basic concepts of Operating Systems that is processes, memory
management, I/O management, the file systems, and security are still first to be taken
care of. The last word 'security' deserves more lines.

Security of systems, as a whole, is a wide issue. Roughly it can be said that computer
security can be viewed from two perspectives. One, internal security, and other, external
security that in todays world is often used as a synonym with data security.
Internal security loosely means preventing unwanted interference/access of one
program with another-the theme of multi programming environment.
Since almost all computers today are multiprogrammed, internal security is
indispensable.If programs can directly access hardware and resources, they cannot be
secured. Various hardware protection strategies have been developed and successfully
used.
Windows had been heavily criticized for many years for it's inability to protect one
running program from another. To remedy this, Microsoft has added limited user
accounts and more secure logins in recent years, however, most people still operate their
computers using Administrator accounts, which negates any possible internal security
improvements brought about by these changes. It is only recently, with the release of
Vista, that even Administrator accounts have certain restrictions. Regardless, these
measures can and are circumvented by the users.(lets see if Microsoft's future version
code named Windows 7 scheduled to be released in late 2009 - mid 2010 can have more
tight measures ). On the other hand Linux and UNIX both have two tier security, which
limits any system-wide changes to the root user.
Most computers today are also on one or the other networks and runs many processes
besides usual system processes. In such cases, the host OS has to support a variety of
networking protocols, application softwares,drivers, and other programs that are
indispensable to a networked computer. This further emphasize the need of a modern
OS to support hardware protection and internal security at the kernel level.
External security, on the other hand, deals with preventing unauthorized access
essentially on networked systems but thats not always. Even when a computer is not
connected to any network, issues of external security may arise for example you won't
allow your system to be searched by others in your absence. This however is not a big
issue for example simply tracing a log file on windows can help you know who access
your system in your absence.
But on a network the problem grows enormously. With the invent of web and later
internet, the complete definition of an OS seems to growing with both data security and
hardware protection added as another important aspect that OS must provide.
Securing standlone desktop resources is not an issue now. Growth of multi transactional
and distributed applications and e-commerce further fueled the need of more secure
systems.
Although modern cryptographic and digital signature techniques have made a great
effort towards data security but the quest for protecting a system from any or all
unauthorized access and malicious programs is still in active search. At the operating
system level itself, there are a number of software firewall available, as well as intrusion
detection/prevention systems.
Security is so much a focus in IT industries and business that even the OS is chosen that
offers the best and secure model. NetWare (networking OS by Novell based on SUSE-
Linux)for eg, is symbolically meant for networking environment only because of its
default security model which is quite stronger relatively. One hardly needs any third
party software while working on Linux platforms. In my B.Tech career, i haven't yet
heard of anyone using additional firewalls or anti viruses on Linux. if you know some,
please let me know (although working heavily with emails may require additional spam
guards turned on).
Besides security,support to incorporate scalability at the hardware level in another
important issue that the OS model (specially those running on servers and workstations)
must deal in when implementing big scalable applications that may require
computational resources, more or less often (hardware) to be upgraded.
Truly, Modern digital Computers have undergone a great phenomenal change in terms
of both speed and accuracy in the past 15 years. Efficiency, performance, durability are
all now a necessity as we make computers do much more than just arithmetic
calculations.
Growing IT-industry needs of large Data processing, information retrieval systems,
Research and developmental tasks today require such fast computing speeds that a
normal desktop resource can't provide.The list of classes of problems that require faster
processing is ever growing. Some of them include: Simulation and Modeling problems
requiring numerous approximations and calculations, Problems dependent on
computations / manipulations of large amounts of data such as Image and Signal
Processing, Entertainment (Image Rendering), Database and Data Mining,Seismic and
other Challenging Problems that include space exploration, Climate Modeling, Fluid
Turbulence, Pollution Dispersion, Human Genome, Ocean Circulation, Quantum
Chromo dynamics, Semiconductor Modeling, Superconductor Modeling, Combustion
Systems,Vision & Cognition.
While achieving faster programming is still a mammoth task, most often the solution
involve parallelism. Parallelism is a strategy for performing large, complex tasks faster.
A large task can either be performed serially, one step following another, or can be
decomposed into smaller tasks to be performed simultaneously, i.e., in parallel. Among
many, two of the most common methods for achieving parallelism are message passing
where the user makes calls to libraries to explicitely share information between
processors and other is data parallel where data partitioning determines parallelism.
Other paradigm of parallel programming such as Shared Memory, Remote Memory,
threads and combined models exists. Major exploitation is in the field of super
computing. Traditional supercomputers have major drawbacks over parallel super
computers. large expense of single high performance processors, significant cooling
requirements, Single processor performance reaching its asymptotic limit are some of
them.

With exploration of advanced techniques like QTC and then physical computing (i heard
of about physical computing first at MIET ETCS national conference when Dr OP Ojha,
delivered a note on the same), researchers are trying hard to further increase computing
speeds and finding more secure and efficient OS models. But as need also advances with
the same rate as resources, whether the latter would ever step ahead is still unclear.
What's clear for sure is that to design any system however powerful, learning basic
concepts is a prime need. We can't develop any operating system with infinite hardware
usability but with no hardware protection.

Needless to say now that operating system are an essential part of EVERY computer
system (the capitalization does matters) and so as the need to master its concepts for
EVERY CS student (the capitalization does matters again) so that if we ever think of any
innovation, we should not be lacked by fundamentals that really begin them.

Hope you enjoy developing your concepts too !


AMIT NIGAM

S-ar putea să vă placă și