Sunteți pe pagina 1din 29

ASSIGNMENT

SUBJECT: BROADBAND COMMUNICATION

SUBMITTED BY: NABEEL KHAN , MUHAMMAD ZARIN

SUBMITTED TO: DR. YOUSAF KHALIL

SEMESTER: 1ST

MSC ELECTRICAL COMMUNICATION

UNIVERSITY OF ENGINEERING AND TECHNOLOGY


Evolution of Telecom services in last three decades
The telecommunication industry has changed and its merging with other industries, such
as information technology. Technological change creates benefits for consumers, but
also opportunities for companies to reduce competition.
Some of the major evolutions are:
 Voice Telephony.
Telephony is the technology involving the development ,application and
deployment of telecom services for the purpose of electronic transmission of
voice between distant parties. The history of telephony is intimately linked to the
invention and development of the telephone.
Telephony is commonly referred to as the construction or operation of telephones
and telephonic systems and as a system of telecom in which telephonic
equipment is employed in the transmission of speech or other sound between
points, with or without the use of wires. The term is also used frequently to refer
to computer hardware ,software and computer network systems, that perform
functions traditionally perform by telephone equipment this context the
technology is specially referred to as internet telephony ,or voice over internet
protocol.
HISTORY:
The first telephones were connected directly in pairs each user had a spread
telephone wired to each locations to be reached . This became incontinent and
unmanageable when users wanted to communicate with more than a few
people.
The invention of the telephone exchange provide the solution for establishing
telephone connection with any other telephone in service in the local area .Each
telephone was connected to the exchange at first with one wire , later one wire
pair, local loop. Near by exchanges in other service areas were connected with
trunk lines and long distance service could be established by relaying the calls
through multiple exchanges.
Initially ,exchange switch boards were manually operated by an attendant ,
commonly referred to as the switch board operator. When a customer cranked a
handle on the telephone , it activated an indicator ,who would in response
plugged the operator headset into that jack and offer service . The caller had to
asked for the call party by name, later by number ,and operator connected one
end of the circuit into the called party jack to alert them .If the called station
answered , the operator disconnected their headset and completed the station
to station circuit. Trunk calls were made with the assistance of the other
operators at other exchangers in the network until the 1970s , most telephones
were permanently wired to the telephone lines installed at customer premises.
Later , conversion to installation of jacks that terminated the inside wiring
permitted simple exchange of telephone sets with telephone plugs and allowed
portability of the set to multiple location in the premises where jacks were
installed . Inside wiring to all jacks was connected in one place to the wire drop
which connects building to a cable. Cables usually bring a large number of drop
wires from all over a district access network to one wire center or telephone
exchange. When a telephone user wants to make a telephone call, equipment at
the exchange examines the dialed telephone number and connects that
telephone lines to another in the same wire center ,or to a trunk to distant
exchange. Most of the exchanges in the world are interconnected through a
system of larger switching systems, forming the public switched telephone
network. Today telephony uses digital technology in the provisioning of
telephone services and systems. Telephone calls provided digitally, but may be
restricted to cases in which the last mile is digital or where the conversion
between digital and analogue signal takes place inside the telephone. This
advancement has reduce the costs in communication and improve the quality of
voice services .The first implementation of this ISDN ,permitted all data transport
from end to end speedily over telephone lines.
NEW ERA:
Starting with the introduction of the transistor, to computer based electronic
switching system, the PSTN has gradually evolved towards automation and
digitization of signaling and audio transmissions.
The field of technology available for telephone has broadened with the advent of
new communication technologies . Telephony now includes the technologies of
internet services and mobile communication ,including video conferencing’s new

VOICE TELEPHONE ADVANCMENT

technologies based om internet protocol (IP) concepts are often referred to


separately as voice over iP telephony or internet telephony. Unlike traditional
phone services ,Ip telephony or services are relatively unregulated by govt . In the
US FCC regulate phone to phone connection , but says they do not plan to
regulate connection between a phone user and an IP telephony service provider.
A specialization of digital telephony ,internet protocol telephony services
involves the application of digital networking technology that was the
foundation to the internet to create , transmit , and receive telecommunication
sessions over computer networks .
Internet telephony is commonly known as voice over internet protocol , reflecting
the principle, but it has been referred with many other terms .VOIP has proven to
be a disruptive that is rapidly replacing traditional telephone infrastructure
technologies . As of January , up to 10% of telephone subscribers in Japan and
south Korea have switched to this digital telephone service

 DATA FAX.
The first fax machines were very primitive and they lacked many of the capabilities
you see today .Still, as you ,ll see the technology was impressive for the time period
It started with Scottish inventor Alexander Bain . A few years removed from
inventing the electric clock, he was ready for his next endeavor, He used elements
from the clock to assist in his invention of the fax machine ( specially the movement
of two pendulums for line _by _line Message scanning). Patented the fax invention
on may-27-1843 , and faxing finally came into existence .
However , that was just beginning .

Evolution of fax

Several inventor would improve on Bain’s maychine over the next have century.
Fredrick bakewell and Giovanni made arguably the bigggest improvements.
Once they were through with it (Around 1865s), people could send images over
telegraph lines .
This is even more impressive when you consider that workable telephone
would not be invented for the next 11 years !
Growing through competition. Without study and consistent within the fax
industry, Fax may not be where it is today.
There seemed to always be different types of fax machines and different inventor
,competing with each other through out the decades.
There was Shelford in 1880s who made it possible to scan two dimensional
originals without manual plotting or drawing .Then Elisha Grey in 1888s who made
it possible for users to send signatures over long distances.
The innovation continued in 1924s when radio corporation of America (RCA) which
led to the first picture transmitted through radio fax .
In that same year, At & T transmitted the first color fax.
Fax grows in popularity and usage.
All of this competition and innovation led to the following developments in fax :
The ability to wirelessly send images
The ability to send fax over radio signal
The ability to scan and transmit and 3D data.
Banks like western Union began using radio fax receivers regularly , and the US
army even transmitted its first photograph via fax to Puerto in 1960s!
But the biggest development was till yet to come…
Xerox invents the first commercial fax machine .
This machine could be connected to telephone line and transmit later –sized
documents in 6 minutes.
This set the standard for fax , made it less expensive , increased its popularity and
usage .
 The internet:
The world largest network of computer network got its original name from US
military arm that funded it.: Arpanet was for the advance Research Project
Agency . Back in 1969s when Arpanet was created ,it connected five sites: UCLA,
Stanford, UC Santa Barba , the University of Utah and BBN. In 1983s ,The US
Defence Department spun-off MILNIT ,which was the part of Arpanet thar
carried unclassified military communications. Arpanet was renamed the internet
in 1984 ,when it linked 1,000 hosts at university and corporate labels. MILNET
was later renamed the defence data network and finaly NIPRNET , for Non-
classified IP Network. Internet users top 1 billion. internet usage has exploded
since 1995, when researchers first started tracking this statistics.

Although estiimates vary from the internet having 1 billion to 1.5 billion users,
everyone agrees that the Net has room for growth as worldwide population top
billion .
That leaves more than 4 billion people around the world without internet access
today.
Internet traffic keeps tracking.Experts quibble how much traffic is on internet and
how fast its growing on .Is it growing at 50% to 60% a year? Or 100% a year. But
there,s no question that the figure has exploded since 1974s, when daily traffic
on the internet surpassed 3 million packets.
First measured in terabytes and petabytes , scientist say the future points to
monthly traffic volumes in the exabytes ,wich is 10 to the 18 th power bytes .
Whatever you call it , that’s a lot of packets!
Security threats rise aalong with usage .Back in 1988,The Morse Worm was the
first major attack on the internet , disablling 10% of the internet’s 60000 host
computers.
Today hundreds of more sinister attacks are aimed at internet users each
day.Indeed the US Computer emergency readiness team (US CERT) stops the
counting the number of security incidents reports it received in 2004 because
attacks against internet connected systems had become so commonplace that it
felt this figure was getting too big to track.
 (ISDN Integrated Services Digital Network).
In the past ,the phone network consisted of an interconnection of wires that
directly connected telephone users via an analog-based system. His system was
very inefficient because it did not work well for long distance connections and
was very prone to noise . In the 1960s, the telephone company began converting
this system to a packet based , digital switching network.
Today nearly all voice switching in the US is digital, however the customer
connection to the switching office is primary still analog.
ISDN is a system of digital telephone connection that enables data to be
transmitted simutaneously end to end . This technologyhas been available from
more than a decade and is designed to enable faster , cleare communications
for small offices and home users . It camr about us the standard system began
its migration from an analog format to digital ISDN format.

ISDN Protocol Analyzer

HISTORY:
The concept of ISDN was introduce in 1972.
The concept was based on moving an analog to digital conversion equipment
onto the customers premises to enable voice and data services to be sent
through a single line. Telephone companies voice and data services to be sent
through a single line . Telephone companies also began using a new kind of
digital communication link between each central office .AT1 link could carry
64kb/s voice channels, and it used the same amount of copper wires as only two
analog voice calls .Through the 1970s ,the telephone companies continued to
upgrade this switching offices. They began rolling out T1 links directly to the
customers to provide high speed access. The need for efficient solution was
greater than ever.
IN the early 1990s, an effort was begun to establish a standard implementation
of ISDN standard was defined by the industry so that the users would not have to
know the type of switch they were connected to in order to buy equipment and
software compatible with it.
Because some major offices switches were incompatible with this standard .
Some major telephone companies had trouble switching to the N1 standard .
This cause a number of problems in trying to communicate between these
nonstandard and everyone else . Eventually all the systems were brought upto
standard. A set of core services were define in all basic rate interface of the N1
standard .The services include data call services voice call services ,call
forwarding and call waiting .
Most devices today conform to the N1 standard .
A more comprehensive standardization initiative , National ISDN2 was recently
adopted . Now several major manufacturers of networking equipment’s have
become involved to help set the standard and make ISDN a more economical
solution. The NI2 standard has two goals ,first one to standardize the primary
rate interface as N1 did for BRI and to simplify the identification process. Untill
this point , PRIs were mainly vendor-dependent, wich made it difficult to
interconnect them.
Furthermore, a standard was created for NI-2 for identifires .Basic rate
interface (BRI).Consists of two 64kb/s B channels and one 16 kb/s D channel ,for
a total of 144kb/s. With BRI only 128kb/s is use for data tranfer , while the
remaining 16kb/s is use for signaling information.
BRIs were designed to enable customers to use their existing wiring . Because this
provide a low cost solution for the customers. Its most basic type service intended
for small business or home use.

 Internet service provider.


The company that provides internet connection and services to individuals and
organizations.
In the addition to provideing access to the internet, ISPs may also provide
software packages . ISPs can host websites for business and can also builds the

BASIC ISP CONCEPT


websites themselves. ISPs are all connected to each other through network
access points.
The rise of commercial internet service and applications helped fuel a rapid
commercialization of the internet. This process was the result of several other
factors as well.
One important factor was the introduction was of the personal computer and the
workstation in the early 1980s a development that iin turn was fueled by
unprecedented progress Ics technology and an attandent rapid decline in
computer prices.
Another factor wich took on increasing importance was the local area netwok
(LAN) to link personal computers . But other forces were at work to following the
restructuring of AT & T corporation in 1984, the US National Science Foundation
took advantage of various new optionsfor its national level digital backbone
services known as NSFNET.
In 1988 the the US corporation for National Reseasrch initiative approvel to
conduct an experiment linking a commercial e-mail services to the internet .
This application was the first internet connection to a commercial provider that
was not also part of the research community.
Approval quickly followed to allow other e-mail providers access, and the
internet began its first explosion in traffic .
Microsoft Corporation became interested in supporting internet applications on
personal computers and developed its internet explorer Web browser and other
programes .
These new commercial capabilities accelerated the growth of the internet ,wich
as early as 1988 had already been growing at the rate of 100% per year.
 Corporat services.
These are the activites that combine or cosolidate certain
enterprise –wide Needed support services provided based on specialized know
best practice and partner . The term corporate services provideris also used. In
the united kingdom, the public audit agencies produced a reportin may 2007
called value for money in public sector corporate services. This provides
performance inducators in five categories Finance, Human Resources information
and communication technology and Estate Management.
E- Banking.
Internet banking or E banking allows cutomers of financial transactions on secure
website. To access a financial institutions online banking facility , a customer
must be registered user of the institution and must have possward for customers
authentication.
The possward of online banking is normally not the same as for m-banking. A
customer can access his/her acount anytime . E banking has given birth to 24*7
banking which earlier was just retricted to banking hours .
To access online banking , the customer would go to the finacial institution’s
website sinstitutions have setup additional security for access but there is no
uniformity to the approach adopted .
Today each and every bank is providing this facility to its customers to encourage
paperless banking.

Online banking has opened doors for all the customers to operate beyond the
boundries.
It allows people to carry out their banking transactions by using safe website
functioned by their repective banks.
 Evolution of E banking.
The story of technolgy in banking started with the use of punched cards
matchines like Acounting matchines.
The use of technology , at that time was limited to keeping books of the bank.
It further developed with the birth of online real time system and vast
improvement in telecommunication system during late 1970s and 1980s
It resulted in the revolution of banking .
Through convenience bamking , the bank is carried to be doorstep of the
customers . The 1990s saw the birth of ditributed computing technologies and
relational data base management system . The bank industry was simply waiting
for these technologies . Now with the ditributed technologies one could configure
the dedicated matchines called fron end matchine for customer service and risk
controlwhile communication in the batch mode without hampering the reponse
time on the front end matchine.
You can facilitate payment of electicity and telephone bills , mobile phone credit
card and insurance premiumbill as each bank has tie ups with various utility
companies service providers and insurance companies across the country .
T o pay your bills you must complete one time regitrationfor each bill , you can
also setup vertical instructions online to pay your bill.
Generally bank is not charging the customers for online bill payment.

 NADRA ( National data base and registration authority).


After the independence of Pakistan Prime Minister launched the personal
identification system program to register manage and issue national
identification card to all the citizens of Pakistan and Muslim refugee setting in
Pakistan.

Change were carried out by election of Pakistan in 1965 for the process of the
voters registration to hold the nationwide 1965 peresidential election. In
1969_70, the amendments in the PIS programe continued by the
election comission of Pakistan supervised the 1970 general election.

After the 1971 was resulted in east Pakistan gainig Independence as


Bangladesh, a new statistical data base system was needed to ensure the safety
of Pakistan’s citizens as well as national security of the country as questions
were being raised over who was a Pakistani and who was not?
So the Bhutoo regime introduced a new registration act at the parliment of
Pakistan to established an authority to issue photo ids .
Biometric system.
Just hearing the biometrics,we draw a picture of the fingerprint in the back of our
mind, but originally biometrics is a term related to our human body itself.
The original term “Biometris” actually refers to measurements and different
sorts of calculations that are related to the human body.
It’s the measurement of human characteristics in a matrix form.
Biometric measurements are unique and irreversible depending on their types.
This why in moder day identification biometrics measurements ensure maximum
efficiency.
The history of biometric oringinated back in 1981 to identify the criminals
fingerprint to be analyzed and stored. Since then, this practice happens all over
the world.
However , now biometric authentication has just not become important , but a
wellcome in all aspects of our lives. Biometric measuremens also evolved from
only fingerprint and now different parts of our body, such as finger vein, palm
vein, iris scan, voice recognition, facial reconition, brain waves, heart signals,
DNA identifications etc.
Breaking the barriers of use in only for criminals identification, biometrics has
now evolved to personal security level reach to cloud computing as well. Bionetric
help us change our lives reformin manual tasks into automation and providing
that extra layer of security. Its not only affecting our lives only as an individuals
but also as a community, and even as a species in the environment.
Let us have a quick review how the biometric technology is affecting our lives.
Security:
Undoubtedly the first things that biometric technology makes our lives easy is
through enhancing security.
Biometric identification and identification are changing the way we do and we
see the things . Smart mobile devices adding extraw layers of security through
fingerprint scanner, voice or facial recognition. Recently, Samsung has added an
iris scnner in their latest smartphone. The data sentre of Google uses multimodal
biometric verification to keep track of the security.
Biometric visitors is helping to maintain visitors in any environment and keep a
record of their activities on the premises.

Also passwords and pins are too weak in terms of security . Moreover , pins or
passwords are hard to remember by most people. In recent study performed on
Canadians revealed that they tend to forget their pins or passwords and leave
any authentications online unfinished. Security checks are boring and tiresome in
all the facilities .
To automate this process, Japan use favial recognition gates in one their airport
that automatiocally scans passerby as they go through the gate and match it
with the stored images.
Many airports now use iris scan as well. Even the upcoming Olympic going to be
held in Japan deployed the same fac ial recognition technology to scan their
staff, media representatives, and athletes , as reported by Japan times.
Just recently on the Ukrainian Border, the country established a biometric border
migration.
These technologies save time , creat less hassle,reduce staff costs and maintain
maximum efficiency . Such one tool created to help boarder control forces is
SecuredPass that allows biometric enrollment of travellers .
Its central system that connects the whole country covring all the airports , docks
passports or mannual face matching when one’s own body becomes the
ultimate password.
 Online shopping.
It first began way back in 1979 when Michael Aldrich used Teletext , a two way
message service wich revolutionized business.
In 1981 he saw the finrst business trasaction with Thomson holidays from the Uk.
1982 Mintel ( an online service accessible by phone lines) was used to make online
purchases, book train tickets, chats check stock prices etc .
In 1984 the first ever shopper buys online at a Tescos store.
After this it jumps farwards to 1990 and Tim Berners Lee created the first browser
and web services.

In 1991 the internet became commercialized and saw the birth of e-commerce.
Amazone started selling books online and pierre omidyr founded e-bay .
In both companies are now the go to places for everything for sale online
throughout the world .
Over the nnext few years , after witnessing the potential of a few online stores
Many competitors and alternatives were created .
In 1997 saw the emergence of comparison sites and shortly after in 1998 Paypal
was founded .
In subsequent years ,online commerce and shopping has become the norm and
we cannot comprehend what the World would look like without it .
Although the number of online sales, products and services has exploded , nothing
fundamental has changed much.
With the passage of time slowely and steadely telecommunication change the
world, today eacvh and everything you could buy and sale online .

 Business connectivity
Business connectivity is define as the way your organization talk to the wider
world.It could be your employees, so people to people , it could be the way your
systems connect to the systems of other companies or finally it could be the sum
of your data links.
o Video conferencing.
Video conferencing as we enjoy it today is carried through the internet and
telephone networks ,but it got its start in television.
The man who first demonstrated a working television, John logie Baid, also
poineered the first attempts to creat a two way visual medium in the 1920s and
30s. Despite his efforts and those of the german postal services wich eventually
employedd him, video conferencing technolgy of the era never progressed
beyond AT & Ts first crude video call between new York and Washington, DC in
1927. That call produced little more thana asilhouetteon even the largest
screens.
While AT and Ts Bell labs experiments peal what video conferencing could offer
at the 1964 World’s fair . The picture phone worked by camera to take pictures
every couple of second’s and then relaying those images over regular phone line

to be displayed on a television screen.


The first picture phone ,however was doomed to failure with the public due to its
expense as well as people hesitation to be seen on camera during a telephone
call, while it floundered, the video conferencing entered the commercial marked
proper in the early 1980s. But the cost of equipment and calls kept it out of
reachfor the general public .
Lukily, the computer revolution of the 1980s drove a rapid rise in software
sophistication. By the 1990s, advances in video compression and and internet
protocol made it possible for video conferencing to be staged across desktops.
Video calling of this kind first appeared in free services such as NetMeeting and
Yahoo, Massenger ,albeit with low quaility visuals to match the radically reduced
price tag.

But that change with the experimenttal work of Cornell University.


Cornell’s CU SeeMe was the first free vodeo conferencing application to orove
video could be sent efficiently over the internet.
Originally built for Mac, Apple’s near death experience in the 1990s forced
researchers to make it compatible with PC.
The application connected upto 30 people at once across anearly version of a
peer to peer network, and hosted the simulcast of radio and TV on the internet .
The CU seeMe experience was powered by $100 webcams such as the connetix
QuickCam, wich by the mid 1990s could send images over the internet at 15
frames per seconds (Today basic camera operates at 24 frames per secon) at a
resolution of 320*240 pixels.
Video conferencing today:

Video coferencing today owes its reach and powes to the incredible growth
of the internet over the past 20 years and the internet is really really really big
,every second there are 3.2 billion people are using the internet , generating 1
zetta byte of traffic wich is equal to 36000 years of high definition video.

Video conferencing in future


Video coferencing headed towards two kinds of futures .
First there is the future we can see the one built on increased accessibility,
reliability and affordability of today’s technologies.
And there is the futre we cannot see .That on still belong to prototype,
speculation and imagination.
 Telepresence.

In the minds of many of our clients, the term “telepresence” means an


immersive, three- screen video conferencing system featuring acoustics,
lighting, furniture, and directional audio all designed to give participants
the feeling that their co-workers are across the table rather than across
the country or world.
Companies like HP, Cisco, Teliris, Tandberg, Teletex, and Polycom
pioneered the rise of telepresence. But it was Cisco, through its extensive
marketing of telepresence as something different than video
conferencing, that created this new class of products in the marketplace
Thanks to Cisco’s marketing (and market) successes, just about every
other video conferencing vendor has reclassified their video conferencing
products as “telepresence,” and the telepresence space has become
confusing; with single room telepresence systems and even “personal
telepresence” offerings by some vendorsBut redefining video conferencing
as “telepresence” isn’t just a marketing exercise.
The core features of telepresence—1080p HD video, full-size images,
directional acoustics, and active speaker switching—are increasingly

available beyond the three-screen immersive room system


, escaping the conference room and moving into room systems, from the
desktop, laptop and even from the living roomThe expansion of telepresence
is even moving into the mobile world, thanks to HD cameras on tablets and
phones. The end result is that video-based collaboration is increasingly
available, and better able to offer a richer alternative experience to voice by
providing a natural meeting experience in which conversations aren’t limited,
or encumbered, by technology constraints Telepresence technologies and
products continue to evolve.
telepresence to improve collaboration throughout the organization for tasks
such as analyst meetings to review covered firms, investment opportunities,
or market analysis. In addition, they are leveraging telepresence to meet with
large business and personal account customers; in-branch kiosks let
customers in branch locations engage with portfolio managers, loan officers,
or insurance representatives via a high-quality, immersive video experience.
Large branch room-based systems enable customers to meet with portfolio
managers in a more private setting.

By leveraging telepresence in these scenarios, companies can make internal


resources more widely available to meet customer needs. The end result is
higher customer satisfaction, better ability to up-sell, and higher customer
retention.

 VPN.

For as long as the internet has existed, there has been a need for protocols to
keep data private and secure. The history of VPN (virtual private network)
technology dates back to 1996, when a Microsoft employee developed the
peer-to-peer tunneling protocol, or PPTP. Effectively the precursor to modern
VPNs, PPTP creates a more secure and private connection between a
computer and the internet.

As the internet took off, demand for more sophisticated security systems
arose. Anti-virus and related software could be effective at preventing
damage at the end-user level, but what was really needed was to improve the
security of the connection itself. That’s where VPNs came in.

A VPN is a private connection over the internet. It’s a broad term that
encompasses several different protocols, which will be explained in detail
later. What they all have in common is the ability to connect remotely to a
private network over a public connection.

 DXX leased circuit;

The predecessor of optical cross-connect was the digital cross-connect


system, typically based on SONET, which first emerged around 1990. The
purpose of the digital cross-connect was to provide network operators with

greater control and protection over their networks, and to allow them to
better utilize their communications infrastructure by strategically
interconnecting low-level TDM streams.
At this time, digital cross-connects were mostly used for grooming network
traffic, which at the time meant switching traffic between circuits. Digital
cross-connects allowed carriers to reroute traffic quickly and cost-effectively.

The first major breakthrough in digital cross-connect technology can be


traced back to 1991, when Tellabs introduced the SONET-based TITAN 5500
Cross-connect. The TITAN 5500 switched internal and external circuits,
allowing traffic to flow between networks. While Tellabs was not the first
provider to enter into this space, the TITAN proved to be a superior and
revolutionary product, putting Tellabs on the map and making the company a
top telecommunications vendor of the time.

The age of the digital cross-connect plateaued.


Q.2): Advancement in networks.
 Transmission networks.
Transmission networks are also contently evolving with time. In past few decades the
advancement is rising exponentially with time.
Initially telephone exchange had a person called telephone operator. That operator used
to switch the incoming calls to the required destination. That system was so slow. If one
person was using the line second would have to wait for that to end up. This system was
upgraded with the introduction of automatic exchange centers.
After that optical fiber technology came. As fiber optic communications technologies
continue to demonstrate their advantages in telecom and data center networks, we are
also witnessing attractive and promising progress on optical wireless communications,
which can potentially work with RF communications to meet the ever-increasing
demands for higher date transmission rates and smart home networking.

Meanwhile, the innovations in optical communications and networking systems


can never be realized without fundamental support from the physical infrastructure
layer. The recent advances on few-mode fibers open up appealing opportunities for
equipment vendors and network operators to solve the capacity crunch problem using
mode-division multiplexing. To integrate the momentum gained from these technical
advances, an efficient and intelligent network control and management (NC&M)
mechanism is essential, especially for building heterogeneous networks to support a
wide range of services with various traffic patterns. Therefore, the implementation of
software-defined networking (SDN) in optical communications networks appears
inevitable, and we expect to see more advances in this area in 2017.In 2017, as fiber
optic communications technologies continue to demonstrate their advantages in
telecom and data center networks, we are also witnessing attractive and promising
progress on optical wireless communications, which can potentially work with RF
communications to meet the ever-increasing demands for higher date transmission rates
and smart home networking.

MODERN TELEPHONE EXCHANGE


Meanwhile, the innovations in optical communications and
networking systems can never be realized without fundamental support from the
physical infrastructure layer. The recent advances on few-mode fibers open up appealing
opportunities for equipment vendors and network operators to solve the capacity crunch
problem using mode-division multiplexing. To integrate the momentum gained from
these technical advances, an efficient and intelligent network control and management
(NC&M) mechanism is essential, especially for building heterogeneous networks to
support a wide range of services with various traffic patterns. Therefore, the
implementation of software-defined networking (SDN) in optical communications
networks appears inevitable, and we expect to see more advances in this area in 2017.
Currently approximately all transmission networks are according to modern
technologies.

 Switching Networks.
Today, we’ll take a look at some of the top trends you can expect to see in
2017 regarding network infrastructure and equipment. Consider it a snapshot of what
the year 2017 will hold for businesses involved in network installations.
By preparing and planning ahead, you’ll be all set to position yourself as an invaluable
resource during the fast-paced times ahead.

Expanding and improving networks is your primary responsibility to clients. But at the
end of the day, you also need to be able to advocate for them, and advise them on their
most effective course of action.

With that said, let’s get down to brass tacks and cover the trends as we head into 2017.

Despite the name, Smart Switches aren’t necessarily the smartest solution for many
installations.devices. In fact, Smart Switches have been around for a while. The
differences between Smart/Web-Smart (S/WS) Switches and Managed Switches come
down to nuanced performance.First and foremost, S/WS Switches don’t offer the full
range functionality that’s available in Managed L2 Switches.

S/WS Switches do allow the user to configure link speeds, duplex and view certain
statistics of the Ethernet interface but this hardly scratches the surface of the
customizability one finds in managed switches.Managed L2 Switches allow the user to
become far more precise in their configuration needs by allowing changes to just about
every setting related to layer 2 of the OSI model. VLAN, trunking, and access control lists
are just a few of the features that are available from a managed L2 Switch.
If S/WS Switches were knives, they’d be butter knives. They’ll get the job done sure, but
they’re not particularly versatile. The Managed L2 Switch however, would be a swiss
army knife.

PoE features can be added to both smart and managed switches. PoE S/WS Switches and
Managed L2 Switches allow the user to control PoE output or enable/disable PoE directly
from the interface.

PoE Switches are becoming more and more common since they reduce clutter. The
added benefit is that you will NOT need to install (or hire an electrician to install) an
outlet at remote locations to power Ethernet.The race continues. Ethernet switches are
providing higher and higher Gbps rates with the new 802.3bz release [not yet available
commercially].

Let’s look at some of the most important questions surrounding this topic.Ethernet
speeds will always be a hot topic. As of yet, it’s been unclear exactly when the network
installation industry should step away from 10/100Mb standard and lean towards
1000Mb instead.The official commercial release of 802. Switch manufacturers should be
fine at this point, as the market and demand for 10/100/1000 is still very high. Pushing
to anything faster is all well and good, but current-gen devices will not be able to take
advantage of speeds greater than 1000Mbps.
 Access networks.

Communication networks play an important role in our daily life because


they allow communicating and sharing contents between heterogeneous nodes
around the globe. The emergence of multiple network architectures and
emerging technologies have resulted in new applications and services over a
heterogeneous network. This heterogeneous network has undergone significant
challenges in recent years, such as the evolution to a converged network with the
capability to support multiple services, while maintaining a satisfactory flevel o
QoE/QoS, security, efficiency and trust.

With the advancement of next-generation mobile and wireless networking


technologies, “smart healthcare” or “connected healthcare” is getting
tremendous attention from academia, governments, industry, and the healthcare
community. The next-generation mobile and wireless networking technologies
such as 5G wireless networks, mobile edge computing (MEC), software-defined
networking (SDN), and cloud radio access networks (C-RANs) can play a
significant role in smart healthcare by offering better insight of heterogeneous
healthcare media content to support affordable and high-quality patient care.
While researchers have been making advances in the study of next-generation
networking and healthcare services individually, very little attention has been
given to make cost-effective and affordable smart healthcare solutions.
Connected or smart healthcare has the potential to revolutionize many aspects of
our society.With the growing emergence of wearable biosensors and wireless
communication technologies, Internet of Thing (IoT) devices in smart healthcare
are vulnerable to privacy breaches and safety threats. Security and privacy
protection are certainly very important issues of healthcare IoT applications, as
IoT devices lack capabilities to protect other connected devices from attack. To
this end, the article “Privacy in the Internet of Things for Smart Healthcare”
presents security vulnerability issues related to password strength in order to
have a higher degree of security in smart healthcare. August 01, 2018 - The
impending release of 5G wireless has organizations considering how they can
leverage the technology. The ever-increasing number of connected medical
devices leaves wireless networks strained, and the potential of 5G can help
increase bandwidth for more devices.
5G is the fifth generation of wireless technology with speeds that could reach up
to 20 Gbps, edging out the current 4G LTE which typically clocks in around 1
Gbps. This improvement over the current wireless broadband technology
healthcare organizations are using can support bigger data sets and faster
network connections.
Connected medical devices are not limited to mobile devices or wireless
networks. Organizations need to balance network traffic among wired
connections, wireless internet, and cellular connections. This allows organizations
to prioritize traffic.
Access points should be broadcasting in 2.4 GHz and 5 GHz, as well as
rebroadcasting AM frequency for pagers and cellular for mobile devices.
Improving the bandwidth of cellular connections can gives clinicians and patients
using smartphones better and faster connections. The more advanced 5G can
also better serve telehealth clinicians and patients who are streaming video
conferences or transmitting large data sets.
Report authors stated that the 5G wireless ecosystem is expected to grow soon
because of the initiatives taken by national and regional governments along with
network providers and wireless carriers.
Future optical Access Networks
The report predicted that large-scale commercial trials will increase by five times
through 2021. Report authors also predicted that 5G will have a large impact on
IoT devices, haptic internet, virtual reality, and robotics.
Healthcare organizations are eager to embrace IoT devices because they save
money by keeping patients out of the hospital.
“If IoT devices can diagnose people in advance then that saves huge costs,”
Taoglas Co-Founder and Co-CEO Dermot O’Shea told HITInfrastructure.com in a
previous interview.. “We can see nothing but benefits from medical devices being
connected. Working with medical device companies brings a much larger delta of
savings and benefits than any other vertical.”
The benefits of 5G in healthcare has prompted vendors to collaborate and seek a
standardized technology to improve device connectivity.
Earlier this year, Qualcomm, Ericsson, and AT&T announced plans to collaborate
and conduct interoperability testing and over-the-air field trials based on the
expected 5G New Radio (NR) specifications under development by the 3rd
Generation Partnership Project (3GPP).
The partnership was formed in response to the demand for advanced wireless
technology for enterprises seeking new revenue streams requiring mobile or
remote data exchanges, such as telehealth and remote care.
Vendors and customers alike are seeking faster and more reliable cellular
connections, but much testing still needs to be conducted before a standard
technology can be deployed.
Q.NO# 3:

How wireless networks has transformed with increase demand of bandwidth and data rate.

Network architecture has been changed significantly based on bandwidth and data rate
demands. LAN architectures have remained essentially unchanged since the late 1990s, so why
change LAN architectures now? Apart from ensuring a future-ready network that can support
the direction of technology and applications growth, there are financial advantages to a
wireless LAN. The installation of fibre in both the vertical and the horizontal has the potential to
lower installation cost when compared to traditional cable designs, which has fibre in the riser
but copper in the horizontal. For example, the cost of installing a wireless LAN with fibre in both
pathways is significantly less than the traditional architecture.
In addition to the financial advantages, the installation time of a wireless LAN architecture can
be reduced by as much as one-third when compared to traditional cable, and fewer support
costs, such as reducing space. Fewer and smaller intermediate distribution frames (IDFs) create
an overall smaller footprint, which opens up more physical space and therefore more cost
savings.
While cost and conveniences – along with the ultimate long-term benefits – play an important
factor in the decision-making process, the transition to a wireless LAN offers immediate
benefits. The scalability of the architecture makes it possible to meet various degrees of network
needs, and offers increased LAN bandwidth that is particularly crucial for MACs and edge
devices. With fewer ports exposed, the connectivity is more secure than current cabling
architectures.

Wireless Networking
Best Environment :
These benefits and capabilities, however, are not exclusive to a certain type of environment; the
move away from end-to-end copper in the horizontal and toward composite cable, made up of
fibre and copper, fits a range of facilities. Hospitals, large venues such as convention centres and
stadiums, hospitality, school campuses, various office spaces, and research labs can all benefit –
and should.
Benefits to the end-user:
Collaboration across all fronts, especially in today’s digital age, is foundational to most business
successes. At Sullivan Park in particular, collaboration is achieved through electronic lab
notebooks and interactive video. Having all wireless access is a necessary requirement.
Within this video environment is high-speed, high-definition video control systems and high-
performance computing modelling and simulation, which require dormant network capacity
demand. The facility itself is a complex environment with both new and legacy lab systems,
research equipment of all kinds, and sensitive and critical control systems. The LAN architecture
needs to be able to support all aspects of different needs, and enable a range of technologies
from varying generations.

Analogue mobile phones first appeared in the early 1980s, and were used for voice calls only
(imagine that!). Second-generation (2G) digital mobiles made their debut a decade later with
GSM, offering text messaging (SMS) as the 'killer application' on top of voice services, becoming
the dominant technology worldwide. A roughly 10-year cycle has continued ever since, with
each generation adding more data bandwidth and therefore enabling a richer set of services:
around the turn of the millennium, 3G (UMTS or CDMA 2000) offered data rates of around
1Mbps and could be described as 'mobile broadband', while 2010 saw 4G (LTE) reaching
100Mbps.
Of course, as in any evolutionary process, there have been intermediate stages: GPRS and EDGE
were '2.5G' packet-switching technologies that made internet connections possible, for
example, while HSPA and HSPA+ brought '3.5G' data rates up to 2Mbps. More recently, '4.5G'
LTE-Advanced and LTE-Advanced Pro have paved the way from 4G to 5G, taking data rates up
to 1Gbps.
We are now on the cusp of the 5G era, with standards, spectrum allocation, network
infrastructure, chipsets and devices all moving into place around the world
All new services available that has been discussed in first question shows that the last decade
data rate and bandwidth was unable to fulfill the current services . that’s why the mobile or
wireless services has gone to 5 generation to achieve that bandwidth. To stick to future
requirements the 5G has designed with smaller cells to use low power for data communication,
and each cell has less customers that each customer get highest available bandwidth.

Q.NO# 4:
HOW DO YOU FORECAST THE FUTURE BANDWIDTH AND DATA REQUIREMENTS IN TELECOM
NETWORKS.
Bandwidth requirement prediction is an important part of network design and service planning.
The natural way of predicting bandwidth requirement for existing network is to analyze the past
trends and apply appropriate mathematical model to predict for the future.

The rapid growth in the use of the Internet has led to huge and increasing demand for
bandwidth, both international and domestic. Ensuring sufficient bandwidth in their network
from core to the customer premises has been the constant challenge for network operators and
service providers.
timely and accurate prediction of bandwidth is very helpful to plan the network
resources, expansion and upgrades on time so that bottlenecks and QoS degradations are
avoided.

This, in turn, is crucial for operators to attract, retain and regain customers. Hence, proper
network traffic prediction is very useful for operators in initial planning as well as later on for
upgrades and optimizations.
During the last few decades, the pervasive evolution of telecommunications transformed
not only technologies, networks, services and applications, but also structure of the
telecom sector players. Earlier, the sector was characterized by state-owned national
companies in monopolistic situation. We have recently witnessed deployment of a
competitive environment, withdrawal of the state ownership, emergence of new service
providers, transnational service providers and service providers' alliances, and the
continuous rearrangement of their ownership structure. The telecom sector thus globally
kept evolving during the 20th century in terms of technology, business and regulatory
aspects.B

we observe an overall slow adoption pattern of broadband, reasons for which are not part of
this report. However, statistics for last three years reveal an astonishing growth of broadband
subscription in metropolitan areas. Broadband has started to receive popularity as the demand
for a high speed Internet connection is increasing between businesses and residential
consumers. The country could be rated among the economies where almost all technological
versions of broadband are being offered. Wired broadband in shape of xDSL, HFC, and FTTx over
GPON whereas wireless broadband in form of WiMax and EVDO are among these technological
versions. Mobile broadband through GPRS and EDGE is also being offered but is not part of this
survey study.

S-ar putea să vă placă și