Sunteți pe pagina 1din 139

G00214660

Hype Cycle for Communications Service Provider Infrastructure, 2011


Published: 27 July 2011 Analyst(s): Peter Kjeldsen, Ian Keene

When analyzing and adjusting their technology portfolios, communications service providers need to keep costs down while securing future revenues through innovations that offer tangible end-user benefits in terms of bandwidth, mobility, sophisticated devices and near ubiquitous video.
Table of Contents
Analysis..................................................................................................................................................4 What You Need to Know..................................................................................................................4 The Hype Cycle................................................................................................................................4 The Priority Matrix...........................................................................................................................11 Off The Hype Cycle........................................................................................................................12 On the Rise....................................................................................................................................13 Terabit-per-Second Transport..................................................................................................13 Cognitive Radio........................................................................................................................13 Network Virtualization...............................................................................................................15 OneAPI.....................................................................................................................................16 Cellular to Wi-Fi Authentication.................................................................................................18 Consumer Telepresence...........................................................................................................19 3D TV Services.........................................................................................................................21 Diameter Protocol.....................................................................................................................23 CMTS Bypass..........................................................................................................................24 WiMAX 802.16m......................................................................................................................26 LTE-A.......................................................................................................................................28 Smart Antennas........................................................................................................................29 VDSL2 Enhancements..............................................................................................................31 Mobile CDN..............................................................................................................................32 Network Intelligence.................................................................................................................34

Cloud-based RAN....................................................................................................................35 Socialcasting............................................................................................................................36 Contact Center Interaction Analytics.........................................................................................38 At the Peak.....................................................................................................................................39 Convergent Communications Advertising Platforms (CCAPs)....................................................39 IPX for LTE...............................................................................................................................42 Rich Communication Suite........................................................................................................44 VoIP Wireless WAN..................................................................................................................46 WDM-PON...............................................................................................................................47 Self-Organizing Networks.........................................................................................................49 4G Standard.............................................................................................................................50 CDN for Fixed CSPs.................................................................................................................52 Integrated Policy and Charging Control Solutions.....................................................................53 Augmented Reality...................................................................................................................55 Public Cloud Computing/the Cloud...........................................................................................57 100 Gbps Transport.................................................................................................................58 Sliding Into the Trough....................................................................................................................59 Mobile Subscriber Data Management.......................................................................................59 Energy Management Gateways................................................................................................61 802.22......................................................................................................................................63 Addressable TV Advertising......................................................................................................63 RF Over Glass..........................................................................................................................67 White Spaces: Unlicensed Spectrum TV...................................................................................69 TD-LTE.....................................................................................................................................70 Next-Generation Service Delivery Platforms..............................................................................72 IMS...........................................................................................................................................75 10G-PON.................................................................................................................................77 Network Sharing.......................................................................................................................79 IPv6..........................................................................................................................................82 802.16-2009............................................................................................................................84 Broadband Over Power Lines...................................................................................................85 802.11k-2008..........................................................................................................................86 802.11r-2008...........................................................................................................................87 Long Term Evolution.................................................................................................................88 MPLS-TP..................................................................................................................................90 Video Telepresence..................................................................................................................92
Page 2 of 139
Gartner, Inc. | G00214660

40 Gbps Transport...................................................................................................................94 Mobile Advertising....................................................................................................................95 Climbing the Slope.........................................................................................................................99 Femtocells................................................................................................................................99 802.11n..................................................................................................................................100 IPTV.......................................................................................................................................101 Mobile DPI..............................................................................................................................104 Mobile Application Stores.......................................................................................................105 FTTH......................................................................................................................................107 HSPA+...................................................................................................................................109 Network DVR..........................................................................................................................111 TD-SCDMA............................................................................................................................113 VDSL2....................................................................................................................................114 DOCSIS 3.0 Cable..................................................................................................................116 Online Video...........................................................................................................................117 Interactive TV..........................................................................................................................119 MPEG-4 Advanced Video Coding...........................................................................................123 OTN and GMPLS/ASON.........................................................................................................125 Entering the Plateau.....................................................................................................................126 Next-Generation Voice............................................................................................................126 ROADMs................................................................................................................................128 Mobile TV Streaming..............................................................................................................129 Off the Hype Cycle.......................................................................................................................130 Mobile TV Broadcasting..........................................................................................................130 Residential VoIP......................................................................................................................132 Appendixes..................................................................................................................................134 Hype Cycle Phases, Benefit Ratings and Maturity Levels........................................................136 Recommended Reading.....................................................................................................................137

List of Tables
Table 1. Hype Cycle Phases...............................................................................................................136 Table 2. Benefit Ratings......................................................................................................................136 Table 3. Maturity Levels......................................................................................................................137

Gartner, Inc. | G00214660

Page 3 of 139

List of Figures
Figure 1. Hype Cycle for Communications Service Provider Infrastructure, 2011..................................10 Figure 2. Priority Matrix for Communications Service Provider Infrastructure, 2011...............................12 Figure 3. Hype Cycle for Communications Service Provider Infrastructure, 2010................................135

Analysis
What You Need to Know
Communications service providers (CSPs) need to develop their network infrastructure to meet cost center challenges such as bit-wise economy-of-scale and network ecology-of-scale. They also need to adjust their business models and product portfolios, and leverage mass-customization enabled by network intelligence in order to optimize existing revenue streams and generate new ones. Selecting the right technologies, choosing the right architectures and getting the timing right are key challenges that CSPs must address. Gartner's 2011 Hype Cycle for CSP Infrastructure can assist CSPs and their suppliers in selecting appropriate technology portfolios to match CSP service strategies. In the following section, we sort technologies based on whether the primary CSP attraction is cost/ performance or revenue protection/generation or a combination of both. A clear trend is that the continued growth in mobile data and various types of video traffic, the interest in cloud solutions, a less austere outlook for the economy and continued innovation have increased the hype around technologies that offer more bandwidth at lower cost per bit. CSPs should, of course, consider which combination of raw bandwidth and network intelligence provides the best end-to-end performance but the increase in hype relative to last year's report is most visible for the technologies that are related to raw bandwidth. A key challenge for CSPs will be to orchestrate investments in different parts of their infrastructure to ensure the best possible end-to-end support of, and alignment with, associated CSP service strategies and visions.

The Hype Cycle


CSP network infrastructures are if international connections are included among the largest and most complex man-made constructions, when measured both in actual size and in terms of investments. The vast size, complexity and importance of these networks tends to make CSP decision makers quite cautious, and the heavy reliance on available standards can sometimes slow down the adoption of new technologies (because standards are not yet in place) or speed things up (if standards are available). However, the rapid increase in traffic volume driven by mobile data and various types of video means that CSPs face some fundamental logistical challenges, as they need to realize:

Page 4 of 139

Gartner, Inc. | G00214660

"Bit-wise economy-of-scale" that is, their networks must be built and operated in ways which ensure that, year after year, as more traffic is handled, the cost per transmitted bit decreases, keeping a cap on the total network cost. "Network ecology-of-scale" with energy efficient architectures and infrastructure that prevents energy consumption from getting out of hand as network traffic continues to rise. "Mass-customization through network intelligence" where new services, content and features are used to protect and drive up revenue.

Both bit-wise economy-of-scale and network ecology-of-scale can be thought of as cost center challenges. It is important for CSPs to realize that these are not simply challenges for today or tomorrow, but for future decades with no real end in sight. In order to justify continued investments in ever more sophisticated network infrastructure, CSPs need to tap into new revenue streams as well as protect existing revenues. Here, masscustomization through network intelligence is essential, as CSPs are transforming both their business models and their networks to become more cost-effective, prevent customer churn and protect/increase revenue by offering new types of service. CSPs that fail to innovate may not survive but CSPs that decide on the wrong type of transformation, or that get the timing wrong, may find themselves in equally dire straits, even before more passive competitors do. An apparent disconnect exists between the CSP investment climate and the changes required by the challenges that they are facing. CSPs do realize the imperative of transforming their networks, business models and organizations. However, like most other organizations, they tend to focus more on near-term risk than on longer-term risks and opportunities. In the current climate the imbalance between short- and long-term investments is even stronger than usual, although the imbalance has been reduced over the past year. Hence, most CSPs are focusing on short-term impact, which offers limited room for true differentiation. Those CSPs that can make the right longterm investments that align services, networks and underlying business models to a durable longterm vision with sustained differentiation taking into account the threats and opportunities related to the emerging public cloud stand a real chance of breaking away from their competitors. CSPs and their suppliers face changing competitive landscapes, driven by consolidation, globalization, innovation, evolving consumer behavior and a challenging financial environment. Competition is arriving in many forms and from all directions both traditional and nontraditional competitors (such as Google) are seeking a share of the fast-changing market for communications, entertainment and information services. Gartner's 2011 Hype Cycle for CSP Infrastructure covers key technologies for network infrastructure that CSPs should evaluate to meet the challenges outlined above. The Hype Cycle features both fixed and mobile carrier infrastructure technologies, as the technological underpinnings of both are increasingly intertwined. The intended audience for this Hype Cycle is technology stakeholders in CSPs and their suppliers.

Gartner, Inc. | G00214660

Page 5 of 139

It is perhaps easiest to understand the technologies on the Hype Cycle and their potential impact on CSP infrastructure by categorizing them according to whether CSPs introduce them for reasons of cost/performance, revenue, or both:

Technologies introduced based on cost or performance considerations. Diameter Protocol, IPv6, Cognitive Radio, Network Virtualization, Cloud-based RAN, OneAPI, Cellular to Wi-Fi Authentication, 802.11k-2008, Terabit-per-Second Transport, 100 Gbps Transport, 40 Gbps Transport, Network Sharing, Multiprotocol Label Switching Transport Profile (MPLS-TP), 802.11n, Optical Transport Networks and Generalized Multiprotocol Label Switching/ Automatically Switched Optical Network (OTN and GMPLS/ASON), Terabit-per-Second Transport and Self-Organizing Networks. Technologies introduced to protect or drive up revenue. Mobile CDN, CDN for Fixed CSPs, Energy Management Gateways, Online Video, Consumer Telepresence, Video Telepresence, Augmented Reality, Socialcasting, Rich Communication Suite, Voice Over Internet Protocol (VoIP) Wireless WAN, Convergent Communications Advertising Platforms (CCAP), Mobile Application Stores, Addressable TV Advertising, Next-Generation Service Delivery Platforms, Network DVR, IPTV, Mobile TV Streaming, Interactive TV, Mobile Application Stores, Public Cloud Computing/the Cloud, and 3D TV Services. Technologies introduced for reasons of both cost/performance and revenue. Integrated Policy and Charging Control Solutions, Mobile DPI, Contact Center Interaction Analytics, Mobile Subscriber Data Management, IPX for LTE, Network Intelligence, Cable Modem Termination System (CMTS) Bypass, Long Term Evolution Advanced (LTE-A), Smart Antennas, Radio Frequency (RF) Over Glass, 10G Passive Optical Network (10G-PON), Wavelength Division Multiplexing (WDM) PON, Fourth-Generation (4G) Standard, Femtocells, Reconfigurable Optical Add/Drop Multiplexers (ROADMs), Long Term Evolution, VDSL2, VDSL2 Enhancements, 802.16-2009, IP Multimedia Subsystem (IMS), 802.11r-2008, MPEG-4 Advanced Video Coding, Time Division-Synchronous Code Division Multiple Access (TD-SCDMA), Data-Over-Cable Service Interface Specification (DOCSIS) 3.0 Cable, FTTH, Next-Generation Voice, Broadband Over Power Lines, WiMAX 802.16m, TD-LTE, High-Speed Packet Access Evolution (HSPA+), White Spaces: Unlicensed Spectrum TV, and 802.22.

The technologies that have progressed most on the Hype Cycle come from either the cost/ performance category (100 Gbps Transport and Network Sharing) or the combined category (RF Over Glass, 802.22, 10G-PON, Long Term Evolution, Femtocells and White Spaces: Unlicensed Spectrum TV). It is interesting to notice that all of the fast-moving technology profiles are related either to increased bandwidth or to lowering the cost of bandwidth. In the 2010 Hype Cycle for CSP Infrastructure, we observed that many technologies were piling up at the Trough of Disillusionment. A residual effect of this phenomena is still visible in the 2011 version of the Hype Cycle, although several of these technologies have moved toward the Slope on Enlightenment as CSPs have started investing in them. This illustrates that a key challenge for CSPs will be to orchestrate investments in different parts of their infrastructure to ensure the best possible end-to-end support of, and alignment with, associated CSP service strategies and visions.

Page 6 of 139

Gartner, Inc. | G00214660

Major Changes to the 2011 Hype Cycle Gartner's "Hype Cycle for Communications Service Provider Infrastructure, 2011" replaces the 2010 report of the same name. As is always the case, some technologies have been deleted and some added both sets are listed below. We also list those technologies that have seen noticeable change since last year's report. Notable Positioning Changes Most of the technologies have changed position since the 2010 iteration of the Hype Cycle, though most of them only slightly. Some, however, have changed position more substantially:

100 Gbps Transport. Network Sharing. RF Over Glass. 802.22. 10G-PON. Long Term Evolution. Femtocells. White Spaces: Unlicensed Spectrum TV.

Further details of the reasons for the revised positions are given in the individual entries in this Hype Cycle. In addition, it should be noted that the following technologies have not moved since last year's report:

CMTS Bypass. Rich Communication Suite. VoIP Wireless WAN. Public Cloud Computing/the Cloud. 802.16-2009.

The lack of movement of the Public Cloud Computing/the Cloud technology may seem surprising, but the sustained hype surrounding cloud-based solutions means that this technology experienced a prolonged stay close to the Peak of Inflated Expectations. Finally, there is one technology that has moved backward since last years report:

Convergent Communications Advertising Platforms (CCAP).

Gartner, Inc. | G00214660

Page 7 of 139

The reason behind the unusual backward movement of the CCAP technology is that its capability to deliver convergent and integrated advertising functions across multiple channels, media types and media formats has been advancing at a slower pace than had been previously assessed. This is mainly due to the complexity and challenges faced with the integration of the broad array of components across the data, the process and the systems layers that comprise a CCAP architecture. The lack of advertising-related open and standardized data models and interfaces further challenges the evolution of CCAP. New to the Hype Cycle The following technologies are new to the Hype Cycle:

Mobile CDN. CDN for Fixed CSPs. 802.11k-2008. Cellular to Wi-Fi Authentication. Integrated Policy and Charging Control Solutions. Network Intelligence. Diameter Protocol. VDSL2 Enhancements. Energy Management Gateways. Terabit-per-Second Transport. Cognitive Radio. IPv6. Cloud-based RAN. Mobile Subscriber Data Management. Network Virtualization. Augmented Reality. Socialcasting. Video Telepresence. Consumer Telepresence. Online Video. Mobile DPI. Contact Center Interaction Analytics.
Gartner, Inc. | G00214660

Page 8 of 139

IPX for LTE. OneAPI. Mobile Advertising.

Deleted From the Hype Cycle The following technologies no longer appear on the Hype Cycle:

Switched Digital Video because the functionality is covered within other techonologies. High-Speed Uplink Packet Access matured beyond the scope of the Hype Cycle. Mobile TV Broadcasting matured beyond the scope of the Hype Cycle. Residential VoIP matured beyond the scope of the Hype Cycle.

Renamed Entries Three entries have been renamed on this year's Hype Cycle:

OTN and GMPLS/ASON was called GMPLS/ASON previously. Convergent Communications Advertising Platforms (CCAP) was called Convergent Communications Advertising Data Platforms previously. 802.16-2009 replaces 802.16e-2005, which was used in previous iterations of this report.

Gartner, Inc. | G00214660

Page 9 of 139

Figure 1. Hype Cycle for Communications Service Provider Infrastructure, 2011

expectations
Self-Organizing Networks WDM-PON VoIP Wireless WAN Rich Communication Suite IPX for LTE Convergent Communications Advertising Platforms (CCAP) Contact Center Interaction Analytics Socialcasting

4G Standard CDN for Fixed CSPs Integrated Policy and Charging Control Solutions Augmented Reality Public Cloud Computing/the Cloud 100 Gbps Transport

Mobile Subscriber Data Management OTN and Energy Management Gateways GMPLS/ASON 802.22 Cloud-based RAN Addressable TV Advertising Online Video Network Intelligence DOCSIS 3.0 RF Over Glass TD-LTE Mobile CDN Cable White Spaces: Unlicensed Next-Generation FTTH Spectrum TV VDSL2 Enhancements MPEG-4 Advanced Video Coding Service Delivery Interactive TV Smart Antennas Mobile Application Stores Platforms VDSL2 WiMAX 802.16m LTE-A IMS Femtocells TD-SCDMA CMTS Bypass 10G-PON Network DVR Diameter Protocol IPv6 Network Sharing HSPA+ 3D TV Services Consumer Mobile DPI Telepresence 802.16-2009 IPTV Cellular to Wi-Fi Authentication Broadband Over 802.11n Power Lines Mobile Advertising OneAPI 40 Gbps Transport 802.11k-2008 Network Virtualization Terabit-per-Second Transport 802.11r-2008 Video Telepresence Cognitive Radio As of July 2011 Long Term Evolution MPLS-TP

Mobile TV Streaming ROADMs Next-Generation Voice

Technology Trigger

Peak of Inflated Expectations

Trough of Disillusionment

Slope of Enlightenment

Plateau of Productivity

time
Years to mainstream adoption: less than 2 years
Source: Gartner (July 2011)

2 to 5 years

5 to 10 years

more than 10 years

obsolete before plateau

Page 10 of 139

Gartner, Inc. | G00214660

The Priority Matrix


A characteristic feature of the Priority Matrix is that there are no transformational technologies that are less than two years to mainstream adoption. This reflects the strong momentum of the CSP infrastructure market and the inherent difficulty in transforming networks quickly. As the Priority Matrix shows, five of the included technologies are transformational in nature. Their transformational aspects relate to network performance (FTTH), the ability to create new services (Next-Generation Service Delivery Platforms) and changed business models (Network Sharing, Network Virtualization and Public Cloud Computing/the Cloud). These are technologies that will make the networks of 2015 to 2020 very different from those we know today. The Priority Matrix also shows a number of technologies that are expected to have a high impact that will reach the Plateau of Productivity in less than two years. These technologies relate to basic network cost and performance (HSPA+), or enable new services and revenue streams (Mobile Application Stores, MPEG-4 Advanced Video Coding and Next-Generation Voice). These technologies will have a noticeable impact on CSP networks by 2012. Further details of the reasons for the technology positions in the Priority Matrix are given in the individual entries in this document.

Gartner, Inc. | G00214660

Page 11 of 139

Figure 2. Priority Matrix for Communications Service Provider Infrastructure, 2011

benefit

years to mainstream adoption


less than 2 years 2 to 5 years
FTTH Network Sharing Network Virtualization Next-Generation Service Delivery Platforms Public Cloud Computing/the Cloud

5 to 10 years

more than 10 years

transformational

high

DOCSIS 3.0 Cable HSPA+ Mobile Application Stores Mobile DPI MPEG-4 Advanced Video Coding Next-Generation Voice Online Video VDSL2

CDN for Fixed CSPs Diameter Protocol Integrated Policy and Charging Control Solutions Interactive TV Long Term Evolution Mobile Advertising Mobile CDN Mobile Subscriber Data Management MPLS-TP OneAPI Self-Organizing Networks Socialcasting VDSL2 Enhancements

Augmented Reality Cellular to Wi-Fi Authentication Cloud-based RAN Contact Center Interaction Analytics Convergent Communications Advertising Platforms (CCAP) Energy Management Gateways TD-LTE

Cognitive Radio LTE-A

moderate

40 Gbps Transport 802.11k-2008 Femtocells Mobile TV Streaming Network DVR OTN and GMPLS/ASON ROADMs Video Telepresence

802.22 100 Gbps Transport 802.11n IMS IPTV IPX for LTE Network Intelligence RF Over Glass Rich Communication Suite Smart Antennas White Spaces: Unlicensed Spectrum TV

10G-PON 4G Standard Addressable TV Advertising VoIP Wireless WAN WDM-PON

3D TV Services Terabit-per-Second Transport

low

802.11r-2008

802.16-2009 IPv6

As of July 2011
Source: Gartner (July 2011)

Off The Hype Cycle


Mobile TV Broadcasting and Residential VoIP are considered mature technologies and have therefore been removed from the 2011 Hype Cycle.

Page 12 of 139

Gartner, Inc. | G00214660

On the Rise
Terabit-per-Second Transport
Analysis By: Peter Kjeldsen Definition: We define terabit-per-second transport as systems with one or more wavelengths each operating at one terabit per second (1 Tbps) or above. Position and Adoption Speed Justification: "Hero experiments," seeking to demonstrate ever higher bit rates per channel, have (over the past few decades) been a key ingredient in the innovation around optical transport systems. In March 2011, ZTE presented a post-deadline paper at the OFC/NFOEC 2011 conference demonstrating transmission of a 10 Tbps single-channel signal over 640 kilometers of fiber. While such impressive technological achievements are attracting significant interest and "R&D hype," it is still very early days in terms of "real market hype" which explains the position of the technology at the far left-hand side of the Hype Cycle. During the second half of the 1990s, there were similar announcements for 100 Gbps singlechannel systems and, considering the time lag relative to the introduction of 100 Gbps systems in today's market, we are probably still more than 10 years away from commercially available terabitper-second transport systems. User Advice: Communications service providers should expect a continued requirement for bitwise economies of scale in their networks, and plan their network architectures and fiber deployments accordingly. With fiber life spans being 20 or more years, it is essential to ensure that deployments are made in a way that makes investments sufficiently future proof. Business Impact: Commercial transport systems with per-channel bit rates in excess of 1 Tbps, may be hard to envision but so were 100 Gbps systems 10 years ago. This technology may eventually offer a cost-effective means of addressing traffic growth issues acting as an enabling technology for the continued expansion of network capacity. Benefit Rating: Moderate Market Penetration: Less than 1% of target audience Maturity: Embryonic Sample Vendors: Alcatel-Lucent; Ciena; ZTE

Cognitive Radio
Analysis By: Jim Tully; Sylvain Fabre Definition: Cognitive radio dynamically identifies how spectrum is being used and chooses appropriate frequencies, protocols and modulation to coexist with other devices. Cognitive radio gives flexibility of operation that goes a long way beyond that offered by software-defined radio

Gartner, Inc. | G00214660

Page 13 of 139

(SDR). SDR enables wireless devices to switch dynamically between protocols and frequencies, and is also a foundation that can be used to build a cognitive radio system. Cognitive radio is a more sophisticated concept than SDR, enabling devices to dynamically negotiate protocol and spectrum use, depending on the needs of these and other devices in the vicinity. The motivation for cognitive radio stems from various measurements of spectrum utilization, which generally show that spectrum is under-utilized. This means that there are many "holes" in the radio spectrum that could be exploited by secondary users. The secondary user must exploit these spectrum opportunities without causing harmful degradation to the primary network. In cognitive radio, communication systems are aware of their internal state and environment, such as location and utilization of the radio frequency spectrum at that location, and can make decisions about their radio mode of operation behavior by mapping that information against predefined objectives. Cognitive radio is further defined by many to utilize SDR, adaptive radio and other technologies to automatically adjust its behavior or operation to achieve desired objectives. The utilization of these elements is critical in allowing end-users to make optimal use of available frequency spectrum and wireless networks with a common set of radio hardware. This will reduce cost to end-users while allowing them to communicate with whomever they need, whenever they need to and in whatever manner is appropriate. Position and Adoption Speed Justification: Cognitive radio is often viewed as an extension of SDR but it is far more complex and applies to the entire system. SDR is a more localized technology where base stations and handsets could adopt the technology at different times. Some handset suppliers could even utilize SDR while others do not within the same system. Cognitive radio needs to be adopted by the entire system (infrastructure and handsets) at the same time. This systemwide dimension increases the complexity and the time scale considerably compared with SDR. Three major issues must be addressed in the development of cognitive radio systems:

Hardware and software architecture of handset devices. These require relatively highperformance digital signal processing technology with the added complexity of reconfigurable capabilities. Such technology typically consumes too much power for handsets. Design of the polling and exploration scheme among neighboring devices. This requires agreement and standardization. It requires a solution that does not consume more bandwidth than is saved through the use of cognitive radio. The IEEE is active in the early stages of this work. Legislation and regulation. New schemes need to be devised for the allocation and licensing of spectrum. These are likely to include agreements to give designated owners of spectrum the highest priority while others can then use unused bandwidth.

These are formidable hurdles and the technology is unlikely to reach the mainstream before 2020.

Page 14 of 139

Gartner, Inc. | G00214660

User Advice: Cognitive radio can allow for a more optimal use of finite radio resources, and improve user experience as radio access can occur on multiple networks, depending on what is currently available to the user. Business Impact: End-users will benefit from a seamless wireless experience as it will become more unlikely for them to ever completely be out of radio coverage. For communications service providers, this could allow for cheaper radio coverage, as multiple legacy access networks could be leveraged in order to achieve coverage and capacity. Benefit Rating: High Market Penetration: Less than 1% of target audience Maturity: Embryonic Sample Vendors: Alcatel-Lucent; Intel; NEC; ST-Ericsson; Thales

Network Virtualization
Analysis By: Akshay K. Sharma; Peter Kjeldsen Definition: Network virtualization is the process of combining hardware and software network resources and network functionality into a single virtual network. This offers access to routing features and data streams which can provide newer service-aware resilient solutions; newer security services that are native within network elements; supporting subscriber-aware policy control for peer-to-peer traffic management; and application-aware, real-time session control for converged voice and video applications, with guaranteed bandwidth on-demand. Newer emerging trends in cloud computing architectures have occurred, with vast mega-plexes of massively parallel server-farms in huge data centers. These mega-plexes comprise application servers typically running on 1 to 10 Gigabit Ethernet general-purpose computers with storage and high-performance processing power. However, while vendors like VMware have virtualized storage or applications that can be deployed across data centers, network connectivity between the data centers has been fixed. Network virtualization is a new concept whereby network connectivity is virtual, to be router-, port-, speed-, latency- and resiliency-agnostic, with the potential of a future on-demand model of network connectivity relying on real-time policy and charging. For communications service providers (CSPs), the essential benefits of network virtualization can include:

Energy and space savings from converged network elements and an elastic network, rather than speed-specific nailed-up circuits connected to dedicated ports/routers. New service networks introduced without new overlays. Reduced total cost of ownership.

Gartner, Inc. | G00214660

Page 15 of 139

Risk mitigation. Streamlined asset utilization. Support of new business models, with on-demand real-time charging. Improved profitability.

Position and Adoption Speed Justification: Cisco, Ciena and Juniper Networks have announced initial network virtualization capabilities; however, in the area of operations support systems/ business support systems, newer solutions with real-time policy control are needed to allow for this capability to be a managed service. User Advice: CSPs should consider network virtualization in the context of their overall network strategies, and monitor announcements from equipment vendors in this area. Enterprises should look to CSPs for leveraged data center offers based on network virtualization. Business Impact: To begin with, emerging technology will have a minor business impact for data center CIOs, until de facto standards emerge. Benefit Rating: Transformational Market Penetration: Less than 1% of target audience Maturity: Emerging Sample Vendors: Ciena; Cisco; Juniper Networks Recommended Reading: "Dataquest Insight: Network Virtualization is An Emerging Trend in Data Center Communications"

OneAPI
Analysis By: Mentor Cana Definition: The increase in the use of mobile devices and mobile data, as well as the sophisticated over-the-top (OTT) services consumed by communications service providers' (CSPs') subscribers has put pressure on CSPs globally: to innovate, to find new business models in order to monetize their network and operational infrastructure, and to offer new services to users. This pressure has led many CSPs to open up their network capabilities for consumption by third-party developers for integration into their services. Many CSPs have been exposing limited network capabilities (mostly messaging, location and charging) through proprietary interfaces, either directly to third-party developers or through application programming interface (API) aggregators as intermediaries. While API aggregation via intermediaries helps third-party developers somewhat, by providing a layer of administrative abstractness, the underlying infrastructure is still fractured because there are a number of aggregators and consolidators each specializing in a limited set of APIs. Further, as the aggregators are aimed at third-party developers, they do not enable the use and reuse of services internally or

Page 16 of 139

Gartner, Inc. | G00214660

between CSPs. The aggregators also: dilute the branding and loyalty of any particular CSP; do not help with multivendor manageability; and do not reduce the cost and the complexity in the creation of applications that span multiple CSPs (multiple interfaces for the same service across different CSPs). Thus, the telecom industry led by the Global System for Mobile Communications (GSM) Association (GSMA) has defined OneAPI as a set of standardized and lightweight Web-friendly APIs for CSPs to expose their network capabilities in a standardized way. OneAPI interfaces can be used internally by a CSP, between CSPs and, more importantly, by third-party providers and developers. Building on the earlier OneAPI specifications, that expose messaging, location, and charging capabilities, the current OneAPI specifications are at version 2.0 finalized in May 2011. OneAPI version 2.0 adds the following new capabilities: voice call control (enabling phone calls from Web, desktop and mobile phones), data connection profile, and device capabilities profile. Position and Adoption Speed Justification: Network-exposure APIs such as OneAPI can add value to CSP networks, and help them compete with OTT providers and other CSPs. The API window of opportunity is open now, as mobile Internet and content services begin to take off. Given the rapid acceleration of mobility and mobile data use, and the overwhelming competition from OTT providers, Gartner estimates that the window of opportunity for CSPs is during the next three to five years while content services and mobile Internet are maturing. After that, it will be more difficult to compete with established brands and APIs will have become just a necessary feature. The business case for developing specific APIs will depend on the value of the applications and the user base that a provider can offer. Currently, Aepona (http://oneapi.aepona.com), working with http://oneapi.aepona.com/GSMA, has enabled a reference implementation of OneAPI specification v.2.0 with the following capabilities exposed for testing: messaging (using Simple Object Access Protocol [SOAP] and Representational State Transfer [REST]), location (using SOAP and REST), data connection profile (using REST only) and charging (using SOAP only). The following CSPs have made their APIs available for messaging and location services in the GSMA's reference implementation (for testing purposes): Vodafone (Betavine), Orange, Telenor, Telecom Italia, Telus and T-Mobile (in the U.K., Austria, Germany and The Netherlands). For example, Vodafone (via its Betavine developer platform) has exposed limited messaging capabilities for commercial use via OneAPI. The Canadian CSPs, Bell, Telus and Rogers, have teamed up in cooperation with Aepona to provide a common platform for developers to access these major CSPs in Canada in a standardized way using the OneAPI implementation. User Advice: Given the current context of constant service innovation, especially with the emergence of OTT and disruptive services related to mobility and mobile devices, CSPs may not have the choice of maintaining their current state and stagnating. Rather, by standardizing and opening up their network capabilities for consumption and integration beyond their immediate control, CSPs can avoid the risk of being marginalized and kept outside of the innovative service creation cycle and potential revenue stream. Thus, CSPs need to implement OneAPI specifications: to keep up with other CSPs already offering them, and take advantage of the economies of scale in architecture and in user base.

Gartner, Inc. | G00214660

Page 17 of 139

The Wholesale Application Community (WAC) a global alliance that aims to create an open application ecosystem for mobile devices by bringing together application developers, independent software vendors, handset manufacturers, OS owners, CSPs and end users plans to utilize the GSMA OneAPI standard to access CSPs' network layer. The Open Mobile Alliance (OMA) has also undertaken initiatives to expose Parlay X Web Services and ParlayREST via OneAPI. Business Impact: CSPs can benefit from the adoption and implementation of OneAPI, because it can also be used in future to enable them to interoperate and use each other's network capabilities. Ultimately, the real test will come from the developer community, and its willingness to adapt and use OneAPI interfaces to build applications and widgets. Uniformity in the application layer and the network-exposure layer will ensure real portability of applications. The CSPs can further benefit from the adaptation and standardization of interfaces, by applying service-oriented architecture principles to ensure that the capabilities and functionalities they expose are widely discoverable, usable (technology and platform agnostic), reusable, maintainable, and supportable over time. The open and direct relationship (based on standardized interfaces) among the CSPs, third-party vendors and application developers, will provide CSPs with a wealth of intelligence about the patterns of use of each individual service in the ecosystem (both internal and external). Based on this knowledge, the CSPs can improve their services, and may also induce innovation in value and service co-creation with their partners and end users. Benefit Rating: High Market Penetration: Less than 1% of target audience Maturity: Emerging Sample Vendors: Aepona; Locatrix Recommended Reading: "Competitive Landscape: SOA for Communications Service Providers, Worldwide" "Market Trends: Worldwide, How and Why Communications Service Providers Are Developing APIs to Expose Their Networks, 2011 Update"

Cellular to Wi-Fi Authentication


Analysis By: Tim Zimmerman; Michael J. King Definition: Cellular to Wi-Fi authentication provides a foundational component for dual-mode smartphones to move freely between cellular and Wi-Fi connectivity for voice and data applications. This multivendor and multiple physical layer authentication allows cellular connections to be transferred to Wi-Fi, as well as among multiple Wi-Fi vendors from one installation to another, whether it is a hot spot or, ultimately, within an enterprise. Mobile users, whose devices can move between 3G and Wi-Fi networks at a low level using a 802.21 handoff, also need a unified and reliable way to authorize their access to all of those networks. 802.11u provides a common
Page 18 of 139
Gartner, Inc. | G00214660

abstraction that all networks, regardless of protocol, can use to provide a common authentication experience. Position and Adoption Speed Justification: The ratification of 802.11u, which provides the necessary functionality for internetwork communication such as network discovery and selection, has boosted the momentum for initiatives such as the Wi-Fi Alliance's Next-Generation Hotspot or Hotspot 2.0. Initial vendor testing is just beginning with the Wireless Broadband Alliance and will begin in the fall of 2011 for the Wi-Fi Alliance. Vendor trials will show the ability to authenticate, but vendors will still need to negotiate roaming agreements and to facilitate rolling out the necessary hardware and software for the end-to-end solution for all components of the multivendor solution. User Advice: Users should expect that once the functionality has been agreed upon by the industry it will take time for the necessary hardware and software to be implemented. They should also expect issues with initial implementations of multiple vendor solutions in hot spots, as well as within the network infrastructure. Enterprises looking to use the technology to additionally provide a migration strategy for roaming among Wi-Fi vendors will have more initial success, since they control the end-to-end solution. Business Impact: The ability to seamlessly roam from cellular to Wi-Fi will be huge as Wi-Fi continues to provide a 450 Mbps connectivity through a single access point architecturally capable of over 1 gigabyte of Wi-Fi connectivity through layering and load balancing using multiple access points. The offloading ability will provide a solution for density-rich environments, such as stadiums, metropolitan areas or university classrooms. Benefit Rating: High Market Penetration: Less than 1% of target audience Maturity: Embryonic Sample Vendors: Aruba Networks; AT&T; Cisco; Verizon

Consumer Telepresence
Analysis By: Fernando Elizalde; Elia San Miguel Definition: Consumer telepresence is a set of technologies that enables users to feel they are "present" during videoconferences. Telepresence has the potential to break down the consumer dissatisfaction that has been prevalent with video telephony. Telepresence solutions share common traits, including: life-size (or near life-size) image displays, high-quality audio, full-motion video capabilities and a minimal camera-to-eye angle to provide virtual eye-to-eye contact. Telepresence systems are being driven by the enterprise market through companies such as Cisco, HP and Polycom. Consumer telepresence creates a greater challenge than its enterprise counterpart. While an enterprise system can cost hundreds of thousands of dollars and may require an environment that is ad hoc, a consumer system will need to embrace a variety of environments in which the majority of costs can be absorbed in other home devices, such as high-definition television (HDTV), broadband and home theater sound systems.

Gartner, Inc. | G00214660

Page 19 of 139

Position and Adoption Speed Justification: Consumer telepresence market adoption is a long way off, despite the launch of much hyped products by Cisco and Skype. These types of product will address only 1% of the addressable market the "innovators" and those who are financially and technology advanced. The quality of experience related to consumer telepresence video will be the primary requirement. The cost of high-end, stand-alone systems will be a deterrent for most consumers. Two models of telepresence system appear to be "shaping up" in the market:

The first is a managed service provided by carrier service providers. This type of service Cisco is its biggest advocate delivers a high-quality consumer experience but is expensive. The second is delivered over a broadband connection to an Internet-enabled HDTV using a kit. Panasonic, LG and Samsung have all introduced Skype-enabled HDTV sets to be commercialized within the year.

Both models need to incorporate the consumer's HDTV and home sound systems to provide the necessary sound quality at a price that is economically viable for the consumer market. In addition to software, a high-definition camera, high-definition sound and a quality microphone system will be required. The first model, commercially introduced by Cisco in the U.S. in October 2010 under the umi brand, has not shown signs of strong acceptance. Cisco's umi was initially priced at $599 for the standalone unit, plus a $24.95 fee per month for unlimited calls, video messaging and video storing. However, as an indication of the lack of success of this model, Cisco reduced the price of the umi box to $499 and the service to $9.95 a month in March 2011, while in April 2011 it announced that it was moving the umi to its enterprise product line. Meanwhile, connected TV manufacturers are already offering a Skype client that, with the use of a high-definition camera that retails for between $100 and $200, allows for Skype video calls to be made on an HDTV set. Only very few countries, and sections of their populace, will have the standard of living necessary to accommodate consumer telepresence within the next decade. It is suitable for some vertical segments in which it may be adopted faster; for example, distance learning and remote health applications. User Advice: For technology providers, consumer telepresence must adapt to the consumer's wallet; this will require the use of a range of different home entertainment systems. The experience that vendors are gaining in enterprise telepresence will be directly applicable to consumer telepresence, but only with regard to features such as quality of service. Consumer telepresence is a market worth studying as it evolves, but it will be many years before it becomes a viable market for many players. Those first to market may have an advantage, but the real gains will come only when the experience offered approaches the feeling of "being there." Telepresence is about quality of experience; rudimentary solutions will not suffice.

Page 20 of 139

Gartner, Inc. | G00214660

Business Impact: Until truly affordable consumer telepresence systems come to market, rather than pared-down business systems for the elite, the eventual success of this technology will still be in question. Even the potentially less expensive Skype offering requires the purchase of new Internet-enabled HDTV sets and kits, but the markets most likely to adopt this technology have only just upgraded to "regular" HDTV. Consumer telepresence's potential to deliver the experience of "being there" offers the promise of significant change in the dynamics of consumer communications. However, and most important, it is a technology solution that requires a change in consumer habits. This is a market fraught with peril for technology and service providers. Benefit Rating: Low Market Penetration: Less than 1% of target audience Maturity: Embryonic Sample Vendors: Cisco; HP; LG; LifeSize; Logitech; Panasonic; Samsung; Skype; Tandberg; Teliris Recommended Reading: "Predicts 2011: CSPs Must Rethink Business Paradigms to Meet Market Challenges" "Vendor Rating: Cisco"

3D TV Services
Analysis By: Fernando Elizalde Definition: Three-dimensional television (3D TV) services deliver 3D images to TV sets using stereoscopic imaging: where two slightly different images are superposed and transmitted to each eye. There are several technologies currently used to deliver 3D images on TV sets. They fall into two broad groups: those that require glasses, and those that don't. Position and Adoption Speed Justification: The resurgence of 3D screening and the enormous commercial success of some recent 3D films in 2010, generated high expectations around the delivery of 3D content to the home TV. Hollywood studios are interested in extending the reach of the cinema experience to the home, because this will create an additional pay-TV revenue stream for their growing portfolio of 3D content. For example, Walt Disney's Pixar subsidiary has announced that it will produce all future content for 3D release. Similarly, Sony Pictures' Discovery Communications and IMAX have formed a joint venture to produce native 3D content, including original series. At the same time, TV service providers are looking for the next premium TV experience after highdefinition TV (HDTV). Several broadcasters and pay-TV operators have been trialing 3D TV services since mid-2009, and many have gone live from mid-2010 onwards (most noticeable among these are the U.S. sports broadcaster ESPN and U.K. satellite TV operator Sky). The BBC, the U.K.'s public broadcaster, has also announced that it will film the 2012 London Olympics in 3D.

Gartner, Inc. | G00214660

Page 21 of 139

Most of the 3D services launched consist (as expected) of a single 3D channel for linear TV, showing mixed content, rather than genre-dedicated channels. Content remains limited to sports events and documentaries. Some operators, such as subscription-based 3D video-on-demand (VOD) services are starting to be launched in France, Germany, Poland and other countries, while TV manufacturer Samsung has launched a 3D VOD service on its connected TV platform. Broadcasters and pay-TV providers in geographies other than the U.S. and selected markets in Europe and Asia/Pacific, will continue to concentrate on introducing HDTV for the next three to five years at least. Current 3D TV technologies present pay-TV providers with a relatively easy setup to deliver the next significant consumer experience; compatible, in some cases, with the 3D-ready HDTV set-top boxes already deployed in some consumers' homes. However, the ready-to-deploy technologies require either the use of special glasses to view the content in 3D, or special glasses and a filter for the television screen (or even a new set) for an experience that, ultimately, may not be optimal. In any case, the lack of industry standards remains an issue that translates into limited deployment on fragmented, proprietary platforms. Understandably, consumer electronics' vendors are heavily advocating 3D TV services, so that they can introduce equipment to capture 3D content and products for the consumer household. Although 3D-ready TVs have started to come down sharply in price, sales have remained low hindering adoption and indicating an initial lack of mass-market interest in 3D TV services. The biggest issues holding back anything like mass-market adoption are the need to wear special glasses to see the 3D effect and the limited viewing angles at which one can appreciate the 3D effect. 3D TV sets that don't require glasses won't be available at mass-market prices for many years to come. User Advice: Early commercial launches in the U.S., the U.K., France and a few other countries will provide a good early insight into how well consumers receive 3D TV services. Vendors and service providers interested in this market must set up a process to keep track of how the uptake of these services progresses, and remember that successful early trials do not always translate into commercial mass-market opportunities. Industry players must set up industry standards quickly, and avoid creating a market with fragmented technologies for 3D TV services. Currently, the 3D content available for TV viewing is restricted to films and sports events and is limited. There is a strong drive from film studios to release 3D content in the near future. The genres most suitable for 3D viewing are horror, sports, action, adult content and children's animation; plus certain types of performance, such as music and dance. These types of content are very suitable for VOD. The appeal of 3D for soap operas, situation comedies, reality shows and news remains questionable. Business Impact: Consumer electronics' manufacturers will continue to introduce 3D-ready equipment and to increase production of this product type in the coming years. However, demand

Page 22 of 139

Gartner, Inc. | G00214660

will be concentrated in markets where multi-TV-set households are starting to replace their second TVs, and where sports bars are popular. Despite the hype and the initial commercial launches, it is not clear that the consumer market is ready for 3D TV yet. However, for the more specialized and less family-orientated content for which people are willing to pay for better quality experiences (such as sports TV), 3D offers a chance for service providers to differentiate their services from those of their competitors and to drive additional revenue. Benefit Rating: Moderate Market Penetration: Less than 1% of target audience Maturity: Emerging Sample Vendors: ESPN; France Telecom; LG; Samsung; SingTel; Sky; Sony; Virgin Media Recommended Reading: "Market Insight: 3D TV, Larger-Than-Life Expectations?" "Dataquest Insight: 3-D TV; A Mass-Market Product or a Niche Technology?"

Diameter Protocol
Analysis By: Deborah Kish Definition: Diameter is an authentication, authorization and accounting (AAA) protocol for computer and telecommunications networks and is the successor to Remote Authentication Dial-In User Service (RADIUS). RADIUS is the legacy AAA solution for Password Authentication Protocol and Challenge Handshake Authentication Protocol. IP Multimedia Subsystem (IMS) architecture needs twice the security, efficiency, reliability and flexibility, therefore Diameter = RADIUS x2. The main features included in Diameter that overcome the limitations of RADIUS include operation over reliable connections (TCP/Stream Control Transmission Protocol [SCTP]). RADIUS operates over User Datagram Protocol, which does not provide reliable connections, whereas Diameter operates over either TCP or SCTP, which are used widely in the all-Internet-Protocol (IP), service-oriented IMS and Long Term Evolution (LTE) architectures. Diameter supports 32-bit vendor-specific attributes, which translates to efficiency versus RADIUS, which supports only eight bits. Diameter supports more pending AAA requests. Diameter's challenge/response attributes can be secured using end-to-end encryption and authentication, and "keep alive" messages that indicate that a server is going down for a period of time. The combination of reliable transport and keep-alive messages allows Diameter-based systems to detect and recover from server failures more efficiently. Position and Adoption Speed Justification: Solutions using the Diameter protocol have been slowly increasing since 2009 and, with the onset of smartphone and device adoption, and the expected increase in mobile data connections, it is determined that the RADIUS protocol will not be sufficient to handle the increasing number of AAA requests. Gartner expects that mobile data traffic will grow at a compound annual rate of 91% between 2010 and 2015. With CSPs ramping up their

Gartner, Inc. | G00214660

Page 23 of 139

LTE infrastructures, we expect that the time to the Plateau of Productivity for the Diameter protocol will be two to five years. User Advice: Carriers should create a sequential road map for controlling the traffic explosion in a planned way and CSPs must be careful in order to provide adequate levels of control over subscribers and the infrastructure. There is no "one solution fits all" due to consolidation among CSPs and the inheritance of varying technologies and solutions. So CSPs must use a multiplicity of tools to address the continuous increase in data consumption; for example, tariff changes, policy management, traffic shaping and optimization, as well as adding new infrastructure, using offload and leveraging spectrum. CSPs that have technology and network road maps targeted at increasing mobile subscriptions, with immediate goals of high quality of experience and service, should look to vendors with routing solutions using the Diameter protocol, as it will be useful in scenarios such as policy enforcement, admission control and service provisioning. Business Impact: When implementing an IMS or LTE network, as traffic levels grow, the lack of a capable signaling infrastructure poses a number of challenges from which Diameter protocol solutions promise to provide relief. While LTE defines Diameter-based rather than Signaling System 7 (SS7) interfaces, most of the functionality performed by SS7 will have to be carried through Diameter. As the adoption of Diameter-based solutions grows, the number of interfaces and complexity will increase, creating the need to optimize networks for Diameter-related tasks. Solutions using Diameter signaling relieve LTE and IMS endpoints of routing, traffic management and load balancing tasks, and provide a single interconnect point to other networks. The resulting architecture enables IP networks to grow incrementally and systematically to support increasing service and traffic demands. Diameter routing can also add other advanced network functionalities, such as address resolution, Diameter interworking and traffic steering. Additionally, the Diameter protocol supports subscriber roaming through Diameter policy exchange across CSPs, as it is better suited to handle the interdomain exchange of user information. Benefit Rating: High Market Penetration: Less than 1% of target audience Maturity: Emerging Sample Vendors: F5; HCL Technologies; Metaswitch Networks; Radvision; Tekelec

CMTS Bypass
Analysis By: Ian Keene Definition: Cable modem termination system (CMTS) bypass is an architecture that enables the deployment of IPTV technology on a cable network. This architecture redirects video traffic away from the CMTS unit that handles the Internet Protocol (IP) data traffic and, instead, routes it through edge quadrature amplitude modulation (EQAM) units. CMTS bypass can reduce the load on CMTS ports by redirecting the cable operator's IPTV traffic. This approach maintains separate streams for video and data, even if the video is delivered as IP-based video. The alternative for cable operators is to pass all IPTV traffic through the CMTS ports.
Page 24 of 139
Gartner, Inc. | G00214660

CMTS bypass gained some traction back in 2006 to 2009, when EQAM units were significantly less expensive than CMTS hardware. It still remains a valid solution for cable operators looking to deliver on-demand IPTV services. However, as CMTS costs reduce over time so the attraction of CMTS bypass fades somewhat. Position and Adoption Speed Justification: There are a number of deployments of CMTS bypassbased IPTV in existence, mainly by small and midsize cable providers. The concept of delivering IPTV via cable networks is an approach being considered for the evolution of multisystem operator (MSO) architecture into a converged IP-oriented model. MSOs are facing a dual challenge to their business models: on the one hand, demand for high-definition content is increasing and taxing the capacity of MSOs' broadcast video delivery architectures; on the other hand, competition from wireline communications service providers (CSPs) with fiber-to-the-home and VDSL2 access networks, is raising the bar in regard to consumers' expectations for data bandwidth delivery. CMTS bypass offers a potentially cost-effective alternative to addressing these issues. Networkwide implementations of CMTS bypass by large MSOs are doubtful during 2011 to 2012, but large MSOs will study the deployments by smaller cable providers. The long-term vision of most cable operators is moving toward an all-IP network. In the meantime, many are looking to provide on-demand IPTV to their subscribers and CMTS bypass is an option to achieve this. However, doubt is cast on CMTS bypass ever reaching the Plateau of Productivity and mainstream deployment for the following reasons:

Data-Over-Cable Service Interface Specification (DOCSIS) 3.0 has reached early mainstream adoption, delivering higher subscriber bandwidth through the DOCSIS 3.0 CMTS. An increasing amount of this available bandwidth is being consumed by over-the-top (OTT) video, hence the relative bandwidth savings from CMTS bypass of IPTV traffic are being continually eroded. CMTS costs have reduced over time and are expected to continue to do so, hence any lower cost attraction of CMTS bypass is diminished. There is a move by a growing number of MSOs to support IPTV delivery to a wide selection of devices such as smartphones and PCs, not just to set-top boxes. In June 2011, CableLabs released an architectural overview of the Converged Cable Access Platform (CCAP) specification. Backed by Comcast, Cox Communications and Time Warner Cable, among others, CCAP integrates the functions of EQAM and DOCSIS 3.0, plus the ability to support passive optical network in the same chassis at the headend. This is likely to be favored over CMTS bypass in the three-year to five-year time period if, and when, products are brought to market.

User Advice: Larger service providers should examine reference points from early adopters as they consider CMTS bypass as one of a number of existing solutions for solving the challenges of bandwidth optimization and video delivery. While there may be some benefits in deploying CMTS bypass, the implementation and management processes demand caution. Smaller players can be more aggressive in deployment of the technology, since it offers some potential benefits at a reasonable price and, with smaller footprints, integration and implementation risks are more easily mitigated. Take into consideration your strategy on OTT video.

Gartner, Inc. | G00214660

Page 25 of 139

Business Impact: Smaller players will benefit from a strategy of service expansion that is costeffective. In the medium term, the CMTS bypass architecture may prove very beneficial in reducing costs: optimizing network capacity by offloading video traffic in IP format from the CMTS platform. However, this offload does not apply to OTT video, which is the fastest-growing type of video traffic in broadband networks. Additionally, MSOs (as well as wireline CSPs) are looking at OTT video as a complementary delivery tool for video programming, mostly as part of a convergence strategy. This limitation looms large as MSOs analyze their alternatives for moving to an IPTV architecture something that seems inevitable at some point as convergence becomes a more important factor in the evolution of video delivery business models. Benefit Rating: Moderate Market Penetration: 1% to 5% of target audience Maturity: Emerging Sample Vendors: Aurora Networks; BigBand Networks; Harmonic; Motorola Recommended Reading: "Forecast Analysis: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Forecast: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update"

WiMAX 802.16m
Analysis By: Joy Yang; Phillip Redman Definition: In April 2011, the Institute of Electrical and Electronics Engineers (IEEE) approved WiMAX 802.16m, also called Mobile WiMAX Release 2 or WirelessMAN-Advanced. WiMAX 802.16m is a proposed technology for next-generation high-speed services. It is being prepared and submitted to the International Telecommunication Union (ITU) as a candidate for standardization for International Mobile Telecommunications-Advanced (IMT-Advanced), or fourth-generation (4G) wireless communications, together with Long Term Evolution-Advanced (LTE-Advanced). The plan is to release the final decision for the 4G standard in 2012. The specification continues to evolve, but currently includes a 100 Mbps downlink in mobile situations and a 1 Gbps downlink in nomadic situations. The group behind the proposal the IEEE 802.16 Broadband Wireless Access Working Group has defined the main principles for ".m." These include backward compatibility with the WiMAX 802.16e-2005 standard, support for spectrum up to 100 MHz, multiple input/multiple output (MIMO), and time division duplexing/frequency division duplexing (TDD/FDD) modes. As well as meeting the requirements for 4G mobile networks and ensuring backward compatibility with existing WiMAX technology, the WiMAX 802.16m working group aims to achieve:

High spectral efficiency and voice over Internet Protocol (VoIP) capacity, leading to data throughput rates of up to 1 Gbps. Improved cell coverage, with "optimized" performance within 5 km, "graceful" performance at 5 km to 30 km, and "functional" performance at 30 km to 100 km.

Page 26 of 139

Gartner, Inc. | G00214660

Better handover capabilities than 802.16e when users are on the move the technology should be able to maintain connections even when people are traveling in vehicles at speeds of 120 km/h to 350 km/h.

The key technologies used in WiMAX 802.16m are orthogonal frequency division multiple access (OFDMA), MIMO, smart antennas, carrier aggregation, relay and intercell interference coordination. WiMAX 802.16m is being developed to support TDD, FDD and half-duplex FDD (H-FDD) schemes, and to operate in licensed spectrum allocated for mobile and fixed broadband services and future IMT-Advanced services. Position and Adoption Speed Justification: Although the WiMAX 802.16m specification has been approved by IEEE and has been chosen as one of two candidates for the ITU's 4G standard, WiMAX will face challenges because of the strength of vendor support for and carrier adoption of TD-LTE. Infrastructure and terminal vendors have made less effort to invest in WiMAX than in LTE, and Gartner expects less investment for 802.16m. Currently, 802.16e-2005 has had limited rollout and support from Tier 1 operators. As many WiMAX operators are already indicating a migration to LTE (Clearwire in the U.S., for example), there is a decreasing chance that WiMAX 802.16m may ever be implemented, even if it is considered a true 4G alternative. The delayed broadband wireless access (BWA) licenses in India, which were released in late 2010, also eliminated the potential WiMAX opportunity, since most of the Indian BWA license holders have chosen to go with TD-LTE. User Advice: WiMAX technology is being challenged by the rapidly maturing TD-LTE technology. WiMAX 802.16e has appeal as a last-mile access technology for fixed broadband networks in emerging markets, and still has its niche market. However, 802.16m is competing directly with TDLTE or TD-LTE-Advanced, which leaves little chance for 802.16m. The business case for WiMAX 802.16m as a mobile solution is not clear. Also, there is weak support from the ecosystem, including infrastructure vendors, terminal vendors and chipset vendors. Business Impact: The IEEE plans to add features to WiMAX 802.16m to meet operators' requirements for quality-of-service management, location-based services, self-organization, security and interoperability with Wi-Fi networks and femtocells, among other things. This will improve WiMAX's ability to advance from current IT-grade services to telco-grade services, which will help WiMAX's market position. Assuming that the standard also supports full backward compatibility, including network and device support for 3G cellular systems, WiMAX 802.16m could become a competitive alternative to LTEAdvanced systems and could be used as a wireline replacement in many cases. Benefit Rating: Moderate Market Penetration: Less than 1% of target audience Maturity: Embryonic Sample Vendors: Alvarion; Huawei; Motorola; Samsung; ZTE

Gartner, Inc. | G00214660

Page 27 of 139

LTE-A
Analysis By: Sylvain Fabre Definition: Long Term Evolution Advanced (LTE-A), the 3rd Generation Partnership Project (3GPP) Release 10, is supposed to be the first version that is fully compliant with the International Telecommunication Union Telecommunication Standardization Sector (ITU-T) specification for fourth-generation (4G) systems as LTE is actually technically still 3G/WCDMA. The targeted peak rate for a downlink is 1 Gbps and for an uplink greater than 500 Mbps for stationary devices. This should be achieved with scalable usage of up to 100 MHz of spectrum. LTE-A should support various cell types, including picocells and femtocells the segment now called "small cells" and should improve uplink speeds, as well as relay technologies to improve coverage. LTE-A should be backward-compatible with LTE Release 8. Some functionality proposals include relay nodes, flexible spectrum usage and cognitive radio. Relays use over-the-air (OTA) links to macro base stations as backhaul connections, so OTA backhaul will need to be taken into consideration when choosing between relay access and direct access. Backhaul could pose challenges, as devices may get hundreds of Mbps up to a theoretical limit of 1 Gbps although no application needs hundreds of Mbps per user, and most backhaul networks are not able to handle 1 Gbps. It is also worth noting that direct access to macro base stations by user equipment in relay coverage may cause significant interference with relay base stations. The scoping of LTE-A is part of the ITU-T specification of 4G. Position and Adoption Speed Justification: LTE-A standardization is likely to be complete by 2011 at the earliest. Therefore, certified infrastructure network equipment for LTE-A can be expected 18 to 24 months after the standardization is complete, which means 2013. After 2013, trials, early commercial rollouts and upgrades to LTE systems will start, and mass-market deployment will happen during the following five years, until around 2018. However, the timeline for LTE-A could vary and depends on the success of LTE, which may still take time, as adoption of a new wireless generation can easily take up to a decade, as was the case for Global System for Mobile Communications (GSM) and wideband code division multiple access (WCDMA). We expect LTE-A to initially focus in urban areas of high data usage. User Advice: It is too early to plan for LTE-A for enterprises. Operators procuring LTE equipment should ensure that the equipment can be upgraded to LTE-A, or that at least that their vendors are designing LTE-A for backward-compatibility with LTE, as today's CSPs struggle via WCDMA and LTE interworking. Although vendors of equipment are likely to yet again promise software-only upgrades, CSPs should plan for some hardware changes on radio sites, as well as another quantum leap in transport capacity needs. Business Impact: LTE-A aims mainly for high-speed wireless data for low-mobility users. Network sharing as a concept to save money for carriers will be part of LTE-A, and could lead to new operational models of mobile networks (some LTE networks already will be relying on sharing, such as LightSquared-Sprint in the U.S. or Yota in Russia). LTE-A also has some implications in relay functions to create mesh networks and base station routing, which will minimize backhaul transport for nearby peer-to-peer traffic. Consumer and enterprise users can expect LTE-A to be positioned as a premium service initially.

Page 28 of 139

Gartner, Inc. | G00214660

Benefit Rating: High Market Penetration: Less than 1% of target audience Maturity: Embryonic Sample Vendors: Alcatel-Lucent; Ericsson; Fujitsu; Huawei; NEC; Nokia Siemens Networks; Samsung; ZTE Recommended Reading: "Magic Quadrant for LTE Network Infrastructure" "Market Trends: Demand for LTE Infrastructure From CSPs, 2011" "Market Trends: Worldwide, How Mobile CSPs can Control the Mobile Data Explosion, 2011" "Dataquest Insight: LTE and Mobile Broadband Market, 1Q10 Update" "Emerging Technology Analysis: Self-Organizing Networks, Hype Cycle for Wireless Networking Infrastructure" "Dataquest Insight: IPR Issues Could Delay Growth in the Long Term Evolution Market"

Smart Antennas
Analysis By: Deborah Kish Definition: Smart antennas provide the signal-processing function behind antenna arrays. The technology has two basic functions: beamforming and identifying the direction of the arrival of the signals. Beamforming in cellular networks has advanced through the various generations of mobile technology to reach higher-density cells with higher throughput. It has played an integral part in the migration toward third-generation networks. Based on the calculation of the direction of the arrival of the signal, smart antennas are able to optimize the transmitter antenna beam, maximize the energy directed to the subscriber radio and minimize the energy radiated to other subscriber radios. Therefore, smart antennas can increase the signal to interference and noise ratio, and the channel capacity. Determining the direction of the signal is important because portions of the signal are scattered and late arrival of scattered signals causes problems such as fading, cutout and intermittent reception. The use of smart antennas can reduce, or eliminate, the trouble caused by multipath wave transmission. Smart antennas can be considered as a space division multiple access technology that will significantly increase the capacity in the same spectrum. It is becoming a key technology in Time Division-Synchronous Code Division Multiple Access, WiMAX and Long Term Evolution (LTE). Smart antennas have become the basis of multiple input/multiple output (MIMO) technology, especially in WiMAX and LTE. However, here we are referring to smart antennas specific to cellular systems (similar solutions are seen in neighboring areas for example, MIMO for Wi-Fi [802.11n]). MIMO technology places multiple antennas at both the source and the destination, eliminating the negative effects of multipath transmission. MIMO can also be advantageous in femtocell

Gartner, Inc. | G00214660

Page 29 of 139

deployment, enabling a femtocell to switch between providing high data rates and strong transmission. Smart antennas are positioned as a low-cost alternative to deploying additional cell sites. Position and Adoption Speed Justification: Wireless communications are increasing due to growth in subscriptions, provisioning of advanced new services and an increase in data and multimedia traffic. The adoption of Apple's iPhone and Android-based phones, and the adoption of other devices with mobile connectivity, such as iPads and tablets, has created service issues due to the increased traffic from mobile Internet usage, mobile video streaming and application store activity, as well as normal everyday SMS data traffic usage. The pressure on communications service providers (CSPs) to provide subscribers with more reliable services from a coverage perspective, as well as increased mobile usage, is leaving them increasingly compromised. CSPs need solutions that will assist with backhaul and traffic offload. Smart antennas have been deployed in most large wireless carrier networks to improve capacity and tackle overall quality of service (QoS) issues caused by increasing voice and data traffic. Smart antennas will be popular for mobile networks when LTE becomes more widely deployed. The module that supports the process for realtime communications will be mandatory technology in the future after LTE build-outs become more apparent. As the technology will be more popular in the future, it will be important for major chip vendors to produce the smart antenna technology module. User Advice: Adding more antenna arrays, with smaller beam-width antennas, would help to improve the capacity of the channel. However, it would also significantly increase the complexity of signal processing. In other aspects of telecommunications, the notion of equipment sharing has been a point of discussion, and it could be the same for smart antennas. Over time, smart antennas have extended additional applications, such as sharing, and they can also be found as virtual antennas embedded in various residential wireless gateways. Wireless carriers can benefit from smart antennas as they maximize spectral efficiency and provide improved coverage and QoS, reducing customer churn and increasing subscriber levels. Consider smart antennas for cellular systems, as MIMO for Wi-Fi (802.11n) is already climbing up the Plateau of Productivity. Business Impact: From a network perspective, smart antennas have been very useful in increasing channel capacity. If handset vendors could introduce the same technology into their devices, it would improve radio performance and help to reduce the power consumption of handsets. Increased broadband penetration may also provide adoption opportunities for smart antenna technology in relation to femtocells. While smart antennas are a good solution to the traffic-related issues in mobile communications, femtocells are an emerging alternative within the enterprise, providing backhaul assistance. Benefit Rating: Moderate Market Penetration: 1% to 5% of target audience Maturity: Emerging Sample Vendors: Andrew Solutions; Antenova; ArrayComm; Ruckus Wireless; Westell; ZTE

Page 30 of 139

Gartner, Inc. | G00214660

VDSL2 Enhancements
Analysis By: Ian Keene Definition: Very-high-bit-rate DSL (VDSL2) started to gain significant commercial traction in 2008 and 2009, with deployments in networks worldwide. Since then, shipments continue to grow as a percentage of the total DSL market. Current VDSL2 deployments typically supply subscriber download speeds of 20 Mbps to 50 Mbps depending on the length of the copper local loop and the quality of the cable. However, an increasing number of communications service providers (CSPs) are concerned that in future such bandwidth provision will not be sufficient to compete and to deliver new high-bandwidth services to subscribers both consumer and business. One solution is to deploy fiber directly to the subscriber, but recent enhancements to the basic VDSL2 technology have promised to increase bandwidth to the point where the cost of fiber deployment may not be necessary. Recent laboratory demonstrations and field trials of VDSL2 enhancements have used one or more of the following technologies:

Copper twisted pair bonding (International Telecommunication Union [ITU] standard G.998.2). This utilizes two or more copper twisted pairs to increase the aggregate bandwidth to the subscriber. "Vectoring" (ITU standard G.993.5). This enhances the cancellation of cross talk between neighboring copper pairs in a cable, enabling higher bandwidths over longer distances. DSL "Phantom Mode" techniques that create a third virtual pair from two bonded copper pairs.

For example, in 2009 Ericsson demonstrated 500 Mbps over 500 meters using six bonded pairs and vectoring. In April 2010, Alcatel-Lucent demonstrated 300 Mbps over 400 meters using two bonded pairs, vectoring and phantom mode. Since then, Huawei and Nokia Siemens Networks (among other vendors) have demonstrated significantly improved speeds over basic VDSL2. During late 2010 and early 2011, vendors have been developing and productizing these demonstrated technologies. Vendors are interested in demonstrating high user bandwidths for their VDSL2 solutions in order to help compete on functionality rather than price as VDSL2 moves to a higher volume and more commoditized market position. Copper twisted pair bonding requires that at least two twisted pairs are available for a subscriber. While some networks have this spare capacity, others do not. Vectoring requires a large amount of processing, which increases as the square of the number of active twisted pairs in a cable. Phantom mode is a technique that was used for creating extra lines in old analog networks, and is now applied to DSL lines. Position and Adoption Speed Justification: In terms of practical subscriber deployments, these laboratory demonstrations need to be developed and proven in the field. User bandwidth will depend upon the length of the copper local loop, quality of the installed cables containing many copper pairs, and the availability of redundant copper pairs to enable bonding. Current vectoring chipsets can scale up to 32 lines; but for large-scale deployment, products that can process 100 or more lines will be needed. CSPs are also likely to want system-level vectoring, as opposed to the

Gartner, Inc. | G00214660

Page 31 of 139

current line card vectoring. Initial products suitable for high-volume deployment are expected in 2012 to 2013, with anticipated subscriber bandwidths of 50 Mbps to 150 Mbps. However, the cost and power consumption of the products currently under development is unclear and may not meet CSPs' needs. In the meantime, available solutions may prove attractive for fiber-to-the-building architectures where the building serves multiple subscribers via VDSL2. With many CSPs building out their fiber networks much closer to subscriber premises, VDSL2 enhancement product development will increasingly become an interesting technology to track and test. Many CSPs will view this as an incremental opportunity to improve service revenue alongside the deployment of gigabit passive optical network technology to replace their ADSL or ADSL2 networks. User Advice:

CSPs should consider VDSL2 enhancements as both complementary and as an alternative to fiber-to-the-home deployments. Monitor vendor progress with product development and conduct field trials. When considering the business viability of VDSL2 enhancements, CSPs need to take into account both capital expenditure and any increased operating expenditure. CSPs should initially consider VDSL2 enhancements, particularly bonding and phantom mode, to enhance bandwidth availability to business customers. Vendors need to focus on developing the components to deliver cost-effective high line count vectoring solutions.

Business Impact: If low-cost high-performing products can be developed during 2011 to 2013, then VDSL2 enhancements will prove attractive to many CSPs as a way of providing higher bandwidth services without the need to bring fiber all the way to the home. The standard may also have an impact in providing cellular network backhaul. Benefit Rating: High Market Penetration: Less than 1% of target audience Maturity: Emerging Sample Vendors: Alcatel-Lucent; ECI Telecom; Ericsson; Huawei; Nokia Siemens Networks Recommended Reading: "Emerging Technology Analysis: Next-Generation Broadband Access Caters for End-User Bandwidth Appetite" "Forecast Analysis: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Forecast: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update"

Mobile CDN
Analysis By: Akshay K. Sharma
Page 32 of 139
Gartner, Inc. | G00214660

Definition: A mobile content delivery network (CDN) is used to improve performance, scalability and cost efficiency. It is like a traditional fixed CDN, but with added intelligence for device detection and content adaptation, and technologies to solve the issues inherent in mobile networks, which have high latency, higher packet loss and huge variation in download capacity. Mobile CDNs are a type of distributed computing infrastructure, where devices (servers or appliances) reside in multiple points of presence on multihop packet-routing networks (such as the Internet) or on private WANs, but which are used to deliver mobile network services. A mobile CDN offloads origin servers via edge caching, and offers improved latency via closer proximity to the user, as well as intelligent optimization techniques. A mobile CDN can be used to distribute rich media such as audio and video as downloads or streams, including live streams. It can also be used to deliver software packages and updates, as part of an electronic software delivery solution. Finally, a mobile CDN may also provide services such as global load balancing, Secure Sockets Layer acceleration and dynamic application acceleration via optimization techniques. In the media sector all these uses are common, and rich-media delivery via progressive downloading is the most frequently used service. Use of all of these services is also common within the e-commerce industry, where content offload is the most frequently used service. While common in fixed networks, the mobile CDN is emerging as a new technique in mobile core networks. Although some elements of mobile offerings, such as bandwidth optimization through caching and video transcoding, may be more established, the umbrella concept of a mobile CDN with global load balancing (redirecting sessions to optimal websites as people roam) is relatively new. Position and Adoption Speed Justification: Using a mobile CDN is less expensive than buying servers and bandwidth, which is critical for mobile communications service providers (CSPs). Newer optimization techniques (including video caching, transcoding and multicasting) are occurring in the mobile core. User Advice: Wireless CSPs should carefully assess the opportunities for partnering with mobile CDNs. As more consumers look to online service options for searching and acquiring content, an efficient and seamless experience will mean the difference between success and failure. CDNs can assist in improving end-user performance, such as the streaming of cached assets, and help to reduce bandwidth costs for high-volume and content-heavy sites. Business Impact: We believe that mobile CDNs will continue to increase the breadth and scope of these additional services, expanding the range of application-fluent network services and facilitating relationships between e-commerce partners, including advertisers. Benefit Rating: High Market Penetration: Less than 1% of target audience

Gartner, Inc. | G00214660

Page 33 of 139

Maturity: Emerging Sample Vendors: Akamai; Alcatel-Lucent; AT&T; Ericsson; F5 Networks Recommended Reading: "Emerging Technology Analysis: Enterprise Video Content Management and Distribution, an Opportunity for Communications Service Providers" "Market Insight: LTE CSPs Can Offer Session Control of Carrier-Grade Solutions With Over-the-Top Web-Based Streaming Video"

Network Intelligence
Analysis By: Kamlesh Bhatia Definition: Network intelligence (NI) is an enabling technology that allows communications service providers (CSPs) to capture subscriber-, service- and application-level awareness contained in network traffic. This information is then analyzed and exposed for integration with other applications in the back office, allowing CSPs to apply granular policies to influence customer experience and adapt to dynamic shifts in application and service usage. The solution is based on nonproprietary hardware and software platforms and can be used by CSPs on any network. The solution is modular and can be deployed stand-alone, rather than being embedded in network elements such as routers or switches, allowing it to evolve independently of underlying network or application complexity. This is essential as the number (and complexity) of applications increases and CSPs are faced with supporting hundreds of protocols and standards. NI implemented as a stand-alone solution forms a middleware between the network and application layer, enhancing applications such as billing and charging, revenue assurance, policy management, bandwidth management, service assurance and security. Position and Adoption Speed Justification: Adoption of NI as part of business and operations systems is driven largely by the need to have a more granular view of and control over subscriber consumption patterns. This is especially relevant as CSPs face a surge in data consumption and as network upgrade costs outpace revenue growth. In addition, as subscriber and network awareness become an integral part of new services, tapping into NI in real time can significantly enhance the perceived value of the service and create new revenue opportunities. NI has, until now, been embedded in proprietary hardware and software network technology, which has made it inflexible and unable to cope with the level of change (more protocols and features), especially in Internet Protocol environments. A nonstandard approach to NI has also made it difficult to integrate the underlying NI with traditional business intelligence and other business and operations applications. We expect that modular NI software will help to close this gap, allowing better integration of back-office software with real-time NI. This will, in turn, play into the trend of making back-office applications more event-driven to monetize content and transaction services. Standardization of interfaces and a platform-based OEM approach by specialized vendors will help to increase the uptake of integrated NI. In some cases, CSPs may want to integrate NI capability policy management to have more granular control of charging or service assurance with the aim of improving customer experience.
Page 34 of 139
Gartner, Inc. | G00214660

User Advice:

Evaluate bandwidth consumption patterns in your network to make a business case for an NI solution. Keep in mind the influence on new product development of enhanced subscriber and application awareness. Revenue and subscriber intelligence is crucial to monetize investment in advanced networks, including Long Term Evolution. Consider building robust NI capability that integrates with applications such as policy management, charging and subscriber data management to address aspects of the revenue and customer life cycles. Technology development cycles are becoming shorter and require specialization. Work with vendors that offer standardization in their approach and a high level of modularity.

Business Impact: Granular understanding of subscriber and network behavior is vital to monetize data and new content-based services. Enhancing the functional capability of back-office applications by adding subscriber and network awareness can help CSPs to improve customer experience, create upsell opportunities and optimize resource consumption. Benefit Rating: Moderate Market Penetration: 1% to 5% of target audience Maturity: Emerging Sample Vendors: Cisco; ipoque; Qosmos; Sandvine

Cloud-based RAN
Analysis By: Sylvain Fabre; Akshay K. Sharma Definition: We define cloud-based radio access networks (RANs) as a new architecture for the mobile RAN that combines several key attributes, with some variations based on the vendor. The cloud aspect refers to the fact that parts of the architecture for the base station is located in the cloud, typically this would be the control elements, such as the base station controller, radio network controller or mobility management entity. One of the clear goals of this technology is cost savings for the communications service provider (CSP), through lower site costs because of a reduced footprint (although the site still needs to be of the right height, at the right location and so on, so that CSPs may not necessarily have complete freedom of placement), as well as a more efficient use of resources. For example, capacity will be directed where usage requires it rather than being allocated equally across the cell using, for example, smart/active antennas while beamforming alone can add extra capacity to a site. Generally, parts of the architecture of the base transceiver station can be pooled and shared among several sites, for example baseband processing. The use of self-organizing networks allows a unified management of the network with the mix of cell sizes, from macro cells to "small cells" which would be typical of a heterogeneous network (also known as HetNets).

Gartner, Inc. | G00214660

Page 35 of 139

Position and Adoption Speed Justification: So far, the main proponents of this architecture are Alcatel-Lucent with its Light Radio concept and Nokia Siemens Networks with Liquid Radio. It will take several more of the leading vendors to present their products for this segment to be considered mature and stable enough to gain wider CSP adoption. User Advice: CSPs should consider this architecture for areas where new site acquisition is an issue due to cost or local regulations issues, or where current sites no longer allow larger form factors to be installed for RAN equipment. Business Impact: For CSPs, this concept provides a distributed architecture where components can be placed in different locations, and capacity can be dynamically allocated where it is most needed. Significant cost savings are advanced by vendors in this space, with Alcatel-Lucent, for example, claiming that in the context of Long Term Evolution and small cells, its Light Radio concept could reduce the total cost of ownership by over 50%, when compared to a legacy single RAN architecture. The very small form factor with system-on-chip components means that the antenna equipment on site can be so small that it is no longer visible. However, more data is needed to establish if having many more such very small cells will be cheaper than fewer, larger traditional sites. Benefit Rating: High Market Penetration: Less than 1% of target audience Maturity: Emerging Sample Vendors: Alcatel-Lucent; Freescale Semiconductor; Nokia Siemens Networks Recommended Reading: "Magic Quadrant for LTE Network Infrastructure" "Forecast: Mobile Data Traffic and Revenue, Worldwide, 2010-2015" "Emerging Technology Analysis: Self-Organizing Networks, Hype Cycle for Wireless Networking Infrastructure"

Socialcasting
Analysis By: Allen Weiner Definition: Socialcasting provides lightweight, portable tools for the creation of scheduled and ad hoc live broadcasts. This space has evolved from consumer webcam videocasts to a clear bifurcation between a growing number of professional news organizations and consumer webcasts. Professional news organizations use socialcasting ("backpack journalism") to cut costs in mobile broadcasts (truck rolling, microwave transmission) or to create live broadcasts in difficult-to-reach areas. An example is Fox News' use of Kyte's (now part of Kit Digital) socialcasting services to broadcast from Haiti using smartphones. The socialcasting components include back-end, online video platform provider (OVPP) capabilities, complete with content distribution network (CDN) agreements, and applications that run on smartphones and FireWire-connected video cameras. Such firms as Qik (now part of Skype), Ustream, Livestream and Kyte are working with local and
Page 36 of 139
Gartner, Inc. | G00214660

national broadcasters as well as publishers that want to enhance their content offerings. In early 2011, Google announced that YouTube would begin experimenting with "socialcasts" on an inviteonly basis. Socialcasting offerings for consumers are services for end users to broadcast live video streams, photos or video clips from their mobile phones directly to other PCs, mobile phones or televisions. Socialcasters can send live video to specific family members, friends and fans who have the service. Socialcasting allows both one-to-one and one-to-many broadcasting over ad hoc networks. The intent of socialcasting is for the video to be viewed live, while the action is happening, although content is posted at an Internet site when the intended recipients are not available to view the videos live, or when the content is for a large audience. Socialcasting also includes elements of social TV in that socialcasting platforms often include links to or direct feeds to Facebook and Twitter. Position and Adoption Speed Justification: The opportunity in this space will continue to expand as both incumbent and new digital publishers look to add more immediacy and social interaction to their content portfolios. The consumer opportunity will continue at a fairly steady pace, but as the demand for video continues to grow, a need for media firms to differentiate will help drive socialcasting. YouTube's entry into this space may be a disrupter, depending on how Google uses the new socialcasting service within YouTube as an element in its Google TV strategy. User Advice: Live content is becoming an increasingly large part of the future of television. Socialcasting needs to be viewed as a component of that vision, not as a separate endeavor. A TV network, for example, should look to create socialcasts in support of a new TV show in which the show's star or producer is featured in a socialcast to enable live interaction to promote the new offering. Socialcasting can also be viewed from a cost-savings perspective as local and national news media do more with less and perhaps trade some level of broadcast fidelity for continued live, on-the-scene coverage. Socialcasting also provides an easy platform to include social media as part of a live broadcast strategy. Most consumers do not know that socialcasting capability is an option for them or how to do it. Marketing is necessary to educate the consumer in order for socialcasting to become mainstream. One reason socialcasting by consumers is not more strongly promoted is that socialcasting for business is often supported by advertising. Sponsors are less willing to have their advertisements associated with content from individual consumers, because there is no control over the subject matter and video quality. For OVPPs such as Flumotion, live content streams are becoming a differentiator as part of their service packages. OVPPs are positioned to offer end-to-end solutions that include encoding on the fly, distribution, social media application development and analytics. For concerts, sporting events and special events, live has a place in the future of social TV, and socialcasting provides an easy, established path to execute that strategy. Business Impact: Google's entry into socialcasting moves this technology up the Hype Cycle slightly as the market for social TV begins to heat up. With the number of enhanced TV platforms

Gartner, Inc. | G00214660

Page 37 of 139

emerging, ranging from Apple to Google's efforts in this space, socialcasting might be poised for a greater market acceptance. Benefit Rating: High Market Penetration: Less than 1% of target audience Maturity: Emerging Sample Vendors: Google; Kyte; Livestream; Qik; Ustream

Contact Center Interaction Analytics


Analysis By: Jim Davies Definition: Interaction analytics solutions provide a holistic understanding of customer-agent interactions through the combined analysis of multiple data streams associated with the interaction. The data analyzed includes the dialogue (i.e., the audio or text, such as email or Web chat), call flow dynamics (such as speaking at the same time or pauses in the conversation), emotion (customer and agent), operations (such as call length or transfers or Web chat durations), screen (i.e., what the agent was doing on his or her desktop) and customer feedback (through the capture of postinteraction survey data). Position and Adoption Speed Justification: Interaction analytics solutions are beginning to appear in the market, but adoption within contact centers is less than 1%. The solutions lack maturity, and many have functional omissions, such as support for multiple channels (e.g., chat and email). These problems are likely to be overcome during the next two years. The insights obtained from analyzing customer-agent interactions provide new perspectives on customer understanding and organizational feedback. Early deployments have delivered high value, fueling accelerated adoption during the next few years. User Advice: Calculate the potential value-add of this integrated analytical technology suite above and beyond the siloed technologies, such as speech analytics or performance management. Pay particular attention to the technical architecture and ensure alignment with the organization's overall customer analytics strategy. Business Impact: Deployment in a contact center may uncover a diverse range of insights that can be used to improve the performance of the contact center and its agents, as well as provide customer and departmental insights (such as customer perceptions of a marketing campaign or a new product pricing strategy). The challenge is in building the business case, because the insights (and, therefore, the ROI potential) won't be known until the investment has been made. Benefit Rating: High Market Penetration: Less than 1% of target audience Maturity: Emerging

Page 38 of 139

Gartner, Inc. | G00214660

Sample Vendors: Autonomy etalk; Nice Systems; Verint Systems

At the Peak
Convergent Communications Advertising Platforms (CCAPs)
Analysis By: Mentor Cana; Jean-Claude Delcroix Definition: A convergent communications advertising platform (CCAP) is a scalable, multichannel set of interrelated applications and technologies used by communications service providers (CSPs) to deliver targeted advertising services. "Convergent" means interoperable with several types of devices, content, applications and telecom services (such as fixed voice, mobile voice, Short Message Service [SMS], Multimedia Messaging Service [MMS], fixed and mobile Internet, TV/video, e-books, car displays and train displays). It may extend to machine-to-machine (M2M) services such as public displays or public terminals. However, initial implementations of convergent platforms may be limited to one basic type of telecom offering mobile communications or multichannel video, for example as long as they handle advertising bound to several individual services and different types of content in relation to user profiles managed in a unified way. A CCAP draws on user-related and contextual data that is controlled or collected by CSPs, such as a user's location, network presence, status and type of device. CCAPs require deep integration with communications services data, which may be accessed through network-exposure APIs. From an architectural point of view, a CCAP includes a range of dedicated processes and data systems, but also connects to other subsystems and common data management systems to exchange relevant customer data. These include CRM, business intelligence (BI), business support systems (BSSs) and operations support systems (OSSs), and service delivery platforms (SDPs). Combining real-time location-based contextual data with specific customer demographic and service information by means of sophisticated analytics and business rules allows CSPs to deliver dynamically targeted advertising content, in a variety of situations, on a customer-by-customer basis, or to specific groups. The scope of a CCAP may include capabilities and data such as: general advertisement processing, various types of analytics, rule and recommendation engines, advertisement forwarding to users, campaign management, response management, agency management, advertiser management, advertising exchanges or links to them, billing, charging and rating, advertisement content management, digital rights management, campaign metrics, gateways and interfaces, user data, network and device data, storage, security and privacy (opt in and opt out). Market analysis and advertising design in the broadest sense may never become a strength of CSPs but that does not limit the potential of the CCAP as a platform. Other providers in the value chain, and perhaps advertisers in self-service mode, would most likely interface with CCAPs using APIs. Today, several non-convergent telecommunications-based advertising solutions exist, with different maturity levels, some being just Internet applications not specific to CSPs. These isolated advertising solutions do not build on a CSP's cross-service user advertising network data. Over time, however, CSPs that intend to "monetize" their unique competitive position may implement

Gartner, Inc. | G00214660

Page 39 of 139

converged platforms, spanning the different networks, devices and services they operate. Orange, to give one example, provides similar content across fixed and mobile networks, but does not yet offer converged advertising. Orange is working on algorithms to process user profiles and has announced a co-operative venture with OpenX, an advertising technology and service firm, to set up a European advertising exchange. AT&T's directory advertising business accounts for 3.2% of its revenue (with $1 billion from online advertising). It is building further on the directory advertising business with its Yellow Pages (YP) app, which is available on 40 million devices. AT&T has also lunched a location-based SMS advertising initiative targeting shoppers in supermarkets. An example of a CSP building a converged advertising platform is Telecom Italia's (TI's) purchase of Italy's well known Internet brand Virgilio, in essence providing TI with an integrated Internet advertising platform that also includes an ad agency (Niumidia Adv). This approach does not come without challenges and risks (over-the-top [OTT] advertising companies also targeting TI's mobile and IPTV subscribers), but it has also given TI an opportunity for additional revenue sources. The move toward a CCAP in terms of CSP architecture is similar to the convergence in telecom operations management systems, which are moving away from silos into an end-to-end environment that covers multiple services and network technologies. Such a platform will increase its value by linking and co-operating with advertising agencies, content distribution services (for music and IPTV, for instance), interactive services, games and social networking. It may fully exploit Web 2.0 technologies, such as mashups and user-generated content. Position and Adoption Speed Justification: CSPs started to use mobile advertising in 2001, mainly to promote their own services. In 2005 and 2006, mobile advertising gained momentum, but revenue remained and still remains relatively low. Since the first appearance of CCAPs on this Hype Cycle in 2007, the need for them has become recognized. In particular, CSPs' CIOs see the need for system integration when they are faced with multiple applications dealing with user profiles and context data. Today, CSPs are slow in adopting a convergent or integrated approach to delivering content and advertising in a massive and productive way. Few (if any) platforms cover all screens and all channels, such as video and SMS, voice and other channels. Instead, the implementation of advertising solutions is progressing through narrow approaches, covering mobile advertisements, new forms of BI, advertisement exchanges and the selling of aggregated user data and analytics to advertising agencies. The lack of advertising-related open and standardized data models and interfaces further challenges the evolution of CCAPs. In mid-2011, the need for CSPs to offer advertising is gathering pace, but still related CSP technology is not at its peak. After the 2009 retraction of the advertising market, 2010 and 2011 appear to be years of transition. Neither the development nor the penetration of advertising by CSPs will be very fast. Moreover, social networks and related advertising are leading the hype in 2011. Software vendors targeting the telecom industry in the broad sense will progressively integrate their multiple applications and will likely include CCAPs, partly through acquisitions. Several vendors are emphasizing convergent and integrated solutions now through advertising platforms with a rich set of features. AOL has probably the largest set of technologies and companies supporting advertising, due to its many acquisitions. In the video world, Microsoft is also offering broad video and TV solutions, including a video-related advertising platform. Oracle and Amdocs also offer a broad solution. As CSPs are slow to move into advertising, vendors will try two
Page 40 of 139
Gartner, Inc. | G00214660

alternative options: to sell advertising platforms to media companies and to offer advertising as a service. Indeed, we see a new interest in offering cloud-based value-added services and advertising to CSPs or with CSPs. Two major technology trends will boost the need for CSP advertising platforms: (1) the growth of mobile data services with location information; and (2) the growth of the IPTV and mobile TV user base. We forecast moderate revenue growth through 2012. Major obstacles are regulations and consumer trends in personal data protection, as well as CSPs' difficult business transformation. We believe that CSP CCAPs will reach the Plateau of Productivity between 2016 and 2019. User Advice: Utilizing their strength in a synergistic approach, CSPs may capitalize on their user information assets in progressive steps, both by the incremental integration of advertising-related support systems and applications, and by the incremental inclusion of advertising channels across service offerings, such as mobile services, SMS, mobile data, Internet content and portals, and IPTV/video and new connected devices (e-books, tablets and public digital displays). CSPs should favor systems that: (1) build on all their services; (2) minimize duplication with their data systems; and (3) connect well within their service-oriented architecture and business process management systems. Beyond collection of user and context data, CSPs should consider vendors that offer integration expertise and solutions in massive data management and storage, security and content management. For analytics, CSPs should look for marketing knowledge as much as tools. As the needs for integrated and convergent systems related to advertising emerge within CSPs, they should assess their expertise across the different systems, technologies and processes encompassing the end-to-end advertising solutions. Based on the assessment, CSPs should determine a strategic path concerning the role they might be able to play in the advertising ecosystem, driven by business models that maximize their profits. Business Impact: The benefits of convergent advertising stem from more comprehensive and coherent profiling of customers including demographics, presence data, location data and local time, and communications, searching and browsing history. Moreover, data can be grouped to offer instant topical or local trends. It is the intelligent converging of customer data and context that creates a CSP's profiling power. With convergent advertising it will be possible to improve targeted advertising, addressing the questions of "which customer?," "where?" and "when?" Real-time interactive solutions work better with convergent solutions, as they give consumers the option to watch something on IPTV and then comment on it or forward it via a mobile phone or the Internet. In addition, multichannel interactive advertising will give advertisers a better understanding of customer behavior across locations and time slots. Social networks within wireless communities are becoming increasingly important for advertisers. When related to payments and purchases, CCAPs can offer lucrative solutions for advertisers: linking advertising directly to sales results. At the same time, a CCAP would make it easier to deal with privacy protection in a uniform way due to its integrated and unified approach to consumer data. A CCAP will enable CSPs to play a new role in the online advertising and content market, particularly in mobile, IPTV and multichannel telecom-based advertising. Only a convergent platform will enable a CSP to play to its strengths. Such convergent platforms, along with converged

Gartner, Inc. | G00214660

Page 41 of 139

services, will support new CSP business models by increasing indirect revenue that is, revenue beyond that related to transmission services. Convergent advertising platforms will cover both fixed and mobile communications. Pure mobile CSPs also need convergent platforms that link advertising across different services, such as the mobile Internet, games, instant messaging, email, location, route planning, payment and video services. Operating a CCAP will provide CSPs with additional analytics and BI based on how their customers consume advertising and the context within which the ads were consumed. This knowledge can be further utilized to design new offers, or flavors of offers, to better serve their customers. However, a CCAP on its own is not a guarantee of success for a CSP. A successful advertising strategy is also required, as well as advertising and marketing skills. It is unknown which CSPs will use their strengths to compete in the advertising market. The competitive position of CSPs will depend not only on the market's acceptance of CSP-based solutions. Competition in telecommunications and regulations on data protection will also determine adoption rates. Other factors, such as the progressive transformation of CSPs' business models over time (as CSPs are identifying new revenue opportunities beyond their traditional offerings), as well as the progressive transformation of the media and advertising landscape, will introduce challenges and risks for CSPs with respect to the adoption and implementation of CCAPs. As a result of these challenges and risks, it is expected that CSPs will adopt CCAPs to varying degrees. Some CSPs will implement minimal advertising capabilities. CSPs with a multitude of service offerings and channels and a large user base will attempt to implement more comprehensive CCAPs, with varying success. Benefit Rating: High Market Penetration: Less than 1% of target audience Maturity: Emerging Sample Vendors: Acision; Alcatel-Lucent; Amdocs; Atos Origin; Capgemini; Citex Software; Comverse Technology; Huawei; IBM; MADS; Medio Systems; Mobixell; Openet; Oracle; RGB; SLA Mobile; Wmode; Zad Mobile Recommended Reading: "Dataquest Insight: New Revenue Opportunities for Telecom Carriers, 2013" "Emerging Technology Analysis: Convergent Communications Advertising Platforms, Communications Service Provider Operations, 2010" "Hype Cycle for Media Industry Advertising, 2010" "Key Issues for Media, 2011" "Market Trends: Worldwide, CSP Mobile Marketing and Advertising, 2010"

IPX for LTE


Analysis By: Deborah Kish; Sylvain Fabre

Page 42 of 139

Gartner, Inc. | G00214660

Definition: IP eXchange (IPX) was developed by the GSMA in collaboration with communications service provider (CSP) members to foster open standardized Internet Protocol (IP) connectivity for multiple types of service providers (mobile, fixed, ISPs, application service providers). It provides for end-to-end quality of service (QoS) in support of both roaming and interworking for Long Term Evolution (LTE) and IP Multimedia Subsystem via an IP-based network-to-network interface (NNI). IPX is not intended to replace or compete with the Internet, but it does offer an alternative for CSPs. The intent of IPX is to provide interoperability of IP-based services between all service provider types within a commercial framework that enables all parties in the value chain to receive a commercial return. The commercial relationships are underpinned with SLAs that guarantee performance, quality and security. IPX is an evolution of the GPRS Roaming Exchange. A GSMA working group called Roaming and Interconnect in LTE is responsible for defining how roaming and interconnection will be enabled in LTE. Some key components of IPX are:

IPX traffic has managed QoS at various levels of performance. Payments associated with the business model are identified and settled between operators. Operators are free to select bilateral or efficient multilateral modes of interconnection for different types of traffic. All IP traffic is protected. Services not requiring premium quality can use less demanding QoS bearers.

It is not evident to end users whether or not their service provider uses the IPX model, but the ability for CSPs to differentiate services using the flexibility provided by the IPX model ultimately provides for end-user choice. For more information, go to: www.gsmworld.com/our-work/programmes-and-initiatives/ipnetworking/ip_exchange.htm. Position and Adoption Speed Justification: With CSPs transitioning to LTE, data roaming and interconnect will be an issue, as well as CSPs emphasizing quality of customer experience. IPX promises QoS, quality of experience and the ability to differentiate services. The principles of IPX have been successively tested and validated by the GSMA. From 2004 onward, GSMA Session Initiation Protocol trials tested an IP-based NNI with numerous IMS-based services. IPX Precommercial Implementation trials have been ongoing since April 2007, focusing particularly on packet switched voice services. In February 2008, the GSMA indicated that IPX trials had been completed successfully and that several international carriers were preparing to roll out IPX services, such as BICS, BT, CITIC Telecom International, Deutsche Telekom International Carrier Sales & Solutions, iBasis, Reach, Sybase 365, Syniverse Technologies, Tata Communications, Telecom Italia Sparkle, Telecom New Zealand International, Telefonica International Wholesale Services, Telekom Austria, Telenor Global Services and TeliaSonera International Carrier.

Gartner, Inc. | G00214660

Page 43 of 139

IPX is dominated largely by European carriers and is still not considered a global phenomenon, therefore we see it nearing the Peak of Inflated Expectations. Based on when we expect LTE to become more apparent, it is likely that IPX will reach the Plateau of Productivity in two to five years and that it will gain wider adoption in other regions as LTE blossoms in other markets, such as the U.S. and Japan. Since the trials in 2008, Deutsche Telekom has announced an IPX platform that offers security, reliability options and multiple QoS levels. Other CSPs such as Orange and TeliaSonera have also launched IPX capabilities. In June 2011, the Amsterdam Internet Exchange launched a service-level interconnection to large IPX networks. User Advice: As IPX services are standardized, CSPs can expect a common charging principle and a technically interoperable end-to-end network. CSPs can also decide to use IPX by choosing the service they want to interwork over the IPX on an individual service basis and the CSPs with which they want to interwork. Doing so leads to agreements containing specific connectivity models, SLAs between CSPs and more flexibility for end users. Business Impact: For mobile CSPs and end users, security will always be of the highest importance. Because the IPX is not accessible from the Internet, attacks are not possible. Those involved in agreements sign up to a security code of conduct creating a trusted community. Enduser terminals and devices have no visibility of the IPX, so they are unable to access the core networks involved in the management and delivery of IP services. Additionally, the IPX creates an environment that is above and beyond the traditional "clearing houses" used in regular Global System for Mobile Communications/wideband code division multiple access networks. CSPs, as part of the value chain and managed network, can expect payment for the IP services they provide at specific quality levels. Benefit Rating: Moderate Market Penetration: 1% to 5% of target audience Maturity: Emerging Sample Vendors: Acme Packet; Deutsche Telekom; Sybase; Syniverse Technologies; Tata Communications; Telecom Italia

Rich Communication Suite


Analysis By: Deborah Kish; Charlotte Patrick Definition: Rich Communication Suite (RCS) services bring together a number of different functionalities into a single place on the consumer's device. The Apple iPhone address book is an early example of part of intelligent address book functionality, followed by Android-based phones, allowing the user to start an email, see an address in Google Maps and click on any telephone number to call or text these are not part of RCS. RCS aims to provide features and benefits in the following three phases/releases:

Page 44 of 139

Gartner, Inc. | G00214660

Release 1 focuses on voice, messaging and enhanced phonebook capabilities for mobile devices. It was developed in 2008 and was the steppingstone to define a stable and proven framework, interworking guidelines and technical reference implementation. Release 2 was introduced in mid-2009 and focuses on RCS services becoming accessible via broadband access. While Release 1 was defined for mobile devices, Release 2 extends the use of rich communication services to other devices with broadband access (that is, a PC connected via a wireless LAN access point). The supported RCS services via broadband access are identical to the services already defined in Release 1. For legacy messaging, however, RCS clients connected via broadband access can only send SMS messages. They are not able to receive SMS messages or to send/receive Multimedia Messaging Service messages. The RCS Release 3, initially defined in February 2010, focuses on consolidating the Release 2 features and adds some enhancements, such as the IP Multimedia Subsystem (IMS) Primary Device feature, which allows customers to use broadband access as the primary device in the absence of mobile devices.

For a full description of the features, goals, members and scope of RCS, visit the GSMA's website (www.gsmworld.com/our-work/mobile_lifestyle/rcs/index.htm). Position and Adoption Speed Justification: The RCS initiative continues to develop across three phases that will consist of broadband access, content sharing and enhancements to sharing and presence, as well as geolocation information; however, the initiative has still not gone live and communications service provider (CSP) participation varies from region to region. For example, North American CSPs have a tendency to simply develop rich applications on their own, while in Europe and Asia/Pacific there is more activity and participation in RCS. Many technology and software vendors continue to be involved in RCS, and some have decided to develop RCS as an enterprise software package in order to reach critical mass. Due to the length of time since the inception of RCS and sporadic launches, coupled with a nonworldwide adoption, RCS's position has advanced in the Hype Cycle and is now nearing the Trough of Disillusionment. User Advice: Equipment and technology vendors and software developers should focus on creating case studies. The value of the RCS platform needs to be proved by real-world examples. They should work toward developing applications that differentiate them from the competition, either organically or by working with software vendors, such as IBM and Oracle. CSPs should use RCS to support IMS as a service delivery platform and consider offering software suites either as cloud or hosted services to enterprise customers. They should mitigate risk by continuing to invest in other service initiatives and/or equipment, such as Mobile 2.0 (as over-thetop and cloud-based providers [such as Skyfire] have), next-generation service delivery platforms, application stores and inter-CSP initiatives, such as joint innovation labs. They should be aggressive with making trials reality. The farther out RCS is pushed, the more likely it is that it will fail to take off. Alternatives exist that are less costly. Business Impact: It is hard to forecast the likely success of RCS as there are unknown factors concerning user experience, such as whether consumers will find the intelligent address book

Gartner, Inc. | G00214660

Page 45 of 139

features very attractive on devices with smaller screens or whether the mass market of customers will be interested in some of the more complex pieces of RCS functionality. It is also unclear whether the initiative's road map is moving at sufficient speed to ensure commercial success. The possible benefits that will drive trials and deployments among CSPs include:

Consumers' ability to see the status of friends, which could trigger additional or simultaneous communication sessions, such as picture sharing with voice, which would not have taken place previously. It could also encourage them to use more niche functionality, such as video. By having these functionalities, consumers are likely to increase their general use of mobile data and voice services, therefore increasing the average revenue per unit. The functionality may increase subscriber stickiness.

Ownership of this type of functionality allows the operator, device manufacturer or over-the-top player to have a degree of influence over consumers' choice of communication service, and to act as a portal to their social contacts (rather than allowing entities such as social network sites to own the relationship completely). Benefit Rating: Moderate Market Penetration: 1% to 5% of target audience Maturity: Emerging Sample Vendors: Alcatel-Lucent; Critical Path; Ericsson; Huawei; Metaswitch Networks; Motorola; Nokia Siemens Networks; Sony Ericsson Recommended Reading: "Dataquest Insight: Are Carriers Leaving Money on the Table With HalfDeployed IMS Architectures?" "Key Issues for Carrier Service and Control Infrastructure, 2009" "Dataquest Insight: The Future for Telecommunications Operators in Social Networking" "Dataquest Insight: Telecom Service Providers: Evolving Toward the 2015 Horizon" "Dataquest Insight: Carriers Can Keep Control of LTE With IMS" "Network Operators Should Strive to Be Community Owners, Not Technology Providers"

VoIP Wireless WAN


Analysis By: Phillip Redman Definition: Wireless network service providers deliver voice over circuit-switched technology, rather than the packet-switched technology they use for data transmissions. The next-generation technology using Long Term Evolution (LTE) is the first all Internet Protocol (IP) network, which will include support for voice. Voice standards over LTE haven't been fully decided on yet, with most providers looking to the voice over LTE (VoLTE) standard. In February 2010, led by the Global
Page 46 of 139
Gartner, Inc. | G00214660

System for Mobile Association (GSMA), many in the industry came forward to support VoLTE. The GSMA's VoLTE initiative has the backing of more than 40 organizations across the mobile ecosystem, including many of the world's leading mobile operators, handset manufacturers and equipment vendors. They support the principle of a single Information Management System (IMS)based voice solution, which will be run over LTE. This is the reverse of what was proposed by a contingency looking to evolve using voice over LTE via Generic Access (VoLGA). The use of IP in the wireless link for packetized voice transmission (as compared with circuitswitched transmission) comprises a voice over IP (VoIP) wireless wide-area network (WWAN). Although this isn't being supported today, during the next few years, Gartner expects initial rollouts of this service. Position and Adoption Speed Justification: With the launch of LTE, network carriers took a step forward toward supporting voice over WWAN (VoWWAN). However, LTE networks are only now being deployed. Low-cost alternatives to support voice still abound, and most calls worldwide are still going over second-generation (2G) GSM systems, which are ubiquitous and offer low-cost calling. It will still be four or five years before the transition away from 2G voice to third-generation (3G)/LTE voice support begins. In the end, voice traffic is will be subsumed by the increasing data traffic, so much of the focus and revenue will be directed toward data capabilities, rather than on the voice for these networks. User Advice: Although some smartphones support VoIP clients that can use 3G networks, current high-speed networks support VoIP, but without quality of service, which results in lower quality and an inconsistent experience. Some operators continue to use VoIP for push-to-talk over cellular services, but it is not supported for cellular calling. Look to third-party fixed mobile convergence (FMC) systems to support handoff and higher-quality voice. Business Impact: There is still little use of VoWWAN capabilities. Many companies have begun to support VoIP and are looking for the same support outside their offices. Consumer users of popular VoIP services also want to extend that capability to their mobile phones. VoIP could add 20% to 40% in additional capacity on all IP networks, which could drive costs lower to support wireless voice services. VoIP also could support many softphones, which will increase the number of devices that support mobile voice capabilities, and it can also enable the simultaneous use of voice and data on a device. It is also the main technology to support FMC and mobile unified communications. Benefit Rating: Moderate Market Penetration: Less than 1% of target audience Maturity: Emerging Sample Vendors: AT&T; fring; Most cellular infrastructure vendors; Skype; Verizon

WDM-PON
Analysis By: Ian Keene

Gartner, Inc. | G00214660

Page 47 of 139

Definition: Wavelength division multiplexing passive optical network (WDM-PON) solutions are characterized by: (a) the use of a passive fiber tree, as found in other PON technologies; and (b) the use of dedicated wavelengths for each user. The use of multiple wavelengths increases the capacity of WDM-PON systems beyond those where multiple users share the same wavelength, and the use of WDM splitters in the fiber tree means that any user receives only the intended wavelength, which WDM-PON suppliers emphasize as a security benefit over traditional PON architectures with shared wavelengths. However, the use of dedicated wavelengths requires more advanced optical components and, therefore, WDM-PON suppliers face a cost challenge that must be overcome before WDM-PONs will find widespread use. While WDM-PON has beaten 10-Gigabit PON (10GPON) in terms of getting nonstandardized products to the market, the standardization of WDM-PON is not complete and it is uncertain when final standards will emerge. WDM-PON is being standardized by the International Telecommunication Union, supported by the Next-Generation Access initiative from the Full Service Access Network. WDM can scale to 1,000 wavelengths, more than 1 Gbps per wavelength, with a reach of up to 100 km, supplying ultrahigh bandwidths to subscribers without the need for localized central office infrastructure and, therefore, with reduced operating costs. Position and Adoption Speed Justification: The challenge facing current-generation gigabit PON (GPON)/Ethernet PON (EPON) technologies is more related to deployment cost than to limited capacity. With a premium cost related to the more advanced optical components, the most likely success scenario for widespread deployments of the more advanced WDM-PON solutions is an upgrade scenario unfolding when current-generation PON deployments start to run out of bandwidth or when the additional cost of WDM-PON becomes negligible from a total-cost-ofownership perspective. If products that are stable under variable temperatures can be brought to market at an effective price, then WDM-PON has the potential to significantly reduce the number of central offices needed by a communications service provider (CSP) and, therefore, to significantly reduce operating expenditure. WDM-PONs will, in these scenarios, compete against 10G-PON upgrade solutions. The combination of 10G-PONs and WDM-PONs is a potential long-term scenario, as is the development of 40-Gigabit-PON (40G-PON) or 100-Gigabit-PON (100G-PON) systems. For a more detailed discussion of 10G-PON and WDM-PON, see "Emerging Technology Analysis: 10G and WDM PON." Some deployments of residential WDM-PONs have already taken place in South Korea. WDMPONs also have potential applications in the backhaul and business market segments. Despite this, the time it is expected to take for WDM-PONs to reach the Plateau of Productivity exceeds the corresponding estimate for 10G-PONs mainly because of the expected price premium related to the more advanced optical components already mentioned. Indeed, before WDM-PONs can pass the Trough of Disillusionment, approved standards need to be in place and the technology needs further innovation and maturity to reduce component costs. User Advice: Ensure you include WDM-PONs when evaluating future network scenarios, but wait for standardization and component innovation that produces steep price erosion before deploying. Business Impact: WDM-PONs offer a future upgrade path for CSPs that have deployed currentgeneration PON solutions.
Page 48 of 139

Gartner, Inc. | G00214660

Benefit Rating: Moderate Market Penetration: Less than 1% of target audience Maturity: Emerging Sample Vendors: Adva Optical Networking; Huawei; Nokia Siemens Networks; Tellabs Recommended Reading: "Emerging Technology Analysis: Next-Generation Broadband Access Caters for End-User Bandwidth Appetite" "Emerging Technology Analysis: 10G and WDM PON" "Forecast Analysis: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Forecast: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update"

Self-Organizing Networks
Analysis By: Sylvain Fabre Definition: Self-organizing networks (SONs) are a key feature of Long Term Evolution (LTE) and next-generation networks. In a nutshell, SON functionality resides partly in the operations support system (OSS) framework and partly in the radio access network eNodeB, and is a set of rules and algorithms that defines automatic actions to be performed by the system upon given events. In the extreme, this would be the network running and managing itself. In practice, a SON is likely to be used gradually as an adjunct to human operators for planning, alarms management and maintenance. SONs will provide a self-configuration, self-optimization, fault management and fault correction function at the base station of an LTE or the next-generation mobile network. They should help mobile operators to manage the coverage, capacity, traffic and backhaul of the base station more easily. Vendors push this feature for its operating expenditure (opex) savings: Less extra head count will need to be added to accommodate the new, additional LTE layer. In practice, there will need to be expert operations, administration and maintenance (OA&M) and radio planning personnel in place anyway, but SONs may reduce the total numbers, or at least enable faster and more efficient decision making. However, effective opex savings, compared with traditional OA&M, still need to be measured. The feature requirements for SONs are expected to be updated by operators, as ongoing LTE trials and early deployments continue to yield results and operators have a clearer idea of how to run their LTE networks. But at its most basic, SONs should allow self-configuration the ability to simply add a cell in the network, and have it configure itself through negotiating neighboring relations with its peers.

Gartner, Inc. | G00214660

Page 49 of 139

SONs require specialized skills, and although all leading vendors have some ability in this area, there will be differences among them. Since the quality of SON products will vary, SONs will become key differentiating features in early LTE releases. Position and Adoption Speed Justification: SONs are a key issue for many CSPs, after backhaul costs and scalability, and have been a key requirement for next-generation mobile networks. Over the past two years, extra SON requirements have been defined in The 3rd Generation Partnership Project (3GPP) and next-generation mobile networks (NGMN). The rationale behind the SON functionality is that second-generation (2G), third-generation (3G) and LTE networks will coexist for some time, so cost-optimized operations, maintenance and planning are needed. A SON can help achieve stable LTE networks for operators, especially in the early stages. It is possible that femtocells may play a greater role in wider-scale LTE; incidentally, SONs are already a key feature of femtocells. Interface specifications and use cases for SONs have been defined and are part of 3GPP specification 32.500 in Release 8. The first use cases of SON were in early deployments of LTE in 2010. Leading operators have indicated that they value SONs. While LTE has enjoyed a lot of attention and hype recently, what constitutes the key building blocks of early LTE releases is less understood. It will take time to mature to determine what level of SON is adequate in the first few LTE network releases, and what architectures will be favored (for example, centralized versus distributed SONs, or a hybrid approach). User Advice: Operators should place requirements on vendors for valued features, and evaluate the cost of these features based on estimated savings in operations, administration and maintenance, as well as network planning and head count. Business Impact: SONs can provide a significant saving in operations, and greatly automate the way future wireless networks are set up, managed and planned. SONs are integral to some advanced RAN architectures, such as cloud-based RANs. Benefit Rating: High Market Penetration: 1% to 5% of target audience Maturity: Emerging Sample Vendors: Actix; Alcatel-Lucent; Ericsson; Huawei; NEC; Nokia Siemens Networks; ZTE Recommended Reading: "Magic Quadrant for LTE Network Infrastructure" "Emerging Technology Analysis: Self-Organizing Networks, Hype Cycle for Wireless Networking Infrastructure"

4G Standard
Analysis By: Sylvain Fabre Definition: A fourth-generation (4G) worldwide standard being developed for a next-generation local- and wide-area cellular platform is expected to enter commercial service between 2012 and

Page 50 of 139

Gartner, Inc. | G00214660

2015. International Mobile Telecommunications-Advanced (IMT-A) is now called 4G. The development effort involves many organizations: the International Telecommunication Union Radiocommunication Sector (ITU-R); the Third Generation Partnership Project (3GPP) and 3GPP2; the Internet Engineering Task Force (IETF); the Wireless World Initiative New Radio (WINNER) project, a European Union research program; telecom equipment vendors; and network operators. Agreement on the initial specification has yet to be reached, but discussions point to some key characteristics. These include support for peak data transmission rates of 100 Mbps in WANs and 1 Gbps in fixed or low-mobility situations (field experiments have achieved over 2.5 Gbps); handover between wireless bearer technologies such as code division multiple access (CDMA) and Wi-Fi; purely Internet Protocol (IP) core and radio transport networks for voice, video and data services; and support for call control and signaling. Many technologies have competed for inclusion in the 4G standard, but they share common features, such as orthogonal frequency division multiplexing (OFDM), software-defined radio (SDR) and multiple input/multiple output (MIMO). 4G technology will be all-IP and packet-switched. In addition, we believe that the network architecture will be radically different from today's networks. In particular, it will include all-IP, low latency, flat architecture and integration of femtocells and picocells (as the "small cell" layer) within the macrolayer. The radio access will be made up of more cells with increasingly smaller sizes to enable increased frequency reuse for maximum performance. Position and Adoption Speed Justification: The 4G standard is still in the early stages of development and has to incorporate a wide range of technologies. But these are not the only reasons why its introduction is some way off. Deployments of High-Speed Downlink Packet Access (HSDPA), High-Speed Uplink Packet Access (HSUPA) and Long Term Evolution (LTE) technology will extend the life of third-generation (3G) infrastructures for voice and, to some extent, for data. Also, network operators will want to receive a worthwhile return on 3G investments before moving to 4G. Then there is the problem of how to provide adequate backhaul capacity cost-effectively; this is already difficult with the higher data rates supported by High-Speed Packet Access (HSPA), and it will become harder with LTE and then 4G. Ultra Mobile Broadband (UMB) which, unlike wideband code division multiple access (WCDMA) and HSPA, is not being assessed for 4G had been under consideration as a next-generation mobile standard in the U.S. and parts of Asia and Latin America, but it failed to gain a foothold and will not be widely adopted. It appears likely at this point that LTE-Advanced (LTE-A) is a clear leader for 4G, with 802.16m a possible distant contender. WiMAX had been considered, but is not in the race anymore; in the U.S., Sprint had been advertising WiMAX as 4G, but has since changed its future plan to LTE. WiMAX worldwide is losing momentum to time division LTE (TD-LTE). User Advice: It is too soon to plan for 4G. Instead, monitor the deployment and success of 3G enhancements such as HSDPA, HSUPA, High-Speed Packet Access Evolution (HSPA+) and LTE, as these need to provide a worthwhile return on investment before network operators will commit themselves to a new generation of technology. Carriers will also need to ensure interoperability with today's networks, as backward-compatibility might otherwise be an issue (as interworking between WCDMA and LTE networks is proving).

Gartner, Inc. | G00214660

Page 51 of 139

Business Impact: The business impact areas for 4G are high-speed, low-latency communications, multiple "pervasive" networks and interoperable systems. Additionally, carriers should recognize that the cost of deploying and operating an entirely new 4G network might be too high to justify, unless a different network business model is found, including, for example, increased use of network sharing and "small cells." Benefit Rating: Moderate Market Penetration: Less than 1% of target audience Maturity: Embryonic Sample Vendors: Alcatel-Lucent; Ericsson; Fujitsu; Huawei; NEC; Nokia Siemens Networks; Samsung Electronics; ZTE Recommended Reading: "Magic Quadrant for LTE Network Infrastructure" "Emerging Technology Analysis: Self-Organizing Networks, Hype Cycle for Wireless Networking Infrastructure" "Dataquest Insight: IPR Issues Could Delay Growth in the Long Term Evolution Market"

CDN for Fixed CSPs


Analysis By: Akshay K. Sharma; Peter Kjeldsen Definition: Fixed carrier content delivery networks (CDNs) are a type of distributed computing infrastructure, where devices (servers or appliances) reside in multiple points of presence on multihop packet-routing networks (such as the Internet) or on private WANs, but which are used to deliver network services from fixed carriers. A CDN for fixed communications service providers (CSPs) offloads origin servers via edge caching and offers improved latency via closer proximity to the user, as well as intelligent optimization techniques. A fixed-CSP CDN can be used to distribute rich media such as audio and video as downloads or streams, including live streams. It can also be used to deliver software packages and updates, as part of an electronic software delivery solution. Finally, a fixed-CSP CDN may also provide services such as global load balancing, Secure Sockets Layer acceleration and dynamic application acceleration via optimization techniques. In the media sector, all these uses are common, and rich-media delivery via progressive downloading is the most frequently used service. Use of all of these services is also common within the e-commerce industry, where content offload is the most frequently used service. While common in the cloud within media networks, the fixed-CSP CDN is emerging as a new technique in CSPs' core networks, to provide bandwidth optimization for an enhanced user experience.

Page 52 of 139

Gartner, Inc. | G00214660

Position and Adoption Speed Justification: Using a fixed-CSP CDN is less expensive than buying servers and bandwidth, which is critical for fixed CSPs. Newer optimization techniques (including video caching, transcoding and multicasting) are occurring in the core. User Advice: CSPs should carefully assess the opportunities for partnering with cloud-hosted CDNs. As more consumers look to online service options for searching and acquiring content, an efficient and seamless experience will mean the difference between success and failure. CDNs can assist in improving end-user performance, such as the streaming of cached assets, and help to reduce bandwidth costs for high-volume and content-heavy sites. Business Impact: We believe that fixed-CSP CDNs will continue to increase the breadth and scope of these additional services, expanding the range of application-fluent network services and facilitating relationships between e-commerce partners, including advertisers. Benefit Rating: High Market Penetration: 5% to 20% of target audience Maturity: Early mainstream Sample Vendors: Akamai; AT&T; Verizon

Integrated Policy and Charging Control Solutions


Analysis By: Kamlesh Bhatia Definition: Integrated policy and charging control solutions are part of the communications service provider's (CSP's) operational functions that allow dynamic and real time control of subscriber entitlements to network resources like bandwidth, volume utilization, traffic priority and so on. The solutions allow CSPs to monetize value delivered to subscribers through personalized and targeted pricing offers based on usage and contextual information. The design of these solutions is intended to allow CSPs to centralize policy and charging functions in a multi-service, multi-network environment and have access to integrated subscriber information. Industry standards bodies like 3GPP (in release 7, 8, 9) have laid down detailed guidelines around the design and use of Policy and Charging Rules Function (PCRF) within the same environment. Position and Adoption Speed Justification: The demand for bandwidth continues to grow as subscribers increasingly consume Internet Protocol content and bandwidth-intensive applications on multiple devices. This has led to a situation where CSPs, especially mobile CSPs are often faced with clogged networks and users see sub-optimal service performance. The traditional flat-rate or static pricing model, especially for data access, has been unable to compensate for growing capital and operational expenses despite increasing revenue. As a result, CSPs are looking at solutions to help them overcome this challenge while maintaining their position as a trusted provider in a highly competitive market environment.

Gartner, Inc. | G00214660

Page 53 of 139

The use of integrated policy and charging control solutions allows CSPs to employ a more granular policy mechanism at the application, service and network level to control entitlements, realize higher margins (through differential pricing of applications/services) and to manage customer experience. It also allows the subscriber greater leverage over their spending and consumption patterns through dynamic changes in their subscription plans. As CSPs move into new business models that hinge on applications, converged services and smart devices, the focus on integrated policy and charging control solutions is expected to grow. CSPs that have already invested in solutions for online charging and subscriber data management (SDM) are expected to take an evolutionary approach to integrating these with policy solutions that offer real-time policy control. Since last year, policy and charging have been moving closer together. This is largely a result of growing concern in the industry around changing consumption patterns for data services and the need to move beyond traditional pricing models. In some cases local regulatory bodies have mandated use of policy decisions to address concerns around bill-shock (e.g., the European Union). We expect this to continue and gather momentum resulting in high adoption of integrated solutions for policy and charging control. Vendors of these solutions are working to alleviate challenges around configurability, integration with operating components and real-time performance. The inclusion of policy management solutions is now a common feature in proposals for real-time charging systems hence justifying the hype around these solutions. User Advice: CSPs should consider integrated policy and charging management solutions as an extension to their investment in subscriber data management offerings. The range of these solutions should transcend beyond the network into the IT layer allowing CSPs to create dynamic personalized offerings (such as integrating with customer or self-care capabilities). The solution should also enable a convergent approach across services and lines of business and allow easy configuration of individualized policy and charging rules. Adherence to industry standards (like 3GPP) is a must. Given the market hype around these solutions, CSPs should look beyond plain technology platforms and focus on their ability to create new policy and charging scenarios using these solutions. Local regulations around the use of policy decisions should be kept in mind. Business Impact: Integrated policy and charging control solutions are key to enforce a fair-usage policy among CSPs' subscriber base (where few users account for large bandwidth consumption). It also has the ability to enhance the quality of experience for end-users by dynamically altering network resource allocation based on subscriber preference. As CSPs engage in new business models (two-sided), use of integrated policy and charging solutions can enhance subscriber loyalty through personalization and create new opportunities to monetize resource consumption. Benefit Rating: High Market Penetration: 1% to 5% of target audience Maturity: Adolescent Sample Vendors: Bridgewater Systems; Comptel; Elitecore Technologies; HP; Huawei; Openet; Orga Systems; Tekelec; Telcordia; Volubill

Page 54 of 139

Gartner, Inc. | G00214660

Recommended Reading: "Emerging Technology Analysis: How CSPs can Cut Costs and Charm Customers with Integrated Policy and Charging Control"

Augmented Reality
Analysis By: CK Lu; Tuong Huy Nguyen Definition: Augmented reality (AR) is a technology that superimposes graphics, audio and other virtual enhancements on a live view of the real world. It is this "real world" element that differentiates AR from virtual reality. AR aims to enhance users' interaction with the environment, rather than separating them from it. The term has existed since the early 1990s, when it originated in aerospace manufacturing. Position and Adoption Speed Justification: The maturity of a number of mobile technologies such as GPS, digital cameras, accelerometers, digital compasses, broadband, image processing and face/object recognition software has made AR a viable technology on mobile devices. As all these technologies converge in maturity, AR has also benefited from a growing number of open OSs (promoting native development), the increasing popularity of application stores (increasing awareness and availability of applications), and the rising availability of overlay data such as databases, online maps and Wikipedia. The combination of these features and technologies also allows AR to be used in a number of different applications, including enhancing user interfaces (UIs), providing consumers with information and education, offering potential for marketing and advertising, and augmenting games and entertainment applications. We also believe that AR will play a role in mobile contextual interactions, and will be particularly powerful for:

Exploration Finding things in the vicinity. Suggestion Indicating real-world objects of interest. Direction Indicating where a user should go.

In 2010, AR had reached the peak of its hype, as many vendors exploited this technology to differentiate their products both services and hardware. For example, AR browser vendor Layar boasts more than 700,000 active users. The vendor is working with LG (to preload its application on new Android devices) and Samsung (to be supported on bada). This year, we observed that the hype surrounding AR has slowed down. Nevertheless, its uses are still being explored. Panasonic provides the Viera AR Setup Simulator (as a promotional tool) to help the consumer feel how their TV will fit into a room. World Lens developed an AR translation application allowing users to translate one language to another; for example, by pointing a camera at a traffic sign. Nintendo 3DS also uses AR as a differentiator to enrich gaming experiences on its 3D display. Despite the hype and potential, a number of factors will slow adoption of AR:

Device requirements for AR in mobile devices are rigorous; so, although mobile services provide a great use case for this technology, it will be restricted to higher-end devices. Mobile devices have smaller screens than other consumer electronics devices such as laptops and even handheld gaming consoles, restricting the information that can be conveyed to the end user.

Gartner, Inc. | G00214660

Page 55 of 139

The interface (a small handheld device that needs to be held in front of you) limits usage to bursts, rather than continued interaction with the real world. GPS technology also lacks the precision to provide perfect location data, but can be enhanced by hardware such as accelerometers, gyroscopes or magnetometers.

As with other location-based services (LBSs), privacy is a potential concern and a hindrance to adoption. As a newer solution, there are also issues with compatibility: competing AR browsers are using proprietary APIs and data structure, making the AR information from one vendor's browser incompatible with that from other browsers.

User Advice:

Communications service providers (CSPs): Examine whether AR would enhance the user experience of your existing services. Compile a list of AR developers with which you could partner, rather than building your own AR from the ground up. Provide end-to-end professional services for specific vertical markets, including schools, healthcare institutions and real-estate agencies, in which AR could offer significant value. A controlled hardware and software stack from database to device will ensure a quality user experience for these groups. Educate consumers about the impact of AR on their bandwidth, to avoid being blamed for users going over their data allowance. Mobile device manufacturers: Recognize that AR provides an innovative interface for your mobile devices. Open discussions with developers about the possibility of preinstalling application clients on your devices, and document how developers can access device features. Build up alliances with AR database owners and game developers to provide exclusive AR applications and services for your devices. Secure preloading agreements and examine how you could integrate AR into your UIs or OSs. AR developers: Take a close look at whether your business model is sustainable, and consider working with CSPs or device manufacturers to expand your user base; perhaps by offering white-label versions of your products. Integrate AR with existing tools, such as browsers or maps, to provide an uninterrupted user experience. Build up your own databases to provide exclusive services through AR applications. Extend your AR application as a platform that individual users and third-party providers can use to create their own content. Explore how to apply AR, through different applications and services, to improve the user experience with the aim of predicting what information users need in different contexts. Providers of search engines and other Web services: Get into AR as an extension of your search business. AR is a natural way to display search results in many contexts. Mapping vendors: Add AR to your 3D map visualizations. Early adopters: Examine how AR can bring value to your organization and your customers by offering branded information overlays. For workers who are mobile (including factory, warehousing, maintenance, emergency response, queue-busting or medical staff), identify how AR could deliver context-specific information at the point of need or decision.

Page 56 of 139

Gartner, Inc. | G00214660

Business Impact: AR browsers and applications will be the focus of innovation and differentiation for players in the mobile device market in 2011. There are interesting branding opportunities for companies and businesses. Points of interest can be branded with a "favicon" (that is, a favorites or website icon) that appears when the point of interest is selected. Companies such as Mobilizy are offering white-label solutions that allow core Wikitude functionality to be customized. AR products such as Wikitude can lead to numerous LBS advertising opportunities. CSPs and their brand partners can leverage AR's ability to enhance the user experience within their LBS offerings. This can provide revenue via set charges, recurring subscription fees or advertising. Handset vendors can incorporate AR to enhance UIs, and use it as a competitive differentiator in their device portfolio. The growing popularity of AR opens up a market opportunity for application developers, Web service providers and mapping vendors to provide value and content to partners in the value chain, as well as an opportunity for CSPs, handset vendors, brands and advertisers. Benefit Rating: High Market Penetration: 1% to 5% of target audience Maturity: Adolescent Sample Vendors: GeoVector; Google; Layar; Mobilizy; Tonchidot Recommended Reading: "Emerging Technology Analysis: Augmented Reality Shows What Mobile Devices Can Do" "Contextual Smartphone Applications Will Exploit Augmented Reality"

Public Cloud Computing/the Cloud


Analysis By: Daryl C. Plummer Definition: Gartner defines public cloud computing as a style in which scalable and elastic ITenabled capabilities are provided as services to external customers using Internet technologies. Therefore, public cloud computing involves the use of cloud computing technologies to support customers that are external to the provider's organization. Through public consumption of cloud services, economies of scale and sharing of resources will be generated to reduce costs and to increase the choices available to consumers. Public cloud computing carries with it the concerns that security, data management, trust, control and guarantees of appropriate performance will not be sufficient to support enterprise needs. Enterprises want the value delivered through cloud computing services, but also need to ensure that the concept is ready for delivering services that a company can rely on. However, public cloud computing has proved itself time and again, in the context of the Internet and the Web, from what is commonly referred to as the "consumer perspective." Sites such as Flickr and Facebook, as well as countless business sites delivering services from entertainment to healthcare records, have been in use for some time in the public context.

Gartner, Inc. | G00214660

Page 57 of 139

It's important to distinguish among the cloud, cloud services and cloud computing. The cloud is an abstract concept that refers to a collection of one or more cloud services. The cloud is used euphemistically to indicate that work is being performed somewhere else, with little regard for where or how. Beyond this, the idea of cloud services is more tangible, in that each cloud provider (e.g., Amazon or Workday) delivers cloud services that provide value to their consumers. In the end, cloud computing is a style of computing, whereas cloud services are used to deliver value through this abstract thing that we call the cloud. Position and Adoption Speed Justification: The public cloud is at (and a little past) the Peak of Inflated Expectations. As enterprises heavily experiment with the concept, they begin serious budgeting efforts for real projects. The evaluation of peer projects that solve actual problems are under way. In addition, cloud providers are advertising their ability to deliver enterprise services and reduce cost. Customers should still be cautious about the claims of most providers, because their models are still unproved for enterprise use. The potential advantages in terms of agility, as well as in time to market to stand up new applications, or set up new users in an existing environment, are worth investigating for cloud computing adoption. Many providers are unprepared to deliver these advantages to the enterprise, but besides reducing the initial investment in assets, agility is certainly something that clients will want to look at closely. User Advice: User companies should be moving experimental projects to feasibility discussions for serious implementation in 2010. There will continued investment in 2011, with rapid growth in cloud computing through 2012. Business Impact: The business impact of cloud computing in the public sense can be varied, but the basic opportunity is for businesses to consume services from other companies that free them from the need to provide those services themselves. This can enable to companies to eliminate work that previously might have been done in-house. It can also lead to massive changes in the way money is spent for example, using operating expenses to fund external services, rather than using capital expenses to fund IT projects. Benefit Rating: Transformational Market Penetration: More than 50% of target audience Maturity: Early mainstream Sample Vendors: Amazon; Google; Rackspace; salesforce.com

100 Gbps Transport


Analysis By: Peter Kjeldsen Definition: Updates to optical transport systems enabling the delivery of 100 gigabits per second (Gbps) data rates per wavelength represent a tenfold increase over the commonly deployed 10 Gbps per channel systems in communications service provider (CSP) networks. Some of the same developments utilized for achieving 40 Gbps throughput are being used in 100 Gbps as well, but to maintain transmission distances, more advanced modulation schemes are being considered for 100 Gbps, with coherent dual polarization quadrature phase shift keying (DP-QPSK) modulation being
Page 58 of 139
Gartner, Inc. | G00214660

emphasized by the Optical Internetworking Forum (see http://www.oiforum.com/public/documents/ OIF-FD-100G-DWDM-01.0.pdf). With further advances in transceiver technologies and the push for even higher per-channel line rates in dense wavelength-division multiplexing (DWDM) systems, it is possible that other advanced modulation schemes can play a role in +100 Gbps transport. 100 Gbps (as well as 40 Gbps) line rates are already standardized for Synchronous Digital Hierarchy/Synchronous Optical Network (SDH/SONET) and optical transport network (OTN) by the International Telecommunication Union (ITU). The Institute of Electrical and Electronics Engineers (IEEE) ratified the 802.3ba standard in June 2010 (see http://standards.ieee.org/announcements/ 2010/ratification8023ba.html), which will allow CSPs to carry 40 Gbps and 100 Gbps Ethernet directly over transport networks supporting these line rates. This allows CSPs to consider the move to higher line rate systems in the wider context of what their future optical transport architecture should look like. Position and Adoption Speed Justification: 100 Gbps commercial trials and early deployments are ongoing in the transport networks of large CSPs such as Verizon, AT&T and Comcast. The 100 Gbps technology is clearly leveraging advances related to the realization of 40 Gbps commercial solutions. However, to realize 100 Gbps solutions raises the bar in terms of more advanced transceiver designs, and cost-effectiveness (cost per bit) of 100 Gbps solutions compared to 40 Gbps is likely still a couple of years away. We expect that 100 Gbps will reach the Plateau of Productivity within the next three to four years, paced by continued growth in mobile data and video traffic and the associated core router User Advice: Evaluate the cost-effectiveness and maturity of the 100 Gbps technology vs. 40 Gbps technology when addressing traffic growth challenges. CSPs should look at 100 Gbps wavelengths as part of architecture evolution rather than a simple capacity upgrade, to cost-effectively support mobile data and video services. Business Impact: This technology will eventually offer cost-effective addressing of traffic growth issues, acting as an enabling technology for the expansion of network capacity. Benefit Rating: Moderate Market Penetration: 1% to 5% of target audience Maturity: Emerging Sample Vendors: Alcatel-Lucent; Ciena; Cisco; Huawei; NEC; Nokia Siemens Networks

Sliding Into the Trough


Mobile Subscriber Data Management
Analysis By: Kamlesh Bhatia

Gartner, Inc. | G00214660

Page 59 of 139

Definition: Mobile subscriber data management (SDM) is the activity whereby parts of datasets related to the same subscriber, but physically located on different repositories, are pulled together either physically, onto a consolidated database, or virtually into a consolidated view and then leveraged for insights. There are several existing data repositories, and subscriber network databases, that hold subscriber data within the wireless communications service provider's (CSP's) infrastructure, including:

Home location register (HLR). Home subscriber server (HSS). Authentication, authorization and accounting (AAA). Other repositories for subscriber-related data that reside within the CSP's network.

Other subscriber databases which may, or may not, be centralized in main data centers include:

Presence servers. Location servers. Device preferences. "Buddy lists."

These databases are fundamentally different from other IT-based systems; for example, those used in operations support systems or business support systems. Subscriber network databases are distributed in the network, close to switching and routing they contribute to real-time call flow and service delivery, and low latency is extremely important whereas IT systems are not. Mobile subscriber databases also exclude databases that reside in the Internet Protocol cloud, but are not distributed in the CSP network, and are used by over-the-top providers like Skype and Google. While HLR and even HSS are no longer new network elements, the past decade has seen these concepts modified from proprietary, limited-capacity, switch-based platforms to commercial platforms massively scalable distributed systems with increased geographical redundancy, and the ability to deliver innovative services in addition to just voice and data. Position and Adoption Speed Justification: The concept of mobile SDM has been spreading throughout the vendor community during the last decade and, in the past few years, rollouts have begun. The CSP community has to actively pursue financial opportunities from its subscriber data (beyond simply upgrading platforms for network databases) as CSPs become more skilled over time at exploiting the customer data insights for real-time marketing. But CSPs and vendors alike are struggling to find services and revenue that could be directly linked to mobile SDM. Having said that, SDM is now being closely aligned with other control plane elements like Policy and Charging Rules Function (PCRF) to bring out more subscriber and context-aware capabilities that CSPs can leverage to generate new revenue or improve the customer experience. User Advice: While the initial driver to move to mobile SDM may sometimes be the need to upgrade legacy switches, CSPs need to think beyond this. SDM, with CRM and combined with Web analytics, policy management and deep packet inspection capabilities, can leverage a whole new

Page 60 of 139

Gartner, Inc. | G00214660

set of revenue by using real-time marketing insights about subscribers to improve customer experience and proactively offer more tailored products and services to certain customer groups. However, this may also trigger a much deeper integration of networking and IT technologies vendors and CSPs should actively plan for such integration over time. Some vendors have now started to embrace this coherent network and IT concept in their latest subscriber data management offerings, especially in conjunction with next-generation service delivery platforms. Business Impact: The ability to capture and leverage mobile subscriber data offers the potential to fundamentally change the role of the mobile CSP. Rather than just sitting on their subscriber data, CSPs have to actively develop new service and revenue opportunities. This could mean hardware and software platform upgrades (which take time and investment); or, for example, CSPs securely exposing and utilizing subscriber data from the network, to profile customers for business purposes. This might enhance CSPs' ability to proactively offer more tailored products and services to certain customer groups, retaining new revenue opportunities and, most importantly, improving the customer experience. A unified view of subscribers across networks can also help to resolve problems with dirty or duplicate data, cutting down revenue leakage. Benefit Rating: High Market Penetration: 1% to 5% of target audience Maturity: Adolescent Sample Vendors: Alcatel-Lucent; Bridgewater Systems; Ericsson; HP; Huawei; Nokia Siemens Networks; Openet; Tekelec; ZTE Recommended Reading: "Competitive Landscape: Subscriber Data Management Solutions, Worldwide" "Emerging Technology Analysis: How CSPs can Cut Costs and Charm Customers with Integrated Policy and Charging Control"

Energy Management Gateways


Analysis By: Ian Keene; Zarko Sumic Definition: Customer energy management gateways allow consumers to become part of the smart grid by enabling self-service energy provisioning on-premises, linking and integrating consumer energy management, and having smart appliances increase energy efficiency and enhance the consumer's quality of life. It is the interactive energy gateway to the customer's home or for energy management in buildings. Opportunities for CSPs are emerging for providing the telecommunication services needed to connect devices and homes to the utility's smart-grid infrastructure and services. Position and Adoption Speed Justification: High and volatile energy prices, coupled with emerging consumer willingness and desire to take control of previously inaccessible consumer decisions, is leading to a resurgence in gateway deployments. Customer gateways, along with an

Gartner, Inc. | G00214660

Page 61 of 139

advanced metering infrastructure particularly for the mass market, as envisioned by the U.S. Energy Policy Act (EPAct) of 2005, the Energy Independence and Security Act of 2007, the American Recovery and Reinvestment Act of 2009 in the U.S., and numerous initiatives in other countries globally enable the consumer to effectively make choices about energy consumption, primarily in the face of price signals through programs such as economic demand response. Often, all of this is done through systems that are programmable by the consumer to alleviate the impact of high prices by reducing consumption or shifting consumption to lower-priced hours. Customer gateways are an enabling technology for energy technology consumerization and an interface between utility and home energy management (HEM) and a home-area network (HAN) system. These gateways are critical components for customer inclusion in energy markets through on-site renewable generation and storage, as well as energy-efficiency programs. In addition to becoming a stand-alone device, a consumer gateway can be provided by utilities and integrated in AMI (smartmetering solution) or can be an integral part of a home energy management/consumer energy management solution. User Advice: Energy IT organizations should prepare for the impact of IT and energy technology consumerization by providing communication links and control schemas to incorporate customerinstalled gateways, home energy management systems and smart devices. Gateways are also a point of handshake between utility deployed smart-grid IT and operational technology (OT) and home energy management solutions provided by consumer technology vendors. CSPs should work with utilities and should look for new service revenue opportunities from the supply of the communications infrastructure needed. While this technology initially had a U.S. focus, opportunities are now global. Business Impact: Customer gateways will affect energy retail, and will be key enablers of energy technology consumerization by acting as portals between utility and consumer energy technology, such as on-site generation and storage, and smart appliances supporting consumer home energy management and enabling energy provisioning transformation. Costs will be offset, at least partially, by energy savings. Benefit Rating: High Market Penetration: 1% to 5% of target audience Maturity: Emerging Sample Vendors: Comverge; Echelon; Tendril Recommended Reading: "Management Update: Top 10 Business Trends Impacting the Utility Industry in 2011" "The Utility of the Future: The Information Utility" "Energy Technology Consumerization: The Quest for Lean and Green"

Page 62 of 139

Gartner, Inc. | G00214660

802.22
Analysis By: Akshay K. Sharma; Peter Kjeldsen; Sylvain Fabre Definition: The Institute of Electrical and Electronics Engineers (IEEE) 802.22 is a standard for wireless regional area networks that uses white space (the unused guard bands in the TV frequency spectrum). It leverages newer cognitive radio techniques to reuse unused spectrum for wireless broadband access, and operates in the very high frequency/ultrahigh frequency TV broadcast bands between 54 MHz and 862 MHz. This standard could lead to devices that enable broadband access via white space spectrum, and to newer communications service providers (for example, over-the-top providers such as Microsoft and Google) entering this area. This spectrum can also be used in rural areas for broadband services. Position and Adoption Speed Justification: Initial drafts of the 802.22 standard specify that the network should operate on a point-to-multipoint basis, whereby the system has a similar topology to a base station and customer premises equipment (CPE) much like a cellular network. One key feature of the wireless regional area network is that the CPE will be sensing the spectrum to determine if newer channels should be used, dynamically selecting channels to avoid interference. Manufacturers and users of semiconductors, PCs, enterprise networking devices, consumer electronic devices, home networking equipment and mobile devices should follow the progress of this standard. Newer prototypes, termed "Super Wi-Fi," are appearing. This technology provides Wi-Fi over the TV "white space" spectrum, in which lower frequencies can reach further and penetrate buildings more easily than standard Wi-Fi radios, which implement the IEEE 802.11 specification. Wi-Fi runs in the unlicensed 2.4 and 5 GHz bands. User Advice: It is too early to plan for 802.22 devices. Business Impact: The main aim of utilizing white space is to provide access to high-speed wireless data for low-mobility users. Benefit Rating: Moderate Market Penetration: Less than 1% of target audience Maturity: Emerging Sample Vendors: Dell; Google; HP; Microsoft; Samsung Recommended Reading: "Emerging Technology Analysis: Potential of TV White-Space Spectrum for Enterprise/Business Use"

Addressable TV Advertising
Analysis By: Andrew Frank

Gartner, Inc. | G00214660

Page 63 of 139

Definition: Addressable TV advertising technologies enable advertisers to selectively segment TV audiences and serve different ads within a common program or navigation screen. Segmentation can occur at geographic, demographic, behavioral and (in some cases) self-selected individual household levels, through cable, satellite and Internet Protocol television (IPTV) delivery systems and set-top boxes (STBs). Because broadband-delivered advertising is inherently addressable by nature, this technology does not apply to broadband-delivered video programming services. Position and Adoption Speed Justification: The value proposition of TV broadcasting for advertisers was once based on the unique opportunity the medium gave advertisers to reach a nationwide mass audience with relative ease. The proliferation of video content options, first on cable and satellite TV and then on IPTV and the Internet, has fragmented television audiences and raised advertisers' expectations to be able to apply targeting capabilities developed on the Internet across all media. Internet TV providers and video ad networks have little difficulty providing such capabilities to connected TVs as well as TV programming viewed on PCs, media tablets, and smartphones, whose share of consumer time is growing, although still lags behind national TV audiences in most regions. Although addressable TV advertising was long envisioned as part of television's evolutionary road map, the competitive threat from online video has led TV service providers to accelerate efforts to provide addressable targeting solutions at scale. These efforts are closely tied to service providers' underlying technology infrastructure, so that the efforts of cable companies, satellite providers, telco IPTV providers, and hybrid approaches combining over-the-air (OTA) broadcast and Internet are progressing at varying rates in different regions. A major factor in TV service providers' efforts to provide addressability is the potential scale of their deployments. Because addressability is for segmenting audiences, the underlying audience that can be addressed by a given solution must be large enough to produce segments large enough to be of interest to advertisers. This factor has limited interest in the technology outside of the largest broadcasting markets such as the U.S. and the U.K. (As Nielsen Research has noted, advertiser goals for addressability can be divided into "reach" strategy, whose main objective is to cover a broad target segment with a message to generate brand awareness, and "effectiveness" strategy, whose main objective is to engage a more-limited target segment that is likely to respond to a specific call to action. In either case, scale is essential, although effectiveness places greater emphasis on the depth and quality of the targeting data.) The scale factor has narrowed the focus on addressability to cable and satellite providers, although a few telco IPTV providers are offering zone-based addressability, such as Verizon FiOS. In the U.S., cable serves about 60% of the 115 million television households, giving it ample scale, however cable-based addressability on a national scale has proven elusive. A few cable operators have achieved regional addressability (such as Cablevision, using Visible World's Connect product to deliver addressable ads to about 3 million households in mid-Atlantic states), but technical issues of legacy set-top boxes and headend systems continue to impede the industry-backed efforts of companies such as Canoe Ventures, a joint venture of the cable industry's largest operators, to offer nationwide multisystem addressability solutions. Digital broadcast satellite (DBS) providers are further along, and are in the process of rolling out addressability solutions based on DVR-based local storage of addressable ads. DirecTV plans to
Page 64 of 139
Gartner, Inc. | G00214660

reach 10 million subscribers when it rolls out in the second half of 2011. Its provider, NDS, is also deploying this technology (called NDS Dynamic) in other regions, notably Sky's AdSmart project in the U.K. At least as significant as the technology and audience size issues are business issues affecting the adoption of addressable TV advertising. There are several to consider:

General inertia and recalcitrance of the TV advertising market, for which addressability represents a disruption to entrenched business practices. Fragmentation of audiences, which represents a large jump in the complexity of media packaging and sales processes. Uncertainty as to how much additional value advertisers will assign to targeting (especially in high-spending, low-consideration sectors, such as consumer packaged goods), and whether this additional value will be sufficient to offset costs. Significant privacy concerns associated with household segmentation. Issues regarding the allocation of revenue and control over addressable ads among broadcasters, distributors and third parties.

It has become clear in the past year or so of trials that the measurable value of addressability varies considerably among marketing sectors and objectives. Communications service providers (CSPs) have found the technique to be effective in marketing new capabilities to customers on their networks based on the knowledge of what they already have. Automotive manufacturers have found value in the ability to switch the final image of an ad referred to as an "art card" to a screen that contains the address, phone number and Web link of a consumer's nearest dealer. (Comcast Spotlight, the cable operator's advertising arm, refers to this capacity as "Adtag.") Automotive advertisers are also among the most willing sectors to pay a premium to reach consumers who are "in-market," which in their case, is shopping for a car. The cable industry's legacy infrastructure barriers have led to estimates of six years or more to provide nationwide addressability in the U.S. A popular belief is that more-advanced providers will break ranks and deliver the capability before then, possibly by employing over-the-top (OTT) technologies that make use of broadband rather than broadcast-based solutions. In any case, it's likely that the efficiencies derived from audience segmentation will eventually prevail, and the practice will become widespread, at least within certain marketing sectors. Some regions will certainly outpace others, but the overall process of change represented by addressable TV ads appears slower than expected, leading us to extend its time to maturity to five-to-10 years from a more-optimistic assessment in the previous year. Finally, it remains to be seen whether incumbent TV distributors, Internet-based challengers or broadcasters and programmers will control the most lucrative part of the value chain. User Advice:

Gartner, Inc. | G00214660

Page 65 of 139

Advertisers must consider how to position their media and sales strategies and privacy policies against emerging TV-ad-targeting technologies. Agencies must offer multichannel campaign management services that include support for various emerging segmentation and targeting capabilities in media. They must also invest in data analytics to gain the ability to use emerging targeting techniques effectively. Agencies and advertisers should work with platform developers and metric providers to define standard metrics that will enable transparency and optimization in the complexity that will result from addressable capabilities. Broadcasters must ensure that they preserve their direct relationships with advertisers and don't get intermediated by platforms over which they have little control. They must bargain aggressively to minimize intermediary revenue splits and may play competitive factions against one another to prevent lock-in to any single-provider solution. Internet portals, OTT services and ad networks can continue to exploit delays in TV addressability by engaging more with traditional video-oriented advertising agencies and advertisers to develop online targeting and segmentation strategies and capabilities. They should also press for interoperability of TV standards with Internet and mobile channels, and challenge proprietary service provider data, such as customer addresses, being used for adtargeting purposes without explicit advance informed consent. Multichannel video programming distributors (MVPDs) must re-evaluate how various Internet advertising technologies might present opportunities to offer advanced advertising capabilities alongside in-band approaches and be wary of platform-specific solutions that require extensive alignment among fragmented operators.

Business Impact: Addressable TV advertising technologies affect advertisers, advertising agencies, MVPDs, TV networks, CE manufacturers, marketing data providers, television regulators and privacy advocates. These technologies also affect marketing technologists considering the CRM implications of new targeting capabilities. Addressable TV advertising represents an opportunity for mainstream advertisers and broadcasters to benefit from the scourge of audience fragmentation thus, turning a big problem into a benefit. We have reduced the benefit rating from "High" to "Moderate" based on the observation that broadband technologies, which are not included here, are increasingly likely to dilute any impact MVPD in-band solutions might have on the marketplace. Benefit Rating: Moderate Market Penetration: 1% to 5% of target audience Maturity: Adolescent Sample Vendors: Alcatel-Lucent; BigBand Networks; Black Arrow; Canoe Ventures; Invidi; Microsoft; NDS; OpenTV; Packet Vision; SeaChange International; Visible World Recommended Reading: "New Television Meets Context-Aware Computing"

Page 66 of 139

Gartner, Inc. | G00214660

"A Scenario for the Future of Television in the Cloud" "Two Roads to TV 2.0"

RF Over Glass
Analysis By: Ian Keene Definition: Radio frequency over glass (RFoG) is a standard that was proposed by the main cable standards body the Society of Cable Telecommunications Engineers (SCTE) it is effectively the equivalent of fiber to the home (FTTH) for cable networks, and has been described as "combining the best of RF cable delivery and passive optical network (PON) technologies." The SCTE Engineering Committee completed the first RFoG standard (SCTE 174 2010) in December 2010, and this was adopted by the American National Standards Institute (ANSI) as a standard in May 2011. As with many telecom standards, enhancements to this standard continue to be discussed in committee. Outside of North America, international standardization of the SCTE proposal is under way. This move comes at a time when cable operators are under pressure to increase the capacity in their networks to support not only ultra-high-speed broadband to compete with FTTH services being deployed by wireline communications service provider competitors but also a future where large amounts of on-demand and high-definition programming, over-the-top and peer-to-peer video, as well as other bandwidth-laden applications and services, are expected to become the norm. Reduced operating expenditure (opex) is also a key aspect of the technology. SCTE 174 2010 is a suite of technical standards to support wider use of optical fiber in the cable plant, while also supporting the coexistence of current legacy technologies over cable's hybrid fiber-coaxial (HFC) system architecture in which voice, video and data share the same spectrum. This RFoG standard defines interoperability with existing headend equipment, back-office systems, digital set-top boxes and Data-Over-Cable Service Interface Specification (DOCSIS) modems. RFoG and HFC can operate out of the same hub, allowing the gradual replacement of coaxial cable with fiber within the delivery network. RFoG is also compatible with standard PON networks such as gigabit PON (GPON) and Ethernet PON (EPON), allowing the transition from HFC to RFoG and eventually to PON and an all-Internet Protocol (IP) environment. RFoG can be an overlay to a PON network for the delivery of linear broadcast TV, and in this respect it is not only existing HFC networks that may untilize RFoG technology. Proprietary RFoG solutions have been on the market since 2006. Note that the term "RFoG" is often used to describe prestandard and proprietary technologies. Most of these vendor solutions are expected to adopt the SCTE standard, but in addition many will offer proprietary enhancements on top of the standards-based functionality. Position and Adoption Speed Justification: A number of cable operators have deployed RFoG (proprietary implementations at the time of writing, but these deployments have mainly been by

Gartner, Inc. | G00214660

Page 67 of 139

small cable operators, are field trials, or are built on new "greenfield" sites). Large multisystem operators (MSOs) are deploying fiber deeper into their networks, but consider HFC and DOCSIS 3.0 to be a sufficiently competitive architecture at least in the medium term for the bulk of their subscribers. The "endgame" for a growing number of MSOs is an all-fiber and an all-IP network. RFoG offers a transition strategy, where portions of the existing HFC network can be changed to RFoG while protecting the investment in existing headend, back-office and customer premises equipment. With the RFoG standard supporting PON, MSOs could move to an all-IP infrastructure at some time in the future. The main advantages of RFoG are:

More RF spectrum, both downstream and upstream, to accommodate more linear broadcast and high-definition TV channels. Lower opex, with lower maintenance of outside plant and reduced power consumption.

While the main disadvantages are:

Increased capital expenditure in non-greenfield deployments, particularly the high cost of deploying fiber cable. Still limited to DOCSIS 3.0 IP bandwidths, less than potential GPON bandwidths.

Now that the standard has been ratified, cable operators will have less risk of vendor lock-in with RFoG deployments, and they are expected to gradually ramp up over the next five years. However, many urban and suburban areas will remain HFC (with fiber continuing to get closer to the street corner) for at least the next three years. User Advice: Cable HFC network operators need to consider opex factors as part of their competitive advantage, as well as what RFoG technologies can mean to that equation. Also, in terms of providing business services to the enterprise and small or midsize business customers, they need to consider what the use of a single network (instead of separate networks) will do to improve their operational success. Operators need to examine the benefits of the longevity of fiber, and the transition path that RFoG can provide toward an all-fiber broadband IP network. Implementation of this technology will require operators to analyze their network needs and requirements in terms of the density of homes passed versus the distance from central headends/ hubs, and strike the right balance for bandwidth expansion and opex savings. The costs of deploying RFoG solutions, relative to HFC, in a greenfield environment with medium to low urban density are favorable. On the other hand, in denser urban environments the cost of replacing coaxial cable with fiber is likely to be prohibitive with the possible exception of multitenant buildings. Some broadband expansion initiatives may also help to drive RFoG. The technology is well suited to extending cable footprints into surrounding rural areas, and could give MSOs a cost-effective way to compete for subsidized rural broadband dollars.

Page 68 of 139

Gartner, Inc. | G00214660

Business Impact: Opex savings will deliver the biggest impact. Key areas are improved network reliability and uptime. RFoG is also expected to deliver major improvements in the powering of outside plant removing all requirements for outside plant power such as backup power, emergency generators and so on as well as providing more environmentally "green" solutions for operators. Maintenance costs are also a major factor: customers with all-fiber plant could achieve as little as 20% of the maintenance costs of an HFC or copper plant. Altogether, vendors of this solution estimate opex savings for their customers of up to 70% in ideal conditions. Taken together with its bandwidth expansion properties, this means that RFoG could have a significant overall business impact. RFoG solutions can be overlaid/installed as needed within an existing HFC network; they are compatible with all digital services for voice, video and data; and they preserve existing protocols in both the downstream and upstream paths. As more MSOs see an all-IP, all-fiber future, RFoG will be used in a number of scenarios during the next five years:

New network build (greenfield sites), and in replacement networks where existing plant has reached end of life. High-density multidwelling units. By small cable operators. In low-density (typically rural) areas where opex savings are maximized and government broadband initiatives might come into play.

Benefit Rating: Moderate Market Penetration: 5% to 20% of target audience Maturity: Early mainstream Sample Vendors: Arris; Aurora Networks; Cisco; CommScope; Hitachi; Motorola Recommended Reading: "Forecast Analysis: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Forecast: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update"

White Spaces: Unlicensed Spectrum TV


Analysis By: Akshay K. Sharma; Sylvain Fabre Definition: On 12 June 2009, the U.S. moved to digital TV. The "white spaces" in the unused TV frequencies between TV channels can now be used to provide wireless broadband that delivers high-speed Internet access (at 10 Mbps and above) to fixed and low-mobility consumers. In October 2009, the first white space trial was launched in Claudville, Virginia.

Gartner, Inc. | G00214660

Page 69 of 139

In Cambridge in the U.K., Microsoft demonstrated high-definition TV (HDTV) streaming in white spaces. In the rest of the world, the abandoned television channels are very high frequency (VHF). The potential exists for further devices from members of the White Space Coalition, which includes Microsoft, Google, Dell, HP, Intel, Philips, EarthLink and Samsung. In February 2010, the city of Wilmington in North Carolina (and the surrounding county of New Hanover) partnered with companies TV Band Service and Spectrum Bridge to launch a new experimental network that uses white space spectrum to provide wireless connectivity to surveillance cameras and environmental sensors in a "smart city" deployment. Position and Adoption Speed Justification: In November 2008, the Federal Communications Commission (FCC) in the U.S. unanimously granted free, unlicensed wireless access to chunks of unused airwaves on the broadcast spectrum that had previously been used to buffer TV channels. This access is dependent on mobile technology companies incorporating geolocation capabilities into their devices. These bar interference with TV signals and have the ability to access (via the Internet) a database that confirms which white spaces are available according to the device's location. Newer prototype solutions are emerging, called "Super Wi-Fi." In these, Wi-Fi runs over white spaces at lower frequencies, and can reach further and penetrate buildings more easily than standard Wi-Fi radios, which implement the Institute of Electrical and Electronics Engineers (IEEE) 802.11 specification. Wi-Fi runs in the unlicensed 2.4 and 5 GHz bands. User Advice: It is too early to plan for white-space devices. Business Impact: The main aim of white space is to provide high-speed wireless data to lowmobility users. Benefit Rating: Moderate Market Penetration: Less than 1% of target audience Maturity: Emerging Sample Vendors: Dell; Google; HP; Microsoft; Samsung Recommended Reading: "Emerging Technology Analysis: Potential of TV White-Space Spectrum for Enterprise/Business Use"

TD-LTE
Analysis By: Joy Yang Definition: TD-LTE, or LTE TDD as it is called by the Third Generation Partnership Project (3GPP), is a time-division duplexing (TDD) version of Long Term Evolution (LTE). According to the 3GPP's definition, TD-LTE will be the successor to Time Division Synchronous Code Division Multiple Access (TD-SCDMA).
Page 70 of 139
Gartner, Inc. | G00214660

TD-LTE will be solely a physical-layer development from LTE frequency-division duplexing (FDD), which, by contrast, uses paired frequency spectrum separated by a guarded band to provide uplink and downlink data communications in dedicated spectrum. There is no operational difference between LTE TDD and LTE FDD at higher layers or in the system architecture. TD-LTE has a different frame structure from LTE FDD at the physical layer and requires greater synchronization in the system. Position and Adoption Speed Justification: TDD technology does not require paired spectrum. Transmit and receive signals use the same frequency band and are separated through time-division duplexing, unlike FDD in which transmit and receive signals work in different frequency bands. TDD has the flexibility to allocate channel capacity to the uplink or downlink dynamically, according to the demands of the traffic. Also, unlike FDD, TDD does not require a guard band between uplink and downlink. Therefore, TD-LTE can make more efficient use of frequency resources. TDD is also a great alternative where FDD spectrum is too costly or insufficiently available, as license fees for FDD spectrum are often more expensive and difficult to obtain. China Mobile, which owns the Global System for Mobile Communications (GSM) network with the largest number of mobile subscribers and is the only major TD-SCDMA operator, is heavily behind the TD-LTE ecosystem. TD-LTE is gaining momentum among technology vendors in relation to compliance. In 2010, equipment vendors Motorola, Ericsson, Nokia Siemens Networks, Huawei and ZTE cooperated with China Mobile and demonstrated their TD-LTE solutions at Expo 2010 Shanghai China. By the end of 2010, two commercial launches had been announced: one by Hi3G in Sweden and Denmark, and another by Aero 2 in Poland. In 2011, China Mobile is conducting trials in six cities with seven infrastructure vendors. At the end of April 2011, there were 30 TD-LTE trial networks worldwide. TD-LTE is also attracting the interest of WiMAX operators. WiMAX is another wireless broadband technology based on TDD technology. However, not enough effort has been invested in WiMAX's successor technology, WiMAX 802.16m. With several equipment vendors having announced that their current WiMAX solutions could in future support TD-LTE through software upgrades, many WiMAX operators are likely to migrate to TD-LTE as their next-generation technology. In 2011, India's broadband wireless access license holders are actively testing TD-LTE with vendors. The TD-LTE ecosystem is in the process of maturing. All the mainstream mobile infrastructure vendors, including Ericsson, Huawei, Nokia Siemens Networks, Alcatel-Lucent and ZTE, together with Chinese TD-SCDMA vendors such as Datang, FiberHome Technologies, New Postcom and Potevio, are contributing to it. As of April 2011, seven TD-LTE data cards and USB dongles were available from ZTE, ST-Ericsson, Qualcomm, Altair Semiconductor, Sequans Communications, Innofidei and HiSilicon Technologies. There were also some TD-LTE handsets, tablets, booklets (mini-notebook PCs) and MiFi routers. China Mobile is pushing dual-mode (TDD/FDD) handsets. Apple has announced a dual-mode iPhone road map, to support TD-LTE in 2014. User Advice: Operators that lack the opportunity, or are unwilling, to pay the high license fees for FDD spectrum should consider TD-LTE as an alternative technology for providing mobile broadband services.

Gartner, Inc. | G00214660

Page 71 of 139

Current WiMAX operators should bear TD-LTE in mind as an alternative for technological evolution. When choosing a WiMAX vendor, communications service providers should evaluate the likely ability of the solution to migrate to TD-LTE in the future. WiMAX equipment vendors without TD-LTE offerings should consider using their experience in TDD technology and their "footprint" in the WiMAX market to break into the TD-LTE market. Business Impact: TD-LTE is intended to deliver high-bandwidth, high-quality mobile broadband services to enterprises and residential users, with potential reductions in operational costs for operators. Benefit Rating: High Market Penetration: Less than 1% of target audience Maturity: Emerging Sample Vendors: Alcatel-Lucent; Ericsson; Huawei; Nokia Siemens Networks; Qualcomm; Samsung; ZTE

Next-Generation Service Delivery Platforms


Analysis By: Martina Kurth Definition: A next-generation service delivery platform (NG SDP) is a set of integrated software components that supports the delivery of Internet Protocol (IP) and non-IP carrier services. The aim with NG SDPs is to create the core of a network/resource-neutral service delivery system that can automate service creation and service management. It provides more flexibility, facilitates faster service creation and enables the combination of many services and service features. Vendors' NG SDP offerings vary according to their key competencies in networks or IT. NG SDPs include many functions, some of which may be considered a business support system or operations support system (OSS) by other vendors. These functions may include: Service creation:

Service element creation. Service composition. Service business conditions.

Service management (traditionally the domain of OSSs):


Fulfillment. Assurance.

Service operation subsystems (traditionally the domain of network resources):

Databases/registers (centralized or distributed).


Gartner, Inc. | G00214660

Page 72 of 139

Application execution. Integration platform and service-oriented architecture (SOA) environment (SOA service orchestration). Identity management (and profile management). Content management subsystems. Call-processing subsystems. Web portal technology. Mobile portal technology.

Enablers and application programming interfaces (APIs):


A set of APIs to interact with and virtualize the carrier's network elements. A set of APIs to connect services, including third-party content services and virtual network operator services. A set of APIs to connect to carriers' existing applications, such as billing, customer relationship management, ERP and OSS applications. Service enablers providing presence and location data, which can be shared between applications. Increasingly, enablers also include application storefronts and devices to create new content and services.

The underlying transmission services can be traditional telecom services (voice, data or mobile), Session Initiation Protocol services or Internet services. Over time, NG SDPs will evolve toward an environment in which various NG SDPs, handling specific services, are integrated and can work together through the use of common capabilities. The concept of an end-to-end integrated service infrastructure or service network is emerging. This software and hardware infrastructure will include all service-enabling functions of a future communications service provider (CSP). Therefore, it will include functions such as device management, user device clients, content management, end-user data, policies, next-generation OSSs, parts of the control layer (IP Multimedia Subsystem or Internet protocols) and even end-user applications. Position and Adoption Speed Justification: CSPs have gone through numerous generations of SDP and consequently have a complex, expensive and slow service delivery environment. NG SDPs have an architecture that aims to simplify the complex service delivery environment to a minimal number of SDPs that work together and use modern software principles, such as SOA, for greater efficiency and lower costs.

Gartner, Inc. | G00214660

Page 73 of 139

CSPs understand the concept of a horizontal service delivery environment with third-party exposure. However, the current reality is that SDPs are deployed as multiple (often vertical) platforms, which interoperate. At present, most SDP implementations that support mobile services and mobile functionality are more advanced. Nevertheless, the worldwide market for NG SDP products and services has matured over the past year. Many CSPs around the world are either evaluating or at various stages in the implementation of NG SDPs in order to enable the business case for new converged services, while mitigating legacy investment in telecom networks and IT. NG SDP infrastructures gradually mature as CSPs tend to pursue modularized, evolutionary enhancements of their existing architectures. Additionally, SDP architectures show greater alignment with CSPs' needs. NG SDPs become key business enablers as focus shifts toward new business models that entail third-party participation, revenue generation and improvements to the customer experience. A key factor in this context is the monetization of existing network and IT assets. Therefore, the need for application development and legacy interoperability are among the main drivers of CSPs' investments in NG SDPs. This trend also leads to the acceleration of network and service exposure, as well as amplified partnerships with the "ecosystem" of third-party developers and content and application providers and next-generation intelligent network (NG IN) integration. The main focus of CSPs is the migration of NG IN and telco services to maximize revenue based on existing infrastructures. We also see a lot of hype around the enablement of application stores, and management and analytics relating to devices in this context. Over time, we expect the SDP increasingly to take on the role of service broker, with parts of the SDP moving into the cloud. Network and service exposure, and APIs for third-party content providers with a focus on content creation and delivery, as well as related partner settlement and charging, are very likely. User Advice: CSPs should invest swiftly in a more agile creation and delivery environment for innovative services so as to be able to anticipate new value chains and sources of revenue. However, they should refrain from a risky "big bang" approach. Instead, a step-by-step deployment model should be used, based on a modularized, horizontal evolution and proven return on investment for each module. CSPs should center their immediate efforts on pragmatic enhancements to existing telco services in order to leverage legacy assets for new composite services. However, IT matters such as Web 2.0, service exposure and device enablers, application stores, as well as improvements to the user experience, will simultaneously become more imperative. CSPs should not wait too long to tap into new domains, such as third-party abstraction and application stores, in order to gain experience of the new service delivery environment. For example, enablers such as presence and location could be exposed to third parties to help build and later charge for innovative IT services, using, for example, social media on the Internet. Consumers, small and midsize businesses and branch-office users could all benefit from an

Page 74 of 139

Gartner, Inc. | G00214660

increased choice of services. However, large business users can develop or source customized services, so the effect is likely to be less dramatic there. To link all software components in a service delivery environment, CSPs will need to select and implement appropriate enterprise service bus and SOA tools. They must define their architecture and ensure ease of integration by minimizing the number of tools and vendors they use. For cost-efficiency, scalabilty and time-to-market reasons, CSPs should evaluate alternative SDP delivery models, such as hosted software as a service (SaaS), platform as a service (PaaS) and the cloud. Multitenant services, third-party content creation and settlement, as well as enterprise services, may be particularly well-suited to these models. Business Impact: NG SDPs will have a profound effect on the service experience of end users in the long term. From a technological perspective, NG SDPs support the business case for new converged services without requiring heavy network investments. Once these platforms are wellunderstood and embraced by major carriers, additional innovative services will become available to users through a high-performance and highly secure "carrier-grade" service environment. The effect on CSPs' product creation capabilities will be significant as SDPs enable new content and services. Flexible and open NG SDPs will also play an important role at the core of NG service networks, where they will be crucial for enabling the "multidimensional" next-generation telco business model. In this model, end users can also be "producers," additional revenue streams can come from nonend-user third parties (for example, advertisers), and CSPs work together to increase their reach. Additionally, carriers are enablers and wholesale providers. Benefit Rating: Transformational Market Penetration: 5% to 20% of target audience Maturity: Emerging Sample Vendors: Accenture; Alcatel-Lucent; Ericsson; HP; Huawei; IBM; Nokia Siemens Networks; Oracle

IMS
Analysis By: Bettina Tratz-Ryan Definition: IP Multimedia Subsystem (IMS) is a standardized, open architecture based on Session Initiation Protocol (SIP). It defines how applications and services are delivered to customers, regardless of the access network on which they run. IMS separates session control from the actual applications for maximum flexibility, and standardizes the signaling and control layer, together with network-based and Web-enabled applications and services. It helps carriers build their strategy on the convergence of platforms, technology solutions and services, as well as on end-user devices and terminals, including handsets and client premises equipment. Within IMS, the Policy and Charging Rules Function (PCRF) is the policy entity that forms the link between the service and transport layers. The PCRF collates subscriber and application data, authorizes quality-of-service

Gartner, Inc. | G00214660

Page 75 of 139

resources, and instructs the transport plane on how to proceed with the underlying data traffic. This function becomes interesting, especially when communications service providers need to deal with traffic from over-the-top players and Web service providers. Position and Adoption Speed Justification: The architecture around the IMS topology and the logic behind it has matured. However, implementation has been mostly deployed by service providers to fulfill certain needs and requirements on a network level or a service environment, such as to build out converged data, voice and collaboration applications for specific enterprise and consumer customers. Many deployments stem from fixed-line operators in their voice migration toward next-generation voice, implementing rich communication suites in lockstep. During the past few years, the migration from an existing network topology to IMS continued to be quite complex. There are many competing IMS offerings from vendors, and service providers have selected vendor "ecosystems" to optimize an IMS-based network transformation solution around their specific core network requirements. In addition, the IMS application enablers, together with initiatives such as IMS Rich Communication Suite (RCS), will support the creation of compelling new services that will provide subscribers with an end-user experience they are willing to pay for. This remains a critical issue, because many blended services receive only a limited return on investment. Mobile operators are pursuing the IMS topology from a home subscriber server and application layer perspective and are interested in IMS RCS delivering a standardized set of applications (presence, instant messaging and active directory) using existing telephony. Wireline operators need a carrier-grade, future-proof voice over Internet Protocol (VoIP) platform blended with IPTV, and they are interested in fixed-mobile service convergence for enterprise customers. IPTV should not be seen as a panacea for successful IMS implementation, but rather as a supplemental service capability. Control, as well as abstraction layers into service delivery architectures, will enable the communications service providers (CSPs) to build out services to vertical customers such as energy utilities. An increasing number of service bundles for smarter applications, such as e-healthcare and video surveillance via sensors and remote cameras, will become offerings of CSPs, and IMS will become a facilitator for those services. User Advice: IMS is a promising long-term architecture for session control, but it needs considerable system integration to deliver the promised results. Consider vendors that have experience in technology migration and have the necessary capabilities to provide the integration and application development skills. Deploy IMS-capable softswitches and proceed with IMScompliant service delivery platforms. Watch for new service opportunities in the RCS community to justify mobile network development, especially for Long Term Evolution (LTE) multimedia services. The success hinges on the quick availability of IMS terminals. Business Impact: Network service providers will be able to support, control and charge for differentiated session delivery for multimedia services with a standards-based architecture. The world's largest carriers will embrace IMS, so it will help in carrier interoperability. In the longer term, IMS will be applicable to fixed and mobile operators, as well as cable operators. Network service providers will also be able to collapse various network layers to gain cost savings, because in future iterations of the architecture, IMS will be able to support both fixed and mobile sessions. Residential and enterprise customers will be able to obtain more "carrier grade" services via a single terminal, with one authentication point for address book, voice and multimedia mail, and value-added services. In the future, even television via IP could be linked using IMS. IMS will have an impact on
Page 76 of 139
Gartner, Inc. | G00214660

future mobile LTE broadband network designs and, therefore, future services that will be enabled on this architecture. Benefit Rating: Moderate Market Penetration: 1% to 5% of target audience Maturity: Adolescent Sample Vendors: Alcatel-Lucent; Ericsson; Huawei; Italtel; Metaswitch Networks; NEC Japan; Nokia Siemens Networks; Sonus; ZTE Recommended Reading: "Magic Quadrant for Softswitch Architecture" "Vendor Rating: Alcatel-Lucent" "Vendor Rating: Ericsson" "Vendor Rating: Huawei" "Vendor Rating: ZTE"

10G-PON
Analysis By: Ian Keene Definition: 10 Gbps passive optical network (10G-PON) is a next-generation solution following the current-generation gigabit passive optical network (GPON) (ITU-T G.984) and Ethernet passive optical network (EPON) (IEEE 802.3ah) solutions, basically offering higher bandwidth and additional features. Like its predecessors, 10G-PON will allow multiple users to share the capacity over a passive fiber-optic "tree" infrastructure, where the fibers to individual users branch out from a single fiber running to a network node. In September 2009, the Institute of Electrical and Electronics Engineers (IEEE) approved 802.3av as a 10G-EPON standard, including both 10/1 Gbps and symmetrical 10 Gbps implementations. In October 2010, the International Telecommunication Union (ITU) approved the ITU-T G.987.1 and G.987.2 10G-PON standards with asymmetrical and symmetrical implementations. Asymmetrical 10G-PON (specified by the Full Service Access Network [FSAN] as XG-PON1) provides 10 Gbps downstream and 2.5 Gbps upstream. Symmetrical 10G-PON (specified by FSAN as XG-PON2) provides 10 Gbps both ways. 10G-PON uses different downstream and upstream wavelengths (1,577 nanometers [nm] and 1,270nm respectively) to those used by GPON, so that both systems can coexist on the same fiber architecture. This allows communications service providers (CSPs) to supply GPON services to the majority of their subscribers, while providing higher-bandwidth 10G-PON to premium subscribers, such as enterprises, or for the deployment of broadband to high-density multidwelling units.

Gartner, Inc. | G00214660

Page 77 of 139

Position and Adoption Speed Justification: Outside Japan and China, 10G-PON is expected to dominate over 10G-EPON. The U.S. might be an exception to this, as multiple system operators there have traditionally tended to favor EPON over G-PON. Growing numbers of CSPs are conducting 10G-PON trials, with examples in the U.S., Portugal and France. The fixed-access market is a high-volume market, and the challenge facing current-generation fiber to the home (FTTH) PON technologies is related more to deployment cost than to limited capacity. This implies that the time it will take 10G-PON to reach the Plateau of Productivity on the Hype Cycle will probably be determined by how quickly the total cost of ownership (TCO) for 10G-PON comes close enough (usually within 15% to 25%) to the TCO of current-generation PON to become attractive. When it reaches this point, most "greenfield" deployments will switch to 10G-PON. However, reaching this cost point can really be a possibility only for asynchronous 10G-PON due to the high cost of 10 Gbps upstream lasers within the optical network terminal. The evolution in the TCO for CSPs will be different from the evolution in the equipment cost for technology providers. The equipment cost is a relatively minor component of the FTTH TCO, and the price tag that technology providers will put on 10G-PON will be shaped not only by production cost, but also by the competitive landscape that providers of 10G-PON solutions will be facing. Another scenario for widespread deployments of 10G-PON solutions that could potentially move 10G-PON to the Plateau of Productivity is an upgrade scenario in which current-generation PON deployments start to run out of bandwidth but with CSPs struggling to fully leverage the bandwidth of current-generation GPON/EPON solutions, this scenario is less likely than the nearcost parity scenario. In either scenario, 10G-PON is likely to compete against wavelength division multiplexing (WDM) PON solutions, with 10G-PON lower on capital expenditure and WDM-PON potentially lower on operating expenditure due to its long (100 km) reach that can eliminate many central office locations. While the expected time to reach the Plateau of Productivity in this Hype Cycle is based on consumer FTTH as the target market, there is another shorter-term market opportunity for 10GPON products. There is growing interest in overlaying synchronous 10G-PON on existing GPON networks to supply enterprises with high-bandwidth, low-latency access to cloud computing services. While comparatively low in volume compared with the consumer access market, synchronous 10G-PON is a technology that CSPs can leverage in their effort to be effective providers of public cloud services. For this reason, 10G-PON has been moved significantly along the Hype Cycle. User Advice: Expect the price premium for asynchronous 10G-PON relative to current-generation PON to erode over time, as when GPON replaced broadband passive optical network (BPON) (ITUT G.983). When evaluating the price premium for 10G-PON, ensure you do so from a total cost perspective. With civil works and fiber installation cost typically accounting for most of the total cost, the price

Page 78 of 139

Gartner, Inc. | G00214660

difference between current-generation PON and 10G-PON is lower from a total cost perspective than from a pure equipment cost perspective. When deploying or evaluating current-generation PON solutions, consider future upgrades either to 10G-PON or WDM-PON solutions. CSPs should evaluate synchronous 10-GPON as a potential part of any plan to provide enterprise cloud services. Business Impact: 10G-PON could become the mainstream PON technology as the price premium relative to current-generation PON diminishes, but it also offers itself as one of the possible upgrade paths for CSPs that have already deployed current-generation PON solutions. Benefit Rating: Moderate Market Penetration: Less than 1% of target audience Maturity: Adolescent Sample Vendors: Alcatel-Lucent; Huawei; Motorola Recommended Reading: "Emerging Technology Analysis: Next-Generation Broadband Access Caters for End-User Bandwidth Appetite" "Forecast Analysis: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Forecast: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Magic Quadrant for Fiber-to-the-Home Equipment"

Network Sharing
Analysis By: Peter Kjeldsen Definition: Network sharing is defined as a situation in which two or more communications service providers (CSPs) share network resources, either through joint ownership of network resources or by third-party-enabled network sharing (open networks). In principle, network sharing can happen with any technology, but it is most widely discussed in the context of open fiber-to-the-home (FTTH) networks and the joint ownership of various infrastructure components of mobile networks. Network sharing, especially radio access network (RAN) sharing, is seriously considered in mobile infrastructure construction. The Third Generation Partnership Project (3GPP) has defined networksharing scenario requirements, architectures and functions in its Release 6. This introduces two network-sharing architectural configurations: the Gateway Core Network (GWCN) configuration and the Multi-Operator Core Network (MOCN) configuration.

Gartner, Inc. | G00214660

Page 79 of 139

In most cases it will be simpler to share passive infrastructure (e.g., fibers, radio towers and so on) than active components (i.e., network nodes, radio units and so on) both in terms of allocating shared costs and the operational challenges of running the network. Both passive and active network sharing require that appropriate operations, administration and maintenance (OAM) tools are used together with appropriate processes that take the shared interests and responsibilities fully into account and both types of sharing can be significantly affected by regulatory decisions. Finally, depending on the context, it may be relevant to distinguish between scenarios where CSPs decide to share their existing infrastructure and scenarios where two or more CSPs invest in new shared infrastructure. Network sharing as defined in this summary is (a) technology-agnostic; (b) includes both active and passive sharing; and (c) includes both sharing of existing infrastructure as well as investments in new shared infrastructure. Position and Adoption Speed Justification: Network sharing as a business model has been around for some years:

Joint ownership of resources has been seen as a way to cap rollout costs for mobile networks, and the financial crisis has sharpened CSPs' focus here. Scenarios include different types of sharing that involve joint ownership of backhaul infrastructure, cell towers and RAN base stations, as well as core network components like data switches and softswitches. Third-party-enabled network sharing via open networks has attracted regulatory attention because of this model's ability to lower entry barriers for CSPs, while still allowing for differentiation in the higher parts of the value chain. Until recently, open networks were seen mainly in FTTH rollouts involving utilities and municipalities, but the open-network business model has experienced a renaissance as a result of government stimulus packages implemented to mitigate the effects of the financial crisis. FTTH initiatives in many different countries offer examples of this development, with Singapore and Australia being prime examples. In the third-generation (3G) cellular era, RAN sharing has been practiced by T-Mobile and Hutchison 3G in the U.K., Vodafone and Orange in Spain, Telus and Bell Canada in Canada, TeliaSonera and Teles in Sweden, Telstra and Hutchison 3G in Australia, and Hutchison 3G and Telenor in Sweden. Ericsson, Nokia Siemens Networks and Huawei are network infrastructure vendors that have proved able to support RAN sharing. Gartner expects that network sharing will attract more attention for Long Term Evolution (LTE) than 3G, as it could prove a cost- and spectrum-efficient way to deploy LTE examples include Sprint and LightSquared in the U.S. and Yota in Russia.

Note that the business models associated with network sharing will not appeal to all CSPs, and that the estimated time to reach the Plateau of Productivity applies to adoption by relevant CSPs, not to all CSPs. It should also be noted that some CSPs will probably pursue more aggressive networksharing strategies abroad than in their home markets, as seen with the June 2011 announcement

Page 80 of 139

Gartner, Inc. | G00214660

from Telenor and TeliaSonera about their plans for creating a shared mobile infrastructure (2G, 3G and 4G) in Denmark. It is interesting to notice that network sharing has gone from being a very controversial undertaking to a commonly accepted strategic option. User Advice: Network sharing should be considered by CSPs wanting to minimize their investments in the lower parts of the value chain. These are often unattractive, as CSPs are looking for real differentiation in the higher parts of the value chain. However, CSPs should carry out careful analyses before embarking on network-sharing schemes, especially when the sharing will have an irreversible impact on their market position. They should consider network sharing only when this approach resonates with their core strategy, and should carefully analyze if critical abilities to differentiate against competition are lost. Regulatory risks should be evaluated and the management overhead associated with network sharing should be explicitly addressed in the underlying business case. Business Impact: The sharing of network infrastructure by CSPs lowers the overall investment needed for basic network infrastructure, thereby reducing risk factors for the individual CSPs' business cases. As such, it tends to attract most attention in capital expenditure (capex)-intensive scenarios unfolding during tough economic conditions. Network sharing levels the playing field between CSPs by reducing the entry barriers for prospective players. The business impact of any instance of network sharing depends on the type of sharing (open or bilateral, for example), the type of infrastructure shared, the regulatory conditions and the competitive landscape in which the sharing occurs. Network sharing is sometimes associated with a return to the monopoly days that preceded widespread telecom deregulation. However, there is a big difference between the vertically integrated monopolies of the past, which spanned the entire value chain, and the network-sharing schemes of today, which affect only a small part of the value chain. In fact, network sharing can stimulate competition throughout the value chain by breaking down barriers to entry in the lower parts of the value chain. Benefit Rating: Transformational Market Penetration: 5% to 20% of target audience Maturity: Adolescent Sample Vendors: Alcatel-Lucent; Ericsson; Huawei; Netadmin Systems; Nokia Siemens Networks Recommended Reading: "Dataquest Insight: Radio Access Network Sharing Is One of the Key Success Factors in LTE"

Gartner, Inc. | G00214660

Page 81 of 139

"Dataquest Insight: The Devil Is in the Detail; Making Radio Site Sharing in LTE and 3G Environments Different" "Australian Government Addresses Competition Problem with National Fiber-to-the-Premises Plan" "Governments Can Bring Moore's Law to Broadband Access (February 2006 Update)" "A Business Model for Next-Generation Broadband Access (February 2006 Update)" "Why Governments Should Care About Fiber-to-the-Home"

IPv6
Analysis By: David A. Willis Definition: Internet Protocol version 6 (IPv6) is the next version of Internet Protocol (IP), designed to overcome several key limitations of IP version 4 (IPv4), the most widely used networking protocol. The main benefits of IPv6 are vastly increased address space, integrated security and quality-ofservice mechanisms, as well as support for autoconfiguration and mobility. In addition, large network operators may see better routing stability as platforms mature. Position and Adoption Speed Justification: IPv4 address exhaustion is occurring. In 2011, the largest blocks were fully assigned by the Internet Assigned Numbers Authority, and the regional registry responsible for assignments in Asia/Pacific (APNIC) is ceasing to allocate IPv4 addresses for networks with more than 1,024 hosts. Other registries will follow, increasingly restricting new IPv4 allocations. As the Internet grows, IPv6 usage will grow along with it. By 2015, 17% of the global Internet will use IPv6, with 28% of new Internet users running the protocol. Yet the installed base of IPv4 devices is huge and will not migrate soon. Through 2020, both the public Internet and the typical corporate/government network will carry both IPv4 and IPv6 traffic. While many public networks will move aggressively, the situation is different for enterprises. For them, the actual depletion date for public addresses is not critical because most enterprises use relatively few public IP addresses, using private IP addressing inside their organizations while managing their pool of routable addresses conservatively. As enterprise customers migrate to Windows 7, they will see pockets of IPv6 appearing more frequently protocol support is built in and used in certain functions (for example, the DirectAccess feature for remote management). Networked systems that connect the physical world, like sensor networks, process automation systems, advanced metering infrastructure, asset management and building automation networks, are another area of development. The commercial availability of wireless sensor products that support the v6-only protocol, known as 6LoWPAN (IPv6 over low-power wireless personal-area networks), and industrial automation protocols such as International Organization for Standardization 100 could suddenly drive billions of devices into an IPv6 Internet.

Page 82 of 139

Gartner, Inc. | G00214660

User Advice: The time for selective deployment of IPv6 is near, but a large replacement of existing IPv4 services is ill advised. Through 2013, we will see the migration of many public-facing services to dual-stack IPv4/v6. Because an expanding part of the Internet's population will be natively IPv6, especially in Asia and the developing world, public-facing services (such as Web and mail hosts) should be first on the list to migrate. Serving IPv6 customers via IPv4-only hosts will result in a poor experience for many users, especially as the trend "picks up steam." We expect that carrier-grade network address translation gateways will become overwhelmed, so clients must take charge of providing native support. Another alternative is for the enterprise to use translation gateway capabilities from vendors of application delivery controllers, such as F5, Citrix, Brocade and Radware. The protocol will rapidly appear in pockets, so IPv6 management must become a system administrator competency. Enterprises should plan to provide limited internal IPv6 support by 2012 if they plan a broad move to Windows 7 and expect to use the DirectAccess feature extensively. This will take the form of providing dual-stack, gateway and tunnel support. Those aggressively adopting wireless sensor network technologies should use IPv6-based systems (6LoWPAN). Beyond the applications already noted, most of the benefits of IPv6 can be delivered with current IP (IPv4) workarounds, such as network address translation and IPsec. Migration costs are very high for established IP networks, and attempts to transition even moderate-size networks have revealed many unexpected problems and hidden costs. The transition should be made in a measured fashion. Fortunately, IPv4/IPv6 coexistence technologies work well at a relatively small scale, which allows nearly all backbone networks to stay with the current protocol. Enterprises transitioning to new operating systems should be prepared to support some remote IPv6 clients, providing gateway services and dual-stack capabilities as needed. By the middle of the decade, systems should be ready to support IPv6, IPv4 and mixed environments at scale. As businesses replace network infrastructure, they should ensure that their vendors have a credible IPv6 strategy every piece of infrastructure equipment should have IPv6 support implemented at minimum in software, and foundational equipment (switches, routers, high-performance gateways) should implement IPv6 handling in hardware. Expect to support both IPv6 and IPv4 through at least 2020. Look for IPv6 Ready Logo certification as a measure of a product's conformance to standards and interoperability. Business Impact: There will be little or no effect for businesses, which need only avoid disruptions that new users running the protocol could create. Benefits will be seen only by very large networks operated by carriers and the military. Aggressive adoption will often be disruptive to operations. But external services serving IPv6 customers must be moved or enterprises risk delivering a poor user experience. Benefit Rating: Low Market Penetration: Less than 1% of target audience

Gartner, Inc. | G00214660

Page 83 of 139

Maturity: Adolescent Sample Vendors: AT&T; Brocade; BT Group; Cisco; Citrix Systems (NetScaler); F5; Juniper Networks; Nortel; NTT Communications; Orange Business Services; Radware Recommended Reading: "Internet Protocol Version 6: It's Time for (Limited) Action" "Changeover to IPv6: The Deadline Approaches" "Q&A: Windows 7 DirectAccess Challenges Remote-Access VPNs" "Planning for the Security Features of Windows 7"

802.16-2009
Analysis By: Phillip Redman; Joy Yang Definition: IEEE 802.16 is a series of wireless broadband standards authored by the IEEE, with current implementations known as WiMAX technology. The current version is IEEE 802.16-2009. It consolidates and obsoletes IEEE Standards 802.16-2004, 802.16e-2005 and 802.16-2004, 802.16f-2005, and 802.16g-2007. The most popular implementation of the IEEE 802.16 standard is the Mobile Wireless MAN, originally defined by the 802.16e-2005 amendment that is being deployed around the world in more than 140 countries by more than 475 operators. This technology succeeds and replaces the one originally on 802.16e-e2005. Position and Adoption Speed Justification: Mobile WiMAX still gets limited uptake globally and is used mostly for semimobile fixed access in developing countries. We have not seen accelerated growth, any new infrastructure or device suppliers, nor any additional Tier 1 operators choose WiMAX as their next-generation technology. Even Clearwire, the joint venture between Sprint and Clearwire and the largest carrier to use mobile WiMAX in a national network, is already testing TDLTE for potential replacement of WiMAX. TD-LTE, which is another time division duplex (TDD) technology, gained strong support from the ecosystem, and has been widely tested by communications service providers (CSPs) worldwide, and will be adopted as the replacement for WiMAX. In the end, we think WiMAX will remain a niche technology with minor service adoption. User Advice: Consider WiMAX as a broadband access service, especially if no alternative infrastructure is commercially available and national roaming isn't needed. WiMAX can also be considered an alternative for Wi-Fi on campuses for enterprise wireless data connections, if deployed in an unlicensed spectrum. However, enterprise customers should leverage product portfolios or ecosystems in which as many products and solutions as possible are certified to gain the cost economics and vendor support for technology road maps, especially for end-user devices and terminals. Business Impact: WiMAX is a semimobile technology that will be used for defined areas, such as DSL fill-in in rural regions, rather than as a nationwide system for voice and data. Benefit Rating: Low

Page 84 of 139

Gartner, Inc. | G00214660

Market Penetration: 1% to 5% of target audience Maturity: Adolescent Sample Vendors: Alvarion; Clearwire Communications; Motorola Solutions; Samsung

Broadband Over Power Lines


Analysis By: Zarko Sumic Definition: Broadband over power line (BPL) technology also called power line communications (PLC) is a landline means of communication that uses established electrical power transmission and distribution lines. A service provider can transmit voice and data traffic by superimposing an analog signal over a standard alternating electrical current of 50Hz or 60Hz. Traditionally, the promise of BPL appeared to reside in electrical engineering domains, in which looping the transformers was cost-effective (for example, in Europe and Asia/Pacific, where, because of higher secondary distribution service voltage, several hundred consumers are served by one transformer, as opposed to North America, where only up to seven consumers are served by one transformer). However, with the recent development of new technologies and technological improvements, embedded utility infrastructures can be used to deliver voice, video and data services. Position and Adoption Speed Justification: Utilities, searching for options to increase revenue, are revisiting BPL, and, at the same time, exploring its potential to improve utility functions. Business models that highlight utility-focused applications, such as advanced metering infrastructure (AMI), appear to be driving new implementations particularly in Europe, where they still have a strong presence. However, other broadband technologies particularly WiMAX are evolving faster and moving into position to take large portions of the addressable market for Internet access. User Advice: BPL technology is maturing, but some technical issues still must be resolved (such as tunneling/bypassing distribution transformers, signal attenuation and radio interference). Distribution feeders are dynamic in nature, resulting in changing network parameters as a consequence of capacitor and line regulator switching for voltage control, as well as sectionalizing and transfer switching. Utilities should understand that most BPL systems must be retuned for optimal performance every time a distribution component gets switched in or out of the network. Therefore, close collaboration should be established between BPL personnel and planning engineers to consider BPL dynamics in circuit design and operations. BPL continues to lag behind other mainstream broadband communications technologies, which are attracting substantially more R&D investments. Although BPL is not yet fully mature, electric utilities and broadband service providers should follow BPL development and conduct technical feasibility and economic viability studies. BPL appears to be more appropriate as a communications channel for AMI and other utility control-monitoring functions (although some initial deployments have raised performance concerns), while less appropriate for Internet access services. BPL must be evaluated as a vehicle that can increase system reliability, improve the use of the distribution asset, and

Gartner, Inc. | G00214660

Page 85 of 139

enable sophisticated energy management and demand-response options, rather than as a new revenue source from entry into the broadband market. Utilities considering deployment of AMI, focused on either achieving utility-centered operational benefits (such as meter reading cost reduction, outage notification and revenue protection) or wanting to avoid out-of-schedule meter reads to support retail switching, can still find BPL-based communications appropriate for their needs. However, utilities focused on finer-granularity consumption data to support energy efficiency initiatives or on the use of consumption data for asset optimization will find BPL's limited bandwidth to be an obstacle. Users must ensure that business models, regulatory issues, and proper divisions between broadband service and utility functions have been addressed before attempting rollouts. In addition, users need to consider that, due to the smaller scale and investment level compared with other mainstream communication technologies, BPL will become obsolete, which will impact product and supplier viability and deployment, resulting in a "stranded asset." Business Impact: Affected areas include broadband communications and energy management services, such as on-premises, "home-plug-type" provisioning for consumer energy management applications, and AMI or automated-meter-reading deployment projects. Benefit Rating: Moderate Market Penetration: 1% to 5% of target audience Maturity: Obsolete Sample Vendors: Ambient; Amperion; Echelon; MainNet Recommended Reading: "Management Update: Top 10 Technology Trends Impacting the Energy and Utility Industry in 2011"

802.11k-2008
Analysis By: Tim Zimmerman; Michael J. King Definition: The 802.11k-2008 radio resource measurement is an Institute of Electrical and Electronics Engineers (IEEE) specification designed to standardize the measurement and reporting of performance information from the wireless client to the access point (AP) about the Wi-Fi environment. By evaluating the quality of connections, better decisions can be made about roaming, load balancing and the quality of service offered by the wireless infrastructure. Position and Adoption Speed Justification: The IEEE approved this standard in mid-2008, but the adoption of 802.11k, as well as 802.11r, has been linked to the Wi-Fi Alliance Voice-Enterprise certification, which is not expected until 2012. The Wi-Fi Alliance has taken the lead as the organization that measures and enforces interoperability within the industry, and until the certification process is complete, few vendors will invest in developing the functionality. User Advice: As wireless LANs (WLANs) are increasingly the access layer of choice for a growing number of users, the confidence in the WLAN must match that of the standard wired LAN. This is
Page 86 of 139
Gartner, Inc. | G00214660

only possible when the client and AP have information regarding client interference issues, AP load balancing, client roaming and AP subscription, thereby enabling the wireless infrastructure to make the best decisions based on real-world reporting, rather than solely AP-to-AP measures or other vendor algorithms. Enterprises that have voice requirements should use the standard, once available, as a baseline for vendor capabilities, because it will be the common denominator to which many will have tested their products. Gartner expects that vendors will continue using proprietary extensions beyond the standard to differentiate functionality to improve quality of service for data, voice and video applications. Business Impact: 802.11k increases the amount of information about the client that is available to the AP, client and, by way of AP reporting, the controller application. This will enable better decisions to be made between the client and the AP, for better handoffs and more warning about signal drop-offs or AP oversubscription. As part of the Wi-Fi Enterprise-Voice certification requirements, the implementation of 802.11k will become a standard part of any voice over WLAN (VoWLAN) solution. Benefit Rating: Moderate Market Penetration: 5% to 20% of target audience Maturity: Emerging Sample Vendors: Aruba Networks; Avaya; Cisco; Motorola

802.11r-2008
Analysis By: Tim Zimmerman; Michael J. King Definition: The Institute of Electrical and Electronics Engineers' (IEEE's) subcommittee has ratified 802.11r-2008 as a standard as part of an amendment regarding Basic Service Set (BSS). This amendment enables a secure, fast hand-off experience for clients, while they're roaming among wireless LAN (WLAN) access points. Fast roaming is achieved by quick reassociation and transfer of security credentials to a new access point, after the client moves out of coverage range of the access point where it was associated. The need for standardized, fast-roaming functionality is most noticeable when deploying losssensitive applications, such as voice over wireless LAN (VoWLAN) and mobile video. It's also useful for applications that are designed around wired latency levels. The intent of the amendment is to simplify the process that access points and clients must perform to hand off clients from one access point to the next. Position and Adoption Speed Justification: Because all vendors have proprietary versions of fast hand-offs for their networks, the adoption of 802.11r has been linked to the Wi-Fi Alliance VoiceEnterprise certification, which is not expected until 2012. The Wi-Fi Alliance has taken the lead as the organization that measures and enforces interoperability in the industry, and, until the certification process is complete, few vendors will invest in developing the functionality.

Gartner, Inc. | G00214660

Page 87 of 139

User Advice: Although ratifying 802.11r as a standard will improve the base level of roaming functionality for devices as they move through the infrastructure, there may be no perceptible difference for most mobile users where the requirement is for data only. Enterprises with voice requirements should use the standard, once available, as a baseline for vendor capabilities, because it will be the common denominator against which many will have tested their products. Gartner expects vendors to continue using proprietary extensions beyond the standard to differentiate functionality and improve quality of service for data, voice and video applications. Business Impact: 802.11r will become the baseline requirement for enterprises that are looking to deploy VoWLAN. The more metrics that are defined to solidify the wireless environment, the more confidence enterprises will have in using it as their primary access layer communication medium. The standardization of another basic wireless element will motivate vendors to find new capabilities with which to differentiate themselves. Benefit Rating: Low Market Penetration: 5% to 20% of target audience Maturity: Adolescent Sample Vendors: Aerohive Networks; Aruba Networks; Cisco; HP; Motorola

Long Term Evolution


Analysis By: Joy Yang Definition: Long Term Evolution (LTE) is a Third Generation Partnership Project (3GPP) venture to define the requirements and basic framework for the wideband code division multiple access (WCDMA) mobile radio access network beyond third-generation (3G) technology. It is also known as Release 8, probably the last step before fourth-generation (4G) technology. The core specifications for Release 8 were completed by the end of 2007. Objectives with LTE include theoretical data rates of 100 Mbps downstream and 50 Mbps upstream in 20MHz of spectrum; full mobility at speeds of up to 500 kilometers per hour; support for 3G network overlays; and handovers between 3G and LTE. LTE is likely to employ multiple input/multiple output (MIMO), Orthogonal Frequency Division Multiple Access (OFDMA) and single carrier frequency division multiple access (SC-FDMA) in the link layers. Notably, it will not use code division multiple access (CDMA) for the radio layer, and there is a major operator-driven effort by the European Telecommunications Standards Institute (ETSI) to cap intellectual property royalties for LTE at a maximum of 5% of the cost of the equipment. (For definitions of MIMO, OFDMA, System Architecture Evolution [SAE] and UMTS Terrestrial Radio Access Network [UTRAN], see "Glossary of Mobile and Wireline Carrier Network Infrastructure Terminology, 2010.") LTE will come in two types frequency division duplexing (FDD) and time division duplexing (TDD) to support deployments in FDD spectrum and TDD spectrum. Currently, most 3G communications service providers (CSPs) are adopting FDD-based 3G technologies, WCDMA and
Page 88 of 139
Gartner, Inc. | G00214660

cdma2000. For them, FDD LTE will be a reasonable next-step migration technology. TD-SCDMA, which has been adopted only by China Mobile, and WiMAX 806.16e are TDD-based 3G technologies. China Mobile is heavily promoting TDD LTE and is likely to adopt it as soon as it is ready for commercial use. The WiMAX Forum has claimed that WiMAX 16e will be able to migrate to next-generation 802.16m technology, which will be able to provide features competitive with LTE Advanced (LTE-A). WiMAX 802.16e has appeal as a last-mile access technology for fixed broadband networks in emerging markets, and still has its niche market. 802.16m is competing directly with TDD LTE or TDD LTE advanced, which leaves it little chance. Furthermore, several major mobile equipment vendors left the WiMAX market in 2008 and 2009, which leaves the WiMAX ecosystem looking slim. Position and Adoption Speed Justification: LTE gained momentum in 2010, and is continuing to see early adoption in 2011. As per Global mobile Suppliers Association statistics, by the end of May 2011, 20 commercial FDD LTE network had been launched in 14 countries, and two TDD LTE networks had been launched, by Hi3G in Sweden and Denmark, and by Aero 2 in Poland. CSPs had announced 154 trials and commitments in more than 60 countries. Western Europe, North America and Japan are the early adopters. CSPs in other markets are aiming to launch services in 2012 or 2013. Some operators have chosen to extend the life of their WCDMA networks by rolling out High-Speed Packet Access Evolution (HSPA+) technology. This may delay the need for LTE by a couple of years. The evolution from High-Speed x Access (HSxPA) to HSPA+ would be less disruptive than going straight to LTE, and it could also be cheaper if CSPs do not add all the possible enhancements for HSPA+, such as 2x2 MIMO, 64 quadrature amplitude modulation (QAM) and additional carriers. Although LTE's performance is better overall, on 5MHz bands HSPA+ is just as spectrally efficient. Within LTE, the core network evolution from the general packet radio service (GPRS) packet core is covered by System Architecture Evolution (SAE), along with the Evolved Packet Core (EPC) network elements. LTE still faces challenges to provide traditional voice and short message services, which rely on circuit-based 2G and 3G technology. Options include circuit-based call fallback to 2G/3G networks and voice over IP Multimedia Subsystem (IMS), which the One Voice organization is working on. Voice over LTE via Generic Access (VoLGA), another LTE voice solution, has been given up on by the industry. The LTE ecosystem is maturing quickly. By March 2011, there were 98 available LTE user devices, including modules, routers, USB dongles, tablet and phones. However, in the early stages of LTE, applications will focus on mobile data usage based on data cards and USB dongles. The development of tablets will boost data bandwidth demand from subscribers, which will help drive the LTE market. User Advice: CSPs have been deploying WCDMA, High-Speed Downlink Packet Access (HSDPA) and High-Speed Uplink Packet Access (HSUPA), and they must carefully consider further upgrades for LTE, which will require new core and radio access networks, as well as new spectrum. End users should not wait for promises of an ideal technology, but rather evaluate price/performance criteria, choose the operator with the strongest service package, and investigate upgrade options

Gartner, Inc. | G00214660

Page 89 of 139

for higher-bandwidth packages. Users should expect better performance, but, as with each of the preceding 3GPP network releases, typical data rates for mobile users are likely to be only 10% to 20% of the maximum theoretical rate though this would still provide a significantly improved experience, compared with HSxPA. Advice for CSPs:

Evaluate the revenue potential of mobile broadband to justify the cost of investing in technologies like LTE. Plan to segment your service portfolio and offer value-added services to generate more revenue. Prepare users for device upgrades by educating them about the benefits of fast mobile access. Focus on the user experience improvements that come with high speeds including lower latency for voice and better Web surfing rather than selling technology. Set realistic expectations about bandwidth speeds and resilience. Mobile broadband will not be a perfect substitute for fiber or very-high-bit-rate DSL. Challenge LTE infrastructure vendors regarding the extent to which their HSxPA/HSPA equipment is forward-compatible with or "upgradable" to LTE. Upgrades from WCDMA to HSxPA have shown that "software-only upgrades" can have unexpected hardware impacts.

Business Impact: LTE has been widely trialed since 2009, and has been deployed by up to 20 major CSPs in leading markets, such as Western Europe, North America and Japan. It is being deployed to deliver high-bandwidth, high-quality mobile broadband services to enterprises and residential users, with potentially reduced operational costs for CSPs. Benefit Rating: High Market Penetration: Less than 1% of target audience Maturity: Adolescent Sample Vendors: Alcatel-Lucent; Ericsson; Huawei; Motorola; NEC; Nokia Siemens Networks; Samsung; ZTE Recommended Reading: "Marketing Essentials: How to Market LTE as the Next-Generation Mobile Infrastructure" "Magic Quadrant for LTE Network Infrastructure" "Forecast: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Market Trends: China, Third-Generation Cellular Networks, 2011"

MPLS-TP
Analysis By: Peter Kjeldsen

Page 90 of 139

Gartner, Inc. | G00214660

Definition: Multiprotocol Label Switching Transport Profile (MPLS-TP) is a connection-oriented packet-switched protocol which offers communications service providers (CSPs) a data-centric transport solution with conventional transport operations, administrations and maintenance (OAM) capabilities. The technology is an attempt to marry the best ingredients of "Ethernet economy" and Synchronous Digital Hierarchy (SDH)/Synchronous Optical Network (SONET) reliability and, as such, aspire to eventually replace SDH/SONET in CSP networks. CSPs expect that MPLS-TP will provide cost-effective mechanisms for aggregating and transporting data with well-defined network management capabilities. MPLS-TP was started as a joint effort between the International Telecommunication Union (ITU) and the Internet Engineering Task Force (IETF). The network architecture and network management principles behind MPLS-TP will be the same as those for SDH/SONET and Optical Transport Network (OTN), and together with its natural fit with MPLS this should be a technology that most CSPs will welcome. Recently there has been disagreements between the ITU and the IETF regarding the finalization of the standards, with the ITU-T Study Group 15 in February 2011 voting to proceed with its own OAM solution. It now appears that there will be more than one flavor of MPLS-TP at least from a standards perspective. This will be a disappointment to CSPs, as the collaboration between the IETF and the ITU toward a unified standard has been seen as a clear advantage of MPLS-TP. Position and Adoption Speed Justification: CSPs are expected to welcome MPLS-TP as a standardized technology for cost-effective transport solutions for carrier Ethernet services. The battle that raged a few years ago between the Provider Backbone Bridge Traffic Engineering (PBBTE) (also referred to as Provider Backbone Transport [PBT]) and Transport (T)-MPLS camps did not encourage CSPs to adopt either technology, even though the need for innovation in this space has been evident for some time. The standardization process is now in advanced stages and prestandard commercial products have emerged. CSPs are still expected to quickly adopt the MPLSTP technology, but the disagreements between the ITU and the IETF will be a concern and something that CSPs will want to understand the full implications from. Therefore, despite the technology having matured and the standards having progressed, we still expect the technology to be more than two years away from the Plateau of Productivity. User Advice: CSPs should evaluate MPLS-TP as a potential technology for their technology portfolio to reap the benefits of a standardized data-centric transport architecture. If they have already deployed PBB-TE or T-MPLS-based solutions, a potential move to MPLS-TP should be evaluated as a likely scenario. CSPs should be aware of the conflict between the ITU and the IETF regarding the final shape of the standard, and evaluate potential risks and possible mitigation options specific to their installed networks and future network strategies. Business Impact: MPLS-TP is expected to offer a standardized solution for cost-effective transport of carrier-class Ethernet traffic.

Gartner, Inc. | G00214660

Page 91 of 139

It is significant that MPLS-TP was started as a joint ITU and IETF effort even with the recent dispute between the two organizations and that the architectural principles are based on proven technology that is widely adopted among CSPs. CSPs are risk-averse when adopting new technologies, and MPLS-TP is perceived as a relatively low-risk path toward more packet-centric optical transport architectures. Benefit Rating: High Market Penetration: 5% to 20% of target audience Maturity: Emerging Sample Vendors: Alcatel-Lucent; Ciena; Cisco; Ericsson; Huawei; Nokia Siemens Networks

Video Telepresence
Analysis By: Scott Morrison; Robert F. Mason Definition: Video telepresence is a form of immersive video communication that creates the impression of being in the same room as other conference participants. Conference participants appear as life-size individuals on large plasma, LCD or projection screens. Multiple cameras and microphones pick up individuals or pairs of individuals, so that all audiovisual information becomes directional, with good eye contact and spatial sound aligned with the location of the person speaking. Telepresence suites are designed and configured by the system supplier to provide layout, acoustics, color tones and lighting that maximize the perception of realism. The term "telepresence" has slowly but surely become synonymous with "videoconferencing." With most videoconferencing solutions now capable of running high-definition (HD) video resolution at 1,080p (progressive scan) lines of resolution, the technological differences between immersive and nonimmersive forms of telepresence have become blurred. Adaptive solutions both single and multiscreen fit into established environments and cost considerably less than fully immersive suites, while "lite" solutions may be no different from regular HD room videoconferencing, unless integrated to provide some of the nontechnology aspects that make the environment immersive: the way in which the room is configured (with the look, feel and lighting of different rooms being set up to aid the impression of sharing the same physical space with other counterparts), and the integration and management of these systems being such that they don't get in the way of collaboration. Operational simplicity and high availability remain a key part of the added value in telepresence. Systems are designed to enable anyone to use them to their full potential, with little or no prior training, without the connectivity problems associated with traditional room videoconferencing solutions. Telepresence systems make high demands on the network, with low-compression, three-screen, HD rooms taking anything from 8 Mbps to 45 Mbps of dedicated bandwidth for video and content. They are typically deployed across Multiprotocol Label Switching (MPLS) networks, often dedicated to and designed for video traffic with minimal latency so that users can retain natural levels of spontaneity during interactions with other participants. However, providers are increasingly adopting new encoding standards, such as H.264 Scalable Video Coding (SVC), to allow for the use

Page 92 of 139

Gartner, Inc. | G00214660

of best-effort or contended bandwidth. While this initially increases the mix of encoding standards in the market, its future widespread use, combined with a common architecture for enabling multiscreen system interoperability, using Telepresence Interoperability Protocol (TIP), will help to push telepresence islands toward interconnectivity. While many service providers have continued to slow the pace of interconnection, through protracted negotiations on bilateral peering, the process is inexorable. Position and Adoption Speed Justification: The primary use of telepresence in most organizations remains for internal meetings or meetings with clients and partners who come to the company premises as a way to reduce travel. However, the collaboration features lend themselves increasingly to project management and engineering environments, where travel is not necessarily being avoided, but the added dimension afforded by telepresence makes new forms of virtual communications feasible for users who benefit from the increased productivity and speed of decision making as a result. Many telepresence vendors are taking the technologies and capabilities first provided in immersive environments and placing them in lower-cost solutions down to the desktop level. Vendors are now going beyond interlinking telepresence rooms with other videoconferencing endpoints a key focus now is on enabling any unified communications soft client to connect. The wider use of H.264 SVC is the key technology mechanism being used, and most vendors have either adopted this or stated publicly their intention to do so by the first half of 2012. Gartner expects telepresence adoption to continue to be driven by larger organizations and specific vertical industries, including financial services, banking, pharmaceuticals, telemedicine, hightechnology manufacturing, consumer products and motion pictures/entertainment. The hospitality and managed office industries are now beginning to roll out telepresence suites for pay per use in their conference centers. Growth in demand has been very strong, but from a small base. Only around 2,000 multicodec systems (that is, systems with multiple video screens and cameras) were sold in 2010, and around 4,000 single-screen systems from the same product lines, against a total market of nearly 200,000 videoconferencing endpoints, and the addressable market for telepresence systems is limited to a combination of larger sites in Fortune 5000 companies and specific applications in other sectors, such as government, professional services and healthcare. Year-over-year growth in demand for desktop video is equally strong, and reaches far more users. User Advice: Organizations should consider video telepresence as the high end of their videoconferencing technology requirements, rather than a stand-alone solution. Most mature telepresence deployments have a ratio of one telepresence system per 10 regular room-based videoconferencing systems. Telepresence can deliver a more immersive, higher-quality group videoconferencing experience than a traditional room-based or desktop videoconferencing solution, albeit at a substantially higher cost. When selecting a solution, organizations should consider video telepresence in the context of their wider productivity, collaboration and unified communications programs.

Gartner, Inc. | G00214660

Page 93 of 139

Business Impact: For regular telepresence users, travel cost reductions and improved productivity will provide a business case with a relatively short payback period, often less than 18 months. Telepresence typically demands a utilization rate of more than three hours per day three to four times what most organizations achieve with traditional videoconferencing. The cost of some immersive endpoint solutions has fallen to around $35,000 per screen/codec (so a three-screen environment will cost around $100,000), but the costs associated with infrastructure are significant, and the networking and managed services costs will typically be double the capital investment in the system over a three- to five-year period. Early adopters indicate that telepresence has boosted usage rates into the 30% to 40% range for organizations, based on a 10-hour business day, compared with less than an hour per day for unmanaged videoconferencing systems. Increased usage is key to cost justification and customer success with telepresence. Public utility services are still not widely available, limiting the benefit they bring to enterprises wishing to extend the reach of their own telepresence footprint. A key benefit of telepresence, even over other forms of video communications, is its ability to displace the need for travel by highly mobile executives. From an environmental perspective, this can help reduce Scope 1 and Scope 2 greenhouse gas emissions. However, to lower travel costs and emissions, additional governance, policy and behavioral measures are required, particularly topdown mandating of video as an alternative to face-to-face meetings. Benefit Rating: Moderate Market Penetration: 1% to 5% of target audience Maturity: Early mainstream Sample Vendors: Cisco; HP; LifeSize; Magor Communications; Polycom; Teliris Telepresence Recommended Reading: "MarketScope for Video Telepresence Systems" "Telepresence Is Coming Home: Are You Ready?" "Building a Global Videoconferencing Strategy"

40 Gbps Transport
Analysis By: Peter Kjeldsen Definition: To address accelerating traffic growth, transport systems are increasing channel capacity from 10 Gbps data rates to 40 Gbps. To facilitate higher line rates without sacrificing transmission reach, solution providers have had to introduce more advanced modulation schemes and thus more complex transceivers, with differential phase shift keying being the most widely adopted enabling technology to address this evolution. 40 Gbps (and also 100 Gbps) line rates are already standardized for Synchronous Digital Hierarchy/ Synchronous Optical Network (SDH/SONET) and optical transport network (OTN) by the International Telecommunication Union (ITU). The Institute of Electrical and Electronics Engineers (IEEE) ratified the 802.3ba standard in June 2010 (see http://standards.ieee.org/announcements/
Page 94 of 139
Gartner, Inc. | G00214660

2010/ratification8023ba.html), which will allow communications service providers (CSPs) to carry 40 Gbps and 100 Gbps Ethernet directly over transport networks supporting these line rates. This allows CSPs to consider the move to higher line rate systems in the wider context of what their future optical transport architecture should look like. Position and Adoption Speed Justification: Commercial deployments are gaining momentum in long-haul backbones and metropolitan networks alike. With large CSPs adopting the technology, economies of scale have kicked in to the point where the carrier cost of deploying 40 Gbps technology compares favorably with alternative approaches based on 10 Gbps technology (when comparing cost per transmitted bit). A number of routing platforms utilize 40 Gbps interfaces, which is an important driver toward wider adoption of 40 Gbps transport solutions. 40 Gbps will be a natural step on the way to 100 Gbps transport line rates, just as 10 Gbps was the step that preceded 40 Gbps solutions. 100 Gbps will take a few more years than 40 Gbps to be cost-effective, and 40 Gbps will continue as part of the solution hierarchy after 100 Gbps is introduced (just as 10 Gbps and 2.5 Gbps are still being used). However, it should be noted that the 40 Gbps life cycle will likely be compressed compared to the 10 Gbps life cycle (which experienced an early onset with the "optical bubble" back in 2000 as well as extended "prime time" due to the financial crises) and the 100 Gbps life cycle (which appears to be in line for an early onset due to early standardization and technology progress). User Advice: With many CSPs already adopting this technology, economies-of-scale are being realized and the pricing barrier is being lowered. The technology is mature enough for deployment as traffic demand and price points line up. CSPs should evaluate investments in 40 Gbps system as an integral part of their overall transport strategy and also consider if waiting for 100 Gbps systems is a viable path. Business Impact: The relentless traffic growth is driving the demand for more capacity at lower cost per bit, both in metropolitan and long-haul networks. Traditionally, moving to a channel rate that is four times higher only increases the cost by a factor of 2.5 to 3 so there is an economy-ofscale aspect to moving up in terms of channel rate. 40 Gbps is almost there in terms of cost, and is rapidly closing the remaining gap. Benefit Rating: Moderate Market Penetration: 20% to 50% of target audience Maturity: Early mainstream Sample Vendors: Alcatel-Lucent; Ciena; Ericsson; Huawei; Infinera; Nokia Siemens Networks; ZTE

Mobile Advertising
Analysis By: Andrew Frank

Gartner, Inc. | G00214660

Page 95 of 139

Definition: Mobile advertising is advertising or other paid placement on mobile device screens. This category was formerly limited to handset-based screens; however, with the introduction of media tablets most notably Apple's iPad it has been expanded to include these screens as well. Mobile advertising encompasses a number of formats:

Mobile Web banners and display ads (including rich media) Mobile in-application ads Mobile search and map-based ads Mobile in-stream video and audio ads Mobile display ads affixed to SMS or Multimedia Messaging Service (MMS) messages

Mobile ads may be acquired through ad networks, directly from mobile publishers or mobile app developers, or from mobile communications service providers (CSPs) or manufacturers that provide portals on certain devices. Since smartphones and media tablets are increasingly used to access Web content (much of which is not yet optimized for display on these devices), as well as broadcast content (TV and radio), the status of advertising within these cross-platform content formats becomes ambiguous. To reduce ambiguity, we consider mobile advertising to apply only to formats that are specifically optimized for wireless Internet delivery to a mobile device. Position and Adoption Speed Justification: Aided by a notable recovery in ad spending overall, the mobile advertising category has accelerated its evolution over the past year, and, despite lingering issues, such as privacy, metrics, standards and so forth, Gartner expects the mobile advertising market to more than double over the next two years and to increase 12-fold by 2015 to $20.6 billion worldwide, or about 4% of total ad expenditures. This growth is being driven by robust consumer adoption of smartphones and media tablets, which is changing the way consumers use and think about mobile devices from primarily as phones to all-purpose information, entertainment and social networking devices. Gartner forecasts nearly 1 billion smartphones and about 350 million media tablets will be sold in 2015, making these platforms nearly indispensable for advertisers. The growth of mobile advertising is also being streamlined by the many lessons learned through the 10-year emergence of Internet advertising, as many of the business models and practices, such as ad networks and exchanges, automation platforms, and context-aware interactive design techniques, are replicated and refined for the mobile channel. Along with mobile adoption of Web advertising concepts, advertisers and content providers are discovering the utility of using a mobile device in concert with other media, including television, radio, print and out-of-home signage. The key to these applications is the use of microphones and cameras as input devices that can recognize audio and visual cues, such as audio fingerprints and watermarks (to synchronize with TV and radio) or quick response (QR) codes and image recognition (to extend messaging from print ads and signage).

Page 96 of 139

Gartner, Inc. | G00214660

In addition to microphones and cameras, other device sensors, such as GPS, a compass and an accelerometer, can provide targeting input that advertising platforms can use to optimize selection and presentation of ads based on location and other contextual factors. Add to these benefits projected growth in mobile payments (forecast by Gartner to have almost 350 million users worldwide in 2015) and mobile becomes an even more attractive promotional and transactional platform for advertisers, especially in retail sectors and other businesses that rely on direct response marketing, such as publishing and financial services. Despite these positive signs, significant challenges do remain. For example:

Formats and standards. Existing ad standards from organizations such as the Mobile Marketing Association (MMA) and the Interactive Advertising Bureau (IAB) are widely considered to trail the capabilities of more-advanced smartphones and media tablets, leading to the emergence of nonstandard device-specific platforms, such as Apple iAds, that have high creative potential but are expensive and limited in terms of reach and expertise. Metrics and measurement. The mobile metrics picture, considered by many advertisers and agencies to be a baseline requirement for any major media investment, remains uncertified and hampered by technical complexity. Privacy and targeting. The issue of privacy norms and regulations, especially for potentially attractive but controversial location-based concepts, has also created controversy and reluctance, particularly on the part of CSPs to use phone-based customer data for ad targeting. Meanwhile, many of the targeting methods developed for the Web, such as the use of thirdparty browser cookies, are widely unavailable on the mobile Web because of such factors as the default settings of Apple's mobile Safari browser, which are set to reject third-party cookies. In summary, growth is likely to accelerate in the coming years, although many fundamental issues remain to be resolved.

User Advice:

Brands and agencies must develop methods of evaluating the effectiveness of mobile campaigns across various mobile channels to optimize the use of mobile media in the marketing mix. This is likely to vary considerably by product category, audience profile and region. In particular, brands and agencies must consider ways to use mobile channels as a response mechanism in concert with other noninteractive formats, such as print and TV, and not just as a stand-alone channel. Local advertisers, in particular, must understand how to leverage the medium's ability to deliver nearby traffic to their offline stores and venues in a privacy-friendly way. Advertisers and agencies must also revise privacy policies to address new and potentially controversial targeting capabilities of mobile devices and systematically assess regional variations and partner practices.

Gartner, Inc. | G00214660

Page 97 of 139

Content providers, developers and publishers need to understand how to incorporate elements such as social features, maps and video into applications that will attract both users and advertisers. CSPs and manufacturers need to be decisive about their intended roles in mobile advertising and acknowledge that, with few exceptions, success will require both strong partnerships and strategic acquisitions to quickly establish key roles in end-to-end solutions that can deliver efficiency and scale to advertisers. CSPs and advertisers should not overlook handset telephony capabilities for contextual clickto-call and save-contact features in ads. For developing markets, SMS will remain a good way to distribute marketing messages to mass audiences, and it may provide enough economic value to subsidize the expansion of access to more-advanced handsets and service plans.

Business Impact: Mobile advertising will siphon most of its revenue from print and outdoor categories, although it will be complementary and often used in concert with those categories, making them more efficient through direct response and thus raising their value to advertisers, which will limit its impact on these media by preventing overall spending from being a zero-sum game. Mobile's impact on television will be minimal, although the overall effect of mobile will be to emphasize direct, targeted, pull-style interactions that may accompany a long-term reduction in the share of marketing resources directed at general media advertising. The question of how mobile advertising will affect growth in Web advertising highlights the issue of categories. Mobile Web will be increasingly difficult to separate from PC-based Web delivery as more content is designed to adapt to different format factors, and competition will center more on Web ads versus in-app ads on these emerging platforms. In this competition, we see in-app display taking an early lead because of the industry-leading efforts of Apple, although we expect Web display to ultimately prevail as HTML5 comes of age and delivers advertisers a uniform standard that eliminates the overhead and reach limitations of platform-specific development. On the opportunity side, mobile advertising seems to be conveying most of its impact on Web leaders, such as Google and Apple, which have successfully exploited the channel, along with a number of mobile ad networks and platform providers whose timing coincided with the longanticipated growth in the channel. Publishers, content providers and application developers appear to have a similar problem on mobile that has challenged their efforts online, namely the "long tail" fragmentation of audiences and usage that makes it difficult for all but a few providers to achieve the scale necessary to attract substantial ad revenue. A few CSPs have succeeded in building momentum around mobile advertising efforts, although the majority find themselves marginalized by reliance on their flagging portals. Manufacturers other than Apple have also found difficulty gaining a foothold in the ad platform market. Benefit Rating: High Market Penetration: 5% to 20% of target audience Maturity: Adolescent

Page 98 of 139

Gartner, Inc. | G00214660

Sample Vendors: Apple; Google; Greystripe; Jumptap; Microsoft; Millennial Media; Velti; Yahoo Recommended Reading: "Forecast: Mobile Advertising, Worldwide, 2008-2015"

Climbing the Slope


Femtocells
Analysis By: Deborah Kish Definition: A femtocell is a small, A5-size base station box aimed at improving indoor coverage, especially for higher-frequency services such as third-generation (3G) services. Similar to picocells, femtocells are even smaller cellular base stations that are designed for use in residential or corporate environments that connect to the customer's own broadband connection using an Internet Protocol (IP) link for backhaul. Their advantages include their lower cost than existing microcellular technology, their physically smaller unit size and their greater network efficiency. Femtocells are offered in two form factors: as stand-alone units, much like a cable modem or wireless router; and as integrated solutions, which are simply Wi-Fi routers or cable modems with a femtocell inside. Alternative technology exists, such as Wi-Fi femtocells' strongest competitor and other devices such as MiFi, which is essentially a femtocell in reverse. A femtocell "talks" cellular to the consumer and broadband backhaul to the carrier, whereas MiFi "talks" Wi-Fi broadband to the consumer and cellular 3G and Long Term Evolution (LTE) to the carrier. Position and Adoption Speed Justification: Still in their early commercial rollout stage, femtocells could make mobile communications more pervasive and encourage more users to switch over from fixed to mobile as their main means of communication. The business case for femtocells has been in question due to their cost, but some technological improvements have been taking place to bring the bill of materials (BOM) low. The main drivers for deploying femtocells are poor cellular coverage in rural areas, and user density in urban areas where the growing use of cellular data services is putting pressure on the existing base station architecture. However, with communications service providers (CSPs) ramping up their LTE networks, consumers and enterprise end users are expected to consume more bandwidth via data and multimedia traffic, so femtocells may be positioned to provide some mobile backhaul relief. CSPs will need to work toward lower or tiered price plans to control demand and may consider including a femtocell as part of high-speed access plans, thereby increasing adoption. Most femtocell initiatives are still at the trial stage and are aimed mainly at consumer markets. There have been some advances in increasing the number of users per femtocell to up to 32 to target the small and midsize business market. User Advice: CSPs should offer incentives offering rebates, for example, is likely to attract subscribers, as consumers look for ways to cut their monthly living expenses. Alternatively, femtocells could be bundled with broadband and/or mobile plans. CSPs and vendors should work together to develop lower-cost solutions, and should be more aggressive with integrated solutions such as femtocells embedded in broadband routers or set-top boxes. Integrated solutions will increase adoption, as they will decrease the BOM, thus driving down cost and eliminating crowded

Gartner, Inc. | G00214660

Page 99 of 139

desktops. Additionally, developing complementary mobile applications, such as "in-house presence alert," which will offer more value per subscriber dollar, will also increase femtocells' attractiveness. Business Impact: Government incentives (such as those in the U.S. and Germany) to increase broadband access in rural areas could be advantageous to CSPs offering femtocells, as a broadband connection is needed for a femtocell service. Mobile services in these areas are likely to increase, as subscribers may opt to switch to mobile as their main means of communication, rather than voice over IP (VoIP), so the impact on consumer markets could be significant. From an enterprise perspective, as technology advances and femtocells' capacity to increase the number of registered users per femtocell grows, this could prove to be a driver in this market too, although the number of public switched telephone network lines in the enterprise is not decreasing as quickly as VoIP is growing. Benefit Rating: Moderate Market Penetration: 1% to 5% of target audience Maturity: Emerging Sample Vendors: Airvana; Fujitsu; Huawei; ip.access; Kineto; Motorola; Nokia Siemens Networks; Picochip; RadioFrame Networks; Sagemcom; Samsung; Ubiquisys; ZTE Recommended Reading: "Femtocells: The State of the Market" "Emerging Technology Analysis: The Mutual Benefits of Femtocells and LTE" "Magic Quadrant for LTE Network Infrastructure" "Forecast: Mobile Data Traffic and Revenue, Worldwide, 2010-2015" "Market Insight: Are Mobile Applications and Video Facing a Shortage of Bandwidth?"

802.11n
Analysis By: Tim Zimmerman; Michael J. King Definition: 802.11n is the latest wireless LAN (WLAN) standard ratified by the Institute of Electrical and Electronics Engineers (IEEE). Improvements in the technology have expanded the throughput and range that can be implemented in 2.4GHz or 5GHz. A single spatial stream operating in a 20MHz channel width can achieve 75 Mbps, compared with the 54 Mbps of a similar 802.11a or 802.11g solution. Dual-stream radios providing 300 Mbps at 5GHz using the bonded channel functionality are common in the market, and several vendors have introduced three-stream radios that can provide up to 450 Mbps. Theoretically, 802.11n is expected to deliver as much as 600 Mbps of networking performance using four spatial streams, but actual performance will depend on each vendor's implementation. Additionally, the performance of 802.11n in clients may be limited to less than the capacity of the infrastructure by the specific implementation, such as the number of antennas that are integrated into the device. Like previous 802.11 standards, 802.11n provides for a

Page 100 of 139

Gartner, Inc. | G00214660

20MHz channel width to enable backward compatibility with 802.11a, 802.11b and 802.11g (a/b/g) standards on the market. Position and Adoption Speed Justification: Since the ratification of the standard, the market movement to 802.11n has been swift. Many vendors continue to report that more than 70% of new access points being purchased are now 802.11n, although they continue to be purchased for different architecture considerations autonomous versus coordinated, controller-based versus in the cloud, or with one, two or three integrated radios within the access point. User Advice: IT leaders should consider 802.11n for all their WLAN requirements, because there is no longer a premium to be paid, in comparison with 802.11a/b/g components. The number of radios within an access point, as well as the number of spatial streams supported and type of multiple input/multiple output (MIMO) support needed, will be determined by the enterprise WLAN requirements, including capacity and level of service. Vendors will still have points of differentiation that will not only improve wireless network performance in terms of capacity and robustness of communication, but will also create the need for use case testing, because implementation choices will affect data, voice and video applications. Business Impact: 802.11n should be considered for all wireless LAN scenarios as mainstream adoption of the technology in small, medium or remote office environments, as well as in higher education and healthcare, continue to drive the technology deeper into the enterprise. IT organizations that have answered the call to implement wireless for conference rooms and reception areas can tackle the additional hurdles (such as voice over WLAN [VoWLAN]) that are impeding the implementation of 802.11n across the access layer. We believe that 802.11n will enable sufficient bandwidth and required capabilities (such as quality of service) for enterprises to consider moving not only data, but also voice and video for many enterprise applications to the WLAN. Benefit Rating: Moderate Market Penetration: 5% to 20% of target audience Maturity: Early mainstream Sample Vendors: Aruba Networks; Cisco; HP; Motorola Solutions Recommended Reading: "Magic Quadrant for Wireless LAN Infrastructure" "Toolkit: Technology Section of a WLAN RFP" "Toolkit: Checklist for Building a Solid WLAN Access Layer" "Critical Components of Any WLAN Site Survey"

IPTV
Analysis By: Ian Keene; Fernando Elizalde

Gartner, Inc. | G00214660

Page 101 of 139

Definition: Internet Protocol television (IPTV) refers to the network architecture, equipment and technologies, middleware and software platforms used to deliver standard or high-definition television (HDTV) signals, in real time, over managed communications service provider (CSP) networks. Until recently, IPTV services were a TV delivery solution for wireline CSPs over either DSL or FTTH access networks. Now, IPTV is beginning to be used by cable operators and satellite broadcasters as a means of providing additional TV channels to their subscribers. In the future, wireless-based distribution networks may be added to this list. IPTV delivery systems increasingly employ advanced video compression (AVC) technologies, such as MPEG-4 or VC-1, whereas early implementations of IPTV used MPEG-2. IPTV should be contrasted to over-the-top (OTT) Web streaming video with no quality of service (QoS) issues like buffering delays. Position and Adoption Speed Justification: IPTV constitutes the CSP's response to competition from cable and satellite operators, plus the newer threat of OTT TV and video services. It is a major area in the field of next-generation telecom architecture and services, with the potential to be a transformational enabler for CSPs and for those end users who have not been able to receive interactive TV. However, factors that hold back the rapid and widespread adoption of IPTV are numerous. These include mature pay-TV markets in some countries and competitive bundled offerings from, for example, cable competitors. However, these issues can be overcome. For example, in the U.S., IPTV is doing very well with people switching from the cable operators to Verizon's FiOs and AT&T's U-verse. There are also technological issues, as the required end-to-end solution is complex. IPTV means that CSPs need to manage complex server farms, home devices and networks. Not all copper loops can offer enough noise-free bandwidth for standard-definition video, and HDTV and poor customer experience can be an issue. The inability of some CSPs to procure different and compelling content, or even content similar to current content deals, and consumer inertia when it comes to changing service providers are issues as well. Consumers are not universally convinced about the benefits of premium content, and the market for over-the-top video on the Internet is growing fast. In most areas, CSPs cannot engage in effective advertising and marketing campaigns. Cable and satellite operators are now also starting to invest in IPTV with hybrid set-top boxes to deliver an expanded selection of on-demand and interactive TV alongside linear broadcast TV and Internet services. Global subscribers of IPTV services delivered by wireline CSPs reached nearly 36 million in 2010, and are forecast to grow to 74 million by the end of 2014. While this is a small number in terms of household penetration (less than 2% worldwide in 2010, and forecast only to reach 3.6% in 2014), not all households are considered the target market for IPTV. Perhaps a clearer picture of the current market penetration is that 9.2% of worldwide, noncable, consumer-fixed broadband households subscribed to IPTV services in 2010. The main activity is in China, the U.S., South Korea and Western Europe (aggregated countries). Recent activity by cable MSOs and satellite broadcasters to integrate IPTV with their traditional broadcast offerings and Internet services (which could include access to OTT TV and video) will lead to more households consuming IPTV than indicated in the above statistics for wireline CSP subscriptions. It is becoming clear that IPTV is not a quick, easy answer to new revenue generation. Rather, it's a value-added service to complete the offering to the household and reduce churn. Some CSPs have started to think about IPTV as a service delivery platform and are experimenting with, for example,
Page 102 of 139
Gartner, Inc. | G00214660

digital advertising and marketing solutions, in addition to the "three-screen strategy" of providing their customers with content and content-related services on TVs, PCs and mobile screens. User Advice: Keys to success will be:

Avoid "me too" offerings. Gain access to compelling content. Bundle IPTV with other services at favorable prices. Ensure that the network can deliver the bandwidth needed for reliable, high-quality viewing.

If CSPs are to drive customer uptake in mature markets, they need to come to market with services that are either equal to those of their competitors at a lower price, or superior to existing video services in terms of content, convenience, ease of use and, importantly, customer service to improve the overall customer experience. Cable operators across geographies have been notorious for the inappropriate customer service. CSPs will need to evolve new applications, usage and user behaviors to differentiate themselves from the established broadcast alternatives. In less-saturated markets, they will need to use the best combination of price, technology and content (as well as bundling with nonentertainment services) to bring new customers into the pay-TV market. Expect market development to vary by region and by country. The complexity of delivery means that integrated solutions will likely be the fastest and most cost-efficient way of deploying the necessary architecture. Increased video content in networks will drive capacity upgrades. The upside for service providers is still largely speculative and contingent on the ability to differentiate services and price aggressively, especially in regions with significant satellite and cable deployment. The most positive immediate effects for CSPs are loss of churn and the ability to sell more broadband to users. CSPs need to embrace OTT video to complement their offerings and facilitate the search of OTT content. They should consider signing partnerships with premium OTT service providers to expose their subscribers to OTT and create a win-win scenario, protecting their subscriber base and increasing the user experience and content available. Business Impact: The effect of IPTV will be felt primarily in the residential market. CSPs promised to deliver a new viewing experience compared with cable or satellite TV. However, most cable and satellite offerings have upgraded to deliver what IPTV originally promised to differentiate on. There is the potential for virtually unlimited programming, thanks to the "switched" nature of the network architecture. In addition, more cross-platform integration between entertainment, communications and information services is possible. This can also be achieved between PCs, TVs and mobile phones, even though the real value, and "killer application," from this kind of three-screen strategy is still unproven as is user willingness to pay a premium for this functionality. Other potential benefits include more integrated search and navigation among broadcast/linear programming, ondemand, OTT and personal content, such as stored music, photos and videos. Benefit Rating: Moderate

Gartner, Inc. | G00214660

Page 103 of 139

Market Penetration: 5% to 20% of target audience Maturity: Adolescent Sample Vendors: Alcatel-Lucent; Cisco; Ericsson; Motorola; Nokia Siemens Networks; Technicolor Recommended Reading: "Forecast: IPTV Subscribers and Service Revenue, Worldwide, 2008-2014" "Forecast Analysis: IPTV Subscribers and Service Revenue, Worldwide, 2008-2014" "Market Share: Leading IPTV Carriers and Their Technology Vendors, Worldwide, 1Q11 Update" "Market Definitions and Methodology Guide: IPTV Service Forecast and Business Models, Worldwide"

Mobile DPI
Analysis By: Akshay K. Sharma Definition: Mobile deep packet inspection (DPI) is a technique used to monitor the data traffic in mobile applications. As a business model evolves in which data services become more important than voice for revenue generation and in which the network is upgraded to Long Term Evolution (LTE) and becomes Internet Protocol (IP) end-to-end the ability to perform traffic shaping, and perhaps blocking, becomes important. Mobile DPI can be a stand-alone network element, or part of existing network elements, such as:

The gateway general packet radio service (GPRS) support node (GGSN) switch in wideband code division multiple access networks. The packet data server node (PDSN) switch in code division multiple access networks. Part of the evolved packet core of LTE and High-Speed Packet Access Evolution networking elements.

Mobile DPI has received a lot of hype in prior years within net neutrality debates, as a means to determine which traffic should be traffic-shaped. However, mobile DPI has now become a mainstream technique to determine how over-the-top traffic is managed, and can be a proactive method for communications service providers (CSPs) to achieve session awareness, subscriber awareness and context awareness with market intelligence techniques. It can be used effectively to prioritize sessions like emergency calls with subscriber data management systems, and can also facilitate tiered services. Position and Adoption Speed Justification: Mobile DPI is seen as a key feature of mobile data switches (for example, GGSN and PDSN), or the evolved packet core network elements of LTE. Mobile DPI is considered a reactive way to monitor traffic, while proactive methods include the policy decision function in IP Multimedia Subsystems (IMSs). However, it is now clear that mobile DPI can be used with Policy and Charging Rules Function (PCRF) for real-time charging with policy

Page 104 of 139

Gartner, Inc. | G00214660

rules, allowing certain carriers to provide tiered services for data plans and usage-based billing. Content-based billing or traffic control could be next. User Advice: Enterprises and consumers should appraise their chosen LTE operator for the SLA and quality of service (QoS) promised. Operators may not openly mention whether mobile DPI is used for traffic shaping or to filter high-volume, peer-to-peer traffic. Business Impact: End-to-end QoS is critical for voice over IP voice quality, as well as for video quality. For mission-critical applications, mobile DPI can be a way to filter noncritical traffic. Benefit Rating: High Market Penetration: 20% to 50% of target audience Maturity: Early mainstream Sample Vendors: Alcatel-Lucent; Allot; Cisco; CloudShield; ipoque; Sandvine Recommended Reading: "Dataquest Insight: Mobile DPI; How Mobile Deep Packet Inspection Became Deep Pocket Inspection" "Emerging Technology Analysis: How CSPs can Cut Costs and Charm Customers with Integrated Policy and Charging Control"

Mobile Application Stores


Analysis By: Monica Basso Definition: Application stores offer downloadable applications to mobile users via a storefront that is either embedded in the device or found on the Web. Application categories in public application stores include games, travel, productivity, entertainment, books, utilities, education, travel and search, and can be free or charged-for. Private application stores can be created by enterprises for mobile workers. Position and Adoption Speed Justification: Mobile application stores are targeted to smartphone and media tablet users, for a range of applications that include entertainment, utilities, social networks, music, productivity, travel and news. One of the original application stores was offered by GetJar, which is still in the market today. In 2008, Apple introduced App Store, with free, advertisement-based or charged-for applications. App Store generated huge interest and adoption by its device customers, and has been a main differentiator for iPhone and iPad success. In January 2011, Apple announced the achievement of over 350,000 apps and 10 billion downloads. Apple paid out over $1.5 billion in revenue sharing to developers in 2010. App Store generated excitement in the market and forced other handset and OS manufacturers to try reproducing similar dynamics and introduce their own application stores for example, Google Android Market, Nokia Ovi Store (now rebranded Nokia Store), Research In Motion BlackBerry App World, Microsoft Windows Marketplace for Mobile and Palm Software

Gartner, Inc. | G00214660

Page 105 of 139

Store. Microsoft and Nokia will pursue synergies among the two stores, as part of a major partnership that brings Windows Phone 7 on Nokia devices. Carriers are also offering upgrades to their own application stores and offerings for their feature phones, with a view to exposing services such as billing, location and messaging to developers e.g., Orange App Shop and Vodafone 360. A number of third parties, such as Handmark, GetJar and Qualcomm, offer white-label solutions to carriers. An increasing number of enterprise portals promote applications that employees should, or are recommended to, download through either passthrough to the store or local download. Public application stores are relevant to enterprises for two reasons: (1) consumerization and personal device models are bringing progressive usage to employees; and (2) mobile business-toconsumer (B2C) application initiatives to target end customers can leverage application stores as channels for application distribution and discovery by target users. Among enterprise-specific application stores, Citrix Dazzle works across a range of client and mobile devices, and provides a mobile app store for internal deployment (i.e., the enterprise runs the store). Other vendors, such as MobileIron and Zenprise, enable private application stores. Due to the expectation that the adoption of smartphones and high-end feature phones will increase, along with the popularity of applications, we expect application stores to accelerate rapidly to the Plateau of Productivity in less than two years. User Advice: Enterprises should evaluate opportunities that originate from application stores to target end customers with mobile applications (e.g., to engage them in community-based activities to implement market campaigns, collect customer feedback and preferences, and provide new services). Application providers and developers should look for application stores that are associated with popular handsets and that can create a good user experience, and should weigh that against the difficulty of developing and porting applications and the potential popularity of an application. It is also important to choose application stores that have good distribution in terms of outlets and service from the application development community. Other features of application stores that would benefit developers include advertisement support (like the Google model, to allow vendors to be "top of deck"), user reviews, rankings and recommendations (as with Amazon.com), and good billing and reporting features. Application stores are a "scale game," and those offering them need to create some unique selling points that will bring developers to their stores, rather than those of their competitors. An "ecosystem" needs to be created in which developers have the tools to easily write and port applications; individuals can easily access, download and use applications; and all sides have visibility into the accounting of application sales and an efficient billing system that allows everyone to get paid in a timely manner. Business Impact: Mobile application stores are likely to have an impact on:

Brands, which can advertise and segment customers based on applications Application providers, giving them access to additional customers in a well-organized ecosystem
Gartner, Inc. | G00214660

Page 106 of 139

Benefit Rating: High Market Penetration: 5% to 20% of target audience Maturity: Early mainstream Sample Vendors: Apple; Google; Microsoft; Nokia; O2; Orange; Palm; Research In Motion; Vodafone Recommended Reading: "Marketing Essentials: How to Decide Whether to Start a Mobile Application Store" "Dataquest Insight: Application Stores; The Revenue Opportunity Beyond the Hype"

FTTH
Analysis By: Ian Keene Definition: Fiber to the home (FTTH) is deployed as the most radical way (in terms of cost and performance) to facilitate very-high-speed broadband access. This high performance can be achieved due to the ultrahigh bandwidth of single-mode optical fibers. The high cost associated with the technology is particularly related to the civil works involved when installing the fibers. There are two categories of FTTH technology deployed today: point-to-multipoint passive optical networks (PONs) and Active Ethernet point-to-point (PTP). The former comes in two types: Gigabit PON (GPON, ITU-T G.984) and Ethernet PON (EPON, IEEE 802.3ah). The latter is standardized by the International Telecommunication Union's G.985 specification and the 802.3ah standard from the Institute of Electrical and Electronics Engineers. Broadband PON (BPON) is an older standard but there are few or no new deployments. Active Ethernet PTP can operate at either 1 Gbps or 100 Mbps transmission rates, EPON at 1 Gbps in each direction and GPON at 2.5 Gbps downstream and 1.25 Gbps upstream. GPON bandwidth is shared by time division modulation and optical splitters, typically servicing up to 64 users per central office port with a geographical coverage of up to 20 km. Note that 10-Gigabit PON and wavelength division multiplexing PON are not included in this FTTH description, as these two technologies are covered separately. Position and Adoption Speed Justification: Communications service providers (CSPs) are already deploying significant amounts of FTTH in, for example, China, Japan, Russia, the U.S., South Korea, France, Italy, Denmark, Norway, Sweden, Australia, Singapore and Portugal. FTTH has proved to be the most future-proof, secure, reliable and bandwidth-agile technology for broadband access and video services delivery. As the cost of FTTH equipment decreases and deployment techniques improve, FTTH is gaining momentum in the market, although the high peruser capital expenditure (mainly related to the fibers and construction costs) remains a barrier to even wider deployments especially as end users are often reluctant to pay more simply for

Gartner, Inc. | G00214660

Page 107 of 139

getting additional bandwidth. Successful FTTH deployments will increasingly depend on CSPs also investing in appropriate content delivery networks (CDNs) and, in some cases, CSPs will favor nearterm-oriented CDN investments over longer-term FTTH deployments. The speed of adoption of FTTH is linked to CSPs wanting to expand their range of services and to the increasing bandwidth demands from the growing levels of network traffic. FTTH deployment depends significantly on the competitive and regulatory situation that an operator faces. Service offerings and deployments do vary and will continue to vary significantly by country and region. As such, the position of FTTH on the Hype Cycle should be understood as a "geographic mean" of deployments across different geographies. Additionally, note that FTTH is moving more slowly toward the Plateau of Productivity than most other technologies, simply because of the huge investment and effort required to deploy it. FTTH is a large, strategic and long-term investment with an expected service life of 30 years. Many CSPs, particularly those with a legacy copper access infrastructure, are opting for shorter-term VDSL2 investments instead. Enhancements to VDSL2, such as bonding and vectoring, will extend the lifetime of the copper local loop. Incumbent CSPs are often active in extending the reach of fiber in their access networks, deploying fiber to the building where it is economically practical, but also relying on copper for direct access to the majority of their subscriber base. However, some government stimulus packages have offset this and added to the heterogeneous picture seen across different geographies for FTTH. User Advice: Wireline CSPs need to consider the implications of continuing investment in DSL technologies versus the large capital expenditure outlay for large-scale FTTH deployments. Take the estimated reduced operational expenses that typically follow a FTTH rollout into account (due to fewer active nodes and the passive nature of the infrastructure). Be sure to incorporate an analysis of the particular regulatory environment concerned and to take into account the impact of relevant government stimulus packages. Consider network sharing to reduce costs. Be aware that while most FTTH business cases are built on expectations of new revenue streams associated with video services, it is typically a competitive challenge that triggers FTTH deployments. An example is Verizon's deployment of FTTH as a response to rival cable operators' move to Data-Over-Cable Service Interface Specification 3.0 (DOCSIS 3.0) and AT&T's comparable services for ultra-high-speed broadband. Research the business case for new revenue streams from services that require 100 Mbps and above. Track initiatives such as Google's plans for a 1 Gbps city network which applications and services would subscribers pay for? Understand your target subscriber base before deploying FTTH infrastructure. Uptake of higherbandwidth services through the deployment of FTTH varies considerably from country to country. The percentage of subscribers per homes passed varies from more than 70% in parts of Asia to less than 20% in Europe as a whole. This disparity can be attributed to pricing, competition and cultural differences impacting the perceived value of high-bandwidth services. In some cases, slow takeup is due to poor marketing and inappropriate service bundles. Note that the popularity of the different types of FTTH varies in different regions. The main explanation for this is that the choice of technology is closely related to the type of CSP incumbent wireline operators mainly favor PON, while municipalities and small local CSPs have
Page 108 of 139

Gartner, Inc. | G00214660

tended to favor Active Ethernet PTP. National alternative operators and multiple system operators considering FTTH also tend to favor PON. The EPON variety has been popular in Japan and Asia/ Pacific, while the GPON variety has been popular elsewhere. However, GPON deployments are now growing rapidly in China. Business Impact: The primary business impact will be the availability of higher bandwidth in the residential and small to midsize business markets. However, the long reach of FTTH systems allows CSPs to operate with fewer active network nodes, which, along with the passive nature of the infrastructure, can have a significant impact on the operational expenses of CSPs deploying this technology. Benefit Rating: Transformational Market Penetration: 5% to 20% of target audience Maturity: Early mainstream Sample Vendors: Alcatel-Lucent; Calix; Cisco; Ericsson; Huawei; Mitsubishi; Motorola; NEC; PacketFront; Tellabs; ZTE Recommended Reading: "Emerging Technology Analysis: Next-Generation Broadband Access Caters for End-User Bandwidth Appetite" "Forecast Analysis: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Forecast: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Magic Quadrant for Fiber-to-the-Home Equipment"

HSPA+
Analysis By: Joy Yang Definition: HSPA+ is also known as HSPA Evolution and Evolved HSPA, the abbreviation "HSPA" standing for High-Speed Packet Access. The Third Generation Partnership Project's (3GPP's) Release 7 specification has HSPA+ theoretically achieving 28 Mbps on the downlink and 11 Mbps on the uplink by using downlink 16 quadrature amplitude modulation (QAM), uplink 16 QAM, and downlink 2x2 multiple input/multiple output (MIMO) technology. In Release 8, the HSPA+ downlink will rise to 42 Mbps by using 64 QAM with 2x2 MIMO. In Release 10, with multicarrier technology, the theoretical HSPA+ downlink peak rate can reach 168 Mbps by using 64 QAM with 2x2 MIMO over 2x20MHz spectrum. The proposed enhancements to HSPA+ in Release 11 will then be discussed, including eight-carrier HSDPA, uplink dual antenna beamforming and MIMO, and downlink multipoint transmission. HSPA+ works in the same spectrum as current Universal Mobile Telecommunications System (UMTS) networks. The 3GPP requires HSPA+ to be backward-compatible with Release 99 (R99)

Gartner, Inc. | G00214660

Page 109 of 139

UMTS and with R5 and R6 HSPA networks and devices. This makes it possible for operators to make use of their existing UMTS and HSPA investments. This analysis focuses on the frequency division duplexing (FDD) type of HSPA+. There is also a time division duplexing (TDD) version, which is an upgrade to Time Division Synchronous Code Division Multiple Access (TD-SCDMA). TD-SCDMA is a 3GPP TDD-based third-generation technology, which has been deployed on a large scale only by China Mobile. Position and Adoption Speed Justification: As per Global mobile Suppliers Association statistics, by April 2011, there were 123 commercial HSPA+ launches in 65 countries. A total of 31% of HSPA networks have been upgraded to HSPA+. And there are now 144 available HSPA+ user devices, which represents great progress from the 40+ user devices that were available in April 2010. Dual carrier HSPA+ (DC-HSPA+), which can support 42 Mbps data bandwidth, has been deployed in 23 networks. There are 45 user devices supporting DC-HSPA+. Although the performance of Long Term Evolution (LTE) is better overall, on 5MHz bands HSPA+ is just as spectrally efficient. In May 2010 at the Shanghai Expo, Nokia Siemens Networks' demonstration of HSPA+ achieved a peak rate of 112 Mbps using 64 QAM, 2x2 MIMO and four bundled wideband code division multiple access (WCDMA) channels (20MHz of spectrum). HSPA+ can accommodate switched voice, is a software upgrade from basic 3G networks and has sufficient capacity in most cellular operator cell sites. Therefore, HSPA+ together with Wi-Fi offload at busy locations is a solution that will delay the deployment of LTE by many operators. Also, because of the high bandwidth promised by HSPA+, some bandwidth-hungry mobile applications are becoming possible, such as mobile video. User Advice: Together with deploying HSPA+, network operators should consider transforming their service and control layers to improve their ability to enable and manage new applications, including voice over Internet Protocol (VoIP) and multimedia applications. Mobile device vendors should synchronize their HSPA+ handset road maps with operators' road maps for network rollout. They should also recognize that HSPA+ is good for the brand image of handset designs incorporating MIMO, which will be a key technology for LTE devices. Business Impact: HSPA+ will significantly improve the mobile broadband experience. It offers enhanced bandwidth and has the potential to increase voice capacity for VoIP services. The recession has made operators conservative about investing in LTE. The backward-compatibility of HSPA+ enables them to provide high-performance mobile broadband services in phases, as demand arises, and to keep using their existing UMTS and HSPA networks for basic services in areas without HSPA+ coverage. For existing HSPA operators, there is the possibility of migrating to HSPA+ through a software upgrade, depending on their vendor. This would protect operators' UMTS and HSPA investments. Some operators, such as Vodafone, have chosen HSPA+ as a more cost-efficient technology than LTE. Benefit Rating: High Market Penetration: 5% to 20% of target audience
Page 110 of 139
Gartner, Inc. | G00214660

Maturity: Early mainstream Sample Vendors: Alcatel-Lucent; Ericsson; Huawei; Nokia Siemens Networks; Qualcomm; Sierra Wireless; ZTE Recommended Reading: "The Impact of LTE on Corporate Wireless Strategy" "Dataquest Insight: Mobile Operators Must Manage Costs While Nurturing LTE Revenue" "Early Commercial LTE Networks To Reach Sweden, Norway" "Vendor Rating: Ericsson" "Market Share: Mobile Carrier Network Infrastructure, Worldwide, 2008" "Forecast: Carrier Network Infrastructure, Worldwide by Country, 2003-2014, 1Q10 Update" "Dataquest Insight: Femtocell Market is Unlikely to Take Off Before 2012" "Emerging Technology Analysis: Long-Term Evolution (LTE), Hype Cycle for Wireless Networking Infrastructure, 2008"

Network DVR
Analysis By: Fernando Elizalde Definition: Network digital video recorders (DVRs), also known as network personal video recorders (PVRs), have a similar function to their stand-alone DVR/PVR counterparts, enabling consumers to record, store and play back content with DVD-like functions. The main difference is that a standalone DVR stores content on a hard drive within the set-top box, while content for a network DVR is stored on the service provider's network. For larger operators, the required network storage will grow to be thousands of terabytes or even petabytes. Network DVRs have evolved into two main versions: in one, consumers make predetermined decisions as to what they will record; in the other, the service provider records all video content, or selects broadcast channels which are then available to consumers for a set number of days. There are also hybrids: users may have a combination of both versions, and there are implementations of combined network and set-top DVRs. Position and Adoption Speed Justification: IPTV and cable operators are interested in the technology as a way to provide value-added services to their customer base. Despite its attractiveness as a customer retention mechanism and generator of incremental revenue, the network DVR (and in particular the version which allows consumers to make predetermined decisions) has network storage capacity and quality of service issues that need to be addressed. On the other hand, it means fewer overheads and management expenses compared with complex set-top boxes fitted with hard disks. To date, most network DVRs deployed worldwide provide access to programs (already broadcast) for a set number of days. Copyright issues are generally

Gartner, Inc. | G00214660

Page 111 of 139

agreed beforehand with the broadcast and/or content owners that make a selection of programs available to TV service providers to distribute on their networks. In other cases, such as On TV in Greece, service providers ask subscribers to authorize them to record shows on their behalf to be stored for a limited time on the operators' servers. Within this type of service, the features most widely deployed are selected time-shifted capabilities, network DVR for selected shows and, to a lesser degree, start-over capabilities. European operators such as Orange, Free and SFR (formerly Neuf) in France, BT Vision and TalkTalk TV in the U.K., Imagenio in Spain, Teo LT in Lithuania (the Gala TV IPTV service), Portugal Telecom (Meo TV), and On Telecoms in Greece (On TV), among many others offer this type of network DVR service. In the U.S., Comcast, Time Warner Cable and Verizon FiOS have introduced different versions of this type of network DVR; while in China several cable and IPTV operators offer most channels on demand after the real-time broadcast. Network DVRs that allow consumers to decide what to record on the operators' servers have seen limited deployment worldwide. The complexity, storage capacity and stress on the network together with copyright issues and the cost of storage have pushed large operators away from network DVR platforms toward successful deployments of premium set-top DVRs. This type of consumer-driven network DVR is implemented most often in small IPTV deployments for example, Teo's Gala TV in Lithuania, Invitel and Wist in Poland, Iskon in Croatia, Amis in Slovenia and Minsk TV in Belarus. Copyright issues that held up full network DVR services in the U.S. have been resolved, with the American Court of Appeal declaring network and set-top DVRs to be the same, and the U.S. Supreme Court refusing to hear a final appeal on the matter. Cablevision launched network DVR services in January 2011, joining other operators worldwide that already offer such services. User Advice: Regional implementations will vary because of copyright laws and their interpretation; carriers must be prepared to address these discrepancies with different strategies and innovations. Regardless of regional differences, operators must have a working relationship with the studios and networks. A network DVR can offer significant benefits over a set-top DVR. For example, network DVRs can handle the treatment of advertisements from preventing viewers from skipping ads, to enabling advertisers to update and replace old ads with more targeted ones. However, for large operators, set-top box DVRs are better positioned to address consumers' recording decisions; this solution avoids the issues that surround service provider network capacity and copyright, and results in less capital expenditure. Business Impact: Network DVRs will affect most, if not all, the players in the consumer pay-TV value chain. For consumers, the positioning of network DVRs facilitates greater time shifting and a move to an "on-demand" environment. Cable and telecommunications companies will benefit from lower capital and operating expenditures, but satellite operators will lose out if they fail to emulate network DVR services (by providing hard drives in set-top boxes) to compensate for the lack of a return path in their offerings. Benefit Rating: Moderate Market Penetration: 5% to 20% of target audience

Page 112 of 139

Gartner, Inc. | G00214660

Maturity: Early mainstream Sample Vendors: Alcatel-Lucent; C-Cor; Cisco/Scientific Atlanta; Ericsson Television; Espial; Microsoft Recommended Reading: "Emerging Technology Analysis: Internet TV, Global Consumer Communications Services" "Forecast: IPTV Subscribers and Service Revenue, Worldwide, 2007-2013" "Dataquest Insight: Worldwide IPTV Growth to Remain Steady, but not Spectacular" "Leading IPTV Carriers and Their Technology Vendors, Worldwide, 2Q09 Update"

TD-SCDMA
Analysis By: Joy Yang; Sandy Shen Definition: Time Division-Synchronous Code Division Multiple Access (TD-SCDMA) is China's homegrown third-generation (3G) standard for cellular networks. It requires only a single frequency spectrum to provide the uplink and the downlink, unlike frequency division duplexing (FDD) which requires a pair of frequencies. This is a possible advantage as there is a lot of unused time division duplexing (TDD) spectrum globally. Position and Adoption Speed Justification: China Mobile received a TD-SCDMA license on 7 January 2009, to become the only carrier running this technology. Given China Mobile's financial and management strength, TD-SCDMA may have a chance to keep its place in the market. These developments are mostly driven by politics, and although they give momentum to TD-SCDMA in the short term, we still believe this technology will become obsolete before reaching maturity, as Long Term Evolution (LTE) is expected to leapfrog TD-SCDMA before it reaches the Plateau of Productivity. In 2010, China Mobile invested heavily in TD-SCDMA network construction. By the end of the year, China Mobile's 3G network covered all county-level cities (those without a subregion beneath them), with 220,000 TD-SCDMA base stations. It has deployed Time Division-High-Speed Downlink Packet Access (TD-HSDPA), which can support 2.8 Mbps downlink and 384 kbps uplink. In 2011, China Mobile will focus on increasing its TD-SCDMA subscribers. It has released a group purchase RFP for 30 million units in 2011. As of 1Q11, China Mobile had 27 million TD-SCDMA subscribers. TDD LTE is gaining momentum with the support of vendors in the ecosystem. Major TD-SCDMA equipment vendors have claimed that current TD-SCDMA Node Bs will be able to migrate to TDD LTE with software upgrades. It is very likely that China Mobile, the only TD-SCDMA communications service provider, will adopt TDD LTE in the future. This leaves limited room for a TD-SCDMA market. However, TDD LTE deployment in China is highly reliant on government licenses. There is little chance of China's government releasing TDD LTE licenses this year or next, since TD-SCDMA licenses were released just two years ago.

Gartner, Inc. | G00214660

Page 113 of 139

User Advice: Companies should resist signing up for TD-SCDMA data services until performance and indoor coverage improve. In the meantime, they should look at 3G options from other carriers that operate mature technologies such as wideband code division multiple access (WCDMA) and cdma2000. Business Impact: TD-SCDMA could adversely affect China Mobile's performance if it fails to live up to expectations. Vendors with multiple product lines are less exposed to the risks presented by TD-SCDMA. Benefit Rating: Low Market Penetration: 1% to 5% of target audience Maturity: Adolescent Sample Vendors: Alcatel-Lucent; China Putian; Datang Telecom; Ericsson; Huawei; Nokia Siemens Networks; ZTE Recommended Reading: "Market Trends: China, Third-Generation Cellular Networks, 2011"

VDSL2
Analysis By: Ian Keene Definition: Very-high-bit-rate DSL 2 (VDSL2) is a DSL standard (G.993.2) ratified by the International Telecommunication Union (ITU) in May 2005. Theoretically, it can deliver asymmetrical or symmetrical aggregate bandwidth of 200 Mbps on twisted pairs at short distances (up to 100 Mbps downstream and upstream). However, practical deployments typically deliver 20 Mbps to 50 Mbps services. VDSL2 essentially doubles the downstream data rates delivered by VDSL, and quadruples those delivered by ADSL2+. Based on discrete multitone line code, the VDSL2 standard specifies eight bandplans or profiles optimized for different deployment scenarios. VDSL2 allows for operation in spectrum ranging from a minimum of 8MHz up to 30MHz, allowing for better performance under various loop length and noise/cross talk scenarios. Its achievements include longer reach than VDSL (up to approximately 2.4 kilometers from the DSL access multiplexer [DSLAM]), and speeds of up to 100 Mbps (symmetric) on short loops. A significant feature is that VDSL2 uses Ethernet as a multiplexing technology, eliminating asynchronous transfer mode. Position and Adoption Speed Justification: While most service providers are still upgrading their networks to ADSL2+, VDSL2+ is being positioned for extra-strength triple-play: the delivery of highspeed data, voice and digital video, with the capability of handling multiple high-definition television (HDTV) streams. Its deployment is being spurred selectively in markets that are facing strong competitive threats from cable multisystem operators and broadband cellular services. For communications service providers (CSPs) that do not want to pull fiber all the way to the home, VDSL2 is seen as a potentially economical last step. CSPs can still use their legacy copper infrastructure, while being able to provision advanced services such as IPTV, video on demand, interactive gaming and peer-to-peer applications all of which require not only high bandwidth, but also better upstream capability than was possible with earlier DSL flavors.

Page 114 of 139

Gartner, Inc. | G00214660

The main deployments of VDSL2 will be where CSPs are expanding their fiber networks out of the central office with a fiber to the node (FTTN) architecture. This brings the DSLAMs closer to the end user, in order to benefit from the higher-bandwidth services that VDSL2 can provide over shorter copper loop lengths. Delays in building out the FTTN architecture caused a slow initial build-out of VDSL2, but deployments are now ramping up in many countries. Vendor revenue from VDSL sales is forecast to match that from ADSL DSLAMs by 2012. Several CSPs have deployed VDSL2 in selected areas, often alongside FTTH deployments. It is deployed in multitenant buildings where fiber is terminated at the building ingress and DSL is used to deliver services to the individual dwellings. With further expansion of FTTN, and increasing CSP interest in deploying high-bandwidth services, more deployments are expected; particularly where there is competition from Data-Over-Cable Service Interface Specification (DOCSIS) 3.0 cable services. Where there is less competition, deployments are not expected during the next two to three years and ADSL or ADSL2 will dominate. User Advice:

CSPs should consider VDSL2 where there is a strong likelihood of intense competition from DOCSIS 3.0 cable services or High-Speed Packet Access (HSPA) cellular services. Factor in the expected arrival of Long Term Evolution (LTE) cellular services in some areas from 2011. Be aware that performance degrades as loop length increases. The best deployment scenarios will be FTTN, fiber to the building or fiber to the curb, with VDSL2 serving the rest of the copper-based access network outside and in-building. Building out the fiber infrastructure will be a significant cost factor for carriers in provisioning for VDSL2 services. To ensure extra bandwidth, carriers may want to consider VDSL2 enhancements. CSPs that have weak competition should consider whether ADSL2+ is sufficient to deliver any additional broadband services that are planned such as IPTV and triple-play services especially if takeup of HDTV is still sparse. Keep in mind, however, that ADSL2+ offers only about 1 Mbps of upstream bandwidth without ADSL2+ channel bonding.

Business Impact:

It will primarily be the residential and small or midsize business markets that will experience significant bandwidth upgrades from VDSL2. The standard may also have an impact in providing cellular network backhaul. VDSL2 will prove popular in countries where regulators favor the unbundling of FTTH making the business case for fiber direct to the premises weaker.

Benefit Rating: High Market Penetration: 5% to 20% of target audience Maturity: Early mainstream

Gartner, Inc. | G00214660

Page 115 of 139

Sample Vendors: Alcatel-Lucent; ECI Telecom; Ericsson; Huawei; Nokia Siemens Networks; ZTE Recommended Reading: "Forecast Analysis: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Forecast: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update"

DOCSIS 3.0 Cable


Analysis By: Ian Keene Definition: A cable modem is a piece of customer premises equipment that modulates data signals over cable operators' hybrid fiber-coaxial infrastructure to deliver broadband Internet access. DataOver-Cable Service Interface Specification (DOCSIS) is an international standard specified by CableLabs, that defines a protocol for the bidirectional exchange of signal data between a cable modem and a cable modem termination system (CMTS) at the headend. DOCSIS 3.0, the latest version, was completed in 2006 and supports up to four times as much bandwidth as DOCSIS 2.0. It achieves this chiefly by:

Bonding four or more 6MHz/8MHz channels for downstream transmission rates of 160 Mbps or more, and upstream rates of 120 Mbps or more. Incorporating statistical multiplexing, to give more users a higher peak capacity. Using Internet Protocol version 6 (IPv6) to expand IP address space.

It also employs the Advanced Encryption Standard for more secure connections. DOCSIS 3.0 requires compatible modular CMTS equipment to enable channel-bonding in both downstream and upstream directions. Position and Adoption Speed Justification: Cable operators worldwide are demanding ultra-highspeed products, so that they can compete effectively against very-high-bit-rate DSL 2 (VDSL2) and fiber to the home (FTTH) services. In addition, mobile operators are offering attractive High-Speed Packet Access (HSPA) services and Long Term Evolution (LTE) is on the horizon. As a result, most high-profile cable operators in North America, Europe, Asia/Pacific and Japan have upgraded a significant portion of their infrastructure to be DOCSIS 3.0-ready. Consumer services have been steadily rolled out through 2010 and 2011, and are expected to continue as this technology reaches the Plateau of Productivity. Generally, services have been limited to 50 Mbps. Providing higher speeds requires further investment in infrastructure and the deployment of fiber closer to the customer premises. However, 100 Mbps services are expanding, and a few cable operators can provide services in excess of 150 Mbps. While DOCSIS 3.0 deployment started in the U.S. to combat the competition from FTTH services, such as Verizon's FiOS offerings, some cable operators are using the technology to leapfrog current ASDL bandwidth offerings from the wireline communications service providers (CSPs). Bundled TV, voice and Internet services, together with higher bandwidth access than that provided by the

Page 116 of 139

Gartner, Inc. | G00214660

incumbent CSP, has helped to reduce customer churn. DOCSIS 3.0 has now been adopted in all worldwide regions. Most cable operators consider that, in the medium term at least, DOCSIS 3.0 will be sufficient to remain competitive with wireline CSPs' broadband offerings that use either VDSL2 or FTTH access networks. User Advice: Volume manufacturing of DOCSIS 3.0-based products has ramped up. While the consumer market is the main target of the operators, enterprise and small or midsize business (SMB) customers might want to consider these solutions for their remote workers or for their offices. With the addition of DOCSIS 3.0 in their portfolios, cable operators wanting to attract SMBs will now have more solutions to offer targeted segments of the commercial sector, beyond using out-ofband overlay technologies that provide Ethernet-type bandwidth or direct fiber connections. Business Impact: The initial impact of DOCSIS 3.0 equipment has been mainly on the residential consumer market. Higher transmission rates of 100 Mbps and above, will give cable operators more effective competitive fire power against VDSL2- and FTTx-based competitors. It also allows cable operators to provide on-demand IPTV services in addition to their broadcast products. Benefit Rating: High Market Penetration: 20% to 50% of target audience Maturity: Early mainstream Sample Vendors: Arris; Cisco; Motorola; Netgear; Technicolor Recommended Reading: "Forecast Analysis: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Forecast: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update"

Online Video
Analysis By: Andrew Frank Definition: Online video describes the delivery of video as a digital data stream over a broadband Internet connection to any device capable of receiving and displaying it. The focus of this definition (in contrast to similar profiles such as Internet TV) is on the use of Internet Protocol and infrastructure as the primary transmission channel for streaming video, regardless of the nature of the content or the device. Display devices may include PCs, smartphones, media tablets and TVs that connect directly or indirectly to broadband sources by a variety of methods, both wired and wireless, including wireless data protocols such as 3G and LTE. We stipulate only that the transmission between the video source and the receiver must traverse public Internet infrastructure, excluding arrangements such as telco IPTV distribution architectures that rely on managed networks.

Gartner, Inc. | G00214660

Page 117 of 139

Online video encompasses several classes of content:

Internet TV, which refers to licensed professional content produced for television broadcast (for example, programming produced for Web distribution) Consumer-generated video content posted on social sites such as YouTube and Facebook Private and corporate communications and training video, including one-way, two-way and multipoint interactions

We can also divide delivery modes into the following:


Live streams On-demand content libraries Programmed, pre-recorded content delivered on a linear schedule

The motivation for taking a high-level, content-neutral view of this definition is to capture the assertion, challenged by some, that the Internet is on the way to becoming the predominant channel for delivering video of any type to any device, and that, while it may not soon displace broadcasting for mainstream TV content, other delivery options are likely to be marginalized. Position and Adoption Speed Justification: In terms of consumer adoption, online video continues to make inroads and move well beyond its initial phase of preoccupation with short-form, user-generated content typical of the early years of YouTube. Consumers are increasingly viewing licensed professional content, such as movies and TV shows and online video streams, evidenced, for example, by the fact that Netflix has now used Internet streaming to support 24 million video on demand (VOD) subscriptions, surpassing Showtime or Starz and closing in on HBO. As a result, Netflix now generates nearly 30% of peak Internet traffic in North America, according to a recent report by Sandvine. HBO, for its part, recently rolled out its own online video service called HBO Go, which relies on TV service providers to handle subscriber relationships, including billing and authentication. The success of Apple's iPad in redefining the tablet category around media has also contributed to the rise in online video usage, which is also occupying an increasing share of smartphone time. The popularity of these devices has helped accelerate the embrace of online video by incumbent TV distributors (cable and satellite operators) embracing the TV Everywhere model, which aims to provide subscribers access to the television content they subscribe to on any device at any location, subject to the licensing restrictions of programmers. At the same time, Web portals and publishers continue to raise investments in online video for the Web and mobile applications, both through original productions and curated services. The falling costs and rising quality of handheld digital video cameras, the integration of video into smartphones and media tablet camera apps, and easy-to-use video editing and management software are also contributing to the acceleration of this trend. Additionally, corporations have realized they can create acceptable quality videos in-house without hiring an expensive production crew. All of this growth in online video usage has led to renewed speculation that additional caps or metering on fixed or mobile broadband connections will be needed to bring usage in line with
Page 118 of 139
Gartner, Inc. | G00214660

bandwidth capacity and costs, as has recently been observed in some mobile data plans. It has also led to predictions of a coming "broadband crisis" that will force many regions to resolve controversial issues affecting investment, such as spectrum reallocation and net neutrality regulations. Nonetheless, demand for online video seems unlikely to reverse its steep growth trend anytime soon, and consumers are growing accustomed to watching video on a number of broadband connected devices. These factors suggest that the investment needed to bring capacity in line with growing demand will generally be available, even if it disrupts some legacy businesses. User Advice: Beyond media applications, online video is creating opportunities for organizations to leverage the power of video for corporate communications, training, collaboration and marketing initiatives. Advertisers should insist that TV advertising investments be tailored for online video delivery as well, and ensure that they're suitably enhanced with features that exploit the interactivity and social connections of the online environment. Advertisers should also consider how best to utilize the increased measurability of online video for optimizing campaigns by observing engagement factors such as completion rates and tune-out points. Communication service providers need to aggressively explore how best to balance public demands for net neutrality and unmetered access to high-quality online video (especially in fixedline contexts) with tiering concepts that ensure escalating usage is equitably compensated by consumers and network peers. Video content providers and investors need to recognize that, although online video can lower barriers and reduce distribution costs, it doesn't solve major distribution problems such as how to make video discoverable and monetizable in an increasingly crowded field of competition for the attention of fragmented audiences. Business Impact: Online video will impact media distribution and advertising, especially in political and cause-related domains that resonate with social distribution. Online video has a global impact on market research, corporate communications, and copyright protection and licensing. It will increasingly be an expected part of any online promotional or merchandising presence. Benefit Rating: High Market Penetration: 20% to 50% of target audience Maturity: Early mainstream Sample Vendors: BBC; Brightcove; Comcast; Hulu; Netflix; Ooyala; Yahoo; YouTube

Interactive TV
Analysis By: Andrew Frank

Gartner, Inc. | G00214660

Page 119 of 139

Definition: Interactive TV can be defined as television programming (that is, licensed video content professionally produced for a home-viewing audience) made interactive by the addition of overlays or other synchronized signals indicating that a viewer response is enabled (for instance, to vote in a poll, play a game, or indicate interest in an advertised product). Because the profile of Interactive TV (hereafter ITV not to be confused with the U.K. broadcast network) predates the use of Internet as a video delivery platform, the term has conventionally been used to refer to methods of achieving interactivity that do not involve Internet technologies. Thus, a legacy definition would restrict ITV to programming delivered by a platform that enables interactive elements to be bound to a program's video stream and distributed over a terrestrial or multichannel video service such as cable or satellite TV and decoded by a set-top box (STB). ("Multichannel" in this context refers to a broadcast-style transmission through which video feeds are assigned to fixed channels that are tuned by a receiver.) The growing use of the Internet to deliver TV programming (sometimes referred to as "over-the-top" or OTT services) to a variety of devices, including connected TVs, blu-ray players, and game consoles, but also PCs, and, increasingly, media tablets, and smartphones, has created a competing vision to the STB-based notion, as these platforms like the Internet itself are inherently and richly interactive. OTT has also created ambiguity in the definition of "television" itself. Some consider TV to refer only to broadcast-based transmission. Others take the position that it's the presentation of content on a TV set that's the defining factor. We take the position here that it's the programming that defines a service as "TV." There is also an increasing trend to deliver interactivity synchronized with TV programming to a mobile app, which a user may access on a smartphone or media tablet while watching a broadcast, bypassing the video delivery system altogether. This fissure in definitions creates a challenge in pinpointing the progress of this technology. For the purposes of this analysis, we will consider ITV in the broad sense of binding interactivity to licensed video content and examine the competitive positions. We will use the term "in-band" to refer to multichannel-based services, "OTT" to refer to Internet-based, and "dual-screen" to refer to the coviewing approach. Similarly, we'll use the term "multichannel video programming distributor" (MVPD) to refer to pay TV distributors (cable, satellite, or IPTV) that operate in-band services, and online video distributor (OVD) to refer to TV distributors that operate OTT services. These follow the U.S. FCC's formal definitions of these terms. Position and Adoption Speed Justification: MVPDs have been trying for more than 20 years to produce an Interactive TV platform that is economically viable and appealing to consumers, yet that goal remains elusive. Meanwhile, the Internet has grown into a credible rival to channel-based video delivery systems, and significantly raised expectations for the kinds of interactive experiences marketers and consumers might share. This contrast has produced a complex, high-stakes battle for the future of interactivity on television. It also makes ITV an erratic and slow-moving technology on the cycle.

Page 120 of 139

Gartner, Inc. | G00214660

For advertisers and marketers, the high value of interactivity is twofold: it is a direct-response mechanism that can lead directly to a sale (often referred to as t-commerce), and it has been proven to significantly increase brand recall, purchase intent and other metrics of high importance to brand marketers. These factors have contributed to the growth of Internet advertising into the secondlargest ad medium behind television, although television still has a significant lead overall. Many of the difficulties MVPDs have had with ITV can be traced to the high cost of replacing consumer premises equipment, combined with 10-year-old decisions to deploy digital STBs whose cost constraints made them nearly obsolete at the time of their deployment (compared with the capabilities of contemporary PCs). In 2001, as the first Internet bubble was bursting and MVPDs were transitioning to digital cable and satellite services, the threat that broadband Internet would soon re-emerge to mount a credible challenge to multichannel digital TV technologies for delivery of even high-definition TV programming seemed remote to most service providers, who were understandably more focused on near-term economic trade-offs. In the intervening period the capabilities of PCs and Internet connections followed Moore's law and roughly doubled every 18 months, and are now supplemented by smartphones and media tablets whose capabilities far exceed these legacy digital STBs. Even when STB upgrades are deployed, MVPDs are generally reluctant to add disruptive features, such as open broadband connectivity and support for Internet standards, because this would undermine their control and ability to monetize ITV features. MVPDs have often clashed with broadcasters over control of interactive video platforms, and this continues to motivate MVPDs to design platforms that isolate interactions from the open Internet. Despite these restrictions, the strategy of subsidizing "free" STBs bundled with service has effectively kept competing consumer electronics products from gaining much traction in the market. The U.S. cable industry has tried to make the most of its base of legacy STBs by developing and deploying Enhanced TV Binary Interchange Format (EBIF), an in-band CableLabs standard for basic interactivity which is currently installed on about 25 million U.S. STBs. In early 2010, Canoe Ventures, a joint venture of the six largest U.S. cable companies, launched a request-for-information (RFI) product that leverages EBIF to provide advertisers with a text overlay to a TV commercial to which viewers can respond with their remotes. Although no results have been published, indications are that uptake has been slow. In Europe and South Korea, Digital Video Broadcasting Multimedia Home Platform (DVB-MHP) is available over the air on at least 20 million STBs, often with a telephone-service-based return path (wired or wireless), while in the U.K., MHEG-5 has been deployed by Freeview (a digital terrestrial TV service reaching 70% of U.K. households) and Freesat (a joint satellite venture between the BBC and ITV), and OpenTV has been deployed by Sky TV satellite service. These deployments offer interactivity, but no return path. To address this, BBC, ITV, Channel 4 and Channel 5 are collaborating on a project called YouView (formerly Project Canvas), which is planning to launch early in 2012. These developments suggest that, for these regions, ITV is becoming a reality with less contention than seen in the U.S. Another challenge to ITV has been the difficulty of designing compelling interactions based on a standard remote control pointed at a screen about 10 feet away (this is sometimes called the "10-

Gartner, Inc. | G00214660

Page 121 of 139

foot interface problem"). As smartphones and media tablets find their way into homes, many marketers and developers have latched onto the dual-screen notion that interacting with an application on a smartphone or tablet that could be synchronized with televised programming content might be a more compelling model for ITV, while providing a way around both the isolation of STBs and fragmentation of OTT platforms. This approach is disruptive to both the MVPD model for ITV and for OTT models from OVDs and connected TV manufacturers. Gartner's assessment of the progress and impact of ITV is an average of competing models. Standards such as EBIF are fundamentally challenged by the rising global trend of Internet TV consumption, including mobile devices and connected TVs themselves, that bypass closed STBs. These alternate TV consumption devices, which will tend to support interactivity through native apps or HTML5 programming techniques, are unlikely to recognize EBIF triggers, which will limit the ubiquity of EBIF or any other MVPD standard that is not compatible with broadband connected devices. And the dual-screen approach shows that ITV can also be achieved through ensemble interaction techniques that don't rely on any sort of in-band signaling. So, although ITV continues to make progress and is increasingly spurred by high-stakes competition, the outcome remains murky and still requires programmers to hedge their bets with multiple approaches. User Advice:

Service providers need to align with their regional industry groups and negotiate collectively for interoperable standards that allow their network platforms (cable, satellite, IPTV, broadcast and online video) to remain competitive and economical to develop for. Service providers also need to focus on multiscreen strategies (TV, PC and mobile) for service bundling and integration. However, service providers must be wary of standards that are incompatible with Internet protocols. Broadcasters and content providers should focus on how to incorporate standards-based interactivity into programming to bring more value to audiences and sponsors. Manufacturers should resist the temptation to create differentiation on the level of standards implementations that would undermine interoperability, and should seek advantage on the application level instead (such as better support for Internet TV and video device controls). Advertisers and ad agencies need to press for control over metrics and reporting standards, and work to ensure full transparency and openness in interactive TV advertising markets. All commercial parties should focus in the near term on partnerships and alliances in the newly forming "ecosystem" for interactive TV services, and hedge their bets on any single technology solution. Regulators should focus on ensuring fair competition among service providers and standards bodies, and be aware that technology is creating media environments in which legacy regulations are often inapplicable or irrelevant.

Business Impact: TV service providers MVPDs and OVDs have a substantial opportunity to increase their revenue share from advertisers and direct marketers by offering interactive features that can support transactions and consumer engagement. Consumer electronics, middleware and STB vendors face potentially decisive competition over where to strike the right balance between
Page 122 of 139
Gartner, Inc. | G00214660

features and cost. TV networks and advertisers, for which DVR-based ad skipping and Internet advertising spending shifts are significant disruptive trends, rely on interactive features, along with more-dynamic targeting, to shore up the value of the TV medium to advertisers. Benefit Rating: High Market Penetration: 5% to 20% of target audience Maturity: Adolescent Sample Vendors: Apple; BBC; Canoe Ventures; Ensequence; Ericsson Television; Google; Intel; Invidi; Microsoft; Nielsen; OpenTV; Rovi; TiVo; Yahoo Recommended Reading: "New Television Meets Context-Aware Computing" "A Scenario for the Future of Television in the Cloud"

MPEG-4 Advanced Video Coding


Analysis By: Ian Keene Definition: MPEG-4 Part 10, or MPEG-4 Advanced Video Coding (AVC), is the standard defined by the Moving Picture Experts Group (MPEG) for compressing audio and visual data for:

Internet over-the-top (OTT) streaming video. Internet Protocol television (IPTV). Digital broadcast TV. Cable and satellite standard definition/high definition (SD/HD) linear broadcast. CD/DVD distribution. Videoconferencing.

It is part of the larger collection of MPEG specifications introduced in late 1998, which were designated as the standard ISO/IEC 14496 for a group of audio and video coding formats and related technologies. MPEG-4 AVC is technically identical to the standard known as H.264 for video compression issued by the International Telecommunication Union's (ITU's) Telecommunication Standardization Sector (ITU-T), which coordinates standards for telecommunications on behalf of the ITU. The H.264 and MPEG-4 Part 10 standards are jointly maintained so that they have identical technical content. MPEG-4 AVC aims to provide good video quality at significantly lower bit rates (half or less of the bit rates of MPEG-2, H.263 or MPEG-4 Part 2), without an increase in the complexity of design, which would make it impracticable or too expensive to deploy. MPEG-4 can deliver bit rate savings of 50% with the same quality as MPEG-2. It also aims to be flexible enough to enable the standard to be applied to a wide variety of applications on a variety of networks and systems.

Gartner, Inc. | G00214660

Page 123 of 139

MPEG-4 MVC is an extension of MPEG-4 AVC used for 3D TV, which decodes alternate left and right frames in the video stream. The key features of the technology, which are embedded in encoders, decoders, receivers, set-top boxes (STBs) and other devices, include significantly improved coding efficiencies (which reduce the amount of bandwidth required for video transmissions), the ability to encode mixed media data (video, audio, speech), error resilience for robust transmissions and the ability to interact with the audio-visual scene generated at the receiver. Position and Adoption Speed Justification: MPEG-4 AVC/H.264 technology has been adopted in many countries by digital terrestrial broadcasters and service provider networks for the transmission of HDTV content over satellite, cable and wireline DSL/FTTH IPTV systems. It is also used for cellular IPTV and is embedded in 3G smartphones. MPEG-4 is part of the Blu-ray Disc format of the Blu-ray Disc Association (BDA). It is used by OTT services, such as YouTube. It is the codec of choice for enterprise video applications. Wireline communications service provider (CSPs) have been quick to embrace MPEG-4 for IPTV services. Satellite and cable operators have been slower to make the transition from MPEG-2 due to the cost of replacing legacy equipment, particularly the large installed base of customer premises equipment (CPE). Programmers and satellite service providers are adopting advanced compression technologies to save transponder capacity on the satellites they use to transmit their programming to customers nationally and globally. MPEG-4 is becoming essential as the number of broadcast HDTV channels increase. Many cable operators still have a significant installed base of MPEG-2-based headend encoders and STBs. While they are moving more aggressively toward adopting MPEG-4, it is expensive to overhaul their entire customer base with MPEG-4-compatible STBs. Dual MPEG-2/4 codec cable STBs are a solution for some. As more customers upgrade to an HDTV service tier, the penetration of MPEG-4 in STBs increases. User Advice: Cable and satellite service providers need to align their business plans with their technology evolution plans with regard to MPEG-4 AVC. The technology is a key enabler of HDTV, and the overall strategy for promoting HDTV should dictate how aggressive cable operators need to be in transitioning STBs that are MPEG-4-enabled. Something similar applies to those considering 3D TV services. Furthermore, integration of linear broadcast, IPTV and OTT video will drive the transition to more-advanced codecs that will enable a multiscreen strategy. Cable and satellite service providers must weigh the costs and operational effects of implementing MPEG-4 AVC in terms of operational efficiency improvements and revenue generation. This includes calculations for STB swap-outs. Any service provider using legacy MPEG-2 is increasingly exposed to competitors. Business Impact: The implementation of advanced video encoding technologies will be crucial to the success of network service providers' operational efficiency for video, and their attempts to pursue additional revenue. This will enable cost savings due to lower bandwidth and capacity
Page 124 of 139
Gartner, Inc. | G00214660

requirements, as well as new revenue from HDTV content. The technology is important for the contribution and distribution of content, especially in video-on-demand distribution systems. Benefit Rating: High Market Penetration: 20% to 50% of target audience Maturity: Mature mainstream Sample Vendors: Cisco; Envivio; Ericsson-Tandberg Television; Harmonic; Harris; Motorola; Technicolor Recommended Reading: "Forecast Analysis: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Forecast: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update"

OTN and GMPLS/ASON


Analysis By: Peter Kjeldsen Definition: The International Telecommunication Union (ITU) defined the now incumbent Synchronous Digital Hierarchy/Synchronous Optical Network (SDH/SONET) standards to establish a digital hierarchy of "circuits," i.e., network connections established using time division multiplexing (TDM) technology. SDH/SONET was successful because it offered a standards-based approach to implementing multiplexing, transport, management, supervision and survivability (including sub-50 millisecond protection switching). However, as wavelength division multiplexing (WDM) technologies entered communications service provider (CSP) networks it became apparent that establishing operations, administrations and management (OAM) capabilities for the "wavelengths" in the "WDM layer" similar to the features defined by SDH/SONET for the "TDM layer" would be necessary to manage the ever-increasing wavelength counts in CSP networks. At the same time, the TDM technology had matured and 40 Gbps and 100 Gbps per-channel transport was being developed. Consequently the ITU defined the Optical Transport Network (OTN) G.709 standard to provide a mechanism that allows CSPs to control diverse traffic types including both circuits in the TDM layer (at up to 100 Gbps) and wavelengths in the WDM layer. The underlying principles of OTNs OAM capabilities are inherited from SDH/SONET. The Internet Engineering Task Force's Generalized Multiprotocol Label Switching (GMPLS) standard and the ITU's Automatically Switched Optical Network (ASON) standard enable CSPs to automate their transport networks by means of an intelligent control plane and associated signaling. GMPLS/ ASON functionality can be applied to different types of transport network, such as SDH/SONET, and emerging packet-oriented transport solutions such as Multiprotocol Label Switching Transport Profile (MPLS-TP).

Gartner, Inc. | G00214660

Page 125 of 139

Position and Adoption Speed Justification: Adoption of OTN and GMPLS/ASON started slowly, but are speeding up as CSPs upgrade and automate their transport networks to cost-effectively support bandwidth-hungry video services and mobile data. OTN and GMPLS/ASON technology is adopted primarily by CSPs that need to automate their transport networks; although it started as a technology only for the largest CSPs, it has now also found its way into smaller networks as the wavelength count in these networks is increasing beyond the "automation threshold." While CSPs need to rapidly add bandwidth to their infrastructure, it is rare that they need to take bandwidth away, so this relatively monotonous growth scenario does not utilize the full flexibility offered by OTN and GMPLS/ASON technology which easily could cater for more dynamic provisioning scenarios. Examples of such dynamic requirements have appeared one such example being AT&T's Optical Mesh Service, where the ability to reallocate bandwidth as needed, by increasing or decreasing capacity in near real-time, is central to this offer in the business segment. User Advice: Investments in OTN and GMPLS/ASON must be justified by operational savings from the automation of tasks that previously were manual or semi-manual. The larger the network, and the larger the amount of traffic, the easier it is to justify this kind of investment. An additional provein factor is the ability to handle complex (and even malicious) failure scenarios by means of sophisticated protection schemes. As the technology has matured and the cost of implementing it has come down, OTN and GMPLS/ ASON is changing its character away from being a key differentiator toward becoming merely "table stakes" especially among the largest CSPs. Business Impact: OTN and GMPLS/ASON enable service providers to optimize network operations by replacing centralized, network management-controlled manual procedures with decentralized, signaling-controlled automated ones. Benefit Rating: Moderate Market Penetration: 20% to 50% of target audience Maturity: Mature mainstream Sample Vendors: Alcatel-Lucent; Ciena; Ericsson; Fujitsu; Huawei; Nokia Siemens Networks; ZTE

Entering the Plateau


Next-Generation Voice
Analysis By: Deborah Kish Definition: Next-generation voice refers to the network architecture, equipment and protocols needed to replace the traditional time division multiplexing public switched telephone network (PSTN) with voice over Internet Protocol (VoIP), and to provide enhanced voice functions and applications in both fixed and mobile networks. In mobile networks, the architectural design for

Page 126 of 139

Gartner, Inc. | G00214660

next-generation voice includes an IP Multimedia Subsystem (IMS) core, as well as the GSMA's voice over Long Term Evolution (LTE) VoLTE initiative. These approaches allow mobile communications service providers (CSPs) to deploy telephony in an efficient way, while moving to upgrade their networks to accommodate broadband-based multimedia services. Position and Adoption Speed Justification: Approaches to next-generation voice are helping CSPs to reduce the cost of delivering telephony services. Although, because of their low ROI, voice services don't lead to revenue growth, they do provide a steady income stream, so CSPs have to reduce the cost of delivering them, while simultaneously innovating toward provisioning new voiceenabled service bundles, particularly in fixed networks. The cost to provide mobile voice is relatively low, but the ROI is considerably higher, so it is an important revenue stream for future infrastructure investments. IMS in the core will complement LTE when mobile next-generation voice becomes available. CSPs are under pressure to maintain good-quality voice services, so are working toward achieving high-definition voice services. Vendors and CSPs should participate in initiatives such as Rich Communication Suite and should work with software and handset vendors to improve interoperability and accelerate time to market. User Advice: Continue to investigate and take advantage of the benefits of open and standardized technology architectures with interoperable interfaces, such as Session Initiation Protocol. Investigate the benefits of telephony emulation, using VoIP rather than IMS architecture for multiservice delivery. Users should evaluate advanced features accurately, ignoring technology hype while acknowledging true added value and understanding the key benefits. Fixed voice revenue still running on the PSTN will continue to be the "cash cow" that will help pay for investments in advanced technology and the applications that will help differentiate CSPs, so waiting out the long life cycles of traditional equipment in select parts of the networks may be a good short-term solution (no longer than two years). Business Impact: The impact of next-generation voice is widespread and will affect CSPs, their corporate customers and residential users, and vendors of next-generation voice technology. VoIP and VoLTE will increase competition between service providers, and should encourage the appearance of a wide range of new Web application providers, such as Google and Skype, as well as cable operators offering voice services. Lower price points due to increasing competition and lower production costs will encourage residential users and enterprises to adopt services at an increased rate. Government initiatives will encourage service providers to increase their reach and upgrade their networks. Benefit Rating: High Market Penetration: More than 50% of target audience Maturity: Mature mainstream Sample Vendors: Alcatel-Lucent; BroadSoft; Cisco; Ericsson; Genband; Huawei; Italtel; Metaswitch Networks; Motorola; NEC; Nokia Siemens Networks; ZTE Recommended Reading: "Magic Quadrant for Softswitch Architecture"

Gartner, Inc. | G00214660

Page 127 of 139

"Forecast: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Forecast Analysis: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Magic Quadrant for LTE Network Infrastructure"

ROADMs
Analysis By: Peter Kjeldsen Definition: Reconfigurable optical add/drop multiplexers (ROADMs) are the wavelength division multiplexing (WDM) equivalent of the add/drop multiplexing that has been used in the Synchronous Digital Hierarchy (SDH) and Synchronous Optical Network (SONET) markets for more than a decade. ROADMs enable communications service providers (CSPs) to automate the way individual wavelengths of WDM systems are routed through their networks, mainly in ring configurations, with protection switching and easy provisioning being the major benefits. ROADMs are a trade-off between capital expenditure (capex) and operating expenditure (opex); where a capex premium is paid (relative to less-dynamic WDM solutions) to save subsequently on opex, due to enhanced flexibility and efficiency in the use of network wavelengths. In Gartner's market statistics for optical transport systems, ROADMs are included in the optical exchange equipment (OXE) segment; which includes all types of node equipment that handle traffic in the optical domain without relying on client-layer functions (such as those provided by SDH/ SONET). The OXE segment also includes optical switches and optical cross-connects. Position and Adoption Speed Justification: ROADM equipment has been deployed for several years, especially in large, bandwidth-hungry networks that are being upgraded to cater for mobile data and video services. Many CSPs have already invested in this technology. User Advice:

Be sure to weigh the opex advantage offered by ROADMs against their intrinsic capex premium; ensuring that investment decisions are made from an appropriate total cost of ownership point of view. Smaller CSPs are likely to find that they will not have the same economic incentives to invest in ROADMs as larger CSPs, simply because of the lower wavelength count in their networks. Equipment vendors should focus their ROADM marketing efforts on midsize and large CSPs and on those that are early adopters of bandwidth drivers such as fiber to the home and Long Term Evolution.

Business Impact: By automating wavelength handling, ROADMs enable CSPs to achieve "bit-wise economies of scale" through faster and more flexible provisioning in their WDM networks. Benefit Rating: Moderate

Page 128 of 139

Gartner, Inc. | G00214660

Market Penetration: 20% to 50% of target audience Maturity: Mature mainstream Sample Vendors: Adva Optical Networking; Alcatel-Lucent; Cisco; ECI Telecom; Fujitsu; Huawei; Nokia Siemens Networks

Mobile TV Streaming
Analysis By: Shalini Verma Definition: Mobile TV streaming is the streaming of live TV from cellular networks to mobile and portable handsets using narrowcasting or multicasting technology. Position and Adoption Speed Justification: In the past year, the industry focus has been mainly on securing broadcast rights for live sporting events. In April 2011, the satellite broadcaster BSkyB signed a new multiyear agreement with the Union of European Football Associations (UEFA) for U.K. satellite rights for 129 live UEFA Champions League matches per year, in addition to the UEFA Super Cup. In this agreement, Sky also gained cross-platform rights, including rights for broadcasts via Sky Mobile TV. Telstra won the 2012 to 2016 rights for broadcasting every Australian Football League match live on its mobile network. In India, Cricket World Cup and Indian Premier League cricket matches in 1H11 became key drivers of mobile TV streaming services. Apalya Technologies, which had the mobile rights for the matches, added more than a million new subscribers to its mobile TV services in the first five months of 2011. In mature markets, communications service providers (CSPs) and media companies are focused on extending their TV broadcasting services via apps on the iOS, Android and BlackBerry platforms. In particular, the tablet is emerging as a major device of choice for mobile TV streaming. The average viewing time on some sports apps on tablets is twice as much as on smartphones. Mobile apps are largely offered free to subscribers of TV program packages. Verizon is extending its FiOS TV broadcasting service to the iPad to be watched at home by its TV subscribers. Telenor has partnered with Aspiro TV to offer new mobile TV apps on Android and iOS devices in the Nordic countries. Broadcasters are also releasing optimized mobile TV apps for the key mobile platforms. ESPN has extended its TV Everywhere service to the iOS platform. To honor existing carriage agreements, Time Warner Cable and Cablevision Systems offer live TV iPad apps only within the subscriber's home over Wi-Fi. In the U.K., in addition to the BBC's iPlayer, other catch-up TV services have been extended to iOS platforms. In emerging markets, traditional mobile TV streaming services are being used to attract subscribers. In India, CSPs are using mobile TV as bait to recruit subscribers to their newly launched thirdgeneration (3G) services. In Africa, CSPs want to use mobile TV to gain a competitive advantage. Vodacom has partnered with MultiChoice to offer a mobile TV streaming service on generation twoand-a-half and 3G devices in Tanzania. Live streaming of the 2010 FIFA World Cup helped to popularize mobile TV streaming in emerging markets. MobiTV, a key mobile TV platform provider, continues to build on its success in terms of content partnerships and subscribers. In September 2010, it had 13 million subscribers globally, driven by

Gartner, Inc. | G00214660

Page 129 of 139

the extension of the service on multiple mobile platforms such as iOS, Android and BlackBerry. MobiTV has found that Android devices have higher TV viewing attach rates (the ratio of video clips viewed per device) and conversion rates from free trials to paid subscriptions than iOS and BlackBerry devices. YouTube's app's presence on iOS, Android and BlackBerry devices translates into a massive installed base. In addition, YouTube on HTML5-compliant mobile browsers has a rich experience. In June 2010, YouTube had 100 million mobile views per day, and the number of videos streamed on mobile devices increased by 160% year on year. Long Term Evolution (LTE) network rollout will be a key driver of mobile TV streaming, once capable devices are available. As HTML5 becomes more pervasive, the industry will start to evaluate it for mobile TV streaming. There is no significant change in pricing from last year. CSPs use various pricing models, including free and subscription-based models, as well as bundling streaming with other data services. User Advice: CSPs offering TV and video streaming services need to consider viewing patterns and the growth of mobile TV and video streaming traffic in their LTE infrastructure rollout planning. In the interim, they can use video optimization solutions, which can offer improvements in video experience of 40% to 70%. While HTML5 will become a key delivery option for mobile TV streaming, broadcasters that rely on digital rights management will not find HTML5 suitable for their business model. They will have to look again at their business model and content rights if they want to use HTML5. Broadcasters and CSPs need to invest in video analytics to understand the mobile TV viewing habits of their users. Business Impact: Mobile TV streaming has an impact on mobile data services in terms of mobile data traffic usage and mobile data revenue. With the introduction of tiered pricing for mobile broadband, the increased usage of mobile TV streaming on Wi-Fi will adversely affect CSPs' revenue from mobile TV streaming services. On the upside, it will help CSPs to manage bandwidth utilization more effectively. Mobile TV streaming will also affect the business of content providers and broadcasters as they make mobile TV streaming a core part of their content strategy. Content rights also come into play. Benefit Rating: Moderate Market Penetration: 20% to 50% of target audience Maturity: Early mainstream Sample Vendors: MobiTV; YouTube Recommended Reading: "Forecast: Mobile Application Stores, Worldwide, 2008-2015"

Off the Hype Cycle


Mobile TV Broadcasting
Analysis By: Shalini Verma
Page 130 of 139
Gartner, Inc. | G00214660

Definition: The broadcasting of digital TV programs to mobile handsets and media tablets using technologies such as Digital Video Broadcasting Handheld (DVB-H), Terrestrial Digital Multimedia Broadcasting (T-DMB) and MediaFLO. Position and Adoption Speed Justification: 2010 continued to be an unexciting year for mobile TV broadcasting, which is fighting an uphill battle against TV streaming and mobile apps for video on demand. During 2010, mobile apps for TV streaming were the key focus areas for media companies and communications service providers (CSPs). Another reason for a lackluster year for mobile TV broadcasting was that very few smartphones released onto the market had native support for mobile TV broadcasting technologies; a few Android smartphones and media tablets were launched with T-DMB TV tuners in Korea. To extend mobile TV broadcasting onto the newer smartphones, technology providers have been releasing external receivers as accessories. SoftBank made a new iPhone app available, allowing users to view One-Seg without the need to keep the One-Seg TV tuner in "close" proximity. Nokia, which is still trying to push DVB-H, announced the launch of a Nokia mobile headset that acts as a DVB-H receiver for its Symbian devices. MultiChoice also launched a separate mobile TV decoder, the Drifta, which receives a DVB-H signal and converts it into a Wi-Fi signal for Wi-Fi-enabled devices, such as laptops, and Windows and iOS devices. During the past year, CSPs have continued the trend of shutting down their mobile TV broadcasting services. With the industry focus on mobile apps, it was impossible for AT&T, for example, to justify the premium monthly price of $10 to $15 for FLO TV (MediaFlo). In October 2010, Qualcomm suspended new sales of the service to consumers and in March 2011 the FLO TV service was discontinued. Dutch operator KPN Telecom is also terminating its DVB-H mobile TV service, MobielTV, in June 2011; the reason given was that DVB-H did not develop into a global standard for mobile TV broadcasting, which led to a shortage of devices supporting the technology. However, some countries are still seeing the launch of mobile TV broadcasting services. In Africa, broadcaster MultiChoice continued to extend its DVB-H mobile service DStv into new countries: launching in South Africa during November 2010. The service costs 36 rand per month, though the broadcaster sees the service as only delivering a ROI in the long term. Notably, these services were launched alongside mobile TV streaming services. In France, Virgin Mobile and broadcast infrastructure and service provider TDF have planned to launch a mobile TV broadcasting service, though the project has been delayed beyond the second half of 2011 because of a lack of consensus on the technology to go with. In the U.S., broadcasters are keen to launch mobile TV broadcasting services and as many as 12 broadcasters and television content owners there have created the joint venture Open Mobile Video Coalition (OMVC). Members have pooled their broadcasting spectrum and are planning to launch a mobile TV service based on the Advanced Television Systems Committee-Mobile/Handheld (ATSC-M/H) standard by the end of 2011. The service will be made available on a variety of consumer products such as in-vehicle displays and netbooks. Mobile TV broadcasting was put to good use during Japan's recent calamity, when the national broadcaster NHK gave 24/7 coverage of the disaster across multiple screens, including One-Seg the country's free mobile digital TV broadcast system. Japan's Integrated Services Digital Broadcasting-Terrestrial (ISDB-T)-based mobile TV was able to transmit emergency warnings, which helped save lives.

Gartner, Inc. | G00214660

Page 131 of 139

As in the previous year, analog TV receivers continued to see a strong demand from emerging markets such Brazil, Peru, Argentina, Russia, Nigeria, Thailand, Egypt and China. Free-to-air service supported by ads continues to be more popular than monthly subscription. Gartner expects mobile TV broadcasting services to be restricted to specific markets. Recent demand for analog TV receivers is indicative of some latent demand in emerging markets. However, on smartphones this technology will continue to be upstaged by mobile apps for mobile TV streaming and time-shifting and Web-based video services. It does have opportunities on other device form factors, such as media tablets and in-car systems, but price and availability of channels will be the key success criteria. User Advice:

Device vendors should explore alternative portable form factors for delivering mobile TV broadcasting, with the objective of improving the in-home and on-the-go TV viewing experience. They should build technologies such as 3D and high-definition (HD) optimization, and those that allow users to perform gestures to turn on the device or flick the device to switch to another device. CSPs and device vendors should concentrate their efforts on specific geographic pockets, and target the segments where opportunities for mobile broadcasting TV are present. They should also review the current business models to use a freemium model keeping basic content free and charging for premium content (for example, upcoming 3D or HD TV optimization content). Content providers should explore delivering content related to major events beyond sports. Mobile TV broadcasting will be more suitable for live events that have significance for the target audience. Government agencies should ensure that the emergency warning system attached to mobile TV broadcasting systems is available for use when required.

Business Impact: Mobile TV broadcasting will affect all areas of video production, rights management, syndication and advertising. Benefit Rating: Low Market Penetration: 5% to 20% of target audience Maturity: Mature mainstream Sample Vendors: DiBcom; LG; Samsung; Telegent Systems; Texas Instruments Recommended Reading: "Forecast: Mobile Application Stores, Worldwide, 2008-2015"

Residential VoIP
Analysis By: Deborah Kish Definition: Residential voice over Internet Protocol (VoIP) is a telephone service delivered via broadband cable, DSL or fiber-to-the-x connections using specialized end-user customer premises

Page 132 of 139

Gartner, Inc. | G00214660

equipment (CPE), primarily a telephone adapter that is integrated with, or attached to, a broadband modem and associated headend/central-office broadband access platforms. These platforms include either Internet Protocol (IP) DSL access multiplexers for telcos or cable modem termination systems for cable operators, along with softswitches and other associated call and customer management servers and software. Residential VoIP can be a managed service that provides quality of service (QoS), or a best-effort service, such as Vonage and magicJack, which plugs into an existing broadband connection or PC but does not provide a managed service with QoS. Service providers can provide these services themselves over their own networks or they can offer VoIP that is hosted and managed by a third party. We do not define PC-to-PC-based calling services, such as Skype, which are best-effort "over the Internet" services, as residential VoIP services. Position and Adoption Speed Justification: Residential VoIP has been available for almost a decade and is becoming more widely adopted. It is, therefore, moving much closer to the Plateau of Productivity. Mobile substitution has been ongoing for several years, so wired phone lines have become less relevant for end users. Voice traffic has substantially shifted to mobile connections or to peer-to-peer VoIP calling, such as Skype, and to Google from fixed connections. Consumers, however, will continue to keep their voice lines. In most cases, a voice service is included in lowcost and bundled service plans as a double play with broadband, as a triple play when video is added or a quad play when a TV service is added. User Advice: Communications service providers have been challenged with finding ways to differentiate themselves from competing providers, including "over the top" competitors such as Google and Skype with services beyond voice. They need to add value to the VoIP platform by offering cross-platform integration with data and multimedia services, as well as integrating fixed and wireless platforms that can be used with any device. At a time when consumers are looking for ways to cut spending, service providers need to find ways to encourage subscribers to add services and pay for them at lower additional costs. An example of this would be developing software pricing models in which consumers can download additional features if they pay for the license. Look for variations of CPE, including gateway devices that incorporate combinations of broadband modems, voice-signaling adaptors, multiport routers and Wi-Fi access points or femtocells. Business Impact: Residential VoIP is delivering on the promise of new service and revenue opportunities. Its bundling with broadband is having a positive effect on the uptake of services and, in many cases, of digital video services as well. In an IP Multimedia Subsystem environment, it also serves as an additional platform for integration with broadband data and video services, as integrated portals are emerging to manage all services from a single location. Caller ID on televisions, voice mail on PCs and televisions, voice-to-text or voice-to-email services, single voice mailboxes for mobile and fixed lines, and the ability to view call logs are becoming common, offering greater simplicity and convenience to consumers, and increasing productivity and the perceived value of the bundle. As technology platforms become more converged, the portability and mobility of applications will increase. Residential VoIP can be used with traditional "black phones" or dedicated IP phones; however; the kind of terminal used is a matter of end-user preference. For residential users, we do not expect an uptake of IP phones but of base station phones, as the device needs to be plugged directly into the modem or router, limiting the location of a single terminal.

Gartner, Inc. | G00214660

Page 133 of 139

Benefit Rating: High Market Penetration: 20% to 50% of target audience Maturity: Mature mainstream Sample Vendors: Arris; AT&T; BT; Cablecom; Cablemas; Cablevision; Cedar Point Communications; Charter Communications; Cisco; Comcast; France Telecom; Genband; Jupiter Telecommunications; Megacable Comunicaciones; Motorola; Nokia Siemens Networks; Rogers Communications; Shaw Communications; Telenor; TeleNet; TeliaSonera; Time Warner Cable; UPC; Verizon Communications; Videotron; Vonage

Appendixes

Page 134 of 139

Gartner, Inc. | G00214660

Figure 3. Hype Cycle for Communications Service Provider Infrastructure, 2010

expectations
VoIP Wireless WAN Self-Organizing Networks Rich Communication Suite RF Over Glass White Spaces: Unlicensed Spectrum TV WDM PON 802.22 100 Gbps Transport

4G Standard Convergent Communications Advertising Platforms Addressable TV Advertising Public Cloud Computing/the Cloud Network Sharing 10G PON TD-LTE Next-Generation Service Delivery Platforms Long Term Evolution Femtocells Interactive TV IMS GMPLS/ASON MPLS-TP Mobile TV Broadcasting 40 Gbps Transport HSPA+ 802.11n Residential VoIP ROADMs Next-Generation Voice Mobile TV Streaming

Smart Antennas CMTS Bypass LTE-A 3D TV Services WiMAX 802.16m

Broadband Over Power Lines 802.11r-2008 WiMAX 802.16e-2005 IPTV

High-Speed Uplink Packet Access TD-SCDMA MPEG-4 Advanced Video Coding FTTH Switched Digital Video Network DVR DOCSIS 3.0 Cable VDSL2 Mobile Application Stores As of July 2010

Technology Trigger

Peak of Inflated Expectations

Trough of Disillusionment

Slope of Enlightenment

Plateau of Productivity

time
Years to mainstream adoption: less than 2 years
Source: Gartner (July 2010)

2 to 5 years

5 to 10 years

more than 10 years

obsolete before plateau

Gartner, Inc. | G00214660

Page 135 of 139

Hype Cycle Phases, Benefit Ratings and Maturity Levels


Table 1. Hype Cycle Phases Phase Technology Trigger Definition A breakthrough, public demonstration, product launch or other event generates significant press and industry interest. During this phase of overenthusiasm and unrealistic projections, a flurry of wellpublicized activity by technology leaders results in some successes, but more failures, as the technology is pushed to its limits. The only enterprises making money are conference organizers and magazine publishers. Because the technology does not live up to its overinflated expectations, it rapidly becomes unfashionable. Media interest wanes, except for a few cautionary tales. Focused experimentation and solid hard work by an increasingly diverse range of organizations lead to a true understanding of the technology's applicability, risks and benefits. Commercial off-the-shelf methodologies and tools ease the development process. The real-world benefits of the technology are demonstrated and accepted. Tools and methodologies are increasingly stable as they enter their second and third generations. Growing numbers of organizations feel comfortable with the reduced level of risk; the rapid growth phase of adoption begins. Approximately 20% of the technology's target audience has adopted or is adopting the technology as it enters this phase. The time required for the technology to reach the Plateau of Productivity.

Peak of Inflated Expectations

Trough of Disillusionment Slope of Enlightenment

Plateau of Productivity

Years to Mainstream Adoption


Source: Gartner (July 2011)

Table 2. Benefit Ratings Benefit Rating Transformational Definition Enables new ways of doing business across industries that will result in major shifts in industry dynamics Enables new ways of performing horizontal or vertical processes that will result in significantly increased revenue or cost savings for an enterprise Provides incremental improvements to established processes that will result in increased revenue or cost savings for an enterprise Slightly improves processes (for example, improved user experience) that will be difficult to translate into increased revenue or cost savings

High

Moderate

Low

Source: Gartner (July 2011)

Page 136 of 139

Gartner, Inc. | G00214660

Table 3. Maturity Levels Maturity Level Embryonic Emerging Status


Products/Vendors

In labs Commercialization by vendors Pilots and deployments by industry leaders Maturing technology capabilities and process understanding Uptake beyond early adopters Proven technology Vendors, technology and adoption rapidly evolving Robust technology Not much evolution in vendors or technology Not appropriate for new developments Cost of migration constrains replacement Rarely used

None First generation High price Much customization Second generation Less customization

Adolescent

Early mainstream

Third generation More out of box Methodologies Several dominant vendors

Mature mainstream

Legacy

Maintenance revenue focus

Obsolete

Used/resale market only

Source: Gartner (July 2011)

Recommended Reading
Some documents may not be available as part of your current Gartner subscription. "Understanding Gartner's Hype Cycles, 2011" "Forecast: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Forecast Analysis: Carrier Network Infrastructure, Worldwide, 2008-2015, 2Q11 Update" "Forecast: Telecom Operations Management Systems (BSS, OSS and SDP), Worldwide, 2007-2015, 2Q11 Update" "Forecast Analysis: Telecom Operations Management Systems (BSS, OSS and SDP), Worldwide, 2007-2015, 2Q11 Update"

Gartner, Inc. | G00214660

Page 137 of 139

This research is part of a set of related research pieces. See Gartner's Hype Cycle Special Report for 2011 for an overview.

Page 138 of 139

Gartner, Inc. | G00214660

Regional Headquarters
Corporate Headquarters 56 Top Gallant Road Stamford, CT 06902-7700 USA +1 203 964 0096 Japan Headquarters Gartner Japan Ltd. Aobadai Hills, 6F 7-7, Aobadai, 4-chome Meguro-ku, Tokyo 153-0042 JAPAN +81 3 3481 3670 Latin America Headquarters Gartner do Brazil Av. das Naes Unidas, 12551 9 andarWorld Trade Center 04578-903So Paulo SP BRAZIL +55 11 3443 1509

European Headquarters Tamesis The Glanty Egham Surrey, TW20 9AW UNITED KINGDOM +44 1784 431611 Asia/Pacific Headquarters Gartner Australasia Pty. Ltd. Level 9, 141 Walker Street North Sydney New South Wales 2060 AUSTRALIA +61 2 9459 4600

2011 Gartner, Inc. and/or its affiliates. All rights reserved. Gartner is a registered trademark of Gartner, Inc. or its affiliates. This publication may not be reproduced or distributed in any form without Gartners prior written permission. The information contained in this publication has been obtained from sources believed to be reliable. Gartner disclaims all warranties as to the accuracy, completeness or adequacy of such information and shall have no liability for errors, omissions or inadequacies in such information. This publication consists of the opinions of Gartners research organization and should not be construed as statements of fact. The opinions expressed herein are subject to change without notice. Although Gartner research may include a discussion of related legal issues, Gartner does not provide legal advice or services and its research should not be construed or used as such. Gartner is a public company, and its shareholders may include firms and funds that have financial interests in entities covered in Gartner research. Gartners Board of Directors may include senior managers of these firms or funds. Gartner research is produced independently by its research organization without input or influence from these firms, funds or their managers. For further information on the independence and integrity of Gartner research, see Guiding Principles on Independence and Objectivity on its website, http://www.gartner.com/technology/about/ ombudsman/omb_guide2.jsp.

Gartner, Inc. | G00214660

Page 139 of 139