Sunteți pe pagina 1din 72

FEBRUARY 2015 VOL. 13 ISS.

02
CYBERTREND.COM

WHY LOOK
TO HYBRID
CLOUD?
IMPROVE EFFICIENCY
WITH FEWER RISKS

WHOLE CHICKEN
Tecumseh Farms Smart Chicken is truly the most natural chicken in the United States. All Tecumseh Farms
products are raised without the use of animal by-products, antibiotics, or hormones, are 100% all-natural, and are
processed using puried cold air instead of adding non-potable waterthats the air-chilled difference.

WWW.SMARTCHICKEN.COM

Volume 13 : Issue 2 : February 2015

17

THIS IS THE YEAR OF


HYBRID CLOUD COMPUTING
8 COVER STORY
IT outsourcing and its underlying complexities
12 BUSINESS
the evolution of EMC from storage to cloud
services
22 CLOUD
security issues related to cloud computing
and how organizations deal with them
26 MOBILITY
a survey of the most popular approaches to
enterprise mobility
30 DATA
the importance of real-time data in today's
business analytics solutions

CONTACT US
P.O. Box 82545
Lincoln, NE 68501
or
120 W. Harvest Drive
Lincoln, NE 68521

TECHNOLOGIES TO WATCH
IN 2015
34 ENERGY
the latest news and research into energyconscious tech
36 IT
how to get more value out of your current and
outdated IT hardware
38 NETWORKING
how to add a guest Wi-Fi hotspot to your
company's wireless network

52 STORAGE
enterprise backup strategies and networkattached storage solutions for small and
midsize businesses
58 ELECTRONICS
the latest in premium consumer electronics
60 TIPS
smartphone, social media, and business
travel tips

40 SECURITY
learn about security and virtualization,
disaster recovery planning, how identity theft
works, and unified threat management

Advertising: (800) 247-4880


Fax: (402) 479-2104
Circulation: (800) 334-7458
Fax: (402) 479-2123
www.cybertrend.com
email: feedback@cybertrend.com

Copyright 2015 by Sandhills Publishing Company. CyberTrend TM is a trademark of Sandhills Publishing Company. All rights reserved.
Reproduction of material appearing in CyberTrend TM is strictly prohibited without written permission.

Gartner: IT Spending On Pace


For Growth In 2015
Gartner expects a 2.4% increase in
worldwide IT spending in 2015 compared to 2014, according to the research firms updated Worldwide IT
Spending Forecast. Gartner had previously forecast 3.9% growth; despite this
adjustment, Gartner says IT spending
is on pace for reasonable growth in
2015. Gartner anticipates the most significant growth areas will be in enterprise software (5.5% in 2015 over 2014)
and all types of devices (5.1% growth).
According to Gartners announcement,
the firm expects [more] price erosion
and vendor consolidation in the $335
billion 2015 enterprise software market
because of fierce competition between
cloud and on-premises software providers. Gartner says total IT spending
should reach $3.8 trillion in 2015 compared with $3.7 trillion last year.

Ready Or Not, The Internet Of Things Has Arrived


The IoT (Internet of Things), which involves the interaction of sensors, the Internet,
and applications, will, in the near-term, impact utilities, manufacturing, and the government, in that order, according to Gartner. However, the research firm adds, IoT will
impact all of us, including businesses, in a big way. Gartner forecasts that there will be
4.9 billion connected things in 2015, a 30% increase over last year. Within a few years,
Gartner adds, there may even be a standard expectation among buyers that all manner of
things include intelligence and connectivity features. The accompanying chart illustrates the
number (in billions) of connected things Gartner expects will exist, organized by category.
CATEGORY

2013

2014

2015

2020

Automotive

96

189.6

372.3

3,511.1

Consumer

1,842.1

2,244.5

2,874.9

13,172.5

Generic Business

395.2

479.4

623.9

5,158.6

Vertical Business

698.7

836.5

1,009.4

3,164.4

Grand Total

3,032

3,750

4,880.6

25,006.6

1&1 Introduces New Business


Line, Powered By Dell Servers

Thin-Client Market Takes A Dip,


But Growth Is Expected

2014 Good Year For


Semiconductor Revenue, DRAM

In its new business line of dedicated


servers geared toward small and midsize
businesses, Web hosting firm 1&1 has
incorporated Dell servers that use Intels
newest Haswell processors. The combination of the new Dell servers and the allnew Intel Xeon generation [processors]
is capable of delivering performance improvements of up to 30% over the previous
generation, says Robert Hoffmann, CEO
of 1&1 Internet. The companys dedicated
servers, which include upgraded DDR4
memory and optional SSD storage, are
designed to provide the levels of high performance and security businesses require.

The most recent report from IDC covering thin and terminal client shipments
shows a drop of 1.8% in 2014 Q3 compared
with 2013 Q3. This was unexpected as IDC
had originally anticipated 5% growth for the
quarter. IDC points to the winding down
of Windows XP migration projects, which
had been a contributing factor as some
users switched from PCs into thin clientbased computing, as a factor. When the results are in for the 2014 calendar year, IDC
expects to see a modest 1.2% growth compared with 2013. IDC expects the zero client
portion of the market (terminals without
OSes) to resume volume growth this year.

Worldwide semiconductor revenue


reached about $340 billion last year, up
7.9% from 2013, according to preliminary results from Gartner. The memory
market saw the best growth among all
device categories, up 16.9%, Gartner reports. DRAM vendors outperformed the
rest of the semiconductor industry, with
a 31.7% growth in revenue as undersupply and stable pricing continued, says
Andrew Norwood, research vice president at Gartner. Norwood says DRAM
revenues of $46 billion last year marked
an all-time high surpassing the previous
record set back in 1995.

February 2015 / www.cybertrend.com

Tablet Sales Continue To Slow, But PC Sales Are Slower


Shipments are increasing for all types of computers, according to Gartner, but at a slower pace than in past years. After the PC market
began its sharpest decline, the tablet market came forth to boom in 2013, but has since tapered off dramatically. As Ranjit Atwal, research
director for Gartner, observes, The collapse of the tablet market in 2014 was alarming. In the last two years global sales of tablets were
growing in double-digits. Atwal cites the extended life cycle of tablets, software upgrades, and lack of innovation in new tablets among the
forces impacting the tablet markets decline. PC sales, meanwhile, including both desktop and laptop computers, continue to grow slowly,
but Gartner anticipates a better year for this category. From 2015, says Atwal, we expect Windows to grow faster than iOS, as the PC
market stabilizes and the challenge for the next iPhone to find significant growth becomes greater, narrowing the gap between the two operating systems. The accompanying chart shows Gartners current and forecast figures for shipments, in millions.
CATEGORY

2014

2015

2016

Traditional Desktop & Laptop Computers

279

259

248

Premium Ultramobile PCs

39

62

85

PC Market Total

318

321

333

Tablets

216

233

259

Mobile Phones

1,838

1,906

1,969

Other Mobile Computers

11

Mobile Market Total

2,060

2,148

2,239

Overall Computer Market

2,378

2,469

2,572

The Year Ahead For Mobile


Enterprise Applications

Enterprise & Government Push


Wireless LAN Sales Higher

Big Data Requires Fast Access,


Experimentation

Organizations are increasingly optimizing


their enterprise applications to meet the demands of mobile users, according to research
firm IDC. Among IDCs 2015 predictions
for mobile enterprise applications, the firm
expects that more than 50% of large organizations will use EMM (enterprise mobility
management) solutions to secure their apps
and data, and that IT departments will allocate at least 25% of their software budget,
overage, to mobile app capabilities. John
Jackson, IDC program vice president for mobility research, says, Organizations that execute effectively will be positioned to enable
innovation across all facets of their business.

With organizations investing more in


employee mobility, it follows that investments in wireless networking technology
would also rise. In its most recent quarterly
report on the WLAN (wireless LAN) market,
Infonetics Research says demand for Wi-Fi
had increased 14% in 2014 Q3 compared to
2013 Q3. Use of the new 802.11ac standard is
increasing especially quickly, with 10 times as
many 802.11ac access points (about 1 million
units altogether) shipping in 2014 Q3 compared to the year-ago quarter. According to
Infonetics, this upward trend will continue,
thanks in large part to enterprise and government customers.

When it comes to big data, organizations that are setting themselves apart
as leaders are changing how they measure operations, customer interactions,
and resource allocations, says Dan Vesset,
IDC program vice president for business
analytics and big data research. As part of
that, theyre working to get faster access
to more relevant data and undergoing
constant experimentation, he says. IDC
predicts that, by 2018, enterprises will be
required to invest in visual data discovery
tools and, within five years, hybrid on/offpremises big data and analytics solutions
will be required.

CyberTrend / February 2015

Wireless Personal Area


Networking Market Expanding
The market for devices using the
WPAN (wireless personal area networking) standard technically known as
802.15.4 will grow to five times its current size in the same number of years,
according to ABI Research. With 425
million devices as of late 2014 and 2.1
billion predicted by the end of 2019, as
ABI forecasts, the world of mobile networking will start to look markedly different. Andrew Zignani, research analyst
at ABI, says, The low-power, low-cost,
mesh networking capabilities of 802.15.4
networks make them the primary wireless technology choice across a number of
different vertical markets including home,
building and industrial automation, smart
metering, and home entertainment. A
similar wireless standard called ZigBee
will face stiff competition from 802.15.4,
adds ABI.

Trying To Reach Travelers? Dont Neglect Mobile Devices


According to research from ShareThis, travelers tend to be social about their travel
plans, and travel recommendations posted on social media often translate into clicks
for businesses. Between June and September 2014, ShareThis tracked 52 million people
sharing travel-related information on social media sites, which amounted to roughly
1.9 billion social signals, meaning posts, likes, retweets, etc. Furthermore, ShareThis
examined a subset of users, about 3.2 million people who searched for or viewed travel
planning content online during previous weeks. Overall, ShareThis found that while 10%
of users shared travel content via social networks, 19% of those who were planning a trip
shared travel content. ShareThis also pointed out that users are more likely to seek out
information on desktop browsers but share it via mobile devices, and on average each
share generated 18 click-backs, or 40% more than other types of social media shares. The
accompanying chart shows the breakdown of researching vs. sharing.
Tablet
Travel Info Sharing

10%
41%

Smartphone

49%

Desktop
Tablet
Travel Research

13%

Smartphone

60%

Desktop
0%

26%

60%

Greater Affordability Bolsters


Emerging Market Tablet Sales

Analysts Offer Mobile Payment


Predictions

PC Shipments Show Modest


Year-Over-Year Growth

In emerging markets around the


world, the average selling price of tablet
devices declined 7.8% from 2013 to 2014,
according to ABI Research. When combined with increased earnings among
buyers, says ABI, tablet computers became 30% more affordable over the
same time period. ABI points out that
the most dramatic change occurred in
Indonesia, where it took approximately
four weeks of income on average to
purchase a tablet in 2013, a time period
that dropped to 1.5 weeks in 2014. ABI
Research expects this trend to continue
throughout this year.

There were 72 billion mobile commerce


transactions worldwide in 2014, according
to Juniper Research, and the firm forecasts that this figure will reach 195 billion by 2019. In terms of dollars, Forrester
Research puts the 2014 total at $52 billion, with $142 billion expected by 2019.
Analysts with both research firms agree
that the fastest growth will occur in pointof-sale and other in-person transactions,
but that the largest increase in transaction
volume will take place in remote payments,
including such things as in-app purchases.
Juniper also sees pent-up mobile commerce demands in ticketing services.

One percent growth is a small increase to


get excited about, but for a PC market rattled
by the economic downturn and a boom in
tablet sales, its a significant figure. According
to Gartner, the research firm which identified the 1% PC market growth in 2014 Q4
over 2013 Q4, this indicates a slow return to
growth after two-plus years of decline. The
PC market is quietly stabilizing after the installed base reduction driven by users diversifying their device portfolios, says Mikako
Kitagawa, principal analyst at Gartner. With
tablet sales having peaked in some markets,
Kitagawa says, consumer spending is slowly
shifting back to PCs.

February 2015 / www.cybertrend.com

STARTUPS
Agricultural Tech Startup Seeks To Help Farmers
Better Understand Their Crops

Find Out More About Your Users


Before They Leave
If your company offers Web-based services, you may wonder if theres a better
way to understand how your customers
are interacting with those services and
why some of them leave. Framed Data, a
San Francisco-based startup, developed a
prediction engine to monitor usage patterns and identify those users who might be
thinking of leaving. Framed Data recently
raised $2 million in seed funding from
Google Ventures, Innovation Works, Jotter,
and NYU Innovation Venture Fund, and
numerous angel investors. The companys
next phase will be to move beyond providing
predictive data to offer specific recommendations, or prescriptive information.

New and innovative analytics solutions are becoming


available for most industries, including agriculture. FarmLogs, a
startup based in Ann
Arbor, Mich., offers
a cloud-based service
that makes detailed
farmland information available to users
via multiple device types. Users can view yield status, precipitation statistics, soil maps,
inventory, and crop performance data at their computer or on the go. The platform also
organizes information relevant to crop planning and budgeting and presents the data in
easy-to-use dashboards. FarmLogs recently received $10 million in Series B funding from
investors including Drive Capital, Huron River Ventures, Hyde Park Venture Partners,
SV Angel, and Sam Altman, bringing . That brings total investments in FarmLogs to $15
million. The company also claims customer use its software to manage more than $12 billion worth of crops. We will continue to add great people to our team of engineers, data
scientists, and designers, said Jesse Vollmar, FarmLogs CEO and co-founder, in a press
release. Having additional capital behind us accelerates our ability to bring the best science
and technology to every farm through intuitive software.

Startup Recommends Examining


Actions, Not Page Views

Company Offering Private Cloud


As A Service Raises $4M More

Seismic Wants To Make It Easier


For You To Close More Deals

San Francisco-based startup Mixpanel


says it analyzes 34 billion actions in its
2,800-plus customers mobile apps and
websites each month. These actions, says
Mixpanel, include far more than the simple
page view and installation data that competing solutions track. Its the deeper data,
such as search results and user types (say,
enterprise vs. consumer), that Mixpanel offers in various panel views, so that businesses
get more meaningful insights into their customers app and website usage. Mixpanel
recently received $65 million in new funding,
which it will use to improve its platform.

In May 2014, Blue Box announced the


general availability of its Blue Box Cloud
service, a PCaaS (private cloud as a service)
solution, including dedicated firewalls and
virtual networking, for businesses of all
sizes. Now the Seattle-based company has
announced the completion of a Series B financing round totaling $14 million, thanks
to a new $4 million investment. Blue Box will
use the financing to hire more engineering,
marketing, sales, and business development
personnel to further what CEO Matthew
Schiltz calls its consistent, reliable, efficient,
and agile private cloud infrastructure . . .

Are your marketing materials up-to-date?


Do they contain, for example, the latest logos
and branding, company statistics, sales forecasts, and pricing? San Diego-based startup
Seismic offers a platform designed to keep
your sales deck current at all times, drawing
the latest information from CRM, Excel,
SharePoint, and other sources, as well as updated content from the marketing team, and
updating electronic marketing documents in
real-time. Seismic recently raised $20 million
in Series B funding led by JMI Equity. CEO
Doug Winter says the financing will go toward continued innovation and growth.

CyberTrend / February 2015

Hybrid Cloud Computing


ESTABLISHING A DEFINITION & DISCOVERING THE TRUE BENEFITS OF A HYBRID APPROACH

JUST AS COMPANIES are starting to get

KEY POINTS
Hybrid cloud computing has
varying definitions, but the goal is
to give your business more agility
and scalability in terms of performance and access to resources.
Hybrid cloud environments let
you keep your sensitive data and
workloads onsite while moving
other workloads to the public cloud
for less expensive storage and
equal or better performance.
IT needs to establish a centralized process for managing and
monitoring the hybrid environment.
Cost savings are possible, but
you need to consider the other
major benefits of the hybrid cloud
first when making a decision.

February 2015 / www.cybertrend.com

a better understanding of public and private cloud environments, the concept of


hybrid cloud computing is growing in
popularity and changing the way companies view the cloud in general. The hybrid
cloud aims to blur the line between public
and private as well as on-premises or offpremises, but it can be difficult to determine what that actually means and how
adopting hybrid cloud computing will
impact your business. It helps to revisit
the technology at its core, try to understand it at a foundational level, and then
discover the many potential use cases and
benefits of implementing a hybrid cloud
computing strategy.

Hybrid Means Different Things


To Different People
Given how popular cloud computing
is today and how many companies are
using the hybrid variation of technology,
one would think there would be a wellestablished definition that clears up any

potential confusion around how hybrid


can benefit businesses. But the fact of the
matter is that as more and more vendors
and providers enter the market, the definition of hybrid cloud computing only
gets . . . cloudier.
Amy DeCarlo, principal analyst at
Current Analysis, for example, says that
her company follows the NIST (National
Institute of Standards and Technology)
definition of hybrid cloud, which is a
composition of two or more distinct
cloud infrastructures that remain unique
entities, but are bound together by standardized or proprietary technology that
enables data and application portability.
In other words, if there are two different,
separate cloud environments that allow
you to migrate data from one to the other
or access resources using certain tools and
solution, then its probably a hybrid cloud
arrangement.
Ed Anderson, research vice president
at Gartner, agrees with DeCarlo on this
definition, and adds that the explanation

of hybrid being an on-premises private cloud combined with an off-premises public cloud is actually a limited
definition and is only a subset of a
broader, cross-vendor hybrid cloud definition that is more aligned with the way
most organizations are constructing hybrid clouds. And in addition to being
two separate clouds that can work together, Anderson considers a situation
where you take two or more cloud
service offerings and combine them together to function as if they were a single
cloud service as hybrid cloud computing as well.
But these definitions of hybrid cloud,
in which at least two separate cloud environments must be involved, are not
the only ones out there. In fact, Dave
Bartoletti, principal analyst at Forrester
Research, says that Forresters definition of hybrid cloud is very direct
and involves a situation where a public
cloud service is connected to any other
type of system or service. This definition opens itself up to quite a few other
potential hybrid scenarios that arent
covered in the two or more cloud environments alternative.
To us, it means any public cloud service connected to anything else, says
Bartoletti. That could be another public
cloud service, a private cloud service,
or it could be connected back to a data
center application or database that is not
in a cloud at all. We take that approach
because what we mean by hybrid cloud
is not descriptive enough. Whats important for people to talk about is what
is the hybrid application use case youre
trying to solve? Hybrid could be private
and public, it could mean multiple public
clouds, and it could also mean a public
cloud resource connected to something
in your own data center.
Bartoletti digs a bit deeper into why
there are so many different definitions
for the hybrid cloud by explaining that
it comes from looking at it from a consumer standpoint, a vendor standpoint,
and a service provider standpoint. For
example, when an app and software developer looks at the hybrid cloud, they
see it as an opportunity to gain access

Most of these implementations are too new to really


derive good numbers yet, but, anecdotally, I think, today
the movement to the cloud is as much about agility as
cost. Businesses expect to save on capital expenses and
maintenance in managed cloud environments but they
also may incur some steep design and migration costs.
There are also concerns about commoditized IaaS [infrastructure as a service] pricing, which is dirt cheap today,
skyrocketing in the future, [with] customers getting caught
at a terrible disadvantage.
AMY DECARLO
Principal Analyst : Current Analysis

to resources they wouldnt normally


have in a data center-only environment.
Bartoletti says a developer might not be
able to put certain data in public cloud,
but in a hybrid arrangement, they
could at least put some resources in the
cloud and still get that level of elasticity
without jeopardizing the other, more
sensitive aspects of the project.
When vendors, particularly those who
offer software or cloud management
platforms, look at the cloud, they see hybrid cloud as a way to manage resources
across multiple locations from one location, Bartoletti says. For instance, you
could be gathering monitoring data from
one environment and performance data
from another, but then you have one
central platform or console to see all
of that information. Their view of hybrid would be to bring it together into a
single location, but the service could live
in a bunch of different clouds.
And finally, when a cloud provider
looks at the hybrid cloud, they often see
it as a way to offer multiple flavors of
cloud. Instead of simply offering a pure
public cloud, a hosted private cloud, or
managed private cloud services, they can
also bring in the ability to combine multiple clouds. But regardless of which of
these definitions you follow, the key to the
technology is the same. Its a way to combine multiple environments, to increase
flexibility, scalability, and elasticity, and
gain access to resources as you need them.

Unique Problem-Solving
Benefits Of Hybrid
When it comes to the benefits of implementing a hybrid cloud strategy, first
and foremost is the ability to run related
workloads in two separate operating environments without there being a major
performance disadvantage. And it also
allows companies to take advantage of
the public cloud without jeopardizing
mission-critical data. Hybrid cloud
computing allows some sensitive application workloads to run in a more
secure and protected environment while
making it possible for the business to
run other applications in lower-cost
cloud multi-tenant cloud infrastructures, DeCarlo says.
Another unique benefit of the hybrid
cloud is that because companies can
put different data and apps in different
locations, they can actually pair application and workloads scenarios with
the cloud environment that best address
their needs, Anderson says. This gives
companies a best-of-breed approach
that ensures the highest possible level
of performance, regardless of where the
workload is hosted.
Anderson does warn, however, that
hybrid models can introduce complexity, but that when good hybrid
cloud tools are used and when organizations plan for and implement a hybrid
cloud framework, this complexity can
be addressed.

CyberTrend / February 2015

Bartoletti agrees that perhaps the best


benefit of using the hybrid cloud is that
ability to match workloads with different
environments, but he offers a specific example of how this might work. Lets say
Im building a mobile application and
I want the elasticity and the scalability
of the cloud, he says. [The cloud is]
a great place to host a mobile application back end, but the mobile application
might require data from a customer database that I have inside my data center.
I could extract that data, directly connect the mobile service to that database
with an API [application programming
interface] to get to the data, or I might
extract some of that data and put it in
another cloud where Im doing big data
processing. In other words, with a hybrid approach, the application you build
could be developed and hosted in the
cloud but still have access to all of the information in needs from your on-premises data center infrastructure.
Bartoletti says that the hybrid cloud
lets development and application teams
get to market faster with new features,
but it also means they have to think very
differently about how they architect and
design applications. You cant take a traditional application designed for physical
infrastructure and simply migrate parts
of it over the cloud. You have to actually
redesign that application from the ground
up so that it can be naturally split into different components that can be hosted in
different types of environments.
In a typical three-tier application,
you have the data tier, the application
server tier, and the Web front-end tier,
says Bartoletti. You want to make sure
you have strong decoupling between
those components so that you now can
easily move the Web tier out to a public
cloud for elasticity, but keep everything
else inside. If you have very strong coupling between those components and
dont have well-defined API interfaces
between them, then its going to be hard
to decouple those apps. The hybrid cloud
really lets you build modern style applications because you now have the
freedom to chop them up into bits and
run each piece on exactly the right cloud

10

February 2015 / www.cybertrend.com

Hybrid clouds can be implemented by either IT


organizations or by providers. When a third-party
implements a hybrid cloud and then delivers that hybrid
cloud experience to their client, we generally refer to
that as cloud services brokerage. Whether an organization decides to implement a hybrid cloud environment,
or use a third party to do it for them, is really a function
of what role they want to play and what responsibilities
they want to retain. We see both.
ED ANDERSON
Research Vice President : Gartner

that gives you the right security, performance, and other profiles.

How Hybrid Impacts


Your IT Team
In addition to impacting your application development teams, the hybrid cloud
will also impact your IT team as a whole.
Anderson stresses that the role of any IT
department is to deliver the technology
solutions needed to support business operations. And in the same way developers can put different components of
their apps into different cloud environments, adopting a hybrid cloud approach
gives IT departments the flexibility to seek
out and incorporate the right cloud technologies for any given scenario, he says.
With this added flexibility and access
to resources comes a potential role change
for the IT team as they work to manage
the uneven handshake, Bartoletti says.
As you move different workloads or, at
a more granular level, different components of applications, your IT team needs
to communicate with the cloud service
provider or cloud software vendor to determine the lines of responsibility. This
fundamentally changes the way IT has to
look at its company-wide infrastructure.
The challenge for hybrid cloud from ITs
standpoint is to start building a consolidated view of IT operations that spans
public, private, and hybrid infrastructures, Bartoletti says.
To help with these monitoring and
management challenges, Bartoletti recommends that companies develop a

monitoring framework and then have


all of your cloud vendors contribute to
that. This means taking an independent
view of operations, instead of focusing
on one clouds monitoring and management offerings. You have to develop a
game plan and then find cloud providers
that will fit that mold. IT teams should
take a look at the current tools they use
to manage their data center infrastructure and talk to those tool vendors about
whether or not they support hybrid
today, he says.

Cost-Saving Potential
& Other Benefits
Anderson says that the two main reasons that most organization adopt cloud
computing is for agility benefits and
cost reduction. And while you can certainly see an increase in agility and flexibility with a move to the hybrid cloud,
its the cost savings aspect that is more
elusive, unless you put a strong foundation in place and use it in the right way.
Most organizations realize that cloud
computing may not result in cost savings,
but will produce other types of benefits,
agility being the primary benefit, says
Anderson. Hybrid cloud environments
can be part of the solution for recognizing
cost savings, but can also produce the
other benefits I mentioned above.
DeCarlo agrees that, anecdotally, the
movement in the hybrid cloud space
today is as much about agility as cost,
but says that most of these implementations are too new to really derive good

numbers in terms of concrete savings.


She warns that businesses expect to save
on capital expenses and maintenance in
managed cloud environments, but they
also may incur some steep design and
migration costs. Theres also a potential in the future for the price of commoditized IaaS offerings, which DeCarlo
says is dirt cheap, today, to potentially
skyrocket, so you have to be careful with
how you implement hybrid cloud computing to get the most benefit out of it,
cost-related or otherwise.
Ultimately, how much money you
save with a move to hybrid cloud depends entirely on how you plan to use
the technology as well as how youll incorporate it into your overall business
approach in the future. Bartoletti says
that some companies see the hybrid
cloud as transition technology where
they can move part of an application to
the public cloud now with the long-term
goal of moving everything to the cloud
later on. These companies have the potential to benefit more from the migration than those that only justify cloud
usage based on cost savings.
In the case of the hybrid cloud, companies may not see as much savings as
they would in a public cloud arrangement because its still necessary to host
some infrastructure onsite. If you
have an application that requires 100
virtual machines to run it and at first
you cant move the data tier out of your
data center, or decide you dont want
to and you only move 50 VMs [virtual
machines] to the public cloud, youre
probably going to save money paying for
only what you use in the public cloud for
those 50 as opposed to moving the whole
thing to the public cloud right away,
Bartoletti says.
Bartoletti stresses that determining
whether or not to use a hybrid cloud
model based solely on cost savings is too
limiting and too general. Its true that
companies will more than likely save
money by moving certain workloads to
the cloud because the cloud platforms
are so much more efficient than data centers are, but this is only one part of the
puzzle. Instead of focusing on cost alone,

The cloud is probably going to save you money for almost


any workloads you move there because the cloud platforms
are so much more efficient than data centers are. We like to
encourage people that if youre justifying a move to hybrid
cloud, just like public or private, justify it first on benefits to
the business in terms of shorter release times, faster provisioning of infrastructure, and the cost savings for operations team members not setting up and installing servers
anymore. Youre going to get those raw cost savings, but
that shouldnt be at the top of your business case.
DAVE BARTOLETTI
Principal Analyst : Forrester Research

you have to look at the benefits you are


receiving from the cloud from a performance, agility, and flexibility standpoint.
With a proper hybrid setup, you should
be able to realize performance gains and
cost savings at the same time, but you
might also have a situation where you
only break even. You have to weigh the
potential cost savings against the benefits.
Its important when youre building
your business case for cloud, whether its
public, private, or hybrid, that you try
to attach value to time to market, new
feature releases, and gaining a competitive advantage. Dont just try to justify
your cloud journey on infrastructure cost
savings. Justify it first on benefits to the
business in terms of shorter release times,
faster provisioning of infrastructure,
and the cost savings for operations team
members not setting up and installing
servers anymore. Youre going to get
those raw cost savings, but that shouldnt
be at the top of your business case.

Hybrid Cloud Continues


To Evolve & Grow In Popularity
The reason why there are so many different definitions for hybrid cloud, and
why there are more hybrid cloud vendors
and providers that ever before, is because
the technology itself is becoming more
popular in the business world. Anderson
says organizations are realizing that one
single cloud environment is unlikely to
meet all of their needs, and with this

realization, those organizations have discovered that implementing a combination of cloud services, configured as a
hybrid cloud, is most likely to meet their
needs. In fact, Anderson says that according to Gartner survey data, nearly
75% of organizations believe they will
eventually implement a hybrid solution.
In addition to the popularity growth
spurt, the way companies are using the
cloud is changing as well. Many organizations began their cloud deployments
with private cloud implementations
that might from time to time require
additional resources from an external
third-party public cloud, such as during
cyclical or seasonal periods with an extraordinary number of transactions,
DeCarlo says. She adds that many businesses and government entities are also
looking at the hybrid cloud as a way to
allow different applications running in
different clouds to tap into highly sensitive data [that] might be running in a
private cloud environment.
As these use cases continue to grow in
popularity and new ones are introduced,
even more providers will enter the market
and offer hybrid solutions for businesses.
The key is to not lose sight of what hybrid cloud computing is supposed to be
about. As long as you are finding ways to
spread out workload, gain access to new
resources, and use the right cloud environments for the right tasks, then you are
on the right track.

CyberTrend / February 2015

11

Technologies To Watch In 2015


NEW DEVELOPMENTS TO LOOK FOR IN THE YEAR AHEAD

ASK ANY BUSINESS owner, executive,


professional, or entrepreneur the one
ability theyd most like to possess if anything was possible and at least one response would surely be the ability to look
into the future. While this obviously isnt
possible, weve attempted the next best
thing by asking a group of analysts and researchers across numerous industries what
technologies business owners, executives,
and others should have on their radar in
2015 and why.

Analytics & Big Data


Analytics and big data were dominant
business themes in 2014, and both promise
tremendous potential. Dave Cullinane,
Cloud Security Alliance board member,
for example, says leveraging big data analysis can grant organizations incredible insights into their customers and markets.
Conversely, theres potential to raise significant privacy issues as consumer-buying
data, for example, is aggregated and analyzed, he says.

12

February 2015 / www.cybertrend.com

Holger Mueller, Constellation


Research vice president and principal
analyst, hails big data as the foundation of next-generation enterprise
software, in part because for the first
time its allowing enterprises to store
all information regardless of relevancy
in one structure and query-able way.
This enables insight potential not previously available, as well as the ability
to achieve better information business
agility, Mueller says.
While gaining greater value from
information and IT investments via big
data/analytics is attractive for all companies, Charles King, Pund-IT principal analyst says this is especially so
for those under increasing pressure to
lower expenses and raise bottom lines.
Numerous vendors, he says, are aiming
to deliver analytics solutions that virtually any employee can easily use, and
2015 could be the year products and
services in this category make their
marketplace debut.

Wesley McPherson, Info-Tech


Research Group research manager,
cites analytics in terms of security as
something to watch for. Specifically,
security analytics is enabling organizations to use security data already
available to better ID security incidents
and understand their points of exploit.
Accuracy and depth are among the primary improvements over existing security-focused models, he says. Overall,
security analytics is entering a frontier
of machine learning and advanced recommendations, McPherson says.

Cloud Computing
The cloud isnt a new topic by any
means, but it will remain extremely
relevant in 2015. Mueller views the
cloud as the only way to build nextgeneration applications, partly because
on-premises infrastructure lacks the
scale and technical capability and isnt
financially viable for even exploratory
trials of next-generation applications.

From a business perspective,


Cullinane says, leveraging cloud technologies and abilities will be essential
to staying competitive with others leveraging cost-savings from the cloud
to gain market share. King, meanwhile,
points to hybrid cloud developments
as noteworthy. While still in its early
stages, this market became increasingly
competitive in 2014 due to offerings
from major cloud service providers,
he says. As a result, hybrid cloud is becoming increasingly complex but also
a buyers market that organizations
would be foolish to ignore, King says.
In organizations where technology
has become a constraint rather than
an enabler due to overly long purchasing and provisioning cycles, Clive
Longbottom, Quocirca founder, says
owners and executives should look

APPLICATIONS
Moving forward, Alan Boehme, Cloud
Security Alliance board member, says
the intersection of mobile, cloud, and
video, along with predictive analytics,
will impact enterprises in a major
way. The need to provide secure,
composite-built applications delivered
through real-time, API-driven platforms
and the associated security challenges
of operating in a perimeter-less environment are challenges for 2015 and
beyond, he says.

COLLABORATION
Collaboration gained many organizations attention last year. Expect collaboration and productivity tools to do
the same this year, including where
they intersect with analytics, says
Alan Lepofsky, Constellation Research
vice president and principal analyst.
Employees are overwhelmed with
content, he says. To help them work
more effectively, next-generation collaboration tools will use analytics to
help sort, filter, and prioritize what
employees are working on.

The ability to provide a PC experience for the cost of the


phone exists, and the promise of an appliance-like experience, when achieved, is very compelling.
ROB ENDERLE
President : Enderle Associates

Businesses are finding they need the control they lost


with BYOD, and court rulings are forcing them to pay for
data plans anyway.
JIM RAPOZA
Senior Rsearch Analyst : Aberdeen Research

to the public cloud, which can decrease these cycles to days or hours
vs. weeks or months. Similarly, Bob
Tarzey, Quocirca analyst and director,
says organizations that arent considering the public cloud for new IT
deployments should question their
reluctance. Quocirca research indicates a continuous shift in acceptance
of public cloud as a viable option for
many IT deployment types, Tarzey
says. Furthermore, public cloud platforms are more secure and stable than
the ones many organizations can produce in-house.

Customer Experience
Mastering customer recordsrelevant information such as customer interactions, purchases, preferences, and
contextwill become increasingly important this year, says R Ray Wang,
Constellation Research chairman.
Examples include loyalty technologies,
master data management, and master
marketing profiles.
To better engage customers, Natalie
Petouhoff, Constellation Research
vice president and principal analyst,
suggests keeping an eye on customerfacing technology as it relates to multiple facets: advertising, marketing,
sales, customer service, after-service,
and data analytics. Also watch for
employee-facing technology that enables the back office to empower the
front office with relevant content; this
includes information about customer

awareness, interest, purchases, loyalty,


advocacy, and referrals.
Also worth noting, Petouhoff says,
is the use of data and analytics with
dashboards that show customer-facing
and internal-facing key performance
indicators. These can enable every
level of employee to know whether
theyre on track and thus drive unprecedented transparency to how well
the company is doing. Without giving
employees resources to provide successful customer interactions, she says,
keeping pace with the competition
becomes nearly impossible. In short,
Petouhoff says, product and price are
commodities, and its the personal
touch and emotional bond to a brand
that makes the loyalty, advocacy, and
referral process that drives better margins, profits, and revenue.

Internet Of Everything
Starting in 2015 and on through
2020, Longbottom says, intelligent
businesses will start implementing
more Internet-connected devices and
systems. Planning, however, is key,
otherwise networks will be unable
to cope with all the data chattering
around it, he says. Longbottom recommends a nodal model of remoteintelligent aggregation devices. That
is, devices that are intelligent enough
to decide what information to discard,
keep, and push forward as significant
events. Big data approaches will also
be needed to ensure that dealing with

CyberTrend / February 2015

13

the IoE isnt just a rearview mirror


system. Using intelligence derived
from analyzing the IoE data can give
valuable direction for the future, as
well, he says.
Mike Battista, Ph.D., senior manager
at Info-Tech Research Group, says the
IoE has finally reached a point of being
more action than talk. He says uses are
shifting from nifty Kickstarter projects to technology used in large companies across multiple industries to
incorporate massive interconnectivity
into products and services. Nearly
all businesses will find a clear use for
something IoE- or IoT (Internet of
Things)-related, if only to save a few
bucks on electricity, he says.

DATA
Joseph Pucciarelli, IDC vice president
and IT executive advisor, expects
data management and architecture
to be important trends in 2015. The
business value is in the data, not the
IT infrastructure, he says. With the
advent of new computing options,
including cloud services, its vital that
business leaders know where their
data is and how to access and protect
it. Monetization of data is the next
frontier, he says.

EMPLOYEES & WORKFORCE


If Holger Mueller, Constellation
Research vice president and principal
analyst, is correct, enterprise workforces will experience major changes
this year. The cost of people and
skills shortages of people will only
go up, and enterprises will respond
with more automation, offshoring,
and outsourcing, he says. Also expect new approaches to re-skill and
educate and next-generation talent
management that no longer falls into
functional areas but enables modern,
21st century business processes, such
as transboarding, he says.

14

February 2015 / www.cybertrend.com

. . . its the personal touch and emotional bond to a brand


that makes the loyalty, advocacy, and referral process that
drives better margins, profits, and revenue.
NATALIE PETOUHOFF
Vice President & Principal Analyst : Constellation Research

. . . technology-led disruption of long-standing industries


. . . is a result of waking up and realizing that extremely
powerful technology is suddenly in every persons pocket.
MIKE BATTISTA, PH.D.
Senior Manager : Info-Tech Research Group

As organizations move to capture


data from proliferating Internetconnected devices and potentially connected devices, expect IoE gateways
to provide communication translation
and security over the sensors IoT solutions require, says Rob Enderle, president, Enderle Associates.

IT, Data Centers & Technology


Data may indeed be an organizations most valuable possession, but
effectively extracting value from it requires a capable IT foundation and
modern technologies. Cullinane says
theres a huge savings potential for
companies in leveraging shared data
center or colocation approaches, which
also enable significant burst capacity
and peak load times. He cautions that
shared data center approaches do raise
issues regarding the mixing of various
companies data. Dedicated servers and
services mitigate this risk, however.
Elsewhere, look for SDN (softwaredefined networking) to continue to
blossom in 2015. Weve been talking
about it for a while, but it will continue to enter the tech mainstream,
says Jim Rapoza, Aberdeen Research
senior research analyst. As SDNs hype
cycle dies down, he says, look for more
real-world examples of how SDN can
change data centers and networks.
More broadly, Longbottom expects
interoperability among different applications, functions, and services spanning the entire IT platform to gain

attention. Everything needs to be capable of talking to everything else, even


outside of the organization, as it looks
to the value chain of suppliers and customers, as well, he says.
Enderle, meanwhile, cites cloud clients as noteworthy. Thin client and
networking technology have advanced
to the point that you can not only provide a PC-like experience for wired
employees, you can increasingly provide it to wireless employees, he says.
This means decreased costs, increased
security, and enhanced user satisfaction and productivity if done right.
The ability to provide a PC experience
for the cost of the phone exists, and the
promise of an appliance-like experience, when achieved, is very compelling, Enderle says.
Similarly, Battista expects technology-lead disruption of longstanding industries to make an
impact. Though not a specific technology, this movement is a result of
waking up and realizing that extremely
powerful technology is suddenly in
every persons pocket, he says. I suspect well see plenty of Uber for ___
technologies and services in the next
year, he says. Battista advises businesses to start contemplating how
ubiquitous technology (mobile devices
and IoT particularly) will affect their
industries.
Also expect machine learning and
AI (artificial intelligence) to generate
excitement. The machines are getting

smarter, and were moving to automate


at digital scale, says Wang. Battista
expects AI to continue its slow march
to overtaking and destroying every
technology radar. Though AI will take
small steps this year, he says, each
step is getting closer to something very
disruptive and important to businesses
and to the world.
In the communications realm, Tim
Banting, Current Analysis principal
analyst, expects organizations will start
looking more at cost-effective ways
to hold online team meetings via UC
(unified communications) soft clients, conferencing, etc.; recruit from a
wider talent pool and retain employees
via videoconferencing; and save real
estate and office costs thanks to remote and home working. I think well
see departments owning parts of the
IT budget and given the freedom to
procure their own services with the
blessing from IT, he says.

Mobile Devices & Mobility


Not surprisingly, numerous analysts tab mobility as a technology to
eye in 2015. Joseph Pucciarelli, IDC
group vice president and IT executive
advisor, dubs mobility mandatory
and the next-generation platform and
new business model. Alan Lepofsky,
Constellation Research vice president
and principal analyst, meanwhile, says

just providing employees a mobile experience for tools they currently use
isnt enough. Instead, he says, organizations must start viewing mobile as
a new way of working, rethinking entire business processes to find innovative ways to provide employees access
to people and content they require to
more effectively perform their jobs.
Peter Crocker, Smiths Point
Analytics founder, expects business
owners and executives will focus on
making mobile apps theyve built and
deployed more engaging and valuable. The number of mobile apps individuals use as a percentage of those
downloaded is very low, he says. Thus,
brands are realizing that app downloads arent a good measure of the
success of mobile strategies. A better
measurement is how often and how
long users engage apps, he says.
Crocker expects tools and services
that drive mobile app engagement to
be an important investment in 2015.
Some of the best are platforms that integrate messaging, analytics, and marketing automation, he says. Such tools/
services monitor app usage and use
predictive models to identify the most
opportune time and place to deliver
a personal and contextually relevant
message.
Device-wise, expect CYOD (choose
your own device) approaches and

The cost of people and skills shortages of people will


only go up, and enterprises will respond with more automation, offshoring, and outsourcing.
HOLGER MUELLER
Vice President & Principal Analyst : Constellation Research

. . . brands are realizing that app downloads arent a


good measure of the success of mobile strategies. A
better measurement is how often and how long users
engage apps.
PETER CROCKER
Founder : Smiths Point Analytics

SOCIAL MEDIA
Clive Longbottom, Quocirca founder,
sees social media fading as a means
for interacting with a broad swathe
of customers. Its now apparent that
the old ways are still best, he says.
Constellation Research vice president
and principal analyst Alan Lepofsky,
meanwhile, expects improved integration between social networking and
enterprise software. A desire to shift
conversations from email to more collaborative platforms with more integration of comments, likes, and real-time
chats into core business software will
continue, he says.

company-provisioned devices to gain


steam, says Rapoza. BYOD is losing
favor as a technology trend, he says.
Businesses are finding they need the
control they lost with BYOD, and court
rulings are forcing them to pay for data
plans anyway. More companies will
return to buying work phones for employees, and more employees will return to carrying two phones, he says.

Security
As in 2014, security will remain a
chief concern for organizations this
year. In 2015, Cullinane says, an ability
to quickly share incident information
(in minutes, not days or weeks) that
is truly actionable (specific data about
what you need to look for) will be
critical as cybercriminals and attacks
become more sophisticated. Sharing
information anonymously will be
essential in a post-Snowden era,
Cullinane says. Recent security incidents involving major corporations are
essentially only indicators of whats to
come, he says.
King says 2014 was a fire storm of
sometimes embarrassing, always injurious security breaches resulting in
huge losses for companies attacked.
The industry really needs to step up
with new ideas that are both usable and

CyberTrend / February 2015

15

effective for corporate customers, he


says. If not, increasingly dangerous,
rapidly evolving threats will make 2015
even more painful. Pucciarelli believes
2015 will be about protecting intellectual property and customer information, as well as reassuring customers/
partners that the organization is safe.
Once, he says, a company was assumed
to be a safe partner. Going forward,
organizations must proactively demonstrate they are, he says.
In terms of solutions, Michela
Menting, ABI Research practice director, expects more security management tools for Web applications.
Considering the widespread use of
cloud-based services such as Dropbox,
Box, and Google Docs, she says, expect an emergence of security-centric
management abilities via third-party
software or from the service providers
themselves to secure such services.
Rapoza expects new security authentication models to gain traction. Were
finally seeing recognition that security
cant continue the way it has, he says.
New authentication models, such
as FIDO [Fast IDentity Online], are
moving to replace passwords and other

Big data approaches will also be needed to ensure that


dealing with the IoE [Internet of Everything] isnt just a
rearview mirror system. Using intelligence derived from
analyzing the IoE data can give valuable direction for the
future, as well.
CLIVE LONGBOTTOM
Founder : Quocirca

[Regarding artificial intelligence,] each step is getting


closer to something very disruptive and important to
businesses and to the world.
R RAY WANG
Principal Analyst & Founder : Constellation Research

insecure authentication processes. On


a related note, as wearables and other
devices add health-tracking abilities,
expect them to include forms of biometrics (for example, fingerprint or
iris scans) and other standard authentication tools, he says.
Finally, McPherson expects more
companies will gravitate to MSS (managed security services), which provide the means to outsource security

monitoring and management to a third


party armed with dedicated teams to
monitor systems and networks 24/7.
Beyond enabling businesses to focus
on other organizational goals, MSS can
reduce costs related to security staffing
and budgets. Whats more, MSS providers generally possess much higher
levels of security and threat knowledge
than many organizations do internally,
he says.

THE BOTTOM LINE

16

Analytics & Big Data - Analytics solutions are increasingly valuable, extending
beyond unearthing business insights to
provide better security and decision recommendations.

Customer Experience - Determine


what customer-facing employees need,
including social tools, to improve engagement, and dont underestimate analytics
for learning more about your customers.

IT, Data Centers & Technology - Take a


careful look at how data center colocation,
SDN (software-defined networking), and
cloud clients can elevate your business and
save money.

Cloud Computing - Its time to leverage


cloud computing capabilities to keep a
competitive edge. Consider implementing
a hybrid cloud as it is currently a buyers
market.

Data - The growth of solutions to monetize


data is on the horizon, but in order to make
the best use of them youll need to understand your datas location, accessibility,
and value.

Mobile Devices & Mobility - The time


for mobility lip service and add-on mobile
solutions is over. Organizations that dont
fully embrace mobility as a new way of
working will falter or fail.

Collaboration - Developers of tools that


help employees communicate and work together remotely will begin integrating more
analytics capabilities for sorting, filtering,
and prioritizing.

Internet Of Everything - Ready or not,


devices are becoming decision-making entities. Exploit these devices for all kinds of
things, from improving customer engagement to reducing energy costs.

Security - Watch for more security solutions to incorporate biometrics (such as


fingerprint or iris scans) and investigate the
benefits of MSS (managed security services) for your business.

February 2015 / www.cybertrend.com

Joseph M. Tucci is Chairman and CEO of EMC. Photo courtesy of EMC.

Over 30 Years Of Evolution & Innovation


EMC HAS ALWAYS GIVEN ITS CUSTOMERS THE RIGHT SOLUTIONS AT THE RIGHT TIME

WHEN YOU ASK industry professionals

KEY POINTS
EMC started off as a physical
storage and memory company, but
has evolved to offer a wide range of
big data and cloud solutions.
EMC offers a hybrid cloud approach for companies so they can
safely take advantage of public
cloud environments in addition to
their internal storage infrastructure.
EMCs big data and analytics
solutions give companies the tools
they need to learn more about customers as well as their own internal
infrastructure.
EMC Mobile is a free application that can be used to plan out
an EMC deployment as well as
manage it after implementation.

what a given technology company is


best known for, theyll often be able to
pin down a specific market or product
type that exemplifies everything that
organization is about. However, with
EMC (www.emc.com), its difficult to
determine exactly what the company is
most known for because it excels in so
many different spaces. One person may
know EMC for its traditional storage
and memory solutions that it has built
on for over 30 years, whereas someone
else might know EMC primarily for its
more recent endeavors in cloud computing and big data.
Its in this longevity that you find
EMCs recipe for success. Rather than
focus on one type of technology and
hope it continues to be relevant, EMC
constantly breaks new ground and offers innovative products and services
for its customers in a wide variety of
categories. But even though EMCs
focus has shifted to embrace the cloud,

big data, and analytics, every solution


the company offers has strong ties to
the one that came before it and continues to build on a proven formula.

Building A Strong Foundation


Established in Newton, Mass., in
1979, EMC hit the ground running
in an attempt to establish itself as a
leading innovator in the storage and
memory space. In 1981, the company
developed one of its first products,
which was a 64-kilobit chip memory
board for Prime Computer. By 1985,
EMC was increasing RAM (randomaccess memory) by offering 1-megabit
memory upgrades. Shortly afterward, in
1986, the company made an even bigger
splash by doubling the capacity of HP
3000 computers and then going public
in the NASDAQ stock exchange. These
were two major milestones that would
start EMC on its fast upward trajectory.
After moving its headquarters
from Newton to Natick, Mass., in

CyberTrend / February 2015

17

and in Fortune magazines first e-50


1994, EMC entered the Fortune 500
1983, the company once again had
index, EMC was among the top 50
list and surpassed $1 billion in sales.
to move headquarters to Hopkinton,
leading Internet companies.
(For added perspective, EMC curMass., in 1987 in order to facilitate
its growth. That same year, EMC derently ranks 139th in the Fortune 500,
livered split-capability controllers for
with $21.7 billion posted in revenue
Even More Growth In The 2000s
IBM Midrange disk subsystems in an
in 2012.) In 1995, EMC released anIn the early 2000s, EMC branched
effort to improve performance. In the
other entry in its Symmetrix product
out into other areas and entered new
following year, EMC started offering
line with the Symmetrix 3000, and in
markets along the way. EMC not only
controllers for Wang VS computers,
that same year the company not only
embraced network-based storage,
opened a new manufacturing plant in
earned $200 million from its open
it also became a major leader in the
Ireland, and listed its stock on the New
storage solutions but also became the
space. The release of EMC Clariion, a
York Stock Exchange. To close out the
market leader in mainframe storage.
SAN (storage area network) disk array,
decade, EMC continued its partnership
The growth continued in 1996 as
ushered in a new era for the comwith IBM to offer storage subsystems
EMC released its EMC Data Manager
pany as it aimed to help its customers
for IBM computers
take advantage of the
and also revealed its
emerging Internet and
own solid-state mainnetworking era.
Throughout 2002,
frame storage system
EMC continued to
called Orion.
expand its offerings
The 1990s would
by introducing EMC
see EMC expand its
Centera, which was the
product offerings as
first content-addressed
well as put its storage
storage system in the
systems in more prodworld, as well as ofucts than ever before.
In 1990, the comfering storage conpany created the EMC
sulting services for its
Symmetrix 4200 inown products and for
tegrated cached disk
third-party storage
array, which had a The EMC Clariion SAN disk array ushered in a new era for EMC. The newest models offer customizable
systems. Over the next
capacity of 24GB. In configurations ideal for small to midsize businesses.
few years, EMC built
1991 and 1992, EMC
upon these foundareleased its Champion
tions and introduce
Integrated Cached Tape Subsystem
product and also announced it had
new Clariion and Centera products, inand Harmonix ICDA system, which
shipped a total 1,000TB worth of EMC
cluding both hardware and software
were specifically designed for AS/400
Symmetrix products. That capacity
solutions. EMC also continued releasing
computers. And in 1994, EMC released
would continue to grow, and in 1997
new models within its Symmetrix line.
the newest version of its Symmetrix
the company revealed an entirely
Another major part of EMCs sucproduct, the Symmetrix 5500, which
new line of Symmetrix products. Also
cess relied on key acquisitions that
was the first system with 1TB (terain 1997, EMC became the worldwide
bolstered the companys product and
market leader in the open storage space.
service lineup. For example, in 2003,
byte) of storage. Also in 1994, EMC
By the end of the 1990s, EMC was
EMC acquired Legato Systems, which
released its first software product,
growing at an exponential rate both
helped add new ILM (information life
EMC Symmetrix Remote Data Facility,
from sales and revenue standpoints and
cycle management) capabilities to its
which was designed for disaster rein terms of public perception. Revenues
products, and Documentum, which
covery and remote replication.
hit $4 billion in 1998, with $1 billion
also helped with ILM in addition to
of that coming from the European
improving the archiving and disposal
A Meteoric Rise
market, and only four years after its
processes. In 2004, EMC acquired
The years from 1992 to 1999 would
first software release, EMC became the
VMware to start its early foray into
prove to be game-changing for EMC in
worlds fastest major software company
virtualization, and Dantz Development
terms of success. In 1992, the company
with software revenue of $445 million.
to improve EMCs disk-based backup
had so much demand for its products
EMC was also named the Stock of the
and recovery products.
that it actually had to cancel a planned
Decade by the NYSE in 1999 due to
Skipping ahead to 2009 helps show
summer shutdown to make sure it
its unrivaled decade-long performance,
just how much EMC evolved (and how
could fill orders for its customers. In

18

February 2015 / www.cybertrend.com

much the storage industry changed


around it) in only a decade. Where
the focus had been primarily on physical storage solutions and networkbased storage systems earlier in the
decade, EMC moved increasingly toward the cloud near the end of it. In
fact, the company acquired FastScale
Technology in August of 2009 to specifically help improve its private cloud
service portfolio.
In addition to the cloud, EMC also
focused heavily on securing and managing data. In 2009, it acquired Kazeon
Systems, which provided e-discovery
software for businesses in multiple in-

have helped its customers stay up-todate with current trends. When network-based storage become popular,
EMC was right there to offer its products. When virtualization and the
cloud started going mainstream, EMC
made purpose-built products and solutions to meet customer needs. And
now that big data and analytics are hot
topics for many organizations, EMC
is ready to offer its products, services,
and expertise to customers.
Whether its an IT-based organization, a manufacturer, or even a health
care facility, EMC offers solutions designed to help customers take advan-

dustries. EMC also released products


designed to help with better scaling
and managing storage both in traditional storage systems and in cloudbased environments. This included the
introduction of EMC FAST (fully automated storage tiering) technology,
which was designed to ease management responsibilities and introduce
automation whenever possible.

tage of new technologies and ensure


their future success. For this reason,
EMC has recently decided to take a
four-pronged approach to offering
storage and memory solutions for its
customers. The four main solution categories are hybrid cloud computing,
big data, trusted infrastructure, and
flash storage.
And while some of these are relatively new technologies, all of EMCs
products have roots planted deep in
the companys history, which means
you can draw on decades of experience if you choose to take advantage of
EMCs unique angle on these different
technology categories.

EMC Meets A New Era


With New Technologies
From the year 2010 until today,
EMC has continued to develop new
technologies, acquire new business
units, and release new products that

Hybrid Cloud
EMCs approach to hybrid cloud
is all about giving you, the customer,
choices, and providing support
throughout the entire migration process. You have the option of setting up
a hybrid cloud based on OpenStack,
Microsoft, or VMware infrastructure
and you can decide which public clouds
you want to utilize as well. And to make
this decision easier, EMC offers a list of
cloud service providers that are either
Silver, Gold, or Platinum level partners.
EMC clearly explains what types of services are supported with a given provider and what EMC products youll be
able to use to set up the specific type of
environment you want.
The process of setting up an EMC
private cloud is designed to be as easy
as possible and the company claims
it can get most environments up and
running in roughly 28 days. But perhaps the best services EMC offers to
newer cloud users are its educational
programs. EMCs IT Transformation
Workshop, Cloud Advisory Service,
and cloud education services are all
designed to provide companies that
are new to hybrid cloud environments
with an in-depth understanding of
what they can do with the cloud and
how the process actually works. Not
only will EMC cloud experts support
you throughout the process, theyll also
train your employees on how to properly manage and maintain the hybrid
cloud on their own.
EMC also offers a wide range of
cloud-enabled solutions that will help
you beef up the performance of your
hybrid cloud and make sure you get
the level of performance you need.
VCE VBlock, for example, is a converged infrastructure platform designed to improve the overall agility
and efficiency of the hybrid cloud.
EMC Vspex serves as a secure bridge
between your private and public clouds
to make sure that all of your information and applications are safe.
Additionally, EMCs ViPR softwaredefined storage solution makes it much
simpler for your organizations IT

CyberTrend / February 2015

19

staff to manage your onsite and offsite


storage with the use of better automation and easy-to-use tools.

Big Data & Analytics


EMC understands that with storage
costing less than ever before and the
cloud giving companies access to ever
greater amounts of capacity, big data
and analytics are two categories that are
making more and more sense for organizations. For that reason, the company
now offers a wide range of big datarelated solutions designed to help companies prepare for or even predict the
future. It all starts with EMCs Hadoop
and Hadoop-as-a-service offerings that
help companies take all of the data
they have and transform it into actionable insights. The main solution with
this strategy is the Hadoop Starter Kit,
which is designed to ease companies
into big data analytics.
For organizations that prefer to dig
a little deeper into their data, or those
that have so much data that its difficult to manage and analyze, EMC offers its Federation Data Lake solution.
This approach also takes advantage of
Hadoop to help you set up large-scale
data lakes in a much more manageable
way. You can scale the size of the lake
up or down and improve efficiency to
make sure that you arent spending
most of your time just trying to find
the data you want to analyze. Instead,
you can get right to the process of
gaining insight from your information.
In addition to Hadoop-based big data
and more traditional analytics solutions, your organization can also take
advantage of EMC solutions to dig a
bit deeper into your IT operations and
try to spot places where improvements
might be made in performance and
overall efficiency. Many companies assume that big data is mainly used to
bring information to the surface that
can generate sales and revenue, but big
data analytics can also be used to improve the infrastructure from within
and thereby drive the company forward.
Employing a similar approach, EMC
also uses big data science and analytics

20

February 2015 / www.cybertrend.com

for solutions designed to improve


overall security efforts. Using solutions
such as EMCs RSA Security Analytics
software enables you to get a more indepth view of your organizations security strategy and pinpoint specific
areas that require extra attention. The
greater visibility that the RSA security
software provides allows for earlier detection of threats from all angles.

Trusted Infrastructure
Rather than relying on the traditional method of setting up your
physical and virtual infrastructure,
EMC takes the process to the next
level and helps you build what it calls
Trusted Infrastructure. EMCs Trusted
Infrastructure approach is built on the
three pillars of availability, security,
and backup and recovery. Availability

HOW EMC STORAGE DOES MORE


Why choose EMC for storage? The benefits your organization can derive from EMCs
Protection Storage Architecture, according to EMC, all come down to speed: faster delivery
of services and applications, faster scaling of operations, and faster business growth.

lower, less expensive storage tiers. This


and ensure they can keep up with the
is particularly important because
is a great way to make sure that you
needs of other hardware and software
it determines how much downtime
are placing applications and data on
solutions. EMC understands that some
your company will have to deal with
the types of storage that make the most
companies are ready to make the jump
over the course of a year. EMC uses
sense financially and from a perforto flash-only arrays, while others require
VMware technology to help companies
mance standpoint.
a hybrid approach to flash storage. Thats
set up virtual environments that are esEMCs Flash in the Server strategy
why EMC offers an All-Flash Array, a
sentially downtime-proof and thereby
is designed for companies that want to
Hybrid Flash Array, and Flash in the
deliver as much as 99.999% availability
specifically pinpoint where flash is used.
Server approaches.
for your mission-critical infrastructure.
Instead of having an entire array or half
EMCs XtremIO All-Flash Array uses
The security arm of EMCs Trusted
of an array based on the flash storage
the fastest flash storage technology availInfrastructure strategy is all about
architecture, you can decide which
fighting off potential threats and proable to accelerate applications and deservers or blades truly require flash to
liver the highest possible performance.
tecting sensitive company informaperform specific tasks. For
tion. In addition to the RSA
example, you can use EMCs
Security Analytics solution
XtremCache solution for
that helps you get a better
high-speed flash caching to
view of your overall secuspeed up the performance
rity approach and make imof a given application rather
provements where needed,
than offer that high level of
EMC offers security tools
performance to an entire set
that provide a more granular
of applications.
view. For example, EMC can
help you determine how susceptible your company and
EMC Mobile
employees are to fraud and
One of EMCs most
identity thefttwo things
helpful tools wont actually
many traditional security
cost you anything to use.
solutions arent able to de- EMC has developed mobile applications for many of its solutions, including Documentum for
The companys free EMC
Mobile application helps you
tect. EMCs Risk Identity enterprise content management. Documentum Mobile users can access, collaborate on, and
throughout the planning and
Calculator, for example, lets share company content on the go.
deployment processes. EMC
employees set up a profile
Mobile provides access to
with some basic information
a vast research library that is customand determine how at risk they are to
Flash storage has lower latency and
tuned to your specific deployment and
identity theft and other threats.
higher density than traditional hard disk
gives you the information you need to
The third pillar of backup and restorage solutions, which means youll
properly maintain your EMC implenot only get those higher speeds, youll
covery is absolutely crucial because in
also fit more capacity into a smaller footthe event that your data is compromentation. Youll also have access to My
mised or if there is downtime, you need
Communities, which is a way to interact
print. The All-Flash Array is particularly
to be able to get back up and running
with and learn from other EMC cushelpful for companies that use resourcequickly. In addition to solutions to protomers, as well as all of the newest videos
intensive applications, including those
tect data across internal and remote enand information from EMC.
that are constantly accessing information
vironments, EMC offers solutions for
EMC also offers mobile apps for
and cycling through different data sets.
better managing the overall backup and
other solutions, such as Documentum
The Hybrid Flash Array uses EMCs
recovery process, and for establishing
and Symmetrix. Regardless of the soluVNX, VNXe, VMAX, Isilon, and Flash
a strong disaster recovery strategy. All
PowerPacks solutions to give you a
tions you choose to implement, EMC
of these solutions make it possible to
more flexible approach to storage. This
offers ongoing support as well as acquickly respond to events and reduce
approach uses EMCs FAST (fully aucess to the complementary tools your
their impact on productivity.
organization needs to succeed. EMCs
tomated storage tiering) software to
development of mobile apps to help
essentially automate the process of
customers manage their particular EMC
moving data to different storage tiers.
Flash Storage
solutions illustrates how the company
Frequently used data will automatiData and applications are more deis once again prepared to support the
cally move to the flash portion of your
manding than ever before, which means
technologies and platforms that its cusstorage array, whereas data that is used
that companies will sometimes turn to
more sparingly can be moved down to
tomers want to use the most.
flash storage to speed up performance

CyberTrend / February 2015

21

The Current State Of Cloud Security


DESPITE IMPROVEMENTS, ITS STILL IMPORTANT TO CONSIDER POTENTIAL RISKS

MANY COMPANIES that have avoided the

KEY POINTS
The cloud adds the potential
for more people to have access to
your data, but its also important
to consider human error and cloud
provider longevity as risk factors.
Major CSPs (cloud service providers) and many smaller ones have
improved their security approaches
over the years, and many now support regulatory compliance, as well.
Make sure your cloud provider
is properly certified and get thirdparty evaluations if at all possible.
Some companies, specifically
those in health care, financial, and
other heavily regulated industries,
still need to carefully calculate the
risks of a move to the cloud.

22

February 2015 / www.cybertrend.com

cloud up to this point have done so due


to security concerns, some of which are
merely perceived while others are wellfounded. But over the years, as more and
more vendors have come to the market,
many cloud providers have used security
as a specific area to differentiate themselves from the competition. This has led
to industry-wide security improvements,
for the most part. Still, some concerns
remain, and for that reason, its important
to revisit the current cloud security landscape and determine whether or not its
time to take on a little risk in order to receive some of the major potential benefits
that come with embracing the cloud.

Should You Still Be Worried?


Although strides have been made, there
are certain aspects of moving data and
workloads to the cloud that will inherently
add risk. For example, James McCloskey,
director of advisory services covering security and risk at Info-Tech Research Group,

says that the fact that youre moving things


to the cloud in the first place means that
at least another party has been added into
the mix and you have security challenges
associated with their access to it. This isnt
just a security issue, but also a privacy issue
because even if someone doesnt steal your
data, they might still be able to look at it,
which could be a problem for companies
that want to move sensitive information to
the cloud.
McCloskey says that nearly every aspect of moving to the cloud is a risk
management decision. The key is to look
at what the potential security issues are,
determine how much of a risk they pose
to your organization, and then weigh that
risk against the potential benefits. If I am
able to outsource the hosting of a certain
function of my IT services to a CSP (cloud
service provider), even if that particular
service may be subjected to slightly elevated risk, the benefit that comes to my
internal organization from not having
to manage the day-to-day security may

mean that I can actually deploy better security on some of my more sensitive systems that are internal, McCloskey says.
In other words, moving some workloads
that can handle more potential risk to
the cloud may free up IT resources and
let you better secure your internal infrastructure.
Jay Heiser, research vice president
at Gartner, agrees that no computing
model is safe from risk, but he does add
that many of the concerns leveled against
the cloud tend to be exaggerated. He compares this idea, obviously to a much lesser
degree, to the recent Ebola outbreaks
that have whipped some people into a
frenzy. The Ebola crisis here is a perfect
example of how an unfamiliar scenario
triggers anxiety and people tend to focus
on the wrong things to the wrong degree,
says Heiser. One of the biggest concerns
about the cloud is that the provider will be
hacked or that nefarious people work for
the company. But, Heiser says, its hard
to see much evidence of that.
Heiser points out that companies
should actually be less concerned about
the provider security and more concerned about their own internal handling
of the cloud. He says that sloppy Web
code and general sloppy usage of the
cloud can often put companies at risk
more than any cloud providers security
measures would. This is especially an
issue with the public cloud because its
easy for users to set up their own public
shares and inadvertently store sensitive
information in poorly protected places.
To help protect your company from this,
youll have to set up internal policies on
how to properly use the cloud and put
controls in place, neither of which is an
issue tied to a given cloud providers security offerings.
Another area where companies should
direct their concern is when it comes to
the actual longevity of the cloud provider
itself. Heiser says that many companies
have suffered data loss incidents due to
providers going bankrupt and practically falling off the face of the earth. This
specific situation happened with a major
storage-as-a-service provider that went
bankrupt and wasnt purchased, which

In addition to these [security and identity and access


management] elements, an organization that decides to
use third-party security services will have to balance any
additional security with performance and integration with
both on-premise and cloud-based applications.
MICHELA MENTING
Cybersecurity Practice Director : ABI Research

essentially put its clients data in limbo.


Youd think that storage-as-a-service
would be relatively easy to migrate away
from, but they were claiming that some
customers had petabytes of data, so just
the volume of data made migration very
difficult, he says.

Cloud Security Has Improved


Over The Years
Because CSPs are often under heavy
public scrutiny and would suffer greatly
if a security breach were to occur, many
CSPs are taking great strides to improve
their overall security approaches and
even support regulatory compliance initiatives for a variety of industries. Heiser
says that these providers are being vulnerability scanned on an ongoing basis
by every college student and credit card
thief on the Internet, so they have huge
incentive to maintain a very thorough
set of processes for change management
and vulnerability management. And the
CSPs are well aware that if a security
breach happens, everybody will know
about it, he says.
CSPs are so concerned about damaging
their reputation and losing customers
that theyre putting robust security solutions in place that rival what many data
centers can do internally. The reason for
this, according to McCloskey, is that CSPs
deal in economies of scale and are able
to spread security costs across all of their
users, which means its more cost-effective for them to put strong security and
data protection measures in place than
it would be for an individual enterprises
data center. By moving to the cloud, some
organizations may actually be able to improve their risk profile rather than add
new vulnerabilities.

One really good example is security


information and event management
[SIEM], says McCloskey. It is tough for
a smaller organization to staff a security
operations center that is monitoring 24/7
the activity against their firewalls and
intrusion prevention and detection systems. If your organization is strapped for
resources and doesnt necessarily have
the priority placed on dealing with those
vulnerability management concerns, you
may find that the CSP is actually better
positioned to be responsive to those
types of issues.

Comparing Providers
& Keeping Them In Check
Michela Menting, cybersecurity practice director at ABI Research, says optimizing a public cloud security approach
means that companies need to take many
factors into consideration. For instance,
she says, companies need to consider
compliance with laws and regulations
requiring knowledge of data location
and the ability to perform electronic discovery. This is crucial for companies
in the health care or financial industries
that need to know exactly where data is
stored and who has access to it. A solid
cloud provider should be able to give you
that information.
She adds that cloud providers should
be able to give you a wide range of security features to protect your data. This
includes tools like encryption, key management, and VPN-enabled cloud instances, Menting says. She also adds
that things such as software isolation;
visibility and transparency of security
controls; and data isolation, retention,
and sanitization methods are all important security measures to look for in

CyberTrend / February 2015

23

. . . a lot of health care organizations arent doing a great


job of protecting that information themselves. They may
find that in order to achieve their higher security obligations, they might actually benefit from going to the cloud,
but they have to very careful.
JAMES MCCLOSKEY
Director Of Advisory Services, Security & Risk : Info-Tech Research Group

Were seeing less and less of this attempt to figure out


whether a cloud provider is secure or not by asking just the
right set of questions. . . . Third-party evaluations, like ISO
270001 and SOC2, primarily, are the only thing that scales
to a multi-tenanted solution. We need to see service providers offering these things, and we do see quite a lot of it and
its growing, but we need to see buyers demanding it.
JAY HEISER
Research Vice President : Gartner

a provider. You have to not only make


sure that your data is safe while its being
stored, but also that there is an endgame
plan in place for if you need to migrate
data to a different service or expunge it
all together.
If you want to dig even deeper into
what a cloud provider offers in terms of
security, there are also established standards to look for that guarantee a certain
level of security. Menting says there are
standards for assessing information security risks, which help determine the
level of security a cloud provider offers.
She also recommends that you find a service provider that offers authentication,
identity, and access management services
for cloud instances. Support for these
standards could help you better compare providers and find one that will adequately meet your companys needs.
Once youve established a relationship with a cloud provider and know that
they meet certain certifications, you also
need to check on that provider and make
sure its holding up its end of the deal.
In fact, Heiser say, the greater issue of
continuous monitoring of CSPs is one
that the world has not fully addressed.

24

February 2015 / www.cybertrend.com

He says that there isnt a specified way of


doing this yet, although groups including
FedRAMP are working on it. FedRAMP
is a program dedicated to making sure
cloud providers are capable of handling
government data; it was established to
help government agencies make a safe
transition to the cloud.
Heiser stresses the importance of
bringing more attention to the idea of
third-party cloud evaluations because
they are the only way to truly prove
that a provider is meeting the relevant
requirements without simply having to
take its word for it. He says that many of
the larger cloud providers, such as AWS
and Microsoft, tend to have a lot of acronyms, meaning they are heavily certified and monitored for performance,
reliability, and security. But this same
amount of attention isnt always paid to
smaller SaaS (software as a service) providers, which begs the question of how
were going to extend this down to twoand five-person organizations, Heiser
says. Still, he does think that in the future,
well see some refinements to the thirdparty evaluation model as it becomes clear
that it misses links in the chain.

Some Companies May Still Need


To Limit Cloud Usage
Even with all of the security improvements, more common support for regulatory compliance issues, and third-party
monitoring of the larger cloud providers,
there are still some situations where you
may not want to move data to the cloud
at all or at least minimize cloud usage.
Generally speaking, highly regulated
data should be treated with more caution, and as a pattern, highly regulated
organizations are less likely to put data
into the public cloud, and are certainly
cautious about their highly sensitive
data, says Heiser. Small to medium
businesses have less to lose and less
ability to manage it themselves, and to
be fair, theyre less likely to be hit with a
huge fine from a regulator, so SMBs are
generally more adventurous and large
enterprises are making very measured
use of the public cloud.
McCloskey agrees and adds that the
type of industry your company is in will
also have a major impact on your overall
cloud decision-making process. For instance, companies in the health care
industry have to worry about HIPAA
regulations, including personally identifiable information for patients and other
medical records. And financial institutions have to worry about their own regulations as well.
Still, McCloskey says, organizations
need to be careful about how they make
their cloud decisions. For example, he
says, they shouldnt focus solely on the
potential security risks and use those
as a way to avoid using the cloud altogether. He says that the cloud can
offer lower costs for organizations, add
more flexibility to their infrastructure
approach, and provide a wide range of
other benefits that can potentially make
it worthwhile to accept a little bit more
risk to the organization.
Its all about finding balance and, according to McCloskey, finding a cloud
provider that is in a position to deliver
a more professional, comprehensive,
and capable security operations experience that what the organization can do
themselves.

Modernize Your Mobility Approach


UNDERSTAND TODAYS WORKPLACE MOBILITY TRENDS

JUST AS ONE MOBILITY trend gains trac-

KEY POINTS
The profound impact that the BYOD
(bring your own device) trend has had
on businesses has been accompanied
by various data and security worries.
Many experts believe BYOD will
eventually give way (or is already
giving way) to a model known as
CYOD (choose your own device),
which is said to balance control between employers and employees.
Cloud-based services can be a
boon to worker collaboration, file
syncing/sharing, and productivity, but
such services require careful company
oversight.
In coming years, expect wearables
and various Internet-connected devices to increase mobility demands.

26

February 2015 / www.cybertrend.com

tion with organizations, another comes


along offering the promise of even greater
capabilities for bolstering worker productivity, managing devices, securing sensitive data, enabling collaboration, and other
sought-after results.
Take BYOD (bring your own device),
for example. While some enterprises are
just now implementing official BYOD
strategies and policies to allow employees
to connect their personal devices to corporate networks, some experts are recommending a CYOD (choose your own
device) approach, whereby organizations
provide employees with lists of companyapproved devices from which to choose.
The reality is that mobile strategy is a
shifting target, involving numerous trends,
some of which come and go, and some of
which overlap each other. Although this
reality is daunting, it remains vital that
businesses find an appropriate path to keep
pace with competitors leveraging their own
mobility initiatives. This article provides an

overview of several dominant workforce


mobility movements, all of which can be
key to modernizing a mobility strategy.

Understand BYOD
Despite its relative youth, BYOD has had
a monumental impact on organizations.
Not too long ago, it was more common
for companies to deploy, say, a fleet of the
same BlackBerry model to its workers, and
many companies retain this model for various mobility management purposes. By
contrast, BYOD ushered in an era where
employees are hopping on the corporate
networks with whatever personal device
they feel most productive with.
Beyond boosting productivity and employee satisfaction, BYOD is praised for
introducing potential costs-savings for companies because BYOD saves companies the
trouble of buying devices and associated
accessories. Another positive is that todays
increasingly tech-savvy workers are more
likely to troubleshoot their own device
problems or find a co-worker who can.

As BYOD uptake has mushroomed,


however, IT, organizations, and experts
point out theres plenty not to like about
BYOD, including the added demand it can
place on network infrastructures, the difficulties it poses in supporting custom applications, and the complexity it can introduce
in terms of paying device/data plan reimbursements. The biggest challenge BYOD
can pose to organizations, however, is lack
of control. Particularly worrisome is the
possibility of workers accessing and sharing
company data over corporate networks
via cloud-based services and applications,
something that introduces security, compliance, and other issues.
For companies that do embrace BYOD,
experts generally advise establishing
strong policies that spell out responsibilities and expectations. Experts also recommend starting small, using pilot programs
to determine which employees should be
included in the BYOD program, what infrastructure upgrades or updates might be
necessary, and which data and resources
will be made available. While some organizations have decided to simply deny personal devices access to corporate networks,
others have adopted device and application
management solutions to address data and
security concerns.
Research from J. Gold Associates expects
that in the coming years, BYOD will transition into BYOT (bring your own thing,
sometimes referred to as bring your own
technology), which presents a more complex and diverse scenario that will require
infrastructure investments to meet the demand that scores of sensors and peripherals
employees bring into the workplace will
create. Enterprises must focus on longerterm platforms rather than short-term niche
products and build a strategy now to deal
with this emerging phenomenon, J. Gold
Associates stated in a recent paper.

BYOD vs. CYOD


CYOD is an approach that lands somewhere in between the two extremes of
companies mandating the use of specific
devices and employees using whatever
devices they want. Many experts believe
CYOD will eventually supplant BYOD, in
part because it can address many primary

You need application access and data access regardless


of the device youre using at that moment. So companies
will look to manage things like access, policy, and security
across all of these things in a much more consistent way.
CHRISTIAN KANE
Analyst : Forrester Research

organizational concerns over BYOD while


still leaving employees a certain degree of
freedom of choice.
With CYOD, an organization provides
employees with a choice of approved devices, operating systems, and platforms to
choose from. Whereas BYOD might entail a
large enterprise ending up with hundreds of
mobile workers using potentially dozens of
different device models, operating systems,
and application versions, CYOD narrows
this down to a more reasonable number,
one that organizations and their IT departments are prepared to manage and support.
Although certain CYOD policies entail
the company paying for devices, data plans,
accessories, etc., a more typical CYOD
policy involves employees purchasing the
device and data plan and the company
overseeing support responsibilities. Key to
a successful CYOD program, experts say, is
not being too heavy-handed in the devices
employees can choose. If a CYOD policy is
too draconian, there exists the potential for
employees to bypass it and connect unapproved devices anyway.
Another option is COPE (corporate
owned, personally enabled). Like CYOD,
COPE offers employees a choice of devices,
except the devices are company-owned.
While COPE policies differ from one to
the next, COPE typically allows for employees using devices for personal activities
(calls, email, social media use, and so on).
For companies, COPE provides control and
security measures and enables IT to more
easily update and back up devices, wipe data
remotely, and more. As with CYOD policies, its possible that employees will balk at
overly restrictive COPE policies.

Manage Your Devices


As popular as BYOD is with employees,
its arguably unpopular in equal measure
with executives and IT personnel responsible

for company security, data integrity, compliance issues, and regulatory requirements.
The lack of control enterprises generally have
over personal devices accessing corporate
networks and data has led many to implement MDM (mobile device management),
MAM (mobile application management),
EMM (enterprise mobility management),
and other solutions to regain control.
Increasingly, experts point to EMM as
the future of corporate mobility. Gartner
dubs EMM the future evolution and convergence of several mobile management, security, and support technologies, including
MDM, MAM, application wrapping, containerization, and file sync/sharing. Gartner
says it eventually expects EMM will address
an array of mobility needs spanning various
smartphone, tablet, and PC OSes.
Similarly, Christian Kane, Forrester
Research analyst, believes EMM will eventually be part of workspace management
solutions that address mobile and PC management aspects. You need application access and data access regardless of the device
youre using at that moment, he says. So
companies will look to manage things like
access, policy, and security across all of these
things in a much more consistent way.
Forrester has noted that EMM improves
upon legacy MDM solutions in three main
areas: security, support, and experience.
Where experience is concerned, Forrester
states the ability to make application recommendations to employees, provide automation and self-service abilities, and enable IT
to roll out and integrate services quickly will
make EMM more enticing to companies.
Chris Silva, Gartner research director,
notes that via just one console, EMM suites
enable managing multiple policies and enforcement methods, which is essential to
managing mobility on a large scale. As mobile OSes mature, he says, more control is
being placed natively into the device. This

CyberTrend / February 2015

27

will shift the evolution of EMMs to focusing


more on app- and data-level controls.
J. Gold Associates predicts the EMM
market will consolidate in coming years
to the point two or three pure-play vendors will exist. For most organizations,
this means EMM functions will merge
into corporate systems from major vendors, which they may already possess.
Furthermore, the research firm expects
base EMM systems will extend security
and management abilities to accommodate the array of Internet-connected devices organizations will likely add.

Tap Into The Cloud


To do their jobs, mobile workers need
fast, easy access to data, files, contacts,
and other resources wherever theyre at.
Increasingly, employees are turning to
cloud-based services (including free consumer-oriented ones) that provide the
productivity, collaboration, sync/sharing,
communication, data access, and other
abilities they covet for real-time decisionmaking, brainstorming, idea sharing,
project planning, and other functions.
There are online services today that enable users to maintain calendars, schedule
meetings, manage projects and deadlines,
edit and save presentations and documents, manage websites, collaborate with
white boards, engage in videoconferences,
make video calls, and exchange messages,
all from their mobile devices.
Largely, mobile workers now view such
cloud-based services as a necessity. If their
organizations dont offer suitable and accessible corporate equivalents, theyll step
outside the company and acquire tools
themselves. This practice has spawned
BYOD-like trends known as BYOS (bring
your own software/services) and BYOA
(bring your own applications).
Although cloud services can enhance productivity and collaboration,
sometimes because workers are already
familiar with the tools from personal experience, they can introduce additional
concerns. Cloud-based services can, for
example, introduce data, security, and
compliance problems when employees
use them to store and share company
data, leaving the company with no insight

28

February 2015 / www.cybertrend.com

. . . while managing mobile devices is already difficult, smart


watches, wearable computing, the IoT (Internet of Things),
and other new technologies will complicate matters.
JIM RAPOZA
Senior Research Analyst : Aberdeen Group

into (or control over) what information is


going where.
In response to these concerns, a
growing number of service providers are
offering enterprise versions. Additionally,
vendors of traditional enterprise collaboration solutions are adding cloud applications, support, and other features to
their solutions to appeal to mobile professionals. Experts do caution that before
implementing, say, a cloud-based collaboration tool, the enterprise should ensure
that its compatible with the various devices, OSes, and platforms the company
uses. Organizations should also consult
with employees about what tools theyre
already using, what tools they dont want
to use, and what features they would like.

Strategize For The Future


While some companies have ditched
PCs in favor of smartphones, tablets, and
other mobile devices, at least for some
workers, the desktop PC will likely have
a place within enterprises for the foreseeable future, despite predictions of the
PCs demise. That said, there are many
factors involved these days in matching
a particular device to a particular user in
order to maximize productivity. This can
be helpful, but it can also complicate decision-making. Furthermore, managing a
fleet of devices that potentially consists of
various 2-in-1 laptops, ultra-small-form
factor desktops, ultrabooks, all-in-one
desktops, or thin clients on top of smartphones and tablets can be considerably
complex. Managing a diverse array of devices, however, seems to be the future.
In a recent paper detailing the collision
of PC and mobile device management,
Forrester Research noted that while most
workers still rely on mobile devices as secondary computing devices, this will change
as mobile app options continue to surface.
Different jobs will require different tools,

meaning employees computing arsenals


will expand to include PCs, smartphones,
tablets, wearables, and other devices. A
siloed management system that only works
for mobile devices and not PCs will make it
harder to take an employee-centric rather
than a device-centric view, Forrester
stated.
Similarly, Jim Rapoza, senior research
analyst at Aberdeen Group, says while
managing mobile devices is already difficult, smart watches, wearable computing,
the IoT (Internet of Things), and other new
technologies will complicate matters. J.
Gold Associates, meanwhile, predicts that
within two years, enterprise users will regularly use three to five personal devices and
eight to 10 business applications. Further,
theyll connect to at least five enterprise
systems. These numbers are expected to
multiply two to three times by 2018-19,
meaning new burdens for enterprises in
terms of access and interaction with corporate systems and infrastructure.
Beyond a changing device landscape,
expect enterprise workspaces to continue
changing in coming years, dramatically so
in some cases. Recently, there has been increasing acceptance of flexible and remote
working practices due largely to affordable
high-speed Internet connectivity and technologies that enable easier collaboration
and communication. Younger workers with
nontraditional workplace expectations and
knowledge workers reluctant to relocate but
who possess skills organizations desire are
also influencing this trend. For companies,
remote working situations can lead to savings in real estate and resources.
Another workplace trend making waves
is open workspace designs that do away
with cubicles and closed offices in favor
of more collaborative environments. Time
and a variety of implementations will tell
whether such efforts affect productivity and
creativity positively, negatively, or both.

Certied Piedmontese beef tastes great: lean and tender, juicy and delicious.
But theres more to it than just avor. Certied Piedmontese is also low in fat
and calories. At the same time, its protein-rich with robust avor and premium
tenderness. Incredibly lean, unbelievably tender: Its the best of both worlds.

piedmontese.com

New, Real-Time Data


IS THAT WHERE THE REAL VALUE LIES FOR BUSINESSES?

KEY POINTS
A new school of thought about big
data posits that the most valuable data
isnt necessarily the massive amounts
of data organizations already possess.
Some experts argue that the most
valuable business data arrives as a
constant influx from mobile devices,
social networks, and Internetconnected machines.
Apache Hadoop is viewed by some
as a vital part of any big data environment, because it handles large-scale
storage and processing well.
Historical data can have a great
deal of relevance, such as in tracing
the success history of a specific mechanical part. Real-time data can be
more important when tracking competitors and customer response.

30

February 2015 / www.cybertrend.com

FOR MANY PEOPLE, big data conjures an image of massive amounts of


data coming from a variety of sources:
new data constantly flowing into the
organization, traditional data the organization produces internally, years
worth of historical data the organization
has archived, and external data (such
as customer service forum threads and
social networks) about the organization.
R Ray Wang, principal analyst and
founder of Constellation Research, aptly
equates big data to a stream that includes all the bits and bytes around us,
meaning every touch point, every click,
every like, every comment, every log
entry in a machine, every purchase.
While few would argue organizations
are amassing more data than ever today,
there is an argument as to which type
of data within the big data vault is
of most value in a business sense. One
school of thought, for example, suggests that the most valuable data is that
flowing in from mobile devices, social

networks, Internet-connected machines,


and similar real-time sources. Further,
theres a notion that aging data actually
holds little business value, thus leading
to questions concerning what older
data is worth keeping, what data is less
useful, and at what point keeping old
data simply amounts to digital hoarding.
Not everyone, however, is buying into
the new school of thoughtat least
not completely.

A New Way Of Thinking?


Whether theres credence to the idea
that most of an organizations aging data
is worth little depends on what information is being sought and what time
frame the data covers, says Nik Rouda,
Enterprise Strategy Group senior analyst. If you want to see how a manufactured part has performed over the years,
social media could give you an idea
when it starts breaking but wont tell
you much about the build lot, materials,
distribution, service records, and usage

Organizations must analyze historical trends to be able


to plan better, diagnose problems, and discover opportunities. . . . Often the real value comes from combining
both [old and new] data sources.
CINDI HOWSON
Founder : BI Scorecard

circumstances, he says. If, however, an


organization wants to know how an ad
campaign is doing currently and adjust accordingly, real-time data analysis
might be all it needs. Still, Rouda says,
comparing the current campaign to previous ones, macro-environments, and
other circumstances requires a historical
view from past data.
Even companies looking to make future predictions usually must do so by
spotting similar events and trend patterns from the past, Rouda says. For
example, he asks, how much data concerning health care clinical studies that
track participants for their lifetimes is
available via Twitter? Geophysical
survey data collected years ago, meanwhile, might have indicated drilling
for resources wasnt profitable, but using
new techniques the financial outcome is
now considerably different. Rather than
undertake the cost for new surveys, the
same data from years ago could be reutilized, Rouda says.
Similarly, Cindi Howson, BI Scorecard founder, points to the quote,
Study the past, if you would divine the
future. Organizations must analyze
historical trends to be able to plan better,
diagnose problems, and discover opportunities, she says. Further, if equating
old data to transactional data, she
says, this type of data is critical for daily
operations. You cant commit to an
order, for example, unless you know inventory on hand, she says.
That said, Howson does believe theres
additional value to gain from real-time

communications and social data. Thus,


companies should be prepared to analyze both new and old data. Often the
real value comes from combining both
data sources, she says. Wang concurs
that old datawhich he says is typically
very accurate, organized, and siloedis
useful when added to new data. What
were looking for is building a level of
relevancy. You get relevancy through
context. Context comes from location,
roles, relationships, time, sentiment, and
intent, he says.

A Matter Of Context
John Myers, research director with
Enterprise Management Associates,
describes a scenario that depicts how
old datawhich he frames as traditional, structured data a retailer might
pull from a POS (point of sale) system
combines with new, multistructured
data, such as that generated by a customer using the retailers mobile app.
This might include what the customer
is saying about the retailer via social
networks, portions of the mobile app
the customer is surfing to, and where
the customer is in a location-based-service perspective.
Say a home improvement retailer
wants to cross-sell or up-sell to a customer, Myers says. Here, real-time data
is extremely important. For example,
if the retailer knows the customer has
its app on, is driving down the street
nearing a competitors store, the retailer
could entice him with a real-time offer
to visit one of its locations instead. If the

customer enters the competitors store,


Myers says, the opportunity is gone.
This doesnt mean there isnt real-time
information to pull from the old-school,
traditional, data coming from the retailers POS system to act upon, Myers
says. This information is similar to, but
different from, what might arrive via
a mobile app, he says. For example, the
home improvement retailers traditional
data may show the customer buys lawn
fertilizer every four months. Using this
data, the retailer can identify the customers preferences and trends and use
them to identify the universe of location-based offerings it could send,
making them contextual to the customer,
Myers says.
Here, the real-time aspect is being
able to apply that universe of cross-sale,
up-sell at a specific point in time, Myers
says. For example, if a retailer doesnt
know why a customer isnt responding
to offers its sending to him, is it better
to continue sending offers that dont
apply to him or ask what about this
customers purchases and history can
we assumeor what about whats going
on in a certain location at a specific time
can we assumemight be a good idea to
entice the customer with? Myers says.
If the retailer knows the customer owns
a home in Florida and a hurricane is
forecast to strike the area, for example,
it could up-sell the customer on power
generators, plywood sheets, etc., applied
to the specific location and time frame.

Get Agile
Boris Evelson, Forrester Research
vice president and principal analyst, says
statements about new data being more
valuable or old data being less valuable are essentially simplified ways of
addressing a more complex issue. The
new business mantra isnt about historical vs. real-time data or small
volumes vs. large volumes, Evelson
says. Rather, its about business agility

THERES ADDITIONAL VALUE TO GAIN FROM REAL-TIME


COMMUNICATIONS AND SOCIAL DATA, THUS, COMPANIES
SHOULD BE PREPARED TO ANALYZE BOTH NEW AND OLD DATA.

CyberTrend / February 2015

31

in the sense of being able to do something very fast but also being able
to turn around on a dime and rebuild
it from scratch very fast again, because
otherwise were going to lose our customers to competitors, he says.
Evelson says differentiators between
new, more real-time big data and historical-type data is just one part of the
equation in the overall business agility
theme. Broadly, Evelson says, weve entered the age of the customer, meaning
while internal business processes (finances, HR, etc.) are still important, customer-facing processes really trump all
that. Evelson says todays customer is
empowered with mobile phones and has
cloud-based access to your business and
all your competitors, and they really can
switch from one service provider, or one
product to another, literally with a click
of the button on their mobile phone. So
basically customers rule.
Business agility, or being able to adjust and adapt to ever-changing customer demands literally on a dime or
at the speed of thought, is what will
set successful businesses from less successful ones, Evelson says. Data professionals must ask what they can do to
support business agility, he says. The
answer is embracing a complete framework that enables agile BI (business intelligence), agile business analytics, and
agile big data, he says. Data agility isnt
just about technology including from

What were looking for is building a level of relevancy.


You get relevancy through context. Context comes from
location, roles, relationships, time, sentiment, and intent.
R RAY WANG
Principal Analyst & Founder : Constellation Research

Companies seem proud that theyre introducing Hadoop


and NoSQL and visualization and other new technologies
. . . . Yet, merely keeping up with technology isnt the
same as transforming the culture of business.
NIK ROUDA
Senior Analyst : Enterprise Strategy Group

organizations arent good, but overly


centralizing data management and data
support doesnt work well either. This,
Evelson says, is because centralization
always breeds bureaucracy, leading to
an inability to react quickly. Agile BI
technology is another component. The
ability to act in an agile manner on really large data sets involving billions of
rows of data is where Hadoop comes in,
Evelson says. Any type of big data environment without Hadoop somewhere as
part of that equation we dont really see
that as a valid situation.

Whats Worth Keeping?


Myers says determining what aging
data is worth keeping depends on the
organizations perspective, including

DETERMINING WHAT AGING DATA IS WORTH


KEEPING DEPENDS ON THE ORGANIZATIONS
PERSPECTIVE, INCLUDING WHAT MONETARY
VALUE IT FEELS THAT DATA HOLDS.
a historical vs. real-time data perspective, but rather involves different components, Evelson says.
These components include agile
software development, meaning quick
software development cycles. Even a
few days may be too late because conditions may have changed, Evelson says.
Agile organization is another component. In the age of the customer, siloed

32

February 2015 / www.cybertrend.com

what monetary value it feels that data


holds. For example, detailed call records
may hold no monetary value to a telecommunications company after a certain period, but it does hold value to its
network personnel in terms of network
planning. Myers says such a scenario
is where Hadoop becomes a very economical way to say, How do we preserve this longer time frame for data

without using the very expensive storage


that we have in, say, our enterprise data
warehouse or the time-expensive storage
of putting it on archive tape?
For Rouda, any and all data relevant
to a strategic decision the company is
making is valuable. Doesnt matter the
source, but the more creative, the more
comprehensive, the more thoughtful it
makes teams, the more valuable it is,
he says. Rouda stresses the importance
of analytics and asking good questions,
which should yield insights and spawn
more questions. Companies seem
proud that theyre introducing Hadoop
and NoSQL and visualization and other
new technologies to do this. Yet, merely
keeping up with technology isnt the
same as transforming the culture of
business, he says. I am continuously
surprised that companies think merely
adopting technology will give them
competitive advantage. I think its only
the starting point to changing the way
they operate.
If possible and cost effective, Wang
would keep all data. Why? There are
a lot of patterns that require a lot of
data inputs to surface up with, he says.
Overall, he views contextual data as being
the most valuable. You want to surface up patterns in data. It starts with all
types of data, following the information
streams, then identifying patterns in the
data, asking questions of the data, and
ultimately, making the next decision with
that data.

Help I.T. stay on pace with the

SPEED OF CHANGE
You use CyberTrend
to keep tabs on the latest

business technology trends.


IT and data center leaders turn
to Processor to learn more about
the products and technologies
that impact organizations
at their core.

Processor is a leading trade publication

.COM
Get the latest issue right now
online at www.processor.com
or on your iPad via the iTunes Store.

that provides the news, product information,


and technology advice that IT leaders and
data center employees can trust.

Greenovations
ENERGY-CONSCIOUS TECH

The technologies
that make our
lives easier also
produce some
unwanted side
effects on the
environment.
Fortunately, many
researchers,
manufacturers,
and businesses
are working to
create solutions
that will keep us
productive while
reducing energy
demands to lessen our impact on
the environment.
Here, we examine some of the
newest green
initiatives.

Toshiba and IHI Corp. are designing turbines that will float in, and be driven by, currents in the ocean.

Toshiba & IHI Corp. Selected To Demonstrate Feasibility Of Using


Ocean Currents As Renewable Energy Source
NEDO (New Energy and Industrial Technology Development Organization),
an organization in Japan that is responsible for promoting the development and
deployment of new energy and environmental technologies, recently announced it
has selected Toshiba and IHI Corp. to carry out a new ocean current energy initiative. The $501 million project seeks to further develop turbine engines that float
in, and are driven by, currents in the ocean, and that generate electrical power that
can be transferred for use onshore. The project is a continuation of research that
Toshiba and IHI have been engaged in since 2011. The turbines the companies
have been developing are tethered to the ocean floor while they "float" like kites
in the current. This phase of the research will involve testing working models of
the turbines in an open ocean environment to demonstrate the viability of using
ocean currents as a sustainable, renewable energy resource. The half-billion dollar
project has no set ending date but is expected to continue until sometime in 2017.

Israeli Startup StoreDot Wants To Reinvent The Battery


The BBC was recently given a demonstration of new battery technology developed by
StoreDot, a startup company from Israel. The demo showed how a smartphone using
one of the company's batteries that was nearly dead could be fully recharged in just a few
minutes. The batteries are made with synthetic organic molecules that make it possible to
transfer ions faster than in existing batteries. Although the battery in the demo wasn't as
energy dense as existing smartphone batteries (meaning it would not be able to power a
phone as long), the company says it is on track to match energy density of other batteries
in 2017. StoreDot plans to halve its recharging time by then, as well. Other products in
the works include batteries for electric vehicles that can recharge in less than three minutes. StoreDot plans to demonstrate its car battery technology sometime this year.

34

February 2015 / www.cybertrend.com

WhirlPool's Energy-Saving Dryer


WhirlPool has created a high-capacity (7.3 cubic feet) dryer that it says
uses 73% less energy to dry your clothes.
The WhirlPool HybridCare Duet Dryer
with Heat Pump Technology has a special built-in refrigeration system that it
uses to dry and then recycle air instead of
venting hot, moist air outside like other
dryers. WhirlPool says this innovation
not only helps the dryer save energy, it
eliminates the need for outside venting,
which means the dryer can be used in
more locations throughout the home. The
dryer has three modes: Speed, when fast
drying time is critical; Eco, when energy
efficiency is the most important consideration; and Balanced, which blends the two.

Whirlpool's new HybridCare Duet Dryer with Heat Pump


Technology has energy-saving features that let you dry
your clothes using about one-fourth as much energy as
other dryer models.

Toyota Opens Patent Gates


In January, Toyota announced it would
make 5,680 of its worldwide patents related to fuel-cell technology available for
other companies to use royalty-free. A
number of the patents are for technical
innovations developed for the Toyota
Mirai, the company's new hydrogenbased FCV (fuel cell vehicle). Toyota says
it hopes making the patents royalty-free
will spur faster development of other FCV
cars and technologies.

Google recently unveiled


the first fully functional
prototype of its driverless
car on its self-driving cars
page on Google Plus.

Google Shows Off Working Prototype Of Its Autonomous Auto


Google announced to the world in 2010 that its engineers were working on driverless car technologies. Since then, it has used a number of vehicles to develop and test
various configurations of the hardware and software needed to replace the driver in an
automobile. The company also was working on a new look for the automobile itself,
as it said fully autonomous cars in the future wouldn't need to have features such as a
steering wheel or gearshift, as there would be no "driver" to need them. The company
released an early mockup of its driverless car design in May 2014. Now Google has put
together its first fully functional prototype that includes all the driverless technology
(sensors, GPS, laser-based radar system, software) as well as the other fundamental
things you need to have in a car, such as steering and braking systems. The company
said it would begin test runs of the latest prototype immediately. The first model of
Google's driverless car is a two-seater, and it will be for city driving, as the company has
capped its top speedfor nowat 25 mph. And if you're wondering why a driverless
car that relies on radar needs headlights, Google says it's so other people can see you,
because they're not necessary for the driver. Google reports that its driverless car's radar
can detect objects as far away as two football fields from all sides.

Solpro Says Its New Helios Smart Is The World's Fastest Mobile
Solar Power Recharger, Able To Recharge Phones In 90 Minutes
Solar-power charger company Solpro says its new Helios Smart solar-powered
charging unit is able to charge products more than four times faster than other mobile solar-powered chargers and can fully recharge a depleted smartphone battery in
about an hour and a half of sunlight. Solpro says this the Helio recharges about four
times faster than most other chargers. Able to charge smartphones, tablets, music
players, and similar mobile products, the Helios Smart also can charge two products
at once. The charger comes with built-in patent-pending technology that allows it to
examine multi-product charging situations and prioritize charging order so that attached devices are charged as quickly as possible. A special trifold design allows for the
maximum solar surface in a minimal space,
and the charger can be folded up like a wallet
to fit into your pocket. At night, you have the
option of recharging the charger using USB.
The Helios Smart weighs 0.63 pound.

CyberTrend / February 2015

35

Recover Value From IT Assets


OPTIONS FOR DEALING WITH UNWANTED HARDWARE

IT HARDWARE ASSET RECOVERY is


often overlooked, despite the fact that it
represents an opportunity for enterprises
to recoup value from unwanted equipment. Furthermore, poorly implemented
IT asset recovery can deal an organizations reputation and bank account a
crippling blow in terms of noncompliance with environmental and data-privacy regulations.
Broadly, IT asset recovery, or ITAD
(IT asset disposition), involves securely
repurposing, donating, recycling, or destroying IT equipment. It's important
to note that a companys responsibility
for its equipment doesnt end after that
equipment exits its doors. A hard drive
containing unencrypted personal data
lost while in transit to an ITADs facility, for example, can mean lawsuits
and fines if data is exposed. Equipment
irresponsibly tossed in a landfill can lead
to the same. This article explores why IT
personnel and executives should take interest in IT asset recovery.

36

February 2015 / www.cybertrend.com

Recover Value
Hardware typically covered under IT
asset recovery includes PCs, laptops, servers,
monitors, fax machines, copiers, printers,
smartphones, and tablets. Increasingly,
wearables and IoT (Internet of Things)related devices are also included. For many
companies, donating such equipment to
charities, schools, etc. is a viable disposal
option with possible tax breaks. Further,
says Sandi Conrad, Info-Tech Research
Group director, many ITAD providers will
manage the process, including properly licensing OSes, ensuring equipment works,
and transferring the equipment.
Traditionally, though, IT asset recovery has meant getting value back from
unwanted equipment. This is changing
as useful life spans for equipment are extending. Companies, for example, are
keeping PCs and servers five or more years
vs. three. Thus, ITAD providers are receiving older assets with less resale value.
Conrad says some disposal companies do
work to recover value beyond equipments

seven-year range, though recovering significant value is less likely if extending the
process this long.
Also impacting the recovery value of
PCs/laptops currently is that more companies are replacing them with mobile devices,
thus driving PC/laptop prices down and
diminishing their recovery value, says Rob
Schafer, Gartner research director. In general, recovering value may be possible for
four- and five-year-old assets if an ITAD
provider excels at extracting value from precious metals used in them, although the
precious metal market is also declining due
to manufacturers using less of such metals.
An oddball exception to the declining
recovery trend is mobile devices, Schafer
says, primarily because end users tend to
replace them after two or three years, well
short of their actual useful life. Thus, ITAD
providers receive younger equipment.
Further, mobile devices have smaller footprints, making them more economical to
ship to where resale markets reside. Many
larger ITAD providers have integrated

mobile devices into their existing documentation, dismantling, recycling, and resale
processes, Conrad says.
Given the declining recovery value for
most IT hardware, Schafer says, executives
expecting to receive checks and not invoices following recovery processes should
change their thinking because the trend is
the exact oppositedisposal costs may well
outweigh recovery revenue. What's more,
the cost of properly performing ITAD
meaning compliance with local data-security and environmental regulations and
ensuring strong, secure chains of custody
and transportationare increasing substantially, he says. Therefore, Schafer recommends realistically budgeting disposal costs
up front, when purchasing assets.

The Associated Risks


Although seeking recovery value is understandable, executives must also remain
focused on their brands where risks associated with equipment recovery and disposal
are concerned. Risks cover two primary
areas: data security and environmental responsibility. Failure to safeguard in either
area could mean negative publicity that
isnt good for the enterprise, Schafer says.
Until recently, it was fairly normal for
organizations to just have IT equipment
picked up, after which it wound up in a
landfill in a third-world country where locals dismantled it for precious metals. Such
practices led to poisons leeching into water
supplies and other environmental hazards.
More recently, laws have been enacted to
greatly curb these practices. It still happens, Conrad says, but companies that get
caught breaking laws face huge fines. Laws
do vary according to state and country, she
says, thus its vital that companies stay informed about the laws that apply to them.
At minimum, organizations should ensure
that any ITAD partner is familiar with local
and global laws.
For North America, e-Stewards and
R2 (Responsible Recycling) certifications
are the predominant guidelines for recycling electronics. When seeking an ITAD
provider, look for certification with one or
both. According to Schafer, Gartner doesnt
consider one superior to the other, although
he feels its unfortunate there are those two

In general, check for equipment destruction, transfer,


recycling, and other certifications; how equipment is
packaged, refurbished, and shipped; if equipment is sold
in bulk or individually; and how equipment is cleaned,
licensed, and restored to peak quality. All these little
things will increase the resale value.
SANDI CONRAD
Director : Info-Tech Research Group

bodies because theres a good 90% overlap


in the certification requirements.
Where data security is concerned,
companies should validate exactly what
process a provider uses to dispose of data
on memory and drives taken from PCs,
laptops, servers, copiers, printers, and
other equipment. Traditionally, DOD
5500 was the standard in this domain,
Schafer says, although NIST 800-88 has
largely replaced it.
Also important is verifying the chain of
custody and the transportation logistics a
provider uses for such drives. Schafer says
these areas, which represent a bulk of total
disposition costs, entail securely packing
and shipping assets to the providers facility. Chain-of-custody particulars also
include the encryption the enterprise itself
uses for drives. Preferably all enterprise
data is encrypted for the process, Schafer
says, because monitoring which drives are
and arent encrypted is a nightmare.
Logistics details can include whether a
provider seals drives at the company and
performs a one-to-one serial number match
at its facility, which is expensive but especially secure, Schafer says. The inverse occurs when, for example, a provider packs
drives on a furniture truck that makes 11
stops on the way, he says. If half the assets
show up, count yourself lucky. In other
words, you get what you pay for securitywise. Because some ITAD providers use
third parties for transportation, organizations should ensure that transport employees have been well-vetted and that
background checks have been performed.

Whats In A Provider
Among the positive traits to look for
in an ITAD provider is its ability to

help calculate the value an organization


can expect to recover from its equipment. For example, a hardware manufacturer may offer to take equipment
back for free at time of disposal as part
of the purchase price, but this route
could mean the manufacturerrather
than the organizationprofits from disposed equipment. ITAD providers can
generally help determine the right time
to dispose of equipment for the most
value, Conrad says.
Elsewhere, organizations should conduct site visits for prospective ITAD
providers, viewing how they handle
and dismantle equipment and noting
whether environmental requirements
are followed. Also ensure a provider can
supply a list of disposed equipment complete with serial numbers that notes the
state of piece of equipment. For example,
if disposing 500 PCs, the provider should
note that 200 were resold, 100 recycled,
and 200 destroyed, Conrad says.
In general, check for equipment destruction, transfer, recycling, and other
certifications; how equipment is packaged, refurbished, and shipped; if equipment is sold in bulk or individually; and
how equipment is cleaned, licensed, and
restored to peak quality. All these little
things will increase the resale value,
Conrad says.
Location-wise, Conrad says a local
ITAD provider isnt necessarily worse
choice than a national one, though local
providers probably use partners that
must be checked and lack multiple facilities that a larger provider likely possesses, meaning just one contact and
contract to manage vs. potentially many
if using numerous local providers.

CyberTrend / February 2015

37

Add A Guest Wi-Fi Hotspot


ESTABLISH A SEPARATE, PUBLIC WIRELESS NETWORK FOR GUEST ACCESS

FOR BUSINESSES, GUEST Wi-Fi


hotspots are great for providing customers and visitors with an Internet connection without giving them access to
the companys private network. Home
users can even benefit from setting up a
separate Wi-Fi hotspot to provide wireless access to their Internet connection
when friends or family come over to
visit. Although this article is geared toward small to midsize businesses, it includes information that is also applicable
to larger businesses and consumers.
We will describe what you need to get
started, as well as what to expect along
the way. Setting up a guest Wi-Fi hotspot
is a relatively simple task, whether you
want to use an existing router or consider a new model that includes features that make it more appropriate for
sharing a Wi-Fi connection.

Internet Connection
A strong, stable Internet connection
is the foundation on which you will set

38

February 2015 / www.cybertrend.com

up your guest Wi-Fi hotspot. Having a


good connection that works properly
will stave off potential questions and
complaints, particularly if you go out
of your way to advertise your free wireless network to customers or visitors.
Pin down the approximate number of
simultaneous connections you expect
the hotspot to support at any given
time and use that figure to determine
(1) what type of Internet connection
you need and (2) what router capabilities are necessary to handle the expected amount of traffic.
Not all Internet services are created equal. For example, the most
basic Internet services available from
ISPs (Internet service providers), either broadband or DSL (digital subscriber line), offer speeds that tend to
start at between 5Mbps (megabits per
second) and 10Mbps. In some cases,
the reported speeds are just the peak,
actual average speeds can be much
lower. In the context of the modern

Internetloaded with streaming media


and multi-tab browsing these speeds
are on the lower end of being acceptable. To give you an idea just how slow
they are, a 25MB video downloaded on
a 5Mbps connection could take as long
as 40 seconds. Using a 50Mbps connection, that video will arrive in as little as
4 seconds.
On the high end, broadband speeds
can reach data transfer speeds as high
as 150Mbps or more, which is more
than enough to handle five or more simultaneous connections. Most wireless
routers will limit the number of connected users that are permitted to use
the guest network simultaneously.
If you anticipate that your guest
Wi-Fi hotspot will attract a large
number of users, you may consider upgrading your service to fiber optic or
designate multiple dedicated broadband
connections. Note, fiber optic Internet
service may have limited availability in
your area.

Wireless Router
If you plan to use one wireless
router to support two separate networks, look for a business-class router
that supports a maximum data transfer
speed of at least 300Mbps; some
models will support much faster rates.
The most recent Wi-Fi standard available as we went to press is 802.11ac,
which has been shown to offer realworld data transfer rates of 600Mbps
and better when communicating with
devices using 802.11ac adapters.
A dual band router is a good option
because it operates on both the 2.4GHz
frequency (which most current and
older devices support) and the 5GHz
frequency (which 802.11n/ac devices
support, and which offers the fastest
data transfer speeds). Most dual-band
routers offer the choice of broadcasting
in 2.4GHz, 5GHz, or both simultaneously to support the widest range of
devices and prevent signal interference.
Many consumer- and business-oriented routers let you easily create two
separate networks: one you can use
for your companys internal network,
and one you can use to provide wireless Internet access for guests with WiFi-enabled devices. Wireless routers
and access points can range in price
from $20 to as much as $300 or more
depending on their speed and feature
sets, so make sure you only pay for
functionality that you and your customers or visitors are likely to use.

Software
All wireless routers come with a
software or firmware-based user interface that enables you to change the
routers settings. Some software might
be included on a disc or accessible only
via a Web browser and the routers
default IP address. When attempting to
access the settings menu, always follow
the instructions specific to your device.
Most often, you can launch a Web
browser on the computer connected to
the router, type the IP address for your
router into the address bar (a commonly used IP address 192.168.1.1, but
this can vary by device), log in, and

then manage the routers settings as


desired. If the router your organization
doesnt support this feature, you may
be able to use a third-party firmware
to set up standard and guest networks
and tap into numerous additional features. CoovaAP (www.coova.org) and
DD-WRT (www.dd-wrt.com) are two
examples of free firmware that you can
install on select routers.

Setup
Once you have all of the hardware
and software you need for a Wi-Fi
hotspot, its time to get it up and running. Most mainstream routers sold
in the U.S. have software that is intuitive to use and makes it simple to add
a guest hotspot. With some routers,

its as easy as clicking Yes during


setup to enable and allow guest access, but with other routers you may
need to follow more steps or launch a
special setup wizard. Because there are
variations in this process depending
on the device and manufacturer, check
the manual for instructions specific to
your router.
In the process of using the routers
software to establish a guest Wi-Fi
hotspot, you will discover relevant
settings that provide you with further
control over the hotspot. Some devices
allow you to set specific days of the
week and ranges of time during which
the guest network can be accessed. Use

these settings to make sure the hotspot


is available when youre open for business and not available when youre
closed; this prevents unwanted, unauthorized access.

Security
Perhaps the most important thing to
think about when setting up a public
Wi-Fi hotspot is to make sure that
your guest hotspot is separate from
your company network. Most routers
support WEP (Wired Equivalency
Privacy), WPA (Wi-Fi Protected
Access), and WPA2 technologies,
which provide for encryption and password protection. Use one of these settings as a minimum safeguard from
potential unauthorized access or abuse;
we recommend using WPA2 as it provides the best security.
From a guests perspective, Wi-Fi
security means that they will have to
select the SSID (service set identifier,
or network name) if it's visible, or type
it in when connecting if the SSID is
hidden, as well as enter a password in
order to log on to the network. To simplify matters, make sure your guests
know how to obtain a password, and
change the password on a regular basis.
Requiring guest users to accept a
ToS (Terms of Service) agreement can
also be beneficial. You can do this by
employing a captive portal, a common
feature of business-class routers, which
is essentially a splash page users will
see on their device screen when logging
on to the network.

Other Considerations
Using router settings, you can impose bandwidth controls to protect
your other networks from experiencing
a bottleneck. You can also set guest
connection time limits and designate
which websites or applications are
permitted to use the hotspot and even
charge fees for using the hotspot. Once
you have installed any necessary hardware (routers, range extenders, and access points), adjusted the settings, and
turned on your guest Wi-Fi network,
its ready to be discovered and used.

CyberTrend / February 2015

39

Unied Threat Management Solutions


THE ALL-IN-ONE APPROACH TO SECURING THE PERIMETER

KEY POINTS
UTM (unified threat management) has traditionally appealed
to SMBs due to solutions integrating multiple security tools in
one appliance.
For security personnel, UTMs
consolidated approach means
only one appliance to monitor
and manage.
Depending on the solution,
there can be significant overlap in
features and functionality among
UTM and next-generation firewall
solutions.
NGFWs (next-generation firewalls) are generally considered as
offering higher performance than
UTMs and are geared toward enterprise organizations.

40

February 2015 / www.cybertrend.com

SIMPLICITY. COMPREHENSIVE. Consolidation. Integration. Centralized control. These are but a few terms used to
describe UTM (unified threat management), a security approach traditionally
popular with smaller companies due to
its integration of multiple security technologies in one appliance. For companies,
UTM means having just one solution (vs.
many) to install, monitor, and manage.
Increasingly, larger enterprises are
adopting UTM solutions, albeit those
generally offering advanced features. Its
these features that are largely the source
of confusion about how UTM differs
from NGFW (next-generation firewall)
offerings. This article explores these and
other issues concerning UTM.

What UTM Really Means


In one sense, UTM is about getting the
most bang for the security dollar. As Adam
Hils, Gartner research director, says, UTM
is about getting a variety of network security safeguards for the best price. UTM

solves some security use cases splendidly,


he says. SMBs (small to midsize businesses), particularly those with 1,000 or
fewer users, that value a limited number of
low-cost management platforms are primary UTM candidates, especially if their
networks arent complex and firewall rule
sets are relatively simple, Hils says.
Fewer security consoles generally
means fewer administrators needed to
oversee them. All these things should
result in a more abstracted management
environment with an easier to use management plan that allows small teams
to work more efficiently, says John
Kindervag, Forrester Research vice president and principal analyst.
Overall, UTM specializes in solving security problems at the perimeters. Beyond
technologies that address firewall challenges and Web, email, and application
security, many UTM vendors are adding
features such as cloud-based sandboxing
and DLP (data loss prevention) to address
various internal security challenges, says

Administrators can be overwhelmed managing too


many solutions. Even worse, a host of network security
solutions that arent talking to each other leaves security
gaps for threats to enter.
CHRIS RODRIGUEZ
Senior Industry Analyst, Network Security : Frost & Sullivan

[UTM solutions] are very effective for small businesses,


medium-sized businesses, and branch offices. This
includes restaurants and other franchise businesses.
CHARLES KOLODGY
Research Vice President, Secure Products : IDC Research

Rob Ayoub, NSS Labs research director.


UTM is also useful for those seeking highperformance ability to do a full-packet
inspection in one go, says Jessica Ireland,
Info-Tech Research Group research manager. A UTM [solution] can inspect up
the stack with little to no effect on a networks performance. This means packets
can be inspected quickly and workflow
isnt affected, she says.
At minimum, a UTM solution should
provide firewall, application security, and
IPS (intrusion protection system) services,
according to Ayoub. Gateway antivirus,
email security, Web content filtering, and
SSL (Secure Sockets Layer) VPN (virtual
private network) components are possible depending on the solution. Chris
Rodriguez, Frost & Sullivan senior industry analyst covering network security,
says some vendors are adding WAF (Web
application firewall), DLP (data leakage
prevention), and NGFW functionality.
Disagreement about how UTM differs from NGFW essentially stems from
an overlap in features. Ireland says that
while semantics is partially at play, the
difference really resides in which solution works for specific organization types.
Traditionally, UTM has been targeted at
SMBs, but solutions have expanded in
capability and crossing into NGFW territory. Still, some UTM vendors primarily
target SMBs. NGFW is typically aligned
with large enterprise-sized organizations looking for the higher-performance

solution that can perform deep-packet


inspection up the stack, Ireland says.
Rodriguez sees NGFWs as integrating
a stateful firewall in addition to application control, user identity-aware controls,
and IPS. Since the rise in popularity of
NGFWs, UTM vendors have also developed their own versions of NGFW, he
says. Most UTM vendors offer NGFWlike functionality, and due to hype
around NGFWs, many are happy to sell a
NGFW, he says. Essentially, though, a
UTM solution can be considered a NGFW,
but a NGFW is not a UTM, he explains.
Ayoub depicts the UTM vs. NGFW
debate as an example of what happens
when history and marketing collide.
Functionality-wise, theres no difference
between the two, he says. Developmentwise, the market typically categorizes
UTM as targeting SMBs and NGFW for
enterprises, but on paper and in our
testing theres no difference in efficacy,
he says.
Quite frankly, UTM has evolved into
NGFW for the most part, Kindervag
says. Forrester Research now only thinks
of UTM as NGFWs due to advancements
in UTM, he says. Any differences that
remain are tied to the quality of features,
he says.

Whats In A UTM Solution


In general, UTM solutions contain security OSes in a hardware appliance. Still,
as Ayoub explains, there is no typical

UTM solution. The hardware, for example, may be purpose-built and use
ASICs (application-specific integrated
circuit) to perform specific functions or
be more generic and use COTS (commercial off-the-shelf) processors to keep cost
down and maintain product flexibility.
ASICs are considered to provide the
highest level of performance because
they use hardware to accelerate specific
functions that a general processor would
perform at an average rate, Rodriguez
says. At least one UTM vendor using
generalized hardware, however, claims
its closing the performance gap by using
integrated features in COTS processors,
he says.
Software in a UTM solution can be
a combination of technologies from
different vendors or from one vendor,
Ayoub says. The one common thread
among all solutions today is they require
some sort of subscription service to keep
the device up-to-date with current threat
information, he explains.
Beyond the traditional hardware-software appliance, virtual UTM appliances
are gaining momentum, Rodriguez says.
These lack some performance acceleration that dedicated hardware provides,
but theyre necessary to inspect traffic between virtual machines and other virtual
environments, such as public and private
clouds, he says. Elsewhere, Ireland expects
companies will start looking to the cloud
for UTM, though presently customers
are still hooked on the buzz of it.

Problems Solved
The primary problems UTMs were
designed to deal with was that there
were just too many security applications
at the gateway, and it was becoming difficult to manage all those, says Charles
Kolodgy, IDC research vice president,
secure products. For example, an organization might have had five pieces of
hardware. UTM was designed to reduce
the number and centralize management.
Rodriguez says the appliance sprawl
organizations faced resulted in administrators being overwhelmed with managing so many solutions, to the point
they couldnt. Appliance sprawl also

CyberTrend / February 2015

41

taxed security budgets. Even worse, a


host of network security solutions that
arent talking to each other leaves security gaps for threats to enter, Rodriguez
says. Ireland says for the most part, the
move toward a consolidated security approach is the trend.
UTM is best suited for deployment
in environments that lack a full-time
security staff to install and configure appliances, Kolodgy says. Plug and play.
So [UTM solutions] are very effective for
small businesses, medium-sized businesses, and branch offices. This includes
restaurants and other franchise businesses, he says.
While the value UTM affords can
make sense for enterprise organizations,
Rodriguez says, these larger organizations typically have bigger security budgets and strong security teams. They
place greater emphasis on brand names
that are proven to perform well and reliably, he says. Still, UTM vendors are
steadily gaining traction with enterprise
customers, he says.
Although the increased performance
UTM vendors are including in appliances makes solutions suitable for enterprises, Ayoub cautions that deploying
UTM in such environments doesnt
come without having to meet other requirements concerning central management and policy deployment. Such issues
could differentiate whether a UTM is a
good fit for a particular company.

UTM Pros & Cons


For administrators, simplicity is arguably the most notable benefit UTM
provides. Kevin Beaver, an independent
information security expert, says this
is a great antidote to the growing complexity of network environments today.
Specifically, UTM enables administrators to push out rules to multiple capabilities. Ireland says rather than deal
with inconsistencies among solutions,
administrators can apply the same rule
set, thus creating fewer potential vulnerability gaps.
On a related note, administrators can
control multiple functions from one interface and learn only one tool from

42

February 2015 / www.cybertrend.com

[The UTM and NGFW markets] havent completely


merged yet . . . but well see more of a movement toward
NGFWnext-generation everything, even IDPS [intrusion
detection and prevention systems].
JESSICA IRELAND
Research Manager : Info-Tech Research Group

one vendor. For many small businesses,


Hils says, regulatory demands drive
requirements to administer multiple
security functions, which mandates rudimentary levels of security controls.
Browser-based management, basic embedded reporting, and localized software
and documentation that dont specifically appeal to large enterprises are
highly valued by SMBs in this market,
he says.
Another key benefit of UTM is the
ease with which administrators can add
new features, Ayoub says. A single software upgrade is usually all thats needed
to add new features and functionality
to a wide range of features. The threat
of policy mismatch is minimized, he
says. Further, one console means just
one throat to choke in terms of only
troubleshooting with one vendor vs.
multiple ones. Conversely, a UTM solution can also be single point of failure,
Beaver says. Also, relying on a single
vendor for all of your security controls
may not be the best fit for your situation, he says.
Rodriguez says that ideally a UTM
solution will go beyond consolidation
benefits and provide integration that
enables breaking down security silos.
This is important now because security
policies must be complex due to the dynamic nature of networks, which can
involve remote employees requiring access, mobile devices, social media, and
cloud services. Firewall policies that
address who is connecting and from
where, as well as what Web applications
individuals are using, are important
starting points, but the ability to determine that a device isnt infected (via
IPS) and looking for sensitive data (via
DLP) are also important, he says.

The UTM Market


Ayoub views the UTM market as
growing at an aggressive pace. As security
threats continue to penetrate traditional
defenses, many organizations are upgrading legacy perimeter security products
to UTM to utilize the latest advancements
in detection and protection, he says. As
more solutions enter the market that are
suitable for enterprises, he says, we believe
this market will continue to show strong
growth for the foreseeable future.
IDC valued the UTM market in 2013
at $3.5 billion and projects it will reach
$6.3 billion by 2018, Kolodgy says. Hils
dubs the market robust and growing
roughly at a 15% rate. The UTM market
is highly penetrated in the SMB segment,
so some vendors are trying to find budgetconstrained small enterprises to appeal
to, Hils says. Weve seen some limited
success in this segment. As years go by,
vendors are asked to add more features
to their UTM platforms without raising
prices significantly.
Rodriguez says UTM is adapting to
new security threats, most notably with
vendors adding advanced malware and
APT [advanced persistent threat] protection features. Some [vendors] develop
these capabilities internally while others
develop partnerships, he says. With the
rapidly changing nature of technology and
cyber threats, Rodriguez expects the UTM
market will continue to see an ongoing
period of development.
As for where the UTM and NGFW markets are colliding, Ireland says a bit of divide
still exists. They havent completely merged
yet (even if just from a marketing materials
standpoint), but well see more of a movement toward NGFWnext-generation everything, even IDPS [intrusion detection
and prevention systems], she says.

Safety In A Virtual World


WHAT TO KNOW ABOUT SECURITY IN A VIRTUALIZED ENVIRONMENT

KEY POINTS
Many CISOs/executives arent
aware of security risks possible in a
virtualized environment. Some who
are may be limited in their authority
and ability to protect against them.
One concern regarding security
in virtual environments is attackers
gaining access to a virtual machine
console.
A key issue that executives face
is their organizations security tools
not extending coverage, management, and policy creation into virtual
environments.
Finding a consulting agency that
employs penetration testers versed
in testing virtualized environments is
one approach to starting to improve
security in a virtualized environment.

VIRTUALIZATION HAS UTTERLY re-

Growing Concern

shaped how organizations operate their


IT infrastructure. As experts point out,
the technology is mature and rapidly approaching mainstream statusif its not
already there. Experts also attest to the
many benefits virtualization enables, including reducing physical servers and
thus realizing power, cooling, and space
savings, as well as faster provisioning of
resources, deployment of applications,
and to-market times.
Less positive and publicized, however,
are the vulnerabilities and security worries a virtualized environment can introduce. The Virtualization and Security
series of MIT Geospatial Data Center
blog posts, for example, details various
concerns related to the hacking and manipulation of VMs (virtual machines) and
compromise of applications hosted on
VMs. Also less positive is a belief among
experts that many CISOs arent aware of
such issues and that some who are may be
limited in what they can do about them.

While publicity may be lacking, concerns about security threats in virtual environments are real. According to Kyle
Prigmore, Enterprise Strategy Group associate analyst, 69% of 315 enterprise IT
and information security professionals
the firm surveyed in 2013 reported being
concerned or very concerned about a
virus or other type of malicious code infecting desktops or servers.
Gartner research director Eric Ahlm,
meanwhile, detailed in a recent article five
security concerns related to server virtualization that impact network security, one
being the use of non-virtualized network
security controls in a virtual network,
something that can break the cloud.
Another concern is a vulnerability in the
hypervisor circumventing all network security contained within it. A similar risk
resides with the concept of the virtual
environments super admin that could
override virtual network access controls
without impediment, Ahlm writes.

CyberTrend / February 2015

43

The MIT Virtualization and Security


blog notes concerns related to confidentiality, integrity, authenticity, availability, and nonrepudiation. With regard
to confidentiality, the blog states the
ability to look inside a virtual machine
can be hacked leading to complications
resulting from VMs being transferred
among different physical hosts. In terms
of availability, theres a concern the availability of applications housed on VMs
could be compromised if a virtualization
environment is attacked.

Whats At Stake?
To illustrate the seriousness of concerns, The CISOs Guide To Virtualization Security from Forrester Research
details a 2011 security incident in which
a fired IT administrator at a pharmaceutical company used a service account
to access the companys network. Using
an unauthorized installation of a virtualization platform, the former employee
deleted 88 virtual servers that held a majority of the companys computer infrastructure, effectively freezing operations
for days and leaving employees unable
to ship product, cut checks, or communicate by email.
Forrester states while many CISOs
arent aware of virtualization security
risks, some who hold concerns lack authority or influence regarding infrastructure and operations to enforce policy
or implement new security controls.
Forrester also writes that most security
professionals possess limited knowledge
about the usefulness and availability of
virtualization-aware security solutions.
Some security professionals, especially
CISOs and other security leaders who
have risen up the technical ranks, lack
the confidence in virtualization knowledge they desire, according to Forrester.
This is particularly the case when we
compare virtualization with more mature
security areas, such as network security.

Security controls that are embedded, or otherwise integrated into the virtual systems management, do better
than those that are just standalone security islands.
ERIC AHLM
Research Director : Gartner

Adrian Sanabria, 451 Research senior


security analyst, also believes theres a
general lack of awareness, primarily because CISOs and other executives arent
aware of all the options available. There
are critical differences between virtual and
physical environments that affect security,
regardless of whether they are hosted or
internal, public, or private, he says.
Edward Haletky, The Virtualization
Practice principal security analyst, says
that although awareness is lacking, we
need to take this a step further and ask,
Do they possess enough awareness about
hybrid cloud security risks? Hybrid cloud
is currently virtualization within the data
center, within the cloud, and on the device. CISOs/executives may try to cover
one of these elements but miss the second
and third, he says. The true risk to a
virtual environment is access to the virtualization management constructs (portals, servers, etc.), he says. The lowest
hanging fruit is to secure those environments. The same holds true for cloud environments actually.

Chief Dangers
Compared with traditional security
configurations, dangers for a virtualized
environment can differ. Virtualized components are just like their physical counterparts but with several additional items
to worry about, Sanabria says. Beyond
potential vulnerabilities concerning
a physical server, theres a worry about
the guest tools software installed on
its virtual counterpart, he says. Further,
there have been a few virtual machine
escape vulnerabilities that could allow an

attacker to jump from one virtual guest


machine to another or allow the virtual
machine host operating system to be attacked, he adds.
Any attacker that acquires access to
a VM or cloud console could cause significant damage, Sanabria says, including
exfiltrating entire virtual drives and guest
machines or instances coming under
the attackers complete control. For example, every virtual machine on a host
could be stopped and deleted in minutes
if there are no controls in place to prevent mass changes to the virtual environment, he says.
Haletky believes badly managed virtualization management security is actually the chief security danger in a virtual
environment, something that could include not enforcing least-privilege access
within directory services. If the management layers are open to everyone, then
your environment will be broken into,
he says. The next biggest danger is the
logging into the management console on
the hypervisor without reason, he says.
The only need to log in is to fix a hardware problem, not to manage the system,
Haletky says. Someone with such access could delete large swaths of virtual
machines in seconds using well-known
scripting tools. Thus, enacting proper
least-privilege, role-based access controls
with real-time auditing to determine
when such an event occurs is a must.
The virtual machine escape vulnerabilityor escape the virtual machine,
as its also knownSanabria describes
has received considerable attention. VM
escape is commonly described as an

WHILE MANY CISOS ARENT AWARE OF VIRTUALIZATION SECURITY RISKS,


SOME WHO HOLD CONCERNS LACK AUTHORITY OR INFLUENCE REGARDING
INFRASTRUCTURE AND OPERATIONS TO ENFORCE POLICY OR IMPLEMENT
NEW SECURITY CONTROLS.

44

February 2015 / www.cybertrend.com

The true risk to a virtual environment is access to


the virtualization management constructs (portals,
servers, etc.).
EDWARD HALETKY
Principal Security Analyst : The Virtualization Practice

exploit that enables an OS running


within a VM to break out or escape.
The OS could then interact with a hypervisor directly, ultimately giving a hacker
access to the host OS and VMs operating
on that host.
Haletky says its true that considerable work is happening to subvert the
hypervisor, but most attacks in a virtualization environment occur outside
the realm of virtualization or the cloud.
Generally, attacks are focused at the
typical targets, such as websites and databases. Haletky says there arent many
escape the VM-like attacks in the wild
that actually work effectively. Its far
easier to attack using well-known attack
methods, he says.
From the attack perspective, Haletky
says there arent any real differences between virtual and physical environments.
From the response side, though, there
are many new ways to respond within a
virtual environment. One is the ability
to monitor at layers not previously available and implement solutions quicker,
as well as utilize built-in automation
and ready-to-go images, or templates,
he says.
Similarly, Ahlm doesnt see new or
different risks in virtual environments
vs. physical ones, bar the hypervisor
trust being compromised, which has yet
to see big issues. The biggest problem
executives currently have is their existing
security investments may not reach
into their virtual environments well, including extending coverage, management, and policy creation to all desirable
virtual environments, he says. In short,
Ahlm says conducting a security controls
evaluation is warranted.
Another issue is how security controls will impact benefits that virtualization enables. For example, if a security
control is introduced into the virtual

environment but greatly limits how


fast the virtual environment works or
hinders virtual systems operations, there
are problems, Ahlm says. Security controls that are embedded, or otherwise
integrated into the virtual systems management, do better than those that are
just standalone security islands, he says.
Sanabria says historically, attackers
havent singled out virtualized environments, though they seem aware of the
value of virtual machine/cloud consoles
that let them control or damage large
amounts of infrastructure from a single
interface. Incident-wise, he says, virtualized environments do make some
forensics tasks as simple as making a
read-only copy of a file. Further, the
nature of virtualization makes it easy to
quarantine VMs by moving them into a
group with more limited access. Also
the machine can be paused, stopping
any threat of exfiltration or malicious
activity, while preserving artifacts important for the investigation, he says.

Mistakes & Advice


A common mistake Sanabria sees
companies often make is protecting
VMs and physical systems but forgetting
to protect the virtual console, which

or disaster if the wrong person obtains


access to it.
Sanabria cites a company whose
computing resources, website, billing
system, databases, and even backups all
existed with a cloud computing services provider. Although the company
believed it was well-protected, an attacker gained access to the providers
console and cut the rope, deleting all
of the companys cloud-based resources
permanently, leaving them without a
product over a span of a few hours.
One way a security approach that
takes virtualization into account can improve upon a traditional security approach is by simply taking whats known
about traditional security and extending
it into the virtual environment, Haletky
says. Virtualization, however, also provides the ability to move security out of
the OS and into a substrate below, he
says. This allows me to hide security
measures from those who can access the
virtual machine and therefore still have
security but without giving away how it
is accomplished.
Done correctly, virtualization can be
a net gain to traditional security and
change our approach, Sanabria says.
Ive seen some vendors that remove
administrative access from production
systems or configure them so that configuration changes cannot be made, he
says. Any changes or administration
functions are then performed on nonproduction systems and promoted to
production, replacing the previous machine, which is then destroyed. Such

THE NATURE OF VIRTUALIZATION MAKES IT


EASY TO QUARANTINE VMS BY MOVING THEM
INTO A GROUP WITH MORE LIMITED ACCESS.
usually has full access to all systems it
hosts regardless of how good the security on them is. He uses an analogy of
an impeccably protected data center
hanging 40 feet above ground from a
single rope to illustrate the importance
of protecting the virtual console. The
rope represents the VM console and is
a single point of potential compromise

approaches could revamp how security


is viewed and reduce threats to systems,
he says.
A question an executive should ask
going forward, Ahlm says, is who will
be the buying center for virtual system
security? Security teams or virtual systems operators? Ahlm believes investigating both options is worthwhile.

CyberTrend / February 2015

45

The Path Of Identity Theft


HOW IT HAPPENS & THE ACTIONS YOULL NEED TO TAKE

KEY POINTS
Documents that you throw out,
such as old financial and medical
records, might provide enough
information for a criminal to steal
your identity and access your
accounts.
Be particularly careful with health
care documentation, as it might contain birth dates, policy numbers, and
billing information thats extremely
valuable to identity thieves.
Many identity thieves are interested in gaining access to your
finances, while others will impersonate you for criminal activities.
Collect all evidence of identity
theft, including canceled checks,
suspicious emails, and credit card
statements.

46

February 2015 / www.cybertrend.com

ITS EASIER THAN YOU might think for


criminals to obtain your personal information. For example, a single computer
spyware infection can give a cybercriminal the opening he needs to track your
online activities and keystrokes. A malicious person can install stealthy spyware
or keystroke logger programs that run
in the background on your computer, allowing him to monitor your online and
keyboard activity as you enter usernames,
passwords, and credit card numbers.
Non-technical criminals can get the information they need to impersonate you
simply by rummaging through your trash
in search of old mail, medical records,
and other personal documents. Even if
you make certain to secure your PC and
shred your mail, beware that criminals
can also obtain your personal information from third-party data breaches
where information about you is made
public online. Well show you the most
likely ways a criminal will try to steal
your identity.

What Theyre Looking For


When it comes to identity theft, a
malicious person only needs a few key
pieces of data to start impersonating
you. Michela Menting, practice director
at ABI Research, says, Of value are
first and last names (including middle
and maiden), dates of birth, address,
phone numbers, emails (especially corporate), [and] bank account and credit
card numbers.
To create actual accounts, a criminal
will need a few other key pieces of information. Cline Gravelines, consulting
analyst at Info-Tech Research Group,
explains that identity thieves are after a
persons Social Security number; health
records; and any other information
that can be used to access credit cards
and bank accounts, redirect mail, rent
housing or vehicles, open cell phone service, acquire employment, or otherwise
commit fraud.
You might think that criminals
only want the most critical data, but

Of value are first and last names (including middle and


maiden), dates of birth, address, phone numbers, emails
(especially corporate), bank account, and credit card numbers.
MICHELA MENTING
practice director : ABI Research

every little bit of information can help


them in their quest to impersonate
you. Anything about the history of a
personthis is particularly valuable for
passing verification systemscan be
valuable, including place of birth, parents names, pets, schools, and sibling
names, says Menting. Other important
particulars could include nationality,
your job title and place of employment,
and notable hobbies. This way, the thief
can supplement the personally identifiable information with details that likely
only you would know.

Information Gathering
There are a number of ways in which
criminals can find the information they
need to steal a persons identity. They
generally prefer the easiest routes possible and often start looking for the information online. A first easy place to get
information is through social networks.
People can reveal a lot of personal information about themselves that can be
easily aggregated, says Menting. Many
people will list their birthdays, current
jobs, and relationship statuses (including
significant others names) in the publicfacing profile.
Automated phishing emails and calls
are another relatively effortless method
employed by crooks. Scammers attempt
to lure you into providing password and
other personally identifiable information
by impersonating a legitimate company
or individual, says Gravelines. Phishing
scams are often given away by suspicious
threats and prompts that the recipient
click links or provide credit card information over the phone or by email instead advising a person to call his bank or
log into his online account.
Cybercriminals might try to obtain the
information directly from the source by
hacking into an organization that holds

the data. Popular targets include health


care providers, tax/revenue departments,
schools/universities, banks/financial/insurance offices, service providers (telcos,
MNOs, ISPs), retailers, and merchants. If a
hacker is after personal information about
a companys employees, hacking into the
human resources department might be
the only in he needs to gather Social
Security numbers, names, and bank account information.
Some hackers might also try to
sell data they obtain from third-party
breaches on the black market, where
other criminals can purchase the information and use it to steal identities. The
health care industry can be particularly
useful to a crook who might use birth
dates, diagnosis codes, policy numbers,
and billing information to buy and sell
medical equipment and drugs. Fraudsters
may also use the information to make
false insurance claims. Furthermore,
providers and patients may not be immediately aware of when medical data
is stolen, so it could take months before anyone notices a discrepancy.
Fortunately, financial institutions can
cancel credit cards and fraudulent transactions quickly.
Stealing mail and dumpster diving are
other ways identity thieves can access
medical records, account information,
and other critical personal details. Crooks
will often target trash bins behind businesses and apartment buildings where
people might toss out their bank statements, credit applications, and old bills.
Basic account details, along with your
name, might be all that a person needs to
impersonate you and acquire additional
account information.
Criminals can also be sneaky when
acquiring personal identification and
credit card numbers. Some use skimming to acquire information. This

method involves installing a false front


on an ATM to capture a users bank account information when he slides his
card and using a nearby hidden camera
or a keypad overlay placed directly on
the original keypad to record or store the
corresponding PIN. Fraudsters may also
shoulder surf as you type your PIN at
an ATM or enter credit card numbers and
other personal details onto your laptop or
tablet while in a public setting.

The Tip-Off
Being on the receiving end of a
phishing email may make it seem like
youre being targeted for identity theft,
but typically, the automated emails are
sent to hundreds of random email addresses. That being said, not all phishing
emails are random. Menting says, When
you start receiving very personal phishing
emails that are directly related either to
your career or your hobbies, or reference
very specific details of your private life,
you may start to worry.
Phishing emails can be tailored to
match the look and style of the emails
you might receive from a legitimate company. Theres a trend now for phishing
emails that purport to be an invoice for
something you have ordered. You just
need to take the time to think back if you
have actually ordered anything or if you
recognize the name at all, says Menting.
If your business starts receiving targeted
phishing emails, its a good idea to alert

PREVENT IDENTITY THEFT


You can avoid falling victim to identity
theft with these tips.
Keep personal details on social
networks to a minimum
Be suspicious of phishing scams via
email and the phone
Be aware of people looking over your
shoulder when entering PIN and
credit card numbers in public
Regularly monitor financial accounts
for fraudulent charges

CyberTrend / February 2015

47

all employees and reiterate that they


shouldnt click any of the links within or
respond to the fake emails.
Many online services now make it a
priority to alert you as soon as major
account changes occur. As a result, one
big identity theft giveaway is an email
from an online account saying that
your password or other settings have
been edited or that you attempted the
maximum number of login attempts,
says Gravelines. Unauthorized address
changes and withdrawals from your bank
account, obviously, are big warnings,
as well. If fraudsters are impersonating
you when opening new lines of credit,
you might start receiving mail or phone
calls regarding applications for accounts
that you have never attempted to open.
Its important that you dont disregard
any of these warnings as mistakes by the
online or financial institution.

Information Application
So, lets say a criminal has successfully found the credentials necessary to
steal your identity. What next? In general, an identity thiefs goal is to steal
your name and other personal information with the intent of gaining access
to finances, incurring debts, or impersonating you for other fraudulent purposes, says Gravelines. Criminals can
use your personally identifiable information to log in to sensitive websites,
such as online banking sites. Once in
control of your accounts, the criminal
can make changes to gain complete control. For example, he might alter your
mailing address, and of course, change
login credentials, so you cant access
your account.
If account takeover isnt pos sible, crooks will move on to the next
best thing: creating a new account.
Gravelines says that criminals might
open a new cell phone plan under your
name and credit card, call organizations
posing as you to acquire even more information, get medical treatment, or
avoid a criminal record by providing
your name and drivers license. In effect, the criminals cover will be your
identity. As the victim, youll be left to

48

February 2015 / www.cybertrend.com

Scammers attempt to lure you into providing password


and other personally identifiable information by impersonating a legitimate company or individual.
CLINE GRAVELINES
Consulting Analyst : Info-Tech Research Group

deal with the mess, including closing the


bad accounts and opening new ones that
the criminal cant access.
After he has stolen an identity, the
crook will work quickly to maximize the
opportunity. Identities are used and
thrown out in a relatively short span of
time, says Menting. It is difficult to
hold on to an identity for very long, notably from financial institutions, because
they have so many checks and balances
in place. To keep the damage to a minimum, its a good idea to regularly review
your bank account statements and credit
card bills. This way, youll notice the
fraudulent charges right away and can
alert the financial institution of the issue.
In cases where the identity theft isnt
caught right away, the criminal will likely
start using your identity in multiple locations, making it that much more difficult
for you to regain control of your identity.

Recommended Action
If you notice fraudulent charges on
your financial accounts, your first step
is to contact your bank, as well as credit,
phone, and utility companies to close or
freeze any accounts the thief has taken
over. If false accounts, credit, or loans
have been opened in your name, youll
also want to immediately go to the police
and report the crime to the department
with jurisdiction. Get a copy of the police report to provide proof of identity
theft when contacting other organizations, says Gravelines.
After youve contacted the appropriate
creditors, banks, and utilities, you might
report the identity theft to the Federal
Trade Commission (877-438-4338) and
provide one of its representatives with
the details of your case. The FTC wont
resolve any individual identity theft cases,
but your report could help overall law
enforcement action against identity theft.

The FTC also handles Social Security


number fraud, and you can contact the
agency to order a copy of your earnings
statements, so youll know if someone has
used your Social Security number to get a
job. If any fraud involved a business scam,
you can file a report with the National
Consumers Leagues Fraud.org.
If youre alerted to the theft by the
liable party, there is little for you to
do. Menting says, The liable organization will normally take some proactive
steps to cancel your account or require
password changes. On your end, its
a good idea to save any evidence of the
identity theft. Keep canceled checks,
suspicious emails, and credit card statements, says Gravelines. You might also
want to replace critical ID cards, such as
your drivers license, health cards, and
passport.
As we mentioned previously, the
identity thief might try to set up other
accounts using your credentials. Start
by contacting one of the three credit
reporting agencies and request a fraud
alert be place on your file. This prevents
someone from opening new credit accounts in your name without your permission. To be safe, you can also alert the
major phone, cable, and utilities service
providers that someone might attempt to
create an account in your name.

Preemptive Action
[Individuals] should make sure that
they have adequate insurance in place,
or that at the very least, the services and
organizations with which they share
their data has adequate anti-fraud and
security risk policies in place to counter
against such eventualities, says Menting.
If you feel a company or financial institution doesnt provide enough protection
against identity theft, you might be better
served to find one that does.

The Fine Art Of Disaster Recovery


THE INTRICATE DETAILS THAT GO INTO CREATING A SOLID DR PLAN

KEY POINTS
Despite the importance of
applications and systems to organizations, many dont have DR
(disaster recovery) plans implemented, let alone test their plans.
Key to creating an effective DR
plan is getting involvement from
all the right departments and
personnel and obtaining the right
information.
Vital to any DR plan is prioritizing systems, applications,
data, etc. in terms of business
continuity.
Accounting for the types of
disasters that are probable and
which systems to include are key
to DR plans.

REALISTICALLY, EVERY COMPANY should


view DR (disaster recovery) as absolutely
essential. Yet, if various surveys and research are to be believed, an alarming
percentage of companies dont have a
DR plan (and many that do never test it).
That means no strategy or procedures in
place to restore communications, servers,
Internet connections, applications, and
other mission-critical business operations
if a security attack, hardware failure, natural disaster, or other event occurs. Ultimately, a company put out of business long
enough may never get back in the game.
Developing a comprehensive DR and
BC (business continuity) plan is more
important than ever due to the complexity now existing within enterprises
in terms of systems, technologies, infrastructure, operations, etc. Further, the
types of disasters possible are more complex and varied, particularly where cyberattacks are concerned.
Essentially, a DR plan is really nothing
more than how a company exactly plans

to continue business as quickly as possible


following a disaster. Creating a DR plan,
however, is far more complex. Beyond
involving the right personnel, a DR plan
requires properly assessing risks, determining which business processes are most
critical, and more. The following details
these aspects and more.

All Aboard
Before deciding what to include in
a DR plan, its important to first determine whom to involve in planning. The
answer isnt just IT. Though your ITs
input is key, IT doesnt necessarily know
in which order computer systems should
be restored for business operations,
says Roy Illsley, Ovum principal analyst. He advocates forming a cross-business unit team headed by IT or a Chief
Risk Officer.
Similarly, consultant Dr. Steve Goldman of Steve Goldman Associates says
because a quality DR plan has an enterprise-wide impact, every department

CyberTrend / February 2015

49

should provide input to meet RTO (recovery time objectives). IT cant and
shouldnt do that, he says. Prior to a
disaster, each department must understand the impact of DR plan implementation on its operations and staff.
Ideally, all LOB (line of business)
managers who oversee applications that
a DR plan will protect should participate, says Dave Simpson, 451 Research
senior analyst, including by signing off
on service-level agreements for applications and determining RTOs and RPOs
(recovery point objectives). Notably, the
lower the RPO/RTO times, the more expensive the DR setup, Simpson says.
Ashar Baig, research director at
Gigaom Research, also emphasizes obtaining each departments feedback concerning RTOs/RPOs, because these are
specific to application types and should
go into plans. A department should indicate, for example, that it needs, say,
its customer relationship management
program restored within four hours after
a disaster to resume business continuity.
Thats not for IT to decide, Baig says.
Further, business-critical priorities for
applications must be conveyed to service providers the company uses and
stipulated into contracts as part of a DR
plan. The DR plan is no good until the
service provider agrees with this, signs
the contract, and theres teeth within the
contract for noncompliance, Baig says.
David Hill, Mesabi Group founder,
says CIOs can help target representatives from operations, application development, technical support, and
other units who are essential to DR for
planning purposes. Additionally, corresponding C-level executives in business units, legal, human resources, and
auditing should assign representatives.
Overall, any unit a disaster would potentially impact should have skin in the
planning game, says Hill. Depending
on the enterprises size and complexity,
multiple people with DR planning skills,
experience, and knowledge should be
involved to help ensure the plans validity, integrity, and completeness. Such
expertise can come from existing employees with proper training; through

50

February 2015 / www.cybertrend.com

Though purchasing a generic plan from a consultant is


an option, youll get a generic response. This strategy
is guaranteed to fail. DR plans must be specific to the
organization.
DR. STEVE GOLDMAN
Consultant : Steve Goldman Associates

hiring those with necessary skills; or via


various third parties that can provide
support, Hill explains.

What To Include
Developing a DR plan thats wellsuited to the enterprise is vital. Though
purchasing a generic plan from a consultant is an option, youll get a generic
response, Goldman says. This strategy
is guaranteed to fail. DR plans must be
specific to the organization.
What information needs to be gathered largely depends on whether the enterprise already has a DR plan in place
or is creating its first one. If an existing
plan only requires some fine tuning,
Hill says, the DR team will probably
know what information is necessary. If
a major overhaul is required, however,
a team may feel its almost starting from
scratch. Overall, capturing information
for the initial DR plan is daunting, Hill
says. Further, theres a danger planning
will be nickel and dimed if executives
arent committed or feel theyre being
forced to participate.
Broadly, planning requires understanding the hardware and software
assets the company has, the relative
business priority of these assets, and
how the assets relate to one another,
Illsley says. Organizations must also understand when an asset will and wont be
needed following a disaster. Quarterly
business reports, for example, may be a
high priority at quarters end but not at
the beginning of the next quarter. Once
assets are known, Illsley says, a risk profile that accounts for who uses an asset,
where, and how should be assigned to
each asset. This level of information will
enable the organization to start to look
at possible solutions, Illsley says.

Beyond gathering information about


applications, RTOs/RPOs, staffing, relocation plans, and facilities from departments across the enterprise, evaluating
recovery priorities is critical. Goldman
cites an organization that makes
product manufacturing equipment as an
example. The company believed its production line for new equipment (the
big moneymaker) should be recovered first. Customers, however, viewed
service and replacement parts as more
important. Thus, the DR strategy prioritizes service and replacement parts applications over manufacturing to keep
customers happy.
Both Goldman and Baig stress that
plans must also account for not so obvious details. This includes servers and
apps that support crisis communication efforts, especially social media,
Goldman says. If these apps are on
third-party or cloud servers, does the
crisis communications staff have access
to them when neededimmediately
after a disaster is declared? he says.
Relatedly, staff and customers that a
disaster impacts will want and need
information about operations. IT could
handle such communication, but
why? Goldman says. Let the communications professionals do it; its
their job.
Baig emphasizes practical aspects
of a DR plan. For example, What if
your entire building burns down and
you dont have computers? he says.
You may have a great DR plan in place
that states Well send all data offsite to
a cloud provider that can spin it up and
make our applications available to employees, but how are employees going to
access that data without computers? This
has to be a formal plan. Such details are

Risks cannot be totally eliminated. An acceptable risk


therefore is one that an enterprise can tolerate and accept.
A disaster recovery plan has to balance risk with reward.
DAVID HILL
Founder : Mesabi Group

When things are fine, no one notices, but when theres a


disaster, the disaster recovery plan is what comes under
the microscope.
ASHAR BAIG
Research Director : Gigaom Research

the difference between a successful vs.


non-successful plan, Baig says.

Build A Complete Plan


A seemingly obvious but essential
question regarding DR plans is: What
systems should a plan cover? The answer
varies depending on whom you ask. Hill
says if any system can be left out, why do
you need it in the first place? A disaster,
after all, implies theres a possibility of
permanently losing the system, and thus
an application and all related data. That
said, some applications/data are more
critical than others. Therefore, Hill recommends a triage approach to recovery.
Goldman says theres a current
trend within disaster recovery of data
replication and backup data centers.
Accordingly, IT disruptions may be
transparent to users, including customers,
which is good, he says. In most cases,
though, IT and business units must prioritize what servers/apps need recovered
in a given time frame. Thus security programs should be recovered, for example,
well before the basketball pool app,
Goldman says.
Youll also want to consider the types
of disasters your plan needs to address.
Although every plan should include certain disaster responses (hardware and
software, user error, etc.), an enterprise
located in the Midwest wont likely pour
considerable resources into planning for
a hurricane. Generally, any event that
could create unacceptable downtime
or availability must be included, Hill

says, including logical events, such


as cyberattacks.
Goldman says he believes a DR plan
should apply more to the consequences
of a disaster than to what initiated the disaster. For example, some organizations
use time-dependent plans. Here, the
cause of disaster doesnt matter; whats
important is how long the data center
will be out of service. Thus, a plan would
include separate response strategies for
different outage time lengths. Other organizations, meanwhile, create DR plans
that include guidance for situational
responses, such as a zero-day virus or
long-term power loss.
While natural disasters can be devastating, Baig says they dont happen often.
Instead, hardware failure is the cause of
99% of disasters, he says. When devising
a DR plan, assuming the mentality of an
insurance company in terms of looking
at probabilities can be helpful, Baig explains. For example, the probability
something will definitely happen rates
as a 1, while the probability of something
that definitely wont would be 0. All
other probabilities fall in between. The
closer you get to 1, the more certainty
there is. This is exactly how we should
approach a DR plan, Baig explains.
Commonly, a disaster will mean losing
one server, driver, rack, etc. vs. an entire
site, Baig says. Here, the goal is quickly
getting back lost data. Rather than rely
on a cloud or managed service provider
for this, Baig strongly recommends
keeping a local backup copy. Use your

service provider for DR when you have


a site-wide disaster, he says. That local
copy will give you land-speed recoveries
for those 99.9% of failures youre going
to face.
To prioritize applications and systems
for recovery, conducting an RA (risk assessment) is beneficial. An RA pinpoints
the qualitative or quantitative value of a
risk thats associated with a recognized
threat, Hill says. A quantitative value
calculates two components of risk, the
magnitude of potential loss and the probability that the loss will occur. Risks
cannot be totally eliminated, Hill says.
An acceptable risk therefore is one that
an enterprise can tolerate and accept. A
DR plan has to balance risk with reward.
Illsley similarly says an RA is key to
defining the potential risks the organization faces. He cites the example of a
brewery that burned down with a fireworks factory located next door. Illsley
says he bets the new brewery wasnt built
next to a fireworks factory. Once you
have the risks, then its a case of allocating a probability and impact to each.
In terms of impact, Simpson says an
RA should include a calculation of how
much money the company would lose
per hour of downtime.

Visit Again & Again


Having a DR plan alone isnt enough
if its not continually tested and retested
and updated. IT is continuous, dynamic,
and ever-changing. The DR plan needs
to adapt as appropriate to ever-changing
conditions, Hill says. Succinctly, a DR
plan should be a living, breathing document. Overall, testing validates a plan,
Baig says. When things are fine, no one
notices, but when theres a disaster, the
DR plan is what comes under the microscope, he says.
Simpson recommends updating plans
at least annually and testing as frequently
as is feasible. Illsley similarly recommends
testing a plan in anger at least annually
to prove its worth. Testing a plan over a
weekend, for example, may provide a good
indication that DR is doable and the plan
works, but it doesnt necessarily show it
can cope with real-world demand.

CyberTrend / February 2015

51

Network-Attached Storage For SMBs


TAKING DATA STORAGE TO THE NEXT LEVEL

KEY POINTS
Organizations large and small
have a need for the secure and
flexible storage options that a NAS
(network-attached storage) unit
provides.
Building a NAS system can be a
simple and inexpensive way to bolster the integrity and security
of your organizations data.
Using RAID (redundant array of
independent disks), or a file system
that performs some form of data
redundancy, is vital to any good NAS
device strategy.
When purchasing a third-party
NAS unit, you commonly have the
option to supply your own HDDs
(hard disk drives) or buy a unit that
comes with drives already installed.

52

February 2015 / www.cybertrend.com

RARE IS THE BUSINESS TODAY that


doesnt rely on digital data in some form
or another for profitability and growth.
For SMBs (small and midsize businesses),
data is just as vital to the bottom line as it
is for large enterprises, and thats why the
storage systems that house data should be
designed with security and flexibility in
mind. NAS (network-attached storage)
is one of the best ways for organizations
of any size to create a data management
strategy that will unfetter employees and
foster future success.

Data When & Where You Need It


A NAS device most commonly refers to a standalone machine or server
that uses a minimalistic OS (operating
system) to manage a series of internal
storage devices, such as hard drives or
SSDs (solid-state drives). The NAS is
typically connected to the corporate
network using a fast connection, such
as Gigabit Ethernet, and it appears as
a simple directory that can be made

accessible from any device inside the


local network or from a secured Internet
portal. A properly configured and welldesigned NAS unit can store data, accommodate capacity upgrades, perform
periodic data backups, quickly recover
in the event of a drive failure, and keep
massive amounts of data secure from
potential cyber thieves.
Smaller businesses will need less
overall storage capacity or a device that
has fewer drive bays. For these organizations, there are a handful of compact
three- and five-bay NAS systems available that support capacities from hundreds of gigabytes to as much as a dozen
or more terabytes. The larger your organization (or your data storage needs),
the more likely a rackmount NAS may be
the best option. These units start at 1U
form factors that support four drives and
go up to 4U form factor racks and larger
with support for a dozen or more drives.
These units also can be combined in
a server cabinet with multiple NAS

systems to meet truly mammoth capacity requirements.

The Necessity Of NAS


There are a few reasons why
you might want to store your
important data on a standalone NAS device instead of
keeping it on a hard drive or
SSD inside a PC or multipurpose server. The OSes used
in the latter machines are designed to perform multiple
functions, including content
creation; application, email,
and Web hosting; Web traffic
filtering and monitoring; and
more. The strain on PC and
server system resources from all of these
activities can put calls for data on the back
burner. Alternately, a single-function NAS
device can support multiple simultaneous
users without anyone experiencing a drop
in performance.
The variety of hardware necessary for
PCs and servers can also cause problems.
With so many interdependencies, its not
a question of if there will be a failure,
but when. And when failure does occur,
theres a much higher likelihood that the
systems demise results in some form of
data loss. Its easy to reinstall an OS if
some aspect of it becomes corrupted, but
its often difficult to replace data that resides on the same storage device as the
corrupted OS.

The Do-It-Yourself NAS


One of the best things about a standalone
NAS device is that you dont necessarily

A RAID 0 array is the only type in which theres no


capacity penalty, but theres also no built-in parity.

Storage servers such as this one


from Supermicro are necessary
for organizations with massive
volumes of data and strict
regulatory compliance needs.

need to purchase a purpose-built machine


to perform the data backup and storage
functions you require. You can assemble
one using off-the-shelf PC components or
use legacy components that arent quite up
to handling modern applications. There are
a handful of open-source NAS OSes that
can make the most of dated hardware and a
series of high-capacity storage devices.
The motherboard is easily the most
important component in a DIY NAS.
Make sure it has one or more vacant PCI
(Peripherial Component Interconnect) or
PCI-Express slots available and a Gigabit
Ethernet adapter. The former lets you
install one or more aftermarket storage
controller expansion cards and the latter
will ensure that network bandwidth
never becomes a bottleneck. The storage
controller, typically built into the motherboard, is one of the more important
aspects of the motherboard. Try to find a
board with a controller that supports the
features of modern hard drives and SSDs,
such as 6Gbps SATA (also called SATA
3). Well describe RAID (redundant array
of independent disks) in more detail
later, but its important to make sure that
any storage controller you choose supports some form of RAID.
When it comes to components, NAS
devices are also fairly economical. They
dont need a sound card or a powerful
processor, and require a minimal amount
of memory. Many NAS devices also dont
need a monitor, keyboard, and mouse;

users can change settings and configurations remotely using a Web-based interface similar to the kind used to configure
wireless routers.

The Meat & Potatoes Of Storage


Although SSDs are still considerably
more expensive than HDDs (hard disk
drives), prices have fallen dramatically
in the past few years. SSDs are currently
available for as little as 50 cents per gigabyte. HDDs, even ones designed to
run in an always-on system, are selling
for pennies per gigabyte. We found 2TB
NAS-tuned HDDs available for less than
$100 each and 500GB SSDs for less than
$250. As we went to press, PCI or PCIE-based storage controllers with SATA
ports (2 port or 4 port) were available for
between $30 and $100.
RAID is a technology that lies at the
core of many NAS offerings. RAID combines two or more storage devices to
form a directory that offers faster data
read/write operations, built-in redundancy and the ability to self-repair, or
all of the above. A RAID 0 array, for instance, is one of the most simple forms of
RAID that combines the capacities of all
drives. The files written on such an array
are split into small chunks and distributed across the drives, which lets them
all perform read and write commands
simultaneously, for dramatically better
performance than a single drive can deliver. The drawback of a RAID 0 array,

CyberTrend / February 2015

53

however, is that if any individual drive


fails, the whole array fails.
RAID 1, 5, and 10 arrays, on the other
hand, offer excellent redundancy designed to be recoverable in the event of a
drive failure. RAID 1 requires at least two
drives, RAID 5 requires three, and RAID
10 requires a minimum of four drives

upgrade path are two more reasons to


consider going with a vendors NAS offering. These machines also tend to have
warranties that cover parts as well as the
software, meaning that you can count
on the entire system to continue to work
for as long as your support agreement
lasts. The same cannot be said for all

IF YOUR ORGANIZATION NEEDS A PARTICULARLY


LARGE NAS DEVICE, YOU MAY WANT TO FOCUS
ON A PURPOSE-BUILT MACHINE RATHER THAN
GOING THE DIY ROUTE.
to operate, but it offers the best performance and data security of the RAID options mentioned here. Because most NAS
boxes tend to be read heavy, a RAID 5
setup, striped with distributed parity, can
be a more economical choice for a small
business-centric NAS. Any configuration with built-in redundancy and selfrecovery will sacrifice capacity to some
degree, which will need to be factored in
during the budgeting process.
Although RAID is a good option for
businesses building their own NAS devices, some NAS software uses unique file
systems that perform the same function.
For instance, there is one available file
system that can create a software-based
RAID array that offers data integrity and
can prevent the silent data corruption
that commonly afflicts very large and
very fast databases.

Benefits Of A Purpose-Built
Machine
If your organization needs a particularly large NAS device, you may want to
focus on a purpose-built machine rather
than going the DIY route. PCs tend to
have somewhat limited storage expansion
capabilities. Depending on the NAS appliance you select, you can have a dozen or
more bays which you can populate with
HDDs or SSDs in a variety of sizes, up to
3TB or more, so capacity is very flexible.
Third-party NAS vendors can also
help you determine how much capacity
is necessary to meet the organizations
needs. Post setup support and a clear

54

February 2015 / www.cybertrend.com

of the open-source alternatives; when


problems arise, youll often need to seek
out advice from volunteers on a community-hosted forum.
SMBs often use a variety of cross-platform systems, which can make it difficult
to share and access data when and where
its needed. Many NAS devices run software that is platform agnostic, letting systems running Windows, Mac OS, Unix,
Linux, and other OSes, as well as VMware
hypervisors, access data and files.
When it comes to purchasing a NAS
unit, there are two types from which to
choose: BYOD (bring your own drive)
and diskful. The former often comes
without drives installed, letting you source them yourself,
which can save you money
in the short term. The latter
comes prepopulated with
HDDs, and as a result can be
pricier to repair, but the drives
may have additional features or
come preconfigured for added
convenience. Either type can
offer similar levels of flexibility
and reliability.

that is capable of recovering, consider replacing all drives in the NAS


(one at a time of course). Many NAScentric hard drives have manufacturer
warranties of between three and five
years, but its best to begin thinking
about swapping out old drives before the
warranty expires.

Security, A Surety
Like everything else connected to your
network, security is an important consideration when looking into setting up
a NAS unit. If you plan to use the NAS
system for proprietary data, customer
records, email records, and anything else
that falls under compliance regulations,
then data must be restricted to all but
authorized users. NAS software typically
offers users the ability to allow or block
guest access. Data encryption is another
security measure that you should enforce
for the most valuable data.

Naturally, A NAS
Expanding storage can seem like more
trouble than its worth. When properly
configured, however, an inexpensive
NAS system can be a set-it-and-forget-it
proposition that adds convenience, security, and new levels of data availability to
all the right employees.

Long Live The Storage


To determine how long
you can rely on your NAS
device, refer to the recommended life span of the oldest
drive in the system (if reusing drives). When a single
drive fails, as long as you
were using a form of RAID

NAS software like the type that FreeNAS offers can display useful logs,
such as system load, available disk space, and uptime.

Enterprise Backup
NOTEWORTHY ABILITIES, DEVELOPMENTS & VARIATIONS IN SOLUTIONS

KEY POINTS
While the gap in features and abilities between high- and low-end enterprise backup solutions has narrowed,
features among vendors solutions
still vary.
Compared to years ago, more
delivery models are available today
for backup solutions, including cloud
options and purchasing from different
types of vendors.
When looking at backup solutions
check each solutions support matrix, or ability to support a companys
applications, OSes, and hypervisors.
Automation, deduplication, compression, disk-to-disk-to-tape, cloud
computing, and virtualization are noteworthy features available in todays
enterprise backup solutions.

ENTERPRISE BACKUP ISNT exactly a


topic that causes the pulse to quicken.
Still, its importance cant be overstated in
terms of disaster recovery and business
continuity. As Charles King, president
and principal analyst at Pund-IT, says,
ignoring the need for backup, or implementing simplistic processes that are
inadequate for an organizations needs
happens more often than one might
think. Why? Because backup tends to get
viewed as an ongoing cost rather than
something of overall value. What enterprise backup is really about, says Dave
Russell, vice president and distinguished
analyst with Gartner, is availability and
remediation and what youre really
willing to invest and protect against.
Another way enterprises err concerning backup is by failing to regularly
reassess their backup plans and processes
or keep current with technology developments and new features from vendors.
Enterprise backup may seem straightforward and core features fairly constant

among vendors, but there are variations.


Thus, it pays to be aware of primary features across a range of solutions.

Changing Times
With regard to backup and storage
in general, Russell uses the term compressed differentiation to note that
differences between high- and low-end
solutions have narrowed, or become
compressed, over the years, with capabilities moving down market and becoming more accessible. Conversations
around backup have also changed. A
decade ago, Russell says, when listening
to an organizations requirements, hed
say, Oh, this is kind of the land of the
killer feature, one of unique capability,
and if not unique, one that only the
large, major providers could actually
deliver. Today, he says, he can rattle
off a half dozen or more vendors, as well
as numerous delivery models (cloud,
buying from hypervisor or pure-play
storage vendors, etc.).

CyberTrend / February 2015

55

Although there is commonality among


primary or core backup processes across
enterprise backup solutions, King says,
storage hardware vendors typically enhance backup solutions to utilize qualities or features in their own products.
Additionally, while a vendor may support
backup across competitors hardware,
functionality may be less robust. Thus,
understanding how applicable a prospective solution is to the enterprises existing
IT environment and assets, and whether
features it desires will be fully available,
are important.
Its also valuable to know what the
organization is doing for backup and
disaster recovery currently, including
how often it backs up, what it backs up
(everything or certain focus areas), how
long it keeps data, and more. Executives
in particular should ask what their enterprises are protecting against, says
Greg Schulz, Server and StorageIO
founder. For example, what threat risks
are likely, what is the business impact
of doing nothing, and what are the opportunities if doing something? In other
words, the thinking should change from
how can we cut costs and get by doing
backup as cheaply as possible? to how
can we enable data protection so that it
becomes an asset, an enabler to the business so that if something does happen,
were surviving and hopefully not losing
anything? Schulz says.

Primary Considerations
Commonly, experts regard backup as
one piece of a broader data protection
puzzle. In turn, data protection is about
disaster recovery, business continuity,
data retention, governance, and other
issues, says Russ Fellows, Evaluator
Group analyst and senior partner.
Enterprise backup applications are just
one tool for solving these needs. One
reason traditional backup products have
achieved and retained their preeminent
recognition is their use of policies for

BACKUP PLANNING 101


When creating an
enterprise-scale backup
plan, vendors can offer
considerable information and advice. Its key
that organizations also
aim their finite resources
at the most important
corporate data sets,
says Mike Karp, Ptak
Associates vice president
and principal analyst.
With that in mind, Karp
provides these planning
guidelines:
Know which data is
most important to the
companys survival
and allocate resources
accordingly.

Ensure major stakeholders buy into the


process used to identify which data sets are
most valuable by involving them early on.
Few people will ultimately care how much
data gets backed up.
Everybody will care
how much is recoverable, so build enough
time into the backup
process to verify that
data is being backed
up and is recoverable.
Most recoveries
involve recovering
recently lost data.

data protection, Fellows says. Other


data protection mechanisms typically
lack in the area of policy implementation, therefore the need to have some
level of integration or coordination with
the enterprise backup applications in
use, he says.
Beyond policy-based protection of
data, Fellows says there has been a high
degree of integration with offsite data
protection and data protection acceleration tools. Backup target devices,
or D2D [disk-to-disk] targets, virtual
tape libraries, or similar are one component needed in larger enterprises, he
says, as is an efficient method for transporting and managing offsite copies of
data. Also important is meeting specific RTOs (recovery time objectives)
and RPOs (recovery point objectives)
on a per application basis. Generally,
its unlikely that large enterprises can
meet all of these requirements with any

Locally held data is


almost always faster to
access than data kept
remotely, so keep the
most recent backups
local, and then send
them to remote
storage in stages.
More data is lost due
to user errors (both
end users and IT) than
other sources. Where
IT is concerned, automation technologies
can make it possible
for even a junior IT
staffer to perform
backup functions at
a senior team members level.

one product, he says. Even many small


and midsize businesses require multiple
products. Simply installing a backup
application with a tape library cant deliver the myriad of business needs required today, he says.
King considers solutions that comprehensively back up core data (regardless of storage technologies involved)
and that centrally manage those processes (automated monitoring, analysis,
reporting, etc.) as falling under the data
protection umbrella. Many enterprises,
especially those in compliance-sensitive
industries, require an ability to back up
data to geographically remote data facilities to ensure data safety in event of
disaster. Enterprise solutions can also
include recovery options and services
designed for specific customer needs and
the ability to back up specific data types.
Protecting data on mobile devices is also
becoming increasingly important. King

UNDERSTANDING HOW APPLICABLE A PROSPECTIVE SOLUTION IS TO THE


ENTERPRISES EXISTING IT ENVIRONMENT AND ASSETS, AND WHETHER
FEATURES IT DESIRES WILL BE FULLY AVAILABLE, ARE IMPORTANT.

56

February 2015 / www.cybertrend.com

Everyone might say we support or back up the worlds


most popular or industrys most popular applications, but
you have to check that.
DAVE RUSSELL
Vice President & Distinguished Analyst : Gartner

With automation removing many errors and data reduction techniques in place, sites can protect the same
amount of data with fewer backups.
MIKE KARP
Vice President & Principal Analyst : Ptak Associates

says that as employees become mobile


in their work habits and the devices they
use, companies must consider and plan
for how to best secure the related data.
For Russell, completeness of solution is key in a backup product,
meaning an ability to support an organizations various applications, OSes,
and hypervisors. First and foremost,
you have to start with the support matrix, he says. Everyone might say we
support or back up the worlds most
popular or industrys most popular applications, but you have to check that.
This also means ensuring a solution can
handle foreseeable changes the enterprise may have in store, such as moving
from one Linux distribution to another.
Russell regards a backup solutions
ability to scale as another primary
feature. For example, a demo version
might run fine on a half dozen machines in a test environment, but after
deploying the solution on a broader
scale (two dozen machines for a midsize
business or potentially thousands for
larger companies), scale is a problem,
Russell says. This could mean the solution cant handle the increase in data or
machines or that an architectural capability the solution requires gets exposed
after the full deployment.
Russell cites individual object recovery (restoring one email or document, for example, vs. bringing back an
entire system) as another important feature. Closely related is support for end
user restores, meaning an administrator

doesnt have to necessarily get involved


in restoring data. In larger enterprises,
involving administrators typically means
opening a problem ticket and layers
of infrastructure and people involved
rather than an end user performing a
restore, Russell says.

of data. Virtualization, meanwhile, has


altered what and how enterprises back
up and protect data, he says.
Similarly, Russell cites the cloud as a
key development in terms of a backup
delivery model (and not necessarily a
replacement for on-premises backup)
for augmenting the process of keeping
one copy of data onsite and electronically vaulting another copy to the cloud
as opposed to taking it to another facility. Finances is another change to
note, Russell says. The industry is
largely moving to either capacity-based
or socket-based charges for backup
software, he says. Capacity is typically
measured in terabytes, while sockets are
measured in terms of CPU sockets. If
nothing else, understand youre likely to
get presented with all sorts of different
types of costs options, or different vendors might display their capabilities
in terms of a bid in different ways,
he says.

Roll With The Changes


Mike Karp, vice president and principal analyst at Ptak Associates, counts
automation, data reduction, and D2D2T
(disk-to-disk-to-tape) technology
as noteworthy changes and developments in enterprise backup solutions.
Automation removes much of the
reason for human error, he says, while
data reduction (deduplication and
compression) means less disk space is
required to store the same amount of
data, thus sites can protect the same
amount of data with fewer backups.
D2D2T, meanwhile, enables backups
and recoveries to occur much faster
than previously.
King also tabs data deduplication as
a notable development for its ability
to help acquire significant benefits regarding increased hardware efficiencies
and lower costs. King also considers
cloud services that simplify or facilitate
key and often-ignored processes (such
as backing up employee computers,
work group documents, and records)
as key developments. Fellows cites private clouds specifically as having had a
big impact on the options available for
storing offsite disaster recovery copies

AVOID MISTAKES
Among the mistakes enterprises
make when devising and implementing backup plans is focusing on
a particular technology or solution
before understanding associated
backup problems and expectations.
Other mistakes: not accounting for
all of the data they need to protect,
or backing up data they dont need
to. Yet another often-cited mistake
is not testing recovery processes.
Simply making backup copies and
checking theyre being stored isnt a
valid strategy, much less a disaster
recovery capability, says Russ
Fellows, Evaluator Group analyst and
senior partner. If an application requires a one-minute recovery point
objective and 30-minute recovery
time objective, for example, test
this regularly. Too often companies
assume that copying data implies
theyre protected, when too often
that isnt the case, Fellows says.

CyberTrend / February 2015

57

THE LATEST PREMIUM ELECTRONICS

Intel's New Computer On A Stick


WWW.INTEL.COM
Why mess with a bulky computer when you can have a PC that fits in your pocket? That's the question Intel wants you to contemplate when you check out the new Intel Compute Stick, which is among the more interesting new devices coming out of this year's
International Consumer Electronics Show. Although it isn't a computing powerhouse, the 4-inch-long Intel Compute Stick is a fully functioning computer, complete with operating system (Windows 8.1 or Linux), ample storage (expandable with a micro SD card slot), wireless capabilities (Wi-Fi and Bluetooth) and a quad-core Intel Atom processor. Simply plug it in to an HDMI port on a TV or other display,
link up a Bluetooth keyboard and mouse (sold separately), connect to the Internet via Wi-Fi, and you're ready to use the device to surf the
Web, use Web-based services (such as productivity apps, online storage sites, social networks, and streaming video services), or use a
program such as Windows Remote Desktop to transform the computer/TV combo into a thin-client system for your business. As of press
time, Intel had not released details about pricing or availability other than to say the device is coming in 2015.

58

February 2015 / www.cybertrend.com

This Phone Has An


Excellent Memory
WWW.ASUS.COM
Not too long ago, 4GB was plenty of RAM for a
PC, and it's still not too shabby. So it's especially
noteworthy that the new ZenFone 2 from Asus
(price to be determined) complements its 2.3GHz
64-bit Intel Atom processor with 4GB of memory.
This is, after all, a phone. Combined with 4G/LTE
and 802.11ac Wi-Fi (that's the latest, fastest Wi-Fi
standard), the ZenFone 2 handles cloud-based
services, high-powered apps, and video streaming
with aplomb, supporting data transfer speeds up
to 150Mbps. The ZenFone 2 uses Asus's ZenUI
mobile interface, features a 5.5-inch Corning
Gorilla Glass 3 display, and uses "fast-charge"
technology, which Asus claims lets you recharge
from 0% to 60% in 39 minutes. Options include
a 1.8GHz or 2.3GHz CPU; 2GB or 4GB of RAM;
16GB, 32GB, or 64GB storage; and many colors.

Compact Wireless
Speaker, Big Sound
US.CREATIVE.COM
If you'd like to fill a small space with big
sound for a reasonable price, Creative has you
covered with its Sound Blaster Roar (formerly
$199.99, now $149.99). The Roar uses either
Bluetooth or NFC (Near Field Communications)
to connect wirelessly with an audio system,
smartphone, tablet, or other compatible device.
Although the Roar measures at a modest 2.2 x
7.9 x 4.5 inches (HxWxD), it features two frontfiring drivers to spread high frequencies forward, as well as an active top-firing mid-range
and bass driver to deliver full sound, with rich
bass even at low volumes. The Roar also includes a built-in microphone with 360-degree
voice pickup (which enables it to perform as a
speakerphone) and a voice recorder with onetouch control, among other enticing features.

CyberTrend / February 2015

59

Images, clockwise from top left, courtesy of Apple (1), Samsung (2), Sony (3), Microsoft (4, 5), and BlackBerry (6)

Smartphone Tips
A ROUNDUP OF POPULAR HOW-TOS

BLACKBERRY
Transfer BBM Contacts To A New
BlackBerry 10 Smartphone
If you are using BlackBerry Messenger
7.0 or later, all of your BBM contacts
are backed up online. This simplifies
the process of transferring contacts to a
new BlackBerry 10 device. On the new
smartphone, all of your contacts will automatically be transferred when you set
up the phone using the same BlackBerry
ID and password used on your previous
phone. BlackBerry does not permit the
use of two different BlackBerry IDs on a
single smartphone. If you have set up a
BlackBerry 10 smartphone with an older
BlackBerry ID and you would like to use
a newer BlackBerry ID instead, you must
first perform a security wipe, which is
available in general settings. Then set
up the device again, this time using the
newer BlackBerry ID and password.

60

February 2015 / www.cybertrend.com

Assign A Ringtone To A Contact


Sometimes its nice to have some idea, by the ring of your BlackBerry, who might
be calling. It helps to assign a specific ringtone to a certain contact or contacts,
which you can do by simply opening the Contacts app, finding the appropriate
contact, tapping Edit (pencil icon), choosing a ringtone from the Phone Ring Tone
drop-down list, and tapping Save.

Search On A Web Page


BlackBerry 10s Web browser includes a feature that lets you search for text on a
Web page. Select More (icon with three vertical dots), select Search (magnifying glass
icon), and enter the text for which you would like to search. Or simply press the S key.

Change How Contacts Are Sorted


BlackBerry sorts contacts alphabetically by company name and peoples last names.
If youd prefer that people be sorted by first name, or that all contacts be sorted by
company, open the Contacts app, tap Settings (gear icon), and choose an option from
the Sort Contacts By drop-down list: First Name, Last Name, or Company Name.

ANDROID
Delete App Shortcuts

Print From Anywhere

Whether you have 20 apps or dozens


of apps installed on your Android
smartphone, it can be useful to create
shortcuts to the ones you use most
often. To begin, access the Applications
menu, press the Home button, swipe
to access the right screen, tap the Apps
icon, and locate the app you want to
add as a favorite. Tap and hold that application icon and then drag it to wherever you would like it to appear on the
Home screen.

Although it hasnt received much attention following a small amount of fanfare surrounding its release, the Google Keep service lives on as a Web-based
service for maintaining notes via desktop computers and mobile devices. Notes
can contain combinations of text, bullet lists, checklists, pictures, reminders,
and more, with a modicum of formatting tools available, so its a handy way
of keeping track of projects, tasks, and thoughts. You can start using Keep by
accessing the online service at keep.google.com or by using the app, which is
available for Android smartphones and tablets via the Google Play store. As with
Gmail, you can archive notes you dont presently need but want to hang on to
for later. The app interface is intuitive, enabling you to quickly create notes on
the go. You can tap the menu icon and then tap Archive to view archived notes;
the menu provides access to reminders and deleted notes, as well. Notes are also
searchable, but cannot be tagged or downloaded as they can using other notetaking services and apps.

Adjust The Default Zoom Level


Multitouch is a great feature. Sadly,
it is not standard on all Android-based
devices. On handsets that dont support multitouch, users can zoom in to
images and Web pages by double-tapping them. By default, the zoom level
when you perform the double-tap is set
to Medium, but you can change it to
Close or Far by launching the Browser;
pressing the Menu key; and tapping
More, Settings, and Default Zoom. Tap
the radio button beside the option you
want, and then back out of the menu by
pressing the Back key.

Add Foreign Language


Keyboards
To add a foreign language dictionary in Android, open Settings,
tap Language & Keyboard, tap
Touch Input, tap International
Keyboard, scroll to find the keyboards you wish to add, and check
them as you find them. Now, when
you are typing on Androids onscreen keyboard, you will be able
to tap the international keyboard
(globe icon) key and select the keyboard you want from the pop-up list.

Safely Replace A Memory Card

Android offers multiple foreign keyboards.

Google Keep For Notes


If, for whatever reason, you would
like to replace the memory card in
your Android smartphone, dont just
open the case and remove the installed
card. First go to the Home screen, and
then tap Menu, Settings, Storage, and
Unmount SD card. When you receive a
warning message, tap OK and wait until
you see SD card safe to remove on the
screen. At that point you can remove
the smartphones cover, release the SD
card (this can involve moving a small
guard, pressing down on the card to
unlock it, or a similar action), remove
it, and replace it with another SD card.

Although it hasnt received much attention following a small amount of fanfare surrounding its release, the Google Keep service lives on as a Web-based
service for maintaining notes via desktop computers and mobile devices. Notes
can contain combinations of text, bullet lists, checklists, pictures, reminders,
and more, with a modicum of formatting tools available, so its a handy way
of keeping track of projects, tasks, and thoughts. You can start using Keep by
accessing the online service at keep.google.com or by using the app, which is
available for Android smartphones and tablets via the Google Play store. As with
Gmail, you can archive notes you dont presently need but want to hang on to
for later. The app interface is intuitive, enabling you to quickly create notes on
the go. You can tap the menu icon and then tap Archive to view archived notes;
the menu provides access to reminders and deleted notes, as well. Notes are also
searchable, but cannot be tagged or downloaded as they can using other notetaking services and apps.

CyberTrend / February 2015

61

I OS

Change Or Turn Off Notifications

AirPrint Advice

Although it can be convenient to


receive notification of each new iMessage, scheduled reminder, or incoming
email message, you may not want these
alerts to pop up on your iPhones Lock
screen, especially if you prefer to keep
this information private or want to
conserve battery life.
To disable badges for individual
apps in the Lock screen, access Settings
and select Notifications. Scroll down
to the Include section, locate an app
you would like to modify, and tap to
access the notification settings for that
app. You should see three Alert Style
options: None, Banners, and Alerts.
Either tap None to select it or scroll
down, find Show On Look Screen, and
tap to switch it off.
If you would like to completely disable accessibility to the
Notification Center, access Settings,
tap Notification Center, and turn off
Notifications View and Today View.

The iPhone has had the ability to print to a wireless printer since the iPhone
3GS, but for many of you, the ability to print from your iPhone remains a mystery, something that appears to require special incantations and the proper
phase of the moon. In reality, most of the problems that individuals experience
when trying to print from their iPhones are caused by using the wrong type of
printer. The printer must support AirPrint protocols, and must be connected
to the same Wi-Fi network as the iPhone. You can check the printers manual
or the manufacturers feature list to ensure that it supports AirPrint. If it does,
heres how to print from your iPhone.
The AirPrint-enabled printer must be connected directly to your Wi-Fi
network. It cant be connected indirectly through a Mac or PC, or through a
third-party wireless printer adapter. To print from your iPhone, bring up an app
and select the page you wish to print. Tap the Action button, then tap the Print
button. In the dialog box that opens, configure any relevant printer options,
such as page size or number of copies, and then tap Print.

Customize The Spotlight Search


For Better Results
You probably already knew the
iPhones Spotlight Search capability is
designed to mimic the universal search
function found in other devices, but
what you may not have known is that
you can remove items from the list of
indexed sources to speed up your device, or just eliminate items you never
search for.
To access the Spotlight Search settings, tap Settings, General, and
Spotlight Search. You can rearrange
the sources to give a higher priority
to Mail, for instance, by tapping on
the right side of the Mail source and
dragging it up to the top of the list.
You can also tap specific items to eliminate them from Spotlight Searches.
Reducing the number of sources here
can also speed up your searches.

62

February 2015 / www.cybertrend.com

Adjust Brightness To Save


Battery Life
To cut down on excessive battery drain in your iPhone, decrease
the screen brightness. Tap Settings,
tap Display & Brightness, and then
drag the slider to the left to dim
the screen brightness. Turning on
Auto-Brightness with the switch on
this screen can also help. Doing this
causes the iPhone to automatically
adjust brightness based on the current ambient light conditions.

Quickly Access The


Camera App
Apples iOS (versions 5 through
8) lets you quickly access the camera
without unlocking the screen and
loading the home page. Getting to
the Camera app quickly can be the Turning down screen brightness is a good way to prolong
battery life.
difference between getting the shot,
and talking about the shot that got
away. To access the Camera app
from a locked screen, tap the home button twice. The Camera apps icon will
appear next to the Slide To Unlock bar. Press the Camera icon and swipe upward, and then snap your picture. This process doesnt unlock your iPhone, so
while the resulting image is stored on your iPhone, you wont be able to view the
image until you unlock the phone.

WINDOWS PHONE
At A Loss? Tap & Hold

Share Contact Information

One question were often asked is


how we discover some of the secret
features that Windows Phone smartphones seem to keep well hidden. The
answer is, we tap and hold. A lot.
Tapping and holding is like rightclicking in Windows; it brings up a
menu with additional functionality related to the item you clicked on, or in
this case, tapped and held on. Here are
a couple of examples of tap and hold
features.
Tap and hold a calendar entry to
assign actions to an appointment. Tap
and hold an email to delete the email,
mark it as read, or clear flags. When
youre using an app, and you want
to see if there are any functions you
didnt know about, just try tapping
and holding. You may be amazed by
what you discover.

With Windows Phone 8, you can quickly share contact information


whether its yours or someones in your contact listwith someone else via
text messaging. Open the Messaging app, tap New, type the recipients information until the appropriate name or phone number appear, tap Attach, tap
Contact, locate the contact file youd like to share, tape Share, and tap Send.
This method can serve as a quick, convenient, and business card-free way to
swap your own contact information with someone else.

Social Media Accounts,


All Together Now
If youre using a Windows Phone
device, you dont have to hunt in different places to find the latest updates
from your LinkedIn, Facebook, and
Twitter connections. Simply go to the
Start screen, tap People to access the
People hub, and swipe left or right to
access the Whats New page. If you get
too much information flooding in at
once, you can view updates from one
social network at a time. To do this,
simply tap All Accounts and then select the account you want to view.
You can also make adjustments
so that contacts from social networking accounts wont be included
among your main contacts list; access
Settings, swipe to access Applications,
tap People, tap Filter My Contact List,
tap to remove check marks next to the
accounts you dont want to include.
When youre finished making selections, tap Done.

Cortana, new with Windows Phone 8.1 smartphones, is designed to be your digital personal assistant.

Get To Know Cortanas Notebook


New to Windows Phone 8.1, Cortana is a digital personal assistant that communicates with users via voice. When youre learning about Cortana here in
CyberTrend or in a user guide, youll notice references to Cortanas Notebook.
This is essentially a menu that provides shortcuts to favorite things and recent
activities. In Windows Phone, tap Search, go to Cortana, and then tap the icon
comprising three horizontal bars at the top right corner of the screen to access
Cortanas Notebook. This is your key to changing settings, remembering tasks
you might have forgotten about, finding things you might have searched for
recently, and more.

Clear Browser Search History


To clear your browser history and access other Internet Explorer settings on
a Windows Phone 8 smartphone, open Internet Explorer, tap the More button
(three dots), tap Settings, and tap the Delete History button. In addition to
erasing the browser history, this will erase cookies, temporary Internet files,
and any saved passwords.

CyberTrend / February 2015

63

Social Media Privacy Tips


TAKE CONTROL OF YOUR ONLINE PRIVACY

SOCIAL MEDIA IS ALL about sharing our


lives with friends and family, and vice
versa. From daily musings about life,
such as a friend thats excited about an
upcoming vacation, to important events,
like the birth of a new grandchild. And
although it might not seem like the news,
photos, personal achievements, failures,
and of course, cute animal videos, you
post would be of much interest to people
you dont know, the information could
be useful to cybercriminals trying to steal
your identity. The default privacy settings
on many social media websites make it so
your posts, tweets, and photos are visible
to the public. Fortunately, its easy to adjust
the privacy settings, so that only the people
you know will see the updates. Here, well
guide you to alter the privacy settings on
Facebook, Twitter, Google+, and LinkedIn.

Facebook
When setting up a Facebook profile,
the service asks for a lot of personal informationincluding education history,

64

February 2015 / www.cybertrend.com

workplace, and phone numberthat you


might not want visible to everyone. To
complicate matters, Facebook hasnt exactly been known for consistency when
it comes to users' privacy settings, as past
interface changes have reset settings and
forced users to continually ensure their
posts and personal information remain
private. To correct some of these issues,
Facebook has made changes in the last
year to simplify its privacy controls.
To examine your current settings, visit
your Facebook home page, click the dropdown button next to the Lock icon, and
select Settings. Click Privacy and youll
see a list of configurable options. For example, under Who Can See My Future
Posts, you can select the default audience
when you create a post, such as Public,
Friends, Friends Except Acquaintances,
and Custom groups. This way, you can
make certain that your posts wont be
viewable to the public at large if you
forget to change the privacy settings when
you post an update.

Under Who Can See My Stuff, you can


also review the posts youve been tagged
in, as well as change the audience for updates youve previously posted. This way,
you can control whether or not any old
updates are available to the public. There
are also Who Can Contact Me? and Who
Can Look Me Up? settings where you
can filter access to non-friends. Because
these options are the only settings in the
Privacy tab, you might think thats all
youll need to change. Its not.
One of the easiest way to assess the
entirety of your Facebook privacy is to
use Facebooks Privacy Checkup. You
can access this tool by selecting the
Privacy Shortcuts button (Lock icon) in
the top right corner of Facebook. Select
Privacy Checkup and in the resulting
pop-up window, Facebook shows you
the controls for who can see your posts.
If youre following our steps, youve already addressed this step. Click Next
Step and youll see what apps youve
logged into with Facebook. Delete the

apps you no longer use. When you're


done, click Next Step.
Finally, Facebook will bring up the information shared on your profile. Here,
youll see options to add a phone number,
email, birthday, hometown, and other information. Click Finish Up to finalize your
new privacy settings. All of the information in the last step can be found in the
About section of your profile, which also
contains several other pieces of information you might want to make private. To
do so, click your personal timeline and
select About. Under the tabs for Work
And Education, Place Youve Lived, and
Contact And Basic Info, you can adjust the
privacy settings for details that werent part
of the Privacy Checkup.

Facebooks primary privacy settings can be found in the


Privacy window.

Twitter
By default, Twitters account settings
make your tweets available for all to see.
The alternative is a protected mode, where
your tweets are only visible to your approved
Twitter followers. Protected tweets are also
not retweetable, so even approved users cant
share your tweets. You also cannot share
permanent links to your tweets with anyone
but approved followers. If you want to use
Twitter to drive Web traffic, the restrictions
in the protected mode might undermine why
you joined Twitter in the first place.
If you want to adjust your tweet privacy level, or the other privacy controls on
Twitter, start by signing into Twitter and
bringing up your account settings. Next,
click Security And Privacy and scroll down
to Privacy. If you only want approved followers to see your tweets, click the Protect

My Tweets checkbox. You can also control


who can tag you in photos, whether your
tweets include a location, and how others
can find you, such as by email address or
phone number. After making your privacy
selections, click the Save Changes button.

something you dont plan on doing, it might


be best to disable location settings.

Google+
For Google+, privacy has been a key consideration from the very beginning. For example, youve always been able to assign a
privacy level for each post you share. And
based on the Circles (friend groups) youve
set up, its easy to share content with only
a specific crowd. Google+ also offers detailed privacy settings where you can control most every aspect of your profile. Visit
your Google+ page, click your name, select
the drop-down menu under the Google+
logo, and choose Settings.
In the Settings window, you can customize who can send you notifications,
comment on your public posts, and manage
subscriptions. If you want to configure the
audience settings for your posts, photos,
and profile updates, scroll down to the Your
Circles section and click Customize. By default, Google+ pushes updates to the people
in your Friends, Family, and Acquaintances
groups. To block a particular group, remove the check from the checkbox. If you
want to reach a larger group of people, you
might want to add a check to the Following
checkbox, so followers of your Google+
profile will be added to Your Circles list.
Next, scroll down to the Profile section.
Here, you can configure how people are able
to find your profile and control what content is displayed in your profile. A setting of
interest for businesses is the Allow People
To Send You A Message From Your Profile,
as this setting offers a way for consumers to
reach out to you. If the setting is limited to
Your Circles or Extended Circles, customers
might not be able to contact you.
If you use Google+ on your mobile
device, youll also want to examine the
Location Settings section. These settings let
you enable or disable location reporting via
your smartphone and tablet. If enabled, you
can control who can see your current city
and/or exact location. The precise location
is ideal for those who wish to share their
location with friends and family. If thats

Google+ offers a wide variety of privacy controls.

LinkedIn
The business-focused nature of LinkedIn
ensures that privacy is a priority. To examine your settings, log in to LinkedIn,
hover your cursor over your profile photo in
the right-hand corner, and select Manage,
which is next to the Privacy & Settings
heading. Scroll down to Privacy Controls
and youll find a host of options to control
what others can see on your profile and
activity feed. For example, you can turn off
activity updates that let other connections
know when you make a change to your profile or follow another company.
If you use LinkedIn to search for
new clients and key connections within
an organization, you can opt to remain
anonymous, so people wont know that
you looked at their profile. To do so, click
Select What Others See When Youve
Viewed Their Profile. There are two anonymous options, one where others will see
an industry and title, or you can opt to
be completely anonymous. You can also
manage who can follow your updates, edit
blocked connections, and shut down users'
ability to view your connections.

Manage All Your Online Accounts


Here, we took you through the basic
steps of managing your privacy settings, and
itd be wise to at least check up on your privacy settings with other social networks you
might use, such as Instagram, Foursquare,
and YouTube. This way, you can have a
measure of control of your publicly available online data.

CyberTrend / February 2015

65

Data Usage & International Travel


HOW TO AVOID RACKING UP UNEXPECTED CHARGES

ASIDE FROM A FEW exceptions, such


as the lack of 4G LTE (Long Term
Evolution) network availability outside
of the United States, advances in communication technologies are making it
easier to use your mobile devices when
traveling abroad. Theres one common
problem, however: American travelers
using cellular services beyond U.S.
shores can easily get stuck with hefty
wireless bills. Whether you use a smartphone, tablet, laptop, portable hotspot,
mobile broadband modem, or other
device, if it relies on cellular communications, it could cost you. We explain
how to fend off excessive charges, regardless of the device.

Understand Roaming
Roaming occurs whenever a wireless carrier other than your own provides your device with a cellular signal.
International roaming rules and rates
are typically different from those that
apply in the U.S., so here's the No. 1

66

February 2015 / www.cybertrend.com

roaming rule to keep in mind when


traveling abroad: Never assume anything. If you think that an "unlimited"
plan doesn't change if you travel outside the U.S., that isn't so. Or if you
believe that roaming charges will skyrocket if you only use the Internet and
that exchanging a few simple text messages or checking voicemail won't add
up very quickly, you will be mistaken
because roaming rates often apply to
anything you do with your device when
it involves a cellular connection.
Also, depending on your devices
carrier and settings, you may or may
not receive an obvious notification
when roaming, so you wont necessarily be prompted when the tally of
charges starts to rise. On the flip side,
you might not be able to roam internationally unless you enable roaming
before you travel. It is imperative, then,
that you fully understand how your
wireless plans roaming features and
rates work before you leave the States.

Barring that, bring your carriers tollfree customer service phone number
with you; if you have any doubt as to
how your plans roaming rates work
when traveling, use a local phone to call
that number and find out.

Know Your Phone &


What It Can (& Cant) Do
When it comes to frequencies, wireless carriers, and foreign cellular networks, there are few absolutes. Its
imperative that you determine what
frequencies your device uses and
whether those frequencies match the
networks available in your destination
countries, because if they dont match,
your device wont work at all for voice,
messaging, or data transmissions.
Devices that use CDMA (Code Division Multiple Access) cellular networks are very limited in terms of
international use. You need one that
works with GSM (Global Standard For
Mobile Communication) networks

if you want it to work in the largest


number of countries. And if you have
a device that uses 4G, research whether
your destination supports the technology before you take the device
abroad. When identifying what frequencies your device supports, find out
the exact frequenciesdont settle for
simply GSM (because there are multiple GSM networks) or brochure verbiage that suggests the device will work
anywhere you go.
Some devices only function on one or
two frequencies. Some CDMA phones
work only on CDMA networks, while
others include added support for two
or more GSM frequencies. In general,
the best devices for international travel
are those labeled world (as in world
phone) or global. These are typically quad-band GSM devices, which
means they operate on all four GSM
frequencies (850/900/1800/1900MHz),
and sometimes on CDMA and 3G/4G
frequencies as well, and will therefore
work in most locations.

Unlock Your Device


It might be necessary to unlock your
smartphone or other cellular device
before you travel so that it will work
not only with networks, but also with
wireless carriers, in other countries.
Call your carrier to find out if this is
necessary and to receive a code or other
means to unlock the phone.

frequencies commonly used worldwide. Or it may involve switching your


wireless plan to a global roaming
or similarly named plan or adding a
special Internet data bundle that will
allow you to travel internationally and
roam all you want for a relatively affordable price.

Track Usage To Avoid Surprises


If you know your devices international roaming rates and you wish
to keep expenses down, it can help to
track voice, messaging, and data usage
on your device. Most smartphone and
tablet operating systems include settings that let you view data usage, and
many include options for setting selfimposed limits on usage.

Use VoIP On Wi-Fi


One inexpensive (and sometimes
free) method for placing voice or
video calls while traveling is to connect to a Wi-Fi hotspot and use a VoIP
(voice over IP) app. Dont rely on this
method, however, as common VoIP
services can be blocked in some countries and by some hotspot providers.
And, as always, make sure that any
Wi-Fi hotspot you connect to is secure
so that technically inclined eavesdroppers cant listen in on your conversations, view your data transmissions, or
nab your website passwords.

changes you can make to various device


and app settings without shutting off
data transmissions entirely.
In the device/OS settings, look for
options with phrases that relate to
fetching new data, using packet data,
and roaming that you can switch off.
Devices differ, so experiment with
these settings; assuming you want to
continue to use the device on international networks but want to limit unnecessary data traffic, the idea behind
changing these settings is to turn off
only those features that work in the
background, searching for new data on
an ongoing basis.
Additionally, review any apps installed on your device. You may want
to change the settings within individual
apps so that they will only update on
demand (when you manually tap a
Sync button, for example) rather than
searching for updated data in the background. You may also find that some
apps only work with an active wireless
connection, so you may want to install
different apps (at least temporarily)
that work offline or only when a Wi-Fi
connection is present.

Use A Temporary Device


If you prefer to forgo all of the steps
in this article, consider picking up a
rental or pay-as-you go smartphone.
These are available in many international airports.

Limit Data Usage


Get A New Device Or Plan
If your device or wireless plan is presenting you with limitations or potentially excessive fees, consider switching
to a new device or a new plan. This
might involve replacing your current device with one that operates on

It used to be simple to limit data


usage on mobile devices: go into settings and switch Data to Off. While
this (or something like it) is still possible on many phones, tablets, and
other devices, there are nowthanks
to a plethora of mobile appsmultiple

KEEP IN MIND THAT YOU CAN CALL YOUR


WIRELESS CARRIER ANYTIME TO SPEAK WITH
A CUSTOMER SERVICE REPRESENTATIVE AND
FIND OUT DETAILS ABOUT YOUR PHONES
CELLULAR CAPABILITIES, YOUR WIRELESS
PLANS, AND HOW THOSE THINGS WORK
TOGETHER IN THE PLACES YOURE GOING.

Remember, You Have A Lifeline


& A Panic Button
Keep in mind that you can call your
wireless carrier anytime to speak with
a customer service representative and
find out details about your phones cellular capabilities, your wireless plans,
and how those things work together in
the places youre going. If youre already
overseas, though, use a local phone to
call customer service so you dont rack
up additional charges. Finally, if all else
fails, use your devices airplane mode
setting as a form of panic button to
ensure that you wont be charged for
wireless services while you figure everything out.

CyberTrend / February 2015

67

Laptop-Projector Setup Problems


TROUBLESHOOT COMMON ISSUES WITH THESE HANDY TIPS

YOURE READY TO give your presentation, but until that first slide appears on
the big screen, you can never be sure
that your equipment has got your back.
We cant tell you not to worry, but these
handy tips should help bail you out if
your presentation goes south.

Hardware & Cable


Connections
It can be difficult to track down the
source of problems that occur when
you are connecting a notebook and
projector. Following are some things to
watch for.
Video. Turn off all equipment and
connect your notebooks video out port
to the projector. The usual connection
choices for a notebook are VGA (Video
Graphics Array), DVI (Digital Visual Interface), HDMI (HD Multimedia Interface), and DisplayPort. Many projectors
have VGA and one or more digital connections. If possible, use a digital connection for high quality.

68

February 2015 / www.cybertrend.com

Sound. Some HDMI and DisplayPort digital video connections can carry
audio through the same port, but both
notebook and projector must support
audio over the digital video connection.
Traditionally, audio is connected using
the notebooks audio out jacks and the
projectors audio in ports; both of these
are often RCA or 3.5mm. If youre not
using the projectors built-in speakers,
make sure you connect your notebooks
audio out to the sound system you intend to use and turn the volume down
on the projectors speakers.

Mouse. If you are using a mouse, or a


remote mouse controller, make sure the
controller/mouse is connected, usually
through the notebooks USB port. If you
are using a wireless device, make sure the
notebook has the appropriate wireless
connection enabled. This is typically Bluetooth or a USB port wireless dongle.

Network Connection
Many venues supply network projectors, which are made available as a
shared resource. Making a connection to
a network projector is as easy as plugging

MANY VENUES SUPPLY NETWORK


PROJECTORS, WHICH ARE MADE AVAILABLE
AS A SHARED RESOURCE. MAKING A
CONNECTION TO A NETWORK PROJECTOR
IS AS EASY AS PLUGGING YOUR NOTEBOOK
INTO THE CORPORATE NETWORK VIA WIRED
OR WIRELESS ETHERNET.

your notebook into the corporate network via wired or wireless Ethernet.
Check with the companys IT staff for
specifics. Once connected, use the network connection wizard in Windows 7 to
find the projector you wish to use:
Click Start (the Windows button
in the bottom-left corner of the
screen).
Click All Programs.
Click Accessories.
Click Connect To A Network
Projector.
The network connection wizard
may inform you that your notebooks firewall is blocking the
ability to connect with the projector. Click to establish the network connection.
Either have the wizard search for
available network projectors or
enter the projectors address manually if it is available.
Once connected, a Network Presentation window will minimize to your
Taskbar. When youre ready to make your
presentation, open the Network Presentation window and select Resume. Your
notebook will treat the network projector
like an external monitor.

menu. Your Desktop background


should now appear on the projector.
Win7 also has a pop-up display for
selecting the content that is sent to the
projector. Press the Windows-P keys
to bring up the four possible selections:
Disconnect Projector (turns the
projector display off)
Duplicate (mirrors your computers
Desktop on the projector)
Extend (uses the projector as an extension of your Desktop)
Projector only (turns off your notebooks display and uses the projector as the main display)

NOTEBOOK-PROJECTOR
TROUBLESHOOTING TIPS
Turn off all equipment before
connecting the notebook to the
projector.
If possible, use a digital
connection to ensure a highquality presentation.
If youre not using the projectors built-in speakers, turn
them down and connect the
notebooks audio out to the
sound system.

Video Is Out Of Range


When the projector cant reconcile
a video signal from a notebook with its
preset resolution, it displays an out-ofrange message. To solve this in Win7:
Right-click a blank area on the
Desktop.
Select Screen Resolution.
Select the display associated with
the projector.
Use the resolution drop-down
menu to adjust the resolution to
the correct value. Try 800 x 600 or
1,024 x 768 as these are resolutions
that many projectors can handle.

If youre using a wireless


mouse or controller, make
sure you can establish the
wireless connection.

Display Turns Off

Adjusting the screen resolution can resolve out-of-range


messages.

Use the straightforward network connection feature in


Windows 7 to connect to a
network projector.
If there is no video, check
all the ports and then check
Windows Screen Resolution
settings.

No Video
In many cases, your notebook will
detect that you have a projector plugged
into one of its video outputs and will
automatically turn on the port. Not all
notebooks do this, however; and even
those that can still have missing video
if the notebook isnt set to duplicate the
Desktop or extend it to the secondary
monitor (the projector). Many notebooks use a function key combination
to toggle the projector port on or off
and set how you can use the display. We
recommend using the control panels
in Win7:
Right-click a blank area on the
Desktop.
Select Screen Resolution.
Select the second display from the
drop-down menu.
Select Extend These Displays from
the Multiple Displays drop-down

If the projectors display turns off


during your presentation, you'll want
to check your notebooks power management feature, especially if youre
running the notebook off of its battery. Whenever possible, use your AC
adapter to run your notebook.

Video Wont Display Or


Is Choppy
Your slide presentation works fine,
but when you try to show a video, all
you see is a blank window or a choppy
rendition of the video. Trying to display a video on two monitors can be too
much for a video card that has marginal
graphics capabilities. If video isnt displaying correctly, change the Display
settings to make the projector the primary display.

When a projected image isnt


proportionally correct, try repositioning the projector and/
or changing the projectors
keystone setting.
If a display turns off during a
presentation, check the notebooks power management
settings.
If video isnt displaying correctly, change the Display settings to make the projector the
primary display.

CyberTrend / February 2015

69

PC Problems On The Road?


HERE ARE SOME QUICK FIXES

IF YOU HAVE USED a computer for any


amount of time, then you know that
PC problems can often occur with little
warning. Maybe you are having trouble
connecting to a Wi-Fi hotspot, or you
cant get your mouse to work. We explore how to troubleshoot these and
other common PC problems so you can
get back to work quickly.

Hotspot Troubleshooting
Ordinarily, when you carry your
laptop into an airline lounge, it will
automatically connect to the available Wi-Fi hotspot. But what if that
doesnt happen? First, check that your
notebooks Wi-Fi adapter is turned on.
Often, youll see a backlit Wi-Fi icon
near the keyboard. If the icon isnt illuminated, look for a physical switch
that you can flip to enable the adapter.
Sometimes, the state of your network
connection is easily determined by
an icon in the notification area of the
Taskbar. For instance, a red X on the

70

February 2015 / www.cybertrend.com

network icon indicates the adapter is


disabled while an asterisk means the
adapter is in the process of detecting
the available networks. You can rightclick the network icon in Windows
7 or Win8 and select Troubleshoot
Problems. When the Windows Network Diagnostics utility opens, it will
reset your connection, disable the
wireless adapter, and then enable the
adapter again.
The utility will display descriptions
of the problems it detects along with
some recommended solutions. In most
instances the utility will repair the connection and report the issue as Fixed.
To enable a disabled adapter, right-click
the Network Connections icon, click
Open Network And Sharing Center,

select Change Adapter Settings, and


then right-click the name of the wireless
adapter. In the resulting menu, you can
choose to disable or enable the adapter,
connect to or disconnect a network, and
diagnose problems, among other options. Click Properties to access detailed
options that may help you troubleshoot
the problem.
When your adapter is working properly, Windows may display a message
indicating there are several available
wireless networks. Select the message
and choose a network SSID (service
set identifier, or name) from the list.
(You may need to input a security password.) To display a list of available
networks in Win 8, go to the Settings
option in the charm bar and click the

THE WINDOWS NETWORK DIAGNOSTICS


UTILITY . . . WILL RESET YOUR CONNECTION,
DISABLE THE WIRELESS ADAPTER, AND THEN
ENABLE THE ADAPTER AGAIN.

Available Networks icon. If the adapter


is working and your system appears to
be connected, but you still cant access
the Internet check for a browser-based
splash screen and/or a Terms Of Use
statement to agree to. Launch a fresh
browser session and click the Home icon
to redirect.

Fix Broken Outlook


PST & OST Files
The PST (personal storage table)
file and the offline OST (Outlook Data
File) is where Outlook stores messages,
calendar events, and notes specific to
your email account. If this file becomes
corrupted, you may find yourself
ousted from Outlook. There are a few
things, however, that you can do to get
a foot in the door.
Microsofts Inbox Repair tool,
Scanpst.exe (Outlook 97-2003, 2007,
2010, and 2013), lets you solve busted
PST/OST problems quickly. To access
the tool, close Outlook and navigate
to C:\Program Files\Microsoft Office\
OFFICE12. (This last folder may have
a different number; for instance, our
version of Office 2013 stores the utility
in the \OFFICE15 folder.) Double-click
Scanpst.exe. By default, the address
for our OST file was already listed,
but if the field is blank, look in the C:\

The Microsoft Outlook Inbox


Repair Tool (Scanpst.exe) lets
you quickly recover corrupted
Outlook PST and OST files.

file structure and rebuilds the headers.


The Recovered Personal Folders item
in your Outlook folders list, if it appears, will contain all the data that
is recovered. You can then drag the
data to your new PST file and delete
the Recovered Personal Folders item
from Outlook.

A Touchy Touchpad
If you use your laptop on a dock
(and use an external mouse and keyboard), you can go weeks or months
with a deactivated touchpad and never
realize it until you hit the road. If you
find yourself in this situation, you

MICROSOFTS INBOX REPAIR TOOL . . . LETS


YOU SOLVE BUSTED PST/OST PROBLEMS
QUICKLY.
Users\USERNAME\AppData\Local\
Microsoft\Outlook\ folder. Click the
Options button to access Replace,
Append, or No Log functions and click
OK. Click Start to begin the scanning
process. Windows will inform you of
any errors and prompt you to perform a
repair when the scan is complete. Before
clicking the Repair button, make note of
the scanned files backup location. Click
Repair and OK when you see the Repair
Complete message. Launch Outlook to
see if this fixes the problem.
If the file structure was corrupted
beyond repair, Scanpst.exe resets your

can activate the touchpad by pressing


the Fn (function) key simultaneously
with the F number key associated with
the laptops touchpad (often labeled
with an image of a touchpad). Using
this key combination will either automatically activate the touchpad or display a device settings dialog box that
gives you the option to enable your
touchpad. Alternatively, you can check
the notification area in the lower-right
corner of the screen for a touchpad
icon. Click the icon and the touchpad
control panel appears where you can
enable or disable an input device.

An Unresponsive Keyboard
Or Mouse
If your programs and applications
dont respond to keyboard commands,
use your mouse to shut down the computer by clicking Start, then Shut Down
(in Win7) or tap the Power Button and
tap Shut Down (in Win8). Unplug the
keyboard from your PC and then reconnect it. Restart your PC to determine whether this process corrected the
problem. (If both input devices are unresponsive, you can press and hold the
Power Button on the tower to manually
shut down your system.)
If your mouse isnt responding, but
your keyboard is, press the Windows
key in Win7 to open the Start menu,
use the Right-Arrow key to select Shut
Down, and then press ENTER. In Win8,
press CTRL-ALT-DELETE, press the
Tab key until the power icon is highlighted, and then press ENTER. Unplug
your mouse and then reconnect it. (If
necessary, you can press and hold the
Power button to shut down the PC.)
Then restart your computer to see if
these instructions fix your problem.
If youre using a wireless keyboard
and mouse, ensure that the peripherals
are synced and in range of the wireless
receiver. You may also need to install
new batteries. If these steps dont enable
peripheral communication with the PC,
try reinstalling device drivers. You can
often download these from the mouse
and keyboard manufacturer websites.

CyberTrend / February 2015

71

S-ar putea să vă placă și