Sunteți pe pagina 1din 33

Interoperability and

Portability
Aradhna Chetal, Balaji
Ramamoorthy, Jim Peterson,
Joe Wallace, Michele Drgon &
Tushar Bhavsar
This section addresses portability and interoperability as it
Cloud Security Alliance relates to moving data and applications securely to and from
Group 4 the cloud.

8/22/2011
Table of Contents

Overview .................................... Pages 2

What is Interoperability? ..................................................................................... Pages 3-8

What is Portability? .............................................................................................. Pages 9-12

Why Do Interoperability and Portability Matter? Page 13-14

What do Interoperability and Portability Mean for Different Cloud Models? Pages 15-18

What are the Benefits of Interoperability and Portability? ......... Page 19

Example Use Cases ...... Pages 20-22

What are the Risks Involved with Interoperability and Portability? ......... Pages 23-28

Emerging Standards and Technology Page 29-31

Sources .. Page 32

Page 1
Overview

Is cloud computing really so different? It uses the same underlying elements that make up most
computing environments. Hardware, software, networks, and applications all interact to provide critical
business functions. The cloud environment uses these elements, what makes it unique? This section
addresses this question as it relates to portability and interoperability when moving data and applications
securely to and from the cloud. The advent of cloud computing offers unprecedented scalability to an
organizations IT processing and administrative capability unlike those available in traditional in-house
infrastructures. This brings with it the potential for less impact on IT resources and reduced capital costs.
Almost instantaneously, additional capacity can be added, moved, or removed in response to dynamically
changing processing needs. A new application support system can be spun up to meet increased demand
in a matter of hours rather than weeks. Should demand fall back, the additional capacity can be shutdown
just as quickly with no left-over hardware now sitting idled. Gaining the benefits of this more elastic
environment requires appropriate planning to avoid being locked into a cloud solution that may not
measure up to the goals for moving to the cloud in the first place.

Why portability and interoperability are important to cloud computing strategies can be
understood by referencing Cloud Taxonomies (reference back to Domain 1 material) which illustrates
various components from which a cloud environment may be composed. Combining components into a
solution introduces boundaries between different components. Boundaries here can be defined as any
distinct division across which data or applications move or operate. Different types of boundaries may
exist between the various components of a cloud system, such as operational boundaries or trust
boundaries, etc Some examples include those such as may exist between physical locations such as
internal .vs. external, across operating systems, across different platform or service providers, between
applications or application components, and between separate or distinct data stores.

Data and application processing routinely crosses these boundaries during the course of routine
business processing, or as data and applications migrate to new providers or platforms. Ensuring
operational integrity across these boundaries as processing needs move into the cloud is a critical
consideration which can be addressed through portability and interoperability. To avoid business
disruption, organizations moving to the cloud both must be addressed as part of risk management and
security assurance considerations when planning a cloud program.

Page 2
What is Interoperability?
The concept of interoperability as it applies to cloud computing is at its simplest, the requirement for
the components of a processing system to work together to achieve their intended result. Components
should be replaceable by new or different components from different providers and continue to work.
Components that make up a system consist of the hardware and software elements required to build and
operate it. Typical components required of a cloud system include:

Hardware
Operating systems
Virtualization
Networks
Storage
Software (Application frameworks, middleware, libraries, applications)
Data Security.

It should be apparent to most readers that these components compose almost any IT system. You
might ask how these components and therefore how interoperability concerns differ from any IT system
where all of the components would logically be expected to work together as well. If cloud computing
uses all of these same components, how is the cloud different?

The difference as it relates to interoperability within the cloud can be introduced by referencing the
architectural framework for cloud computing. This framework describes cloud computing as the
separation of application and information resources from the infrastructure and mechanisms used to
deliver them (reference Domain 1 What is Cloud Computing).

Essentially, in cloud computing, the underlying components of a system that provide for the
infrastructure and delivery of an application function may be abstracted. Other applications that interact
with the application function will receive the same processing capabilities as if the application were
provided through traditional integration models which have greater dependency on where and how a
process operates in order for them to work.

Abstracting an application within cloud computing continues to provide the same capabilities, but
receiving those capabilities has no dependency on the infrastructure components needed for the cloud
application to operate. It can be running on any type of hardware or OS, be written in any language and it
may reside anywhere. The application function provides a service accessible over the internet and that
interacts with other application modules that themselves may reside and operate from anywhere.
Abstracting what an application does, from where and how it does it introduces unique issues for
maintaining interoperability between applications as they move to the cloud.

Interoperability impacts often arise within cloud computing when changing business needs drive the
need for changing a service provider. While a system may be functioning to specifications with one cloud
provider, will it continue to do so with a new provider? Lack of interoperability can lead to being locked
to one cloud service provider. Inevitable business decisions will result in changes over time that may lead
to a change in providers. Reasons for this change include:

An unacceptable increase in cost at contract renewal time

Page 3
A provider ceases business operations.
A provider suddenly closes one or more services being used, without acceptable migration plans.
Unacceptable decrease in service quality, such as a failure to meet key performance requirements
or achieve service level agreements (SLAs).
A business dispute between cloud customer and provider.

The degree to which interoperability can be achieved or maintained when considering a cloud project
often will depend on the degree to which a cloud provider uses open, or published, architectures and
protocols. Refer for a moment to the Jericho Cloud Cube which defines a means for differentiating
between cloud models. An important dimension in this cube defines a measure for interoperability based
on the degree of ownership of the technology behind a cloud solution.

The Cloud Cube defines an axis that identifies two degrees for interoperability as proprietary and
open. Cloud providers that offer services that are based on technology that is not publicly documented,
or that may require specific technology licensing requirements with that provider fall on the side of the
axis considered as proprietary. Alternatively, cloud providers that offer services that utilize open or
published technologies, or that reply on recognized standards fall on the open side of the axis.

The degree to which a provider offers a service based on technology that is published or standardized,
typically will increase the interoperability across cloud systems. The reason for this is other vendors
providing similar services can build their offering s from the same published specification which results
in interoperable technologies from many vendors. Either option may provide the best fit for a given need,
however, consideration for the use of open and published technologies offers a greater opportunity for
interoperability and reduces the chance of being unavoidably locked into a single providers offering.

For example, a company may seek to reduce costs to maintain the components required to run an in-
house order entry subsystem. Some components of the order system may remain in-house, while others
may be moved to an external cloud provider. Reasons for such a move may be to take advantage of cost
savings opportunities or new features offered by the provider. Expectations for interoperability
expectations would be that the subsystem continues to operate to specification within the larger customer
fulfillment system? This change should not cause the larger order processing and fulfillment operation to
stop working beyond a manageable time period required to complete the switch, or to provide any level of
service less than before the switch to the cloud.

Similarly, changing from one hosted storage provider to another that may offer compelling storage
and transfer cost savings should not result in the moved data being unavailable for periods unacceptably
longer than would be expected for an in-house storage migration of similarly sized data sets and the
continued accessibility. Also continued use of that data by the users and systems that require it should
remain similarly unaffected after the move.

Almost any change related to a cloud migration can be expected to have some operational impact on
the affected systems. Moving data sets and applications will typically result in the business function
being unavailable during the planned period of the move. If appropriate considerations for
interoperability are made the impact of a service change can be both manageable and cost effective,
avoiding unanticipated costs and unacceptable delays.

Page 4
When appropriate interoperability between components is attained, companies can effectively deploy
cloud solutions from a single cloud provider or from many providers as best meets their needs. Since
cloud applications may be composed of separate elements running from potentially many separate sites,
cloud computing introduces another unique issue which intersects with interoperability requirements.
The issue is how to ensure all the geographically diverse components work together. Interoperability
must be retained for all components of a system and this consideration must extend to cover the
possibility that services and workloads may shift. Through interoperability, all components will require
appropriate orchestration in order continue to operate correctly and securely regardless of where they are
hosted or on what platform.

Interoperability considerations and recommendations for cloud systems include:

Hardware hardware will inevitably vary from provider to provider leaving an unavoidable
interoperability gaps if direct hardware access is required.

Recommendations:

o Whenever possible, use virtualization to remove any hardware level concerns.


o If hardware must be addressed, important security considerations are to make sure that
the same or better physical and administrative security controls exist when moving from
one provider to another.

Virtualization While virtualization can remove concerns about physical hardware, distinct
differences exist between common hypervisors such as ZEN, VMware and others.

Recommendations:

o Consider using open virtualization formats such as OVF to ensure interoperability.

Frameworks Different platform providers offer different cloud application frameworks and
differences do exist between them that affect interoperability

Recommendations:.

o Investigate the APIs to determine where differences lie and plan for any changes
necessary that may be required to application processing when moving to a new provider.
o Use open and published APIs to ensure the broadest support for interoperability between
components and to facilitate migrating applications and data should changing a service
provider become necessary.
o Avoid service providers that supply services using unpublished proprietary APIs.

Page 5
o Applications in the cloud interoperate over the internet and outages can be anticipated to
occur. Determine how failure in one component will impact others and avoid stateful
dependencies that may risk system a data integrity when a remote component fails.

Storage Storage requirements will vary for different types of data. Structured data will most
often require a database system, or require application specific formats. Unstructured data will
typically follow any of a number of common application formats used by Word Processors,
Spreadsheets and Presentation Managers.

Recommendations:

o Store unstructured data in an established portable format such as the interoperable ZIP
format for both reduced storage and transfer requirements.
o Understand the size of data sets hosted at a cloud provider. Use interoperable data
compression for data moved to and from the cloud to reduce the volume of data being
moved and to reduce the time required time to move. Typically there is a cost to move
each byte of data into and then, when needed, out of the cloud as well as costs while
stored.
o Aggregate files into ZIP archives to reduce complexities of moving sets of multiple files.
o Ensure the storage format selected interoperates regardless of the underlying platform.
Data may need to be accessible from mobile, to desktop, to mainframe.
o Check for compatible database systems and assess conversion requirements if needed.

Security Data and applications in the cloud reside on systems you dont own and likely have
only limited control over. A number of important items to consider for interoperable security
include:

Recommendations:

o Make sure authentication controls for system and user account access credentials are
compatible to ensure continued and consistent system access integrity and security.
This includes maintaining the same authentication factor level (i.e. single .vs.
dual).
Assess whether user or process authentication requires a change in credentials
If Smart card or other removable key storage is utilized, make sure support is
provided, or use roaming credentials.

o The most important way to protect sensitive data moved to the cloud is through
encryption
Encrypting data before it is placed into the cloud will ensure that it cannot be
accessed inappropriately within cloud environments.

Page 6
Avoid losing control of encrypted data by holding the keys and avoid trusting
them to a 3rd party, even if that 3rd party is your service provider. If only you hold
the keys, only you can access your data;
Use only interoperable encryption such as that available through published
specifications such as ZIP or OpenPGP that directly and persistently protect data
and files regardless of the platform, storage systems, or location where it resides.
o Not all information used within a cloud system may qualify as confidential or fall under
regulations requiring protection. Assess and classify data placed into the cloud. Avoid
the additional work of encryption and key management if the nature of the data does not
warrant it. Encrypted assets should be protected using encryption that is based on
established standards such as AES and providers should comply with any geographic
regulatory mandates such as the U.S. Federal Information Processing Standard (FIPS)
140-2 or national boundary restrictions. Moving to the cloud does not remove
compliance requirements;
Never rely on an encryption algorithm that has not undergone industry scrutiny.
Make sure security services of your provider adhere to the same regulatory
mandates to which you and your data must conform.

o Avoid unnecessary and potentially costly steps of re-encrypting and re-keying


confidential data that may be required if data moves between disparate cloud
environments.
Use only interoperable encryption to protect files such as is available using ZIP
or OpenPGP.

o When keys must be available within the cloud, investigate how and where keys are stored
to ensure access to existing encrypted data is retained
Understand which staff of your service provider may have access to encryption
keys and under what rules of engagement do they require it.
Understand your responsibilities and liabilities should a compromise occur due to
unanticipated gaps in protection methods offered by your service provider.
If password encryption is used, determine if a stronger means such as a
public/private key may be required to avoid difficulties of sharing a password for
collaborative data.
If keys are provided by your service provider, make sure you understand what
these keys protect. Do they directly encrypt your data, or do they protect only
other keys.

o Applications utilizing Service-oriented Architectures remain in a state of evolving


security. Look for gaps that may exist when introducing services. Compensate by
ensuring data is protected through portable encryption formats. Require application to
support reading and writing necessary application data when it is unencrypted and also
when it may be protected using portable data encryption formats.

Page 7
o API security keys used for calls to services requiring authentication should interoperate
and appropriate maintenance and protections of keys must exist on new platforms

o Data integrity measures should be incorporated to ensure data remains unaltered while in
the cloud.
Use watermarks where appropriate to identify ownership
Increase the use of digital signing of files and documents to validate data
integrity and identity
Incorporate time-stamps on digital signatures to fix content and integrity marks
in time.
Where native document formats do not support digital signing, protect
documents in interoperable formats that do. Examples are ZIP and OpenPGP.
Both of these formats provide digital signing for any type of document or file.
o Log file information should be handled with the same level of security as all other data
moving to the cloud. Log files contain important information for system investigation
and analysis and often contain highly sensitive information that must be protected.
Investigate how log files are created and managed within the cloud environment
where processing occurs.
Determine how log analysis tools interoperate with existing log management to
ensure the routine log management continues without disruption or determine
what new tools are required to support logging requirements for a cloud service.
Make sure sufficient timeliness is provided by log reporting facilities to ensure
alerting and response continuity is maintained and ensure SLAs for cloud
providers address this.
Make sure log retention requirements can meet regulations.
When completing a move to the cloud that includes migration of log information,
make sure copies of log files are properly deleted from the original system to
avoid leaving this now duplicated information behind.
Log files in-house are generally not well protected today, and expectations for
more robust solutions being available yet from cloud offerings should not be set
too high.
Encryption of log file data must occur. Choose an encryption method that is
interoperable across platforms. It is recommended to use secure ZIP
compression on all log files to protect them and to store them efficiently. Log
files typically have a high compression factor, often as high as 90%.
Apply the same diligence to the encryption keys used for log files as you do for
all other data placed into the cloud.
Encrypting log files will inhibit search functions. This is a tradeoff that should
be made as searching and security tend to work at odds with each other.
Searching typically requires indexing which leaves sensitive information
exposed.

Page 8
What is Portability?
The concept of portability as it applies to the cloud provides for application and data components
to continue to work the same way when moved from one cloud environment to another without having to
be changed. Portability is achieved by removing dependencies on the underlying environment. A
portable component can be moved easily and reused regardless of the provider, platform, operating
system, location, storage or other elements of the surrounding environment.

Portability of applications means that an application running on one cloud platform can be moved
to a new cloud platform and operate correctly without having to be re-designed, re-coded, or re-compiled.
If the new cloud platform is a different operating platform such as when an application moves from a
Windows host to a Linux host, the application will run without needing to be changed.

Application portability can be significantly enhanced by using higher level languages or JAVA to
provide for more interoperable application support through greater abstraction from the operating
platform. Lower level languages can be used, but abstraction should also be considered through service-
oriented architectures.

Portability of data means that databases, data files or other data elements used within application
or user processing can be moved to any new environment and used without requiring changes to the data
format or to the applications that use it. Applications processing data should be assessed to ensure
appropriate security remains while the data is in use. Leaving unencrypted data or encryption keys within
application memory buffers is not recommended.

Data persisting outside the processing boundaries of applications or databases should be protected
so it can be stored or moved anywhere without risk of exposure that may occur as data passes across
many of the gaps present today in many data security systems. Inherently, these systems are not able to
fully protect data since they operate as point solutions capable only of securing data within the
boundaries in which they operate. Examples of solutions commonly deployed which cannot fully secure
data through its useful life include secure channels, encrypted drives, or secure messaging.

Since data frequently moves through any number of paths, point methods of securing files
requires additional diligence to ensure that as data moves from one point to another that each point retains
the required level of security. To achieve this data often must undergo encryption as it reaches a new
point and then must be decrypted as it passes to the next point. Risks arise that any change to a point may
open data to exposure. This risk increases with cloud computing since each point may reside within
services of different providers.

For example, a sensitive data file may undergo encryption and decryption within an SSL/TLS
channel as it is being passed over a network from one process to another. As it leaves the encrypted

Page 9
channel, it may be destined for storage with a storage service provider. As it arrives at the storage facility
it undergoes another encryption operation as it is written to storage.

Rather than relying on point solutions for encrypting data choose an encryption method that
operates directly on the data independent of the applications or systems that may process, move or store
it. Portable data encryption should be used to encrypt sensitive files moving to the cloud. Either of two
published formats should be considered to persistently and portably protect data.

The OpenPGP format provides one method for encrypting individual files. The ZIP format
provides an important alternative that uses algorithms and encryption strengths of the same durability as
OpenPGP and adds additional archiving capabilities which makes it easier to work with multiple files
across different platforms. ZIP also includes industry leading portable data compression to reduce a files
size before encryption which provides additional benefits of reducing impacts to storage or transfer
bandwidth. Both formats are universally recognized and it is likely that a versatile cloud solution will
need to support encrypted data in both formats.

As example, information that may reside protected within a database facility, must protected
when that data is used outside the database system. Report queries, log files and other extracts should be
securely zipped to ensure information is not compromised.

Additional considerations for portable data encryption include the types of keys that can be used
and whether metadata can be encrypted. Passwords remain the most common method of keying.
Public/private key methods are available in either the X.509 or OpenPGP format. Metadata is
information about a file that is stored within the encrypted format. Important metadata to encrypt
includes file names and file sizes. Choose a portable format such as secure ZIP that includes protections
for important information about your data, not just the data itself.

Moving to the cloud can simplify many of the concerns of traditional systems, replacing
traditional costs such as infrastructure and power with a pay as you use it approach. Making this type
of move requires addressing appropriate portability issues to ensure essential business needs continue to
be met. Existing applications and data when moved from traditional processing models to the cloud will
encounter portability considerations unique to the cloud.

There are a number of issues standing in the path of moving to the cloud. Portability
considerations and recommendations that impact moving to the cloud include:

Management control managing systems and applications becomes much more difficult with a
move to the cloud. Traditional tools operate effectively within the established enterprise boundary, but
they lack the ability to manage services as they move outside the perimeter. Adopt management tools that
can monitor internal as well as external applications regardless of location. Understand what
management control capabilities are provided by a selected service provider and be prepared to adopt a
set of new management tools to maintain hosted functions. This can result in supporting more than one
management tool to cover internal and external systems.
Page
10
Service Level Expectations for meeting service level agreements (SLAS) will introduce both
distance and boundary transitions that can impact abilities to meet the SLAs a company must meet for
their own customers. Applications that run across the internet from geographically dispersed locations
will not consistently perform to the level of traditional applications. Are the SLAs from a cloud service
provider sufficient to meet SLAs requirements for your customer? If you move to a new provider, can
they achieve the same level of performance? Determine who has the burden of proof for a possible
violation.

Fault management Services moved from traditional environments can be expected to lack
integration points with exiting management tools used to monitor, report, and remedy system faults.
Determine how events will be received to detect and avoid disruptions to service. Log files and messages
may contain sensitive material. Make sure event reporting does not become a point of compromise or a
cause for lack of compliance.

Different architectures Systems in the cloud may reside on disparate platform architectures.
Moving to the cloud or changing to a new service provider within the cloud can be impacted by
architecture differences. Leading platforms such as Google App Engine, Force.com and Amazon all
provide some degree of support for moving applications. However, each is architected differently enough
so that moving from one to another is not easy. Appropriate portability assessments must be made to plan
for adjustments required to ensure both data and application portability are maintained.

Security integration End to end security remains a requirement for cloud systems to ensure
compliance and data confidentiality are maintained. Cloud systems introduce unique portability concerns
for maintaining security, including:

o Authentication and identity mechanisms for user or process access to systems now must
operate across all components of a cloud system. Single sign-on using authentication
against an enterprise Active Directory may pose a risk when outside components cannot
authenticate to an internal LDAP repository. Ensure authentication extends appropriately
to the divergent systems required for a cloud application.
o Two factor systems should be assessed to determine they will recognized within cloud
systems where remote system elements are not directly accessible to removable card or
token key storage devices.
o Access controls must extend from the internal environment to the cloud. Preventing
access to users within the corporate perimeter is an established practice today. Cloud
applications running on the internet now require access management to appropriately
restrict access for anyone using the internet.
o Data encryption must be applied to data moving to the cloud. The systems one which
your data resides will not be owned by your organization, and they will most often be
managed and maintained by outside parties. To ensure there is no loss of control of
sensitive data or risk of falling out of compliance, encrypt data before it moves to the
cloud. Remove risks to portability by placing data into universally portable secure
formats that are published such as OpenPGP or ZIP.
o Encryption keys should not be turned over to cloud providers.

Page
11
Metadata is an aspect of digital information that is often and easily overlooked. This is generally
because metadata typically is not directly visible when working with files and documents because it is not
part of the main document or file content. Metadata is data about data. It has been used for years in the
IT industry. Familiar examples include file owner or time-stamp information. This metadata about a
file records information useful for managing the file, but is not needed for most user needs for the file.
Users of a spreadsheet typically are most concerned with the rows, columns and charts within the
document and rarely consider the metadata that accompanies that spreadsheet wherever it goes.

Metadata becomes an important consideration in the cloud, simply because this metadata does
move with the document. Encryption functions that protect important financial information for a
spreadsheet may easily leave the metadata for the file unprotected. Further as the use of metadata evolves
from storing basic information to storing much more advanced information about files and the users of
those files, the risk of exposure through metadata leakage increases. Especially since metadata often is
unseen by design.

As the use of metadata expands, portability concerns arise over whether the metadata can be used,
and protected within cloud systems, and when moving from one cloud system to another. The use of
metadata increasingly relies on XML to provide for portability of this information. Because XML adds to
the storage requirements for each file, document formats that combine XML to preserve portable
metadata with ZIP compression provide for efficient and portable packaging methods for documents and
files.

Information retentions for metadata may fall under the same retention requirements driven by
requlation as must be applied to the data itself. When moving files and their metadata to new cloud
environments, make sure copies of file metadata is securely removed to prevent this information from
remaining behind and opening a possible opportunity for compromise.

Page
12
Why Do Interoperability and Portability Matter?
Reasons for moving to the cloud vary from business to business. Each business will make cloud
decisions based on differing needs and objectives. Decision making when selecting suitable cloud service
and deployment models requires analysis of which cloud model is the best fit for a defined need.

Organizations looking to host business processing in the cloud may choose the IaaS model to
extend their infrastructure needs for OS platform support, storage, email and messaging, or other
extensions of basic IT infrastructure. Companies needing application support may decide on the SaaS
model to access business application services for functions such as Customer Relationship Management
(CRM), business collaboration software, or Ecommerce. Those looking to expand or move custom
application processing to the cloud may look to the PaaS model for cloud development frameworks. A
single business may likely find they need to adopt each of these models over time as more and more of
their information processing support moves to the cloud.

A factor for successful cloud deployment is achieving processing compatibility for cloud systems
with that of the traditional systems they replace. The ability to move to the cloud requires that
applications now hosted internally can run in the cloud. Applications that worked on in-house
infrastructures must continue to work with the same capabilities and reliability as they move to the cloud.
Storage solutions hosted within internal NAS or SAN facilities must continue to support the same storage
and access needs regardless of which cloud storage provider is selected. Applications built on cloud
frameworks must meet the same business requirements and development efficiencies as in-house efforts.
Once established, each of these scenarios must also provide on-going compatibility should changing
business needs require changing any of the underlying cloud components on which a solution depends.

Portability and interoperability combine to provide compatibility of cloud solutions. They are
important considerations for cloud planning because together they ensure that cloud solutions continue to
operate (through interoperability) and do so unchanged (through portability). When portability and
interoperability between components is not addressed, unanticipated processing failures will be the likely
result with the associated costly disruption of business continuity.

Portability and interoperability must be considered whether the cloud migration is to public,
private, or hybrid cloud deployment solutions. They are important elements to consider for service model
selection regardless of whether a migration strategy is to Software as a Service (SaaS), Platform as a
Service (PaaS), or Infrastructure as a Service (IaaS). Failure to appropriately address portability and
interoperability in a cloud migration may result in failure to achieve the desired benefits of moving to the
cloud. Ignoring these aspects of cloud migration can result in costly problems or project delays due to
factors that should be avoided such as:

Page
13
Application, vendor, or provider lock-in choice of a particular cloud solution may mean
you cannot easily move in the future to another cloud offering from another vendor
Processing incompatibility and conflicts causing disruption of service provider,
platform or application differences may expose incompatibilities that cause applications
to malfunction ifa new cloud platform is chosen
Unexpected application re-engineering or business process change moving to a new
cloud provider can introduce a need to rework how a process functions or require coding
changes to retain original behaviors
Costly data migration or data conversion - lack of interoperable and portable formats
may lead to unplanned data changes to move to a new provider
Retraining or retooling new application or management software
Loss of data or application security different security policy or control, key
management or data protection between providers may open undiscovered security gaps
when moving to a new provider or platform.

Portability and interoperability are not considerations unique to cloud environments and their
related security aspects are not new concepts brought about by cloud computing. However, the open and
often shared processing environments that exist within the cloud bring a need for even greater precautions
than are required for traditional processing models. Multi-tenancy means data and applications reside
with data and applications of other companies and that access to confidential data (intended or
unintended) is possible through shared platforms, shared storage and shared networks.

Data moving outside the protected organization boundaries means encryption is mandatory and
traditional parameterized security measures are not sufficient in the cloud. To ensure portability and
interoperability of data in transit to, and stored within the cloud, make sure data is protected before
placing it in the cloud and make sure keys remain within the hands of authorized company staff.

Page
14
What do Interoperability and Portability Mean for Different Cloud Models?
As defined in the earlier section Interoperability means the ability for multiple cloud platforms to
work together or interoperate. The primary goal of interoperability is to make it easier to adopt cloud.

Compatibility is next natural step of how to achieve this interoperability. It is the ability of the
application and the data to work the same way irrespective of the service model (IaaS, PaaS, SaaS) or
deployment models (private, public, and hybrid) and location (internal or external to the enterprise)

Portability is where application components are able to be moved easily and reused regardless of
provider, platform, OS, infrastructure, location, storage, format of data, API etc. The pre-requisite for
interoperability is existence of abstraction between application, data, logic and system interfaces.

One of the key factors to cloud portability and interoperability is data portability. While the systems
interoperability becomes the primary concern of the cloud service provider, issues around data
interoperability and portability remains critical. The onus is on the cloud consumer to make sure that the
data is portable as the consumer owns the data. Data should be in a format that is sharable between the
cloud providers, since without ability to port data it will become impossible to switch cloud providers.

Organizations must approach the cloud with the understanding that they may have to change
providers in the future. Portability and interoperability must be considered up front as part of their cloud
planning strategy. It is advisable to do basic business continuity planning to help minimize the impact of
worst-case scenarios when various companies might find themselves with urgent needs to switch cloud
providers.

In this section we will look what portability and interoperability means to various cloud models
(both service and deployment models) and various factors that cloud customers should consider with respect
to portability and interoperability.

Before we look into various cloud models all cloud customers should acknowledge that

Substituting cloud providers is in virtually all cases a negative business transaction for at least one
party, which can cause an unexpected negative reaction from the legacy cloud provider. This must be
planned for in the contractual process as outlined in Domain 3, in your Business Continuity Program
as outlined in Domain 7, and as a part of your overall governance in Domain 2.
Understand the size of data sets hosted at a cloud provider. The sheer size of data may cause an
interruption of service during a transition, or a longer transition period than anticipated. Many
customers have found that using a courier to ship hard drives is faster than electronic transmission for
large data sets.

Page
15
Document the security architecture and configuration of individual component security controls so they
can be used to support internal audits, as well as to facilitate migration to new providers.

Infrastructure as a Service (IaaS)

As the responsibility of the cloud provider is to provide basic computing utilities such as storage,
computing power etc., a majority of application design tasks related to interoperability rests on the cloud
customer. The cloud provider should provide standardized hardware and computing resources that can
interact with various disparate systems with minimal efforts. The Cloud provider should strictly adhere to
industry standards to maintain interoperability. The provider should be able to support complex scenarios
such as cloud brokerage, cloud bursting, hybrid clouds, multi- cloud federation etc.

Recommendations

Understand how virtual machine images can be captured and ported to new cloud providers, who
may use different virtualization technologies. Ex: Distributed Management Task Force (DMTF)
Open Virtualization format (OVF).
Identify and eliminate (or at least document) any provider-specific extensions to the virtual machine
environment.
Understand what practices are in place to make sure appropriate de-provisioning of VM images
occurs after an application is ported from the cloud provider.
Understand the practices used for decommissioning of disks and storage devices.
Understand hardware/platform based dependencies that need to be identified before migration of
the application/data.
Ask for access to system logs, traces, and access and billing records from the legacy cloud provider.
Identify options to resume or extend service with the legacy cloud provider in part or in whole if
new service proves to be inferior.
Determine if there are any management-level functions, interfaces, or APIs being used that are
incompatible with or unimplemented by the new provider.
Understand costs a involved for moving data to and from a cloud provider
Determine what means are supported for moving data as efficiently to the cloud as possible through
using standard capabilities such as data compression
Understand what security is provided and who maintains access to encryption keys
Encrypt data before placing it in the cloud and retain control of all encryption keys to avoid risks
from multi-tenancy or careless or disgruntled staff of the cloud provider
Encrypt using standard encryption formats for ensured portability using OpenPGP or ZIP

Platform as a Service (PaaS)

The cloud provider is responsible to provide a platform on which the consumers can build their
systems. They provide with a runtime environment and an integrated application stack. It allows developers
to quickly develop and deploy custom applications on the offered platforms without the need to build the
infrastructure. The cloud provider provides the entire infrastructure and its maintenance to its consumers.
Page
16
Recommendations

When possible, use platform components with a standard syntax, open APIs, and open standards.
Ex: Open Cloud Computing Interface (OCCI)
Understand what tools are available for secure data transfer, backup, and restore.
Understand and document application components and modules specific to the PaaS provider, and
develop application architecture with layers of abstraction to minimize direct access to proprietary
modules.
Understand how base services like monitoring, logging, and auditing would transfer over to a new
vendor.
Understand what protections are provided for data placed into the cloud and data generated and
maintained in the cloud
Protect data using standard encryption formats of OpenPGP or ZIP and retain control of all
encryption keys
Understand control functions provided by the legacy cloud provider and how they would translate
to the new provider.
When migrating to a new platform, understand the impacts on performance and availability of the
application, and how these impacts will be measured.
Understand how testing will be completed prior to and after migration, to verify that the services
or applications are operating correctly. Ensure that both provider and user responsibilities for
testing are well known and documented.

Software as a Service (SaaS)

The cloud provider provides application capabilities over the cloud and the client just manages
his/her operations and the information flowing in and out of the system. The client needs a browser and
majority of the administrative at all the levels rests with the provider.

Recommendations

Perform regular data extractions and backups to a format that is usable without the SaaS provider.
Understand whether metadata can be preserved and migrated.
If possible use data escrow services.
Understand that any custom tools being implemented will have to be redeveloped, or the new
vendor must provide those tools.
Assure consistency of control effectiveness across old and new providers.
Assure the possibility of migration of backups and other copies of logs, access records, and any
other pertinent information which may be required for legal and compliance reasons.
Understand management, monitoring, and reporting interfaces and their integration between
environments.
Is there a provision for the new vendor to test and evaluate the applications before migration?
Page
17
Private Cloud

Private cloud is when consumers run private cloud within their enterprise or use private cloud
offering from the cloud providers.

Recommendations

Ensure interoperability exists between common hypervisors such as KVM, VMware, Xen.
Ensure standard APIs are used for management functions such as users and their privilege
management, VM image management, Virtual Machine management, Virtual Netw ork
management, Service management, Storage management, Infrastructure management, Information
Management etc

Public Cloud

Interoperability in public cloud means exposing most common cloud interfaces. They may be
vendor specific or open specifications and interfaces such as OCCI, libcloud etc

Recommendations

Ensure that the cloud providers expose common and/or open interfaces to access all cloud functions
in their service offering.

Hybrid Cloud

In this scenario the consumers local private infrastructure should have the capability to work with
external cloud providers. A common scenario is cloud bursting. For an enterprise to meet peak demands
to better serve its customers, the enterprise could share load with external cloud providers

Recommendations

Ensure that the cloud providers expose common and/or open interfaces to access all cloud functions
in their service offering.
Ability to federate with different cloud providers to enable higher levels of scalability.

Page
18
What are the Benefits of Portability and Interoperability?
Infrastructure Abstraction Application developers and administrators no longer need to worry
about the hardware the application is running on. Hypervisors gives the abstraction from underlying
hardware thus removing the compatibility concerns.
Abstraction between application, data, logic and system interfaces This level of abstraction
provides agile application development process, portability, modularity and lose coupling.
Enterprises no longer need to worry that application and data need to reside on the same location.
Cloud Adaptability and Customization Provides ability for the enterprises to adopt cloud and also
the ability to customize the cloud environments to fit their needs.
Vendor lock-in Portability and Interoperability standards provide consumers the ability to switch
cloud providers without a lock-in to a particular provider.
Openness Transparency is one of the key requirements of cloud computing. Provides the
confidence to the consumers with their business continuity planning in the event they want to switch
providers.

Page
19
Example Use Cases

The follow scenarios are based on a person we will call Tim. Tim is it manager of sales management
tool and history for his corporation. The use case explore the concepts of moving this sales management
tool from in house to the cloud then to anther cloud vendor and back to in house. The sales management
tool is for tracking sales opportunities and the process they go through. It consists of web interface and a
database containing transactions and history.

Tim is facing many decisions that affect his budget both or people and hardware.

Move from In-house to Cloud :


o Tim is working on an old sales management tool that was built in house and requires
multiple engineers and soon will require a hardware upgrade. This expense is taking a
large portion of his IT budget. He has found two cloud vendors that could help reduce
his cost.
PaaS The first vendor cloud vender has configured hardware and software
platform to run his internal applications as well as the historical data. Concerns
creating new IDs for each user and setting up permissions. Management of
these ID is separate from the management of Corporate ID
management. Procedures and process will need to be put in place to update both
Identity Management systems. There is a possibility of some type of federation
that can offer some type cross coordination of the Identities but this to will need
to be wrapped in procedures and process. Example: if someone leaves the
company then Tim will need to remove the ID from both systems. The downside
of this solution is Tim will still need to keep his engineers working on the
system.
SaaS - The second vendor offers turnkey Sales Management Tool. Tim may not
need as many engineers on the project and have a predictable cost for
service. The downside with many of the off the shelf solutions is you must tailor
your business process to the software, including Identity Management. Security
concerns are many of the same concerns as IaaS solution, two places to manage
ID and permission. Process and procedures to ensure proper maintenance, with
the added burden of maintaining historical data (because SaaS solutions rarely
have the same database structure as the historical data) which could not be
moved to the SaaS vendor.
Move from One Cloud provider to Another Cloud provider:
o Tim uses a turnkey sales management tool hosted by a cloud provider. The tool does not
quite match his sales process so Tim is changing vendors to a sales management tool that
more closely aligns to his companys sales process.
SaaS The new vendor is competitors with the old vendor and therefore
interoperability would not be high on their list of priorities. On the other hand if
the vendors were built on standards as recommended above the work to move
Page
20
data would be much easier. The Identify management becomes an issue.
Translating identities and permission from one vendor to another would be work
most likely done by Tim. Cloud brokers (addressed later) may provide the
expertise to accomplish this movement
PaaS & IaaS The issue with both the platform and infrastructure cloud venders
is they are mostly like build on different technologies. For example on platform
vendor may be Windows based and another is Lunix based. The Infrastructure
vendors that just provide cpu and storage and allow the customer to pick a
platform would likely facilitate the movement of virtual machines from vendor to
vendor.
Move parts of applications from In-house to the cloud:
o Part of Tims sales management application provides information about a customer for
the sales person prior to visits. The portion of the application needs to be accessible from
anywhere. Moving this portion of his application to the cloud allows access to the
information and allows him to test moving to the cloud.
Identity management and data transfer from on-premises to cloud application
Move part In-house and part on another cloud.
o Tim cannot find the perfect sales management tool the fits new sales processes. Tim
decides to develop part of the application in house and leave that data in the cloud.
PaaS Data warehousing services provide this type of benefit. The applications
can be developed in the cloud or in house. The
Move part to Cloud 1 and the rest to Cloud 2:
o Tim has decided to have Application to application provider and database to database
provider
Move from the Cloud(s) to In-house partially or fully.
o Tims company has just been acquired. The new company has its own in house sales
management tool. Tims sales management tool is in the cloud and needs to be moved to
the in house tool.
IaaS & PaaS having your cloud base application based on a platform and an
infrastructure provides the option of moving the application in house by
recreating the infrastructure or platform in house and then moving the
application. With moving to an established in house application many of the
same considerations and moving to the cloud occur. The data migration and
transformation is critical. Identity mapping and data access also critical.
SaaS When moving from the cloud application to an in house application is
about changing of business processes. Consider leaving the cloud applications for
a period of time while the history and information is still needed.

Other use case considerations

When you use cloud provider#1 and in turn cloud provider#1 is using other cloud providers like
#2 and #3. Now your data is spread across all #1, #2 and #3 cloud providers and you don't have
direct relationship with #2 and #3. What are the issues faced when you try to migrate out of this.
o Data migration from on vendor to another means pulling data to in house. Translating the
data to a new format or schema. Then moving it to its new destination.
o Identity management and permission on the data also need to be translated. The
Cloud Broker or aggregator.
o When you use services from this cloud aggregator each of those services could be offered
from different cloud providers and the consumer is totally agnostic about it. What are the
issues faced when you try to move to another cloud aggregator.
Page
21
Cloud Brokers hold the promise of solving many of the migration issue. The
idea is that a cloud broke understands many cloud vendors and can facilitate the
migration of data and identity to another cloud vendor.
Cloud brokers may specialize in data migration, application migrations or
identity migration.
Cloud Identity and Access Management
o The key too many of these scenarios is Identity Management when moving from in house
application to a cloud vender or from cloud vendor to another cloud vendor. Identity
Management as a Service providers are just beginning to become available.
o When using Identity Management as a Service providers should provide
Strong passwords and polices that can be managed by the customer. These
standards should exceed internal standards.
Ability to leverage this management within the firewall as well.

Page
22
What are the Risks Involved with Interoperability and Portability?
The biggest challenge to adoption of Cloud is portability and interoperability even higher
than Security as per the recent research by IEEE. Most organizations feel they lose control of
their critical business data. There is no doubt that there are several risks involved with Portability
and Interoperability in Cloud environments. Some of which are enumerated below.

Some companies have databases which are huge and are sized to be Peta bytes of data in
the cloud. The process of moving huge databases into the cloud can be slow and painful and that
also means moving data from one provider to another, especially if there is a lack of appropriate
supporting tools

Several of the Public Cloud offerings are built on proprietary technologies and not open
standards. Cloud providers do not support Heterogeneous deployments.

The lack of integration between various Cloud networks makes it difficult to migrate
applications and data amongst them and also diminish the realization of Cost savings.

There is lack of any standards which Cloud Providers should abide by to help facilitate
migration of data and applications in and out of their environments.

Enterprise tools that enable management of security and availability in the cloud in the
same way as in a data center are not there today.

Any cloud migration decision is, needs to be made in the larger context of related
application portfolio and infrastructure components management

In order to take advantage of Clouds scalable infrastructure will require that the
development teams have concurrent programming, workload decomposition, and data
partitioning experience. The development team also should have experience debugging
distributed systems.

The applications may explicitly require file or block storage services and may have
specific performance requirements, retention policies, or compliance

Restrictions and there is a risk these requirements may not be completely met when
migrating from one Vendor to another and this may incur additional work for the development
teams to make them compliant in the new Cloud Providers environment.
Page
23
The design of application and how the distributed components interact and messages or
data being exchanged at each transaction can be an issue when migrating from one vendor to
another as the infrastructure environment may be different.

Online transaction processing (OLTP) systems can have stringent requirements regarding
distributed transactions and support for a two-phase commit protocol. Application semantics may
also require custom orchestration of transactions.

A Quantitative analysis of number of users (including concurrent users) who will be


accessing the application is another important factor to consider for capacity and migration
planning for the new Cloud environment.

Interactive applications are sensitive to network and processing latency for usability. Due
to distance between users and a remote service provider causing additional Latency, may require
teams to consider regional proximity. Ideally the selection of the new Cloud provider should be
done based on a guarantee of an acceptable response time range.

Licensing requirements: Some Cloud providers may not allow in house software licenses
to be migrated to their environment and may require new licenses being bought by them to be
deployed, hence causing additional expense. The precise licensing requirements for Third party
softwares which are required by Applications/services should be evaluated upfront and
negotiated with the Cloud Vendor.

For an application to function at optimum performance, data rates and number of


transactions, impact on data transfer rates can be an issue. .If the application requires a specific
transport or network protocol for certain functions, or to provide access to users, these should be
considered when planning migration or portability of applications to a particular vendor.

In case of SaaS applications, relevant data can become locked because it is difficult to
access and use. SaaS vendors design the information model for their convenience and may not
even publish their data models in the documentation for consumption by user organizations.
SaaS integration with existing systems and processes may be more difficult than with other
migration options because APIs may not be available, well-built, well-documented, or usable

Inadequate service-level can be a significant issue for certain applications which have
strict quality of service requirements. Some risks to service levels cannot even be quantified
prior to hosting applications in the cloud. Most public cloud providers typically do not offer a
range of service levels, and have a static approach to SLAs.

In order to maximize the ROI of funds already sunk into development of applications in
house and IT, organizations may want to leverage existing investments in development and
operations skills, tools, infrastructure, and deployed applications. Identifying the net value of
these investments can be challenging in some cases.

Page
24
There are no standards for IAAS Virtual Machine formats that can be an issue for IAAS
migration by consumer organizations.

There are no portability audits available upfront which can certify that a particular
application environment is portable and nor are the Cloud Providers audited/certified on the basis
of portability and interoperability.

Risk management of the Cloud providers from a legal, compliance and operational
perspectives is another issue when it comes to interoperability. If the customer applications have
strict compliance, privacy requirements, the new Cloud vendor should be evaluated and audited
for those requirements before a decision to migrate is made.

Recommendations

Organizations are showing overwhelming interest in cloud computing due to economic


reasons and efficient resource management. Most IT environments will require significant
changes and will be introducing new challenges of cross-functional and cross-supplier
integration. This will bring the long-term benefit of decoupling core and secondary functions so
that they can be more effectively distributed. As the potential market for cloud service providers
grows and new market competition is introduced, clients with highly portable applications and
services will be able to negotiate for improved cost savings, lower risk and maximum ROI with
better service level features and options.

Identifying and selecting an optimal application migration option cannot be made in


isolation. Any cloud migration decision, may it be an application or infrastructure migration
decision, it needs to be made in the larger context of related application portfolio and
infrastructure portfolio management. Precise migration goals and motivations must be identified
along with a detailed analysis of requirements and constraints looking at various aspects for
Quality of service. Several questions must be answered about the service/application, for
example:

Is the application used for a key business function?

What is the future roadmap for the application?

Where in the lifecycle of the application is it, just released or close to be retired?

What are the underlying technologies used architecture, Framework, infrastructure and
integrations?

Cloud migration decisions will impact the application platform decisions. Other decisions
could be:

Data Center sourcing

What are the application platform foundations?


Page
25
Which Programming Languages are used for development and which developer skills are
available in house

Presentation Technologies, Server Side frameworks

Application modernization requirements

Migration costs

How can the existing investment be leveraged?

Needs to build additional operational efficiencies?

Are the individual applications or services are self- contained with limited integration points that
are permitted and supported by a cloud model.

Can Applications and services use standard communication protocols and front-end access
methods are supported by the client and the cloud service provider?

The end-user profile for the intended applications or solutions is open and inclusive so that all
intended end-users will have continuous accessibility to the applications or solutions ported to
The Cloud

Security for the applications or services is reasonable and simplistic enough to be included in a
cloud service model

Client teams have pre-approved them for consideration and migration to Cloud

Cloud service provider has reviewed and approved the applications/ solutions and has established
that the needs fit into the service provider cloud landscape of supported offerings.

The application and services are supported by the behind-the-scenes platforms provided by the
cloud service provider

Other Policy, governance considerations could be:

The application and services SLAs are reasonable and supported by the cloud service provider
for long-term maintenance and support, If Application has any specific compliance, privacy
needs they are supported by the Cloud provider and it should be enforceable in the runtime and
by audits.

The cloud migration decision may be impacted by some of the following issues as well:

Because of pay as you go and self-service characteristics, application migration to cloud


providers can support a departments autonomous operations. This principle impacts the

Page
26
migration decision because some alternatives can be used more in certain cases than others,
without central IT support.

Organizations favoring open solutions will have a greatly reduced choice of cloud
providers. Most alternatives present lock-in challenges related to the service offered. Closed
versus open can mean different things for virtualization, code and data perspectives.

The organization's sourcing principles may dictate the use of single vendors over best-of-
breed vendors and influence the decision when multiple migration options meet an application's
requirements. Some new cloud platform vendors may not have offerings corresponding to all
requirements and/ or alternatives.

Hence it is important to go thru a structured methodology and extensive analysis for


deciding to port applications to the Cloud. A Complete risk assessment exercise must be
conducted for the applications/services being migrated to a Cloud environment.

Inventory all the application components that need to be ported to the cloud. Pre-
migration audit should be conducted and that should include evaluation and ranking of all the
applications or service components that will be migrated to the cloud (NOTE: this includes, sw
version, sw patch levels, sw vendor license agreements, number of users accessing the
application or service, any interfaces to third party organizations or services, currently supported
hardware platforms, front-end or end user access and connectivity options, bandwidth
requirements, availability requirements, service level agreements (SLAs RTO RPO minimums,
application or service business criticality level and confirmation of established business
continuity procedures).

Pre-Portability analysis of Governance should be conducted as all portability activities


will need to have clearly defined customer and service provider governance. The Governance
teams/auditors for supporting this activity for Cloud migration will need to be available to ensure
that all requirements are met as defined by the Pre-portability audit and requirements.
Pre-Portability audit for Security should be conducted in detail so all security requirements and
attributes for the application or service are outlined in a check list and confirmed with the service
provider for the availability and support for required security attributes. In some cases, a higher
level of service will be possible with certain cloud configurations.

Another important Pre-Portability activity is to plan fall Back Options or in the event that
a no-go is decided, for any reason, during the move to the Cloud, a clearly defined fall-back
strategy and plan are available to the customer. Any and all in-house fall back processing will
be the responsibility of the customer. The Cloud service provider will work with the customer to
ensure that all the necessary data and content is available for fall-back plan initiation and
support.

In addition to extensive pre-portability planning and a good portability plan, maximum


ROI and low risk can be further protected with an audit process. An audit process can be a

Page
27
monthly or yearly point-in-time check to ensure that the optimum features, functions and cost-
savings models of The Cloud solution are being applied to the ported application or services.

After the pre-portability plan its important that the plan is orchestrated in the order of
steps to be conducted and an end to end plan that outlines the start-to-finish activities for initial
portability or migration to cloud. This plan should also include either a mock or dry-run
portability test. If the Cloud service provider does not support such a test then, the live
portability activity will need to take place with a first attempt activity that will then determine
the process for subsequent applications and user groups.

In order to minimize risk further a Portability Audit must be conducted. Risk can be
further lowered significantly using a continuous audit process. This audit process should include
complete review of key risk factors. These factors include any changes or updates that could
impact the on-going portability or interoperability of the application or service. As long as the
application or service remains highly portable, customers will have the option for negotiating
improved plans annually or at service renewal times. Hence reduces the risk of a situation where
only one possible cloud service provider is available. This will ensure maximum ROI and it will
significantly reduce the risk of having a non-distributed cloud service model for larger
customers. In such situation, clients with many applications and services will want the option of
distributing between different cloud service providers to maximize their ROI and mitigate their
Risk. Portability will be very crucial for such customers anyway.

In spite of all due diligence that goes into planning the migration to Cloud, there are still
risks and unexpected events that happen in real life as we all know, hence following a sound risk
management strategy i.e., accept, avoid, mitigate, and transfer before placing applications and IT
services in the cloud is a good way to go.

Risks which cannot be avoided, pre-determined, or quantified prior to hosting


applications in the cloud, no amount of up-front planning can mitigate these accepted or
residual risks. Unfortunately, many organizations do not have a plan to manage these risks.
Those organizations that do actively manage these risks do so by reapplying many of the same
risk management strategies (accept, mitigate, and transfer) toward creating a cloud exit strategy,
distributing services, and acquiring insurance to mitigate and transfer as much of the remaining
cloud risk as possible.

Having a Cloud Exit strategy Or strategy to exit from one Cloud environment to another
should be part of the process of deciding to migrate to cloud. It will be prudent to have a set of
event triggers and actions that can help the organization know when something is going awry,
what action to take, and the resources required to execute the exit strategy. Risk can also be
mitigated to some extent by using multiple sources that might be internally managed or managed
using a third party such as a cloud broker or federated clouds. The residual risks can be
transferred e.g losses which cant be recovered if an outage has already taken place may cause
damage to the reputation of a business or lost data or a breach which has occurred, such risks can
be transferred using Risk insurances.. it may compensate for measurable business loss and
compensate for time required to rebuild data.

Page
28
Emerging Standards and Technologies
Cloud consumers have various methods to interact with cloud services in the form of APIs and
interfaces. At present there are important standards and technologies emerging for portability and
interoperability for the cloud. It will take long time for these standards to mature and be supported by the
majority of the providers. Until these standards are adhered cloud brokers will play an important role in the
overall cloud ecosystem. These cloud brokers will abstract the possibly incompatible capabilities and
interfaces and provide a proxy in advance of the arrival of common and open standards.

Following are few of the many efforts centered on the development of both open and proprietary APIs
that tries to address things such as management, security and interoperability for cloud.

Open Grid Forums (OGF) Open Cloud Computing Interface (OCCI). It is a protocol and API
originally initiated to create a remote management API for IaaS, allowing for development of
interoperable tools for deployment, autonomic scaling and monitoring. It has evolved into flexible
API with strong focus on integration, portability, interoperability and innovation and supports all
service models.
Open Data Protocol (OData) is a web protocol for querying and updating data that provides a way
to unlock the data from the applications. This is being used to expose and access information from
variety of sources such as relational databases, file systems, content management systems, web
sites etc
DMTFs Open Virtualization Format (OVF) is a packaging standard designed to address the
portability and deployment of virtual appliances. A workload for amazon EC2 cannot be shifted
into Savvis or Terremark without first being reformatted. OVF addresses this problem by providing
an import format that allows a virtual machine to be recast into a form that can be recognized by a
hypervisor say VMwares ESX Server. Cloud providers today dont allow the virtual machines to
be exported in OVF format. It has to be exported and converted into OVF format before importing
into other provider.
IEEE the standards body has created 2 working groups P2301 and P2302. P2301 working group
will provide a portability roadmap for cloud vendors, service providers and their consumers. The
goal of this working group is a standard that will allow users to know that the services they buy
meets standardized definitions and measures. P2302 working group will define topology, protocols,
functionality and governance for cloud-to-cloud interoperability and federated operations (hybrid
cloud). The standard that results will provide an economy of scale that is transparent to the users.
Unified Cloud Interface (UCI) Project is an attempt to create an open and standardized cloud
interface for the unification of various cloud APIs. Its goal is to provide a single programming
point of contact that can encompass entire infrastructure stack as well as emerging cloud centric
technologies. The Resource Description Framework (RDF) based language is used to describe a
semantic cloud data model (taxonomy and ontology).
Cloud Computing Interoperability Forum (CCIF) was formed to enable a global cloud computing
ecosystem where organizations can work seamlessly for wider adoption of cloud computing. The
key focus of this group is to create a common agreed upon framework or ontology that enables
the ability of two or more cloud platforms to exchange information in a unified manner. (seems to
be dormant)
CTP Cloud Trust protocol was developed by CSC is licensed at no cost to CSA. It is being
integrated to CSAs GRC Stack. CTP provides cloud consumers with right information to make
Page
29
correct information risk management decisions. This allows cloud consumers to make informed
decisions on what processes and data could be moved to cloud provider. CTP provides
transparency into cloud and cloud customers can inquire providers about configurations,
vulnerabilities, access, authorization, policy, accountability, anchoring and operating status
conditions. This gives cloud consumers insight into the security configurations and health of their
services deployed in the cloud. The success of this depends on service providers to adhere to this
protocol and the cloud consumers demand for it.
OpenStack This was originally founded by Rackspace and NASA. It has grown into an open
source community with global collaboration of developers and cloud computing technologists to
provide open source cloud computing platform for public and private clouds. It delivers solutions
for all types of clouds with simple implementation, scalable and with lots of features geared
towards IaaS service model. It has 3 major interrelated projects
o Openstack compute opensource software and standards for large scale deployments of
auto-provisioned virtual compute instances
o Openstack object storage Opensource software and standards for redundant storage of
static objects
o Openstack image service Provides discovery, registration and delivery services for
virtual disk images.
For enterprises to restrict access to the cloud services using their already existing IAM
infrastructure federation (WS-federation, SAML) is really the standard. Some of the service
providers started supporting OAUTH for accessing resources in the cloud.
Few other emerging standards that address users privacy concern with respect to accessing their
data include Mozillas BrowserID, Microsofts U-Prove (formerly cardspace), Kantara initiatives
UMA (User Managed Access). UMA is built on OAUTH and gives a uniform way of managing
access to the resources across the web with a unified access engine. This also has the potential in
future to help small businesses manage their resources across different cloud providers to their
employees.
Homomorphic encryption not yet within a standards track, homomorphic encryption may
provide a means to support encrypting data for use within the cloud without the requirement of
decrypting. With homomorphic encryption, operations on data may occur directly using
ciphertext without decrypting and achieve the same result as with the plaintext form of the same
data. A useful application for this will be to process encrypted database information that is stored
in the cloud and protected with encryption. As example, calculating a value from two encrypted
values results in an encrypted result that when decrypted has the same, correct value as would
result from the same calculation on the unencrypted values.

Areas that need more work

Currently if a cloud consumer wants to orchestrate or dynamically move a work load running on
Amazon EC2 to another provider like Terremark there is no easy way to move that work load
without first converting into OVF format first. Providers should support exporting the work load
in an open interoperable format like OVF
Currently each hypervisor exposes its own API for monitor the traffic crossing VM backplanes.
Standards need to emerge for these APIs that could be leveraged by cloud consumers for

Page
30
monitoring and reporting. This will allow cloud consumers not to retool their applications if they
move from one provider to another.

Page
31
Sources
http://msdn.microsoft.com/en-us/library/cc836393.aspx

http://blogs.msdn.com/b/eugeniop/archive/2010/01/12/adfs-wif-on-amazon-ec2.aspx

PROVIDING SINGLE SIGN-ON TO AMAZON EC2 APPLICATIONS FROM AN ON-


PREMISES WINDOWS DOMAIN
CONNECTING TO THE CLOUD
DAVID CHAPPELL
DECEMBER 2009
http://download.microsoft.com/download/6/C/2/6C2DBA25-C4D3-474B-8977-
E7D296FBFE71/EC2-Windows%20SSO%20v1%200--Chappell.pdf

http://www.zimbio.com/SC+Magazine/articles/6P3njtcIjmR/Federation+2+0+identity+ecosystem

http://www.datacenterknowledge.com/archives/2009/07/27/cloud-brokers-the-next-big-
opportunity/

http://blogs.oracle.com/identity/entry/cloud_computing_identity_and_access

http://www.opengroup.org/jericho/cloud_cube_model_v1.0.pdf
http://www.burtongroup.com

http://www.pkware.com/appnote

http://www.apps.ietf.org/rfc/rfc4880.html

Page
32

S-ar putea să vă placă și