Sunteți pe pagina 1din 16

Big Data

Special Report
trading technologies for financial-market professionals

Sponsored by:

waterstechnology.com

June 2012

thomsonreuters.com

READY TO
OUTPERFORM?
HOW MUCH DO YOU EXPECT FROM
MACHINE-READABLE NEWS?
EXPECT MORE. WHETHER YOU ARE
RUNNING BLACK BOX STRATEGIES
THAT NEED SUB-MILLISECOND DATA OR
MANAGING MEDIUM-TO-LONG-TERM
INVESTMENTS, THOMSON REUTERS
NEWS ANALYTICS ENABLES YOU TO
OUTPERFORM THE COMPETITION.
Be the first to react to market-moving economic or company events.
Analyze thousands of news stories in real time to exploit market
inefficiencies or manage event risk. Use statistical output from our
leading-edge news analytics to power quant trading strategies across
all frequencies and provide additional support to your decision makers.
With unmatched depth, breadth and speed of news, razor-sharp news
analytics and both hosted and on-site deployment options, we have
everything you need to gain critical insight. And turn that insight into profit.
THOMSON REUTERS NEWS ANALYTICS. DISCOVER. DIFFERENTIATE. DEPLOY.

For more information: QED@thomsonreuters.com

Thomson Reuters 2012. All rights reserved.


Thomson Reuters and the Kinesis logo are trademarks of Thomson Reuters.
48003923 001206.

KNOWLEDGE TO ACT

Editor-in-Chief Victor Anderson


victor.anderson@incisivemedia.com
tel: +44 (0) 20 7316 9090
US Editor Anthony Malakian
anthony.malakian@incisivemedia.com
European Staff Writers James Rundle
james.rundle@incisivemedia.com
Steve Dew-Jones
steve.dew-jones@incisivemedia.com
US Staff Writers Jake Thomases
jacob.thomases@incisivemedia.com
Tim Bourgaize Murray
timothy.murray@incisivemedia.com
Head of Editorial Operations Elina Patler
elina.patler@incisivemedia.com
Contributors
Max Bowie, Editor, Inside Market Data
Michael Shashoua, Editor, Inside Reference Data
Global Commercial Director Jo Garvey
jo.garvey@incisivemedia.com
tel: +44 (0) 20 7316 9474
US Commercial Manager Bene Archbold
bene.archbold@incisivemedia.com
US Business Development Manager Melissa Jao
melissa.jao@incisivemedia.com
European Business Development Executive Mark Garvey
mark.garvey@incisivemedia.com
Senior Marketing Manager Claire Light
claire.light@incisivemedia.com
Design Lisa Ling
lisa.ling@incisivemedia.com
Group Publishing Director Lee Hartt
Chief Executive Tim Weller
Managing Director John Barnes
Incisive Media Head Office
32-34 Broadwick Street
London
W1A 2HG, UK
Incisive Media US
55 Broad Street, 22nd Floor
New York, NY 10004
tel: +1 646 736 1888
Incisive Media Asia
20th Floor, Admiralty Center, Tower 2
18 Harcourt Road
Admiralty, Hong Kong, SAR China
tel: +852 3411 4888
fax: +852 3411 4811
Subscription Sales
Hussein Shirwa Tel: +44 (0)20 7004 7477
Dominic Clifton Tel: +44 (0)20 7968 4634
waters.subscriptions@incisivemedia.com
Incisive Media Customer Services
Haymarket House
2829 Haymarket
London
SW1Y 4RX
Tel (UK): +44 0870 240 8859
Tel (International): +44 (0)1858 438421

To receive Waters magazine


every month you must subscribe
to Buy-Side Technology online,
Sell-Side Technology online or one
of our multi-brand subscription
options. For more information
and subscription details, visit
waterstechnology.com/subscribe

The Industrys Gold Rush


tarting in the mid-1800s with the onset of the California Gold Rush and culminating in the 20th century with the rise and proliferation of large factories and
ever-more sophisticated techniques, mining has always been a money-maker.
In the second decade of the 21st century, mining is once again big businessdata
mining, that is.
For those organizations that can assimilate, interrogate, and derive meaning from
large, unstructured data sets, a fortune awaits. The judicious application of Big Data
tools and technologies can go a long way toward addressing rapidly changing regulatory requirements, while traders can tap into the full potential of social media and other
sentiment data, and risk managers can monitor their rms counterparty, asset class
and country exposure on an intra-day basis.
In the nancial services industry, data is king, and taming Big Data, therefore, holds
the key to rms controlling large portions of their operating environments.
But the question remains: Will the capital markets be on the cutting edge of this
fast-emerging revolution? When it comes to cloud computing, the adoption of mobile
technology, the harnessing of social media data, and the implementation of eldprogrammable gate arrays (FPGAs) to super-charge compute-intensive processes,
the capital markets has, by and large, lagged other industries in terms of adapting to
change. Even in the realm of Big Data, the pharmaceutical industry and the military
have been leading the charge.
But successfully addressing the Big Data challenge offers game-changing potential,
which, if fully utilized, can bring about a competitive advantage. Recently, State Street
chief scientist David Saul, spoke to Waters about the exciting prospect of attacking Big
Data using semantic database technology. He described the technology as cool and
exciting stuff that has limitless potential in the nancial services industry.
With various technologies readily available, now is not the time to sit on the sidelines
and wait for the technology to mature and trickle down. Now is the time to be an early
adopterthis is where research-and-development dollars should be going. This is the
nancial services industrys gold rush.

Victor Anderson
Editor-in-Chief

Waters (ISSN 1068-5863) is published monthly (12 times


a year) by Incisive Media. Printed in the UK by Wyndeham
Grange, Southwick, West Sussex.
Incisive Media Investments Limited, 2012. All rights
reserved. No part of this publication may be reproduced,
stored in or introduced into any retrieval system, or
transmitted, in any form or by any means, electronic,
mechanical, photocopying, recording or otherwise,
without the prior written permission of the copyright
owners.

Sell Side

Special Report Big Data

NYSE Technologies Bows Hosted TAQ Analytics Lab


NYSE Technologies, the data and trading
technology arm of NYSE Euronext, will
unveil a new service by the end of this
month, dubbed Market Data Analytics Lab
(MDAL), which will provide access to a
central, managed database of its historical
trade and quote (TAQ) data as well as a
range of hosted analytics and tools for
querying the data, enabling clients to
back-test and implement trading strategies
more easily and without the cost of
acquiring and managing the entire TAQ
database in-house.
An extension of the exchanges Capital
Markets Community cloud-based connectivity platform, MDAL allows customers to
churn a lot of data within our environment
before bringing it into theirs, especially
when performing calculations on large
volumes of market data within specific time
periods, says Todd Watkins, product
manager for US cash and data products at
NYSE Technologies. And it benefits NYSE
Technologies because we can deliver the
dataset they need rather than the entire

database. But we will continue delivering


the data via FTP and summary files by
email, Watkins explains.
Clients no longer have to buy and store
these different datasets on their own site.
Instead, they can download only the subset
they need, says Brian Fuller, business
development manager for global market
data at NYSE Technologies, who adds that
the service also includes a library of
pre-built, commonly used functions,
ranging from simple equations to more
sophisticated, moving average-type
calculations, which can be created in a
simplified version of XML using drop-down
menus accessed via a web interface.
MDAL provides historical TAQ data for
NYSE Euronexts US equity markets, and
the exchange is looking to expand the
service to cover other asset classes traded
on markets within its parent group, such as
derivatives and bonds, as well as data from
other exchanges in the Northeast US, and
other away markets, based on client
demand, Fuller says.

Users will be able to upload their own


datasets into MDAL in spreadsheets or
CSV files, and link them to the exchanges
datasets to filter data according to their list
of securities.
Currently, users can buy TAQ content
online as an FTP download, but cannot
perform the calculations themselves in a
managed fashion. With MDAL, not only will
clients be able to perform calculations
online using hosted datasetsand hence
not have to manage the capture and
storage of that data onsitebut will also be
able to download the resulting calculations
and underlying data in a variety of file
formats.
We think we will see a wide range of
users, starting with smaller buy-side and
quantitative shops, or mid- and back-office
staff in larger firms who dont need access
to the entire dataset, Fuller says.
Pricing for MDAL will be a monthly
per-user fee, the cost of which will vary
according to the number of concurrent
accesses, Watkins says.

Twitter Hedge Fund Eyes Rebirth as DCM Capital


Launched last year, Derwent Capital Markets
Absolute Return fund fused capital markets and
social media. It based its investment decisions
on sentiment analysis from Twitter. After
posting decent results for its first month of
trading, the fund went quiet, eventually
wrapping up its operations for a new direction.
We couldnt have timed it worse to try and
launch a new and innovative fund with the US
losing its AAA rating and equity markets in
freefall, says Paul Hawtin, CEO and founder of
Derwent Capital Markets, which has now
rebranded as DCM Capital.
The fund held around $40 million in seed
capital initially. Hedge funds need a certain
amount of capital$100 million-plusbefore
they can reach critical mass, he says. So, we
made the decision to move out of the hedge
fund industry, and open up our technology to
the mass markets for an online trading platform
that combines a research tool embedded
within it.
DCM plans to launch the platform, aimed at
retail investors, in late summer. The technology

June 2012 waterstechnology.com

powering it has been


developed in-house
using the financial
resources from
Derwents ill-fated
venture, building on a
core of the sentiment
analysis tools initially
built for the hedge
fund.
Weve spent the
last 18 months
Paul Hawtin
improving our
DCM Capital
technologyinitially,
we were just focused on global sentiment, but
now were able to monitor it on any individual
stock, currency or commodity, he says. DCM
partnered with IG Group for the project. Client
funds will be held by the latter. IG will then push
the prices and tradable instruments to the
platform.
DCM says it has developed a nuanced
approach to sentiment analysisthe ability of
funds to mine the sheer volume of data

projected by social media, and generate alpha


from that.
Its quite a complex thing, but to simplify it,
we listen to the Twitter firehose of data, says
Hawtin. We do look at a few others as well,
and were looking at growing that, but predominantly its Twitter.
While the platform will be largely web-based,
DCM plans to use HTML5 to roll out iOS
applications for Apple devices, with plans for a
move to other devices. Other vendors have
begun their own inroads into the space.
Thomson Reuters recently released its own
sentiment indicators, while other major vendors
are looking at including sentiment in their
market data feeds as an additional layer rather
than an executable quality.
What weve found is that companies have
fantastic tools, but you get information
overload, Hawtin says. Its great, but what
does that mean, and how can you use it to
trade? With this, weve spent a lot of time
focusing on how to refine it for the end-user to
understand and trade off it.

News

Traders Seek Profits From Unstructured Data


Financial firms are increasing their focus on
trying to derive value from analysis of unstructured data, from news to social media sources,
though challenges to adoption remain around
the trustworthiness and timeliness of nontraditional data sources.
The first challenge lies in determining how
much structure a dataset contains, and whether
it can be analyzed by existing tools used for
other datasets. There are at least three or four
traditional areas of data that were all used to
dealing withstructured data, such as ticks and
quotes; and semi-structured data, such as
news feeds, because they have some structure
applied in terms of a headline and fields that one
can filter, says Mark Fischer, vice president of
product management at CQG. Then theres
completely unstructured information, like Twitter
or blogs, which have no structure associated
with thembut we are finding ways to mine
that information, Fischer says.
Structured data is just like market data. For
example, non-farm payrollswe already have
that in a structured format before it leaves the
lockup, says Rich Brown, head of quantitative
and event-driven solutions at Thomson
Reuters. Unstructured data is where the
opportunities are. With structured data, the
opportunity is over in a thousandth of a second.
But unstructured data applies for much longer
time horizons, and offers the largest opportunities for people to differentiate their strategies.
Sentiment analysis has been around for a
long time, but it is slow and not what people
use in terms of high-frequency trading. So what

people are trying to do


now is figure out if
they can get that any
closer to real time,
Fischer says. Im a
skeptic about
performing low- or
semi-low-latency
sentiment analysis for
high-frequency
trading. These tools
wont be instantaneSteve Ellenberg
ous reactions to the
MDSL
marketplace, but will
be longer-term and more thoughtful.
Alexander Abramov, director and corporate
relations committee chair for the Information
Systems Audit and Control Association, says
theres nothing new about the social media
rumor mill and how it affects the decisionmaking process, adding that he hopes the
industry will be the beneficiary of new technologies to enable firms to derive greater insight
from analyzing unstructured data.
Steve Ellenberg of MDSL, points to a key
problem with basing decisions on data from
social sourcesespecially those on which
algorithms base trading decisions at submillisecond speeds: News feeds are structured
and have a certain authority. But theres a very
low entry point to some forms of unstructured
data and social media, he says.
These challenges can make it hard for
developers to fine-tune their systems to get the
most out of the morsels of legitimate value

hidden in the universe of social media. In terms


of social media, there are a lot of engineering
problems: The signal-to-noise ratio is very low,
and from an engineering perspectivesay, with
Twitterwe are working with snippets of text,
so there is a lot of research to be done to
overcome these limitations, says Ron Papka,
global head of client analytics and market data
distribution at Citi.
However, he says that companies are
increasingly using these channels to quickly
disseminate important news or warnings to a
mass audience. Over time, more companies
are releasing information over Twitter rather than
by traditional means. For example, when Total
had a gas leak on its North Sea platform
recently, they didnt issue a press releasethey
tweeted it to get the information out there. So
over time, this will become structured information for use in trading and risk management,
Papka says.
Still, pending the development of more
sophisticated analysis tools, much of the
universe of unstructured datafrom news to
tweetswill be used for risk management, to
halt trading strategies in the event of unexpected news. If you look at the high-frequency
space, there are uses but it is used more to
stop trading, Brown says, though he adds that
new tools that can use this information more
proactively may not be far away. With news,
you can get signals of the volume of news and
can use that to build adaptive algorithms that
react to news, rather than just stopping a
strategy.

DTCC Expands Chennai Office to Bolster Operations


The Depository Trust & Clearing Corp.
(DTCC) is expanding its India business center
in Chennai into a technology infrastructure
support and development office to help
bolster its round-the-clock transactionprocessing, funds delivery, and data storage
businesses.
Over the next two years, the DTCC plans
to expand its full-time staff in Chennai. In
anticipation of this, the office has relocated to
a larger site in Chennai where it can potentially acquire additional space as it expands.
The DTCC began working with technology vendors in Chennai in 2004 and has

staffed an IT center and vendor oversight


function there since 2008.
Creating a stronger base in India helps
us strengthen our presence and IT resources
to support regional European and AsiaPacific business initiatives. The geographic
dispersal of our staff also allows us to sustain

our follow-the-sun workflow model for


managing our IT, explains Robert Garrison,
DTCC managing director and CIO.
The decision to expand in Chennai also
reflects the DTCCs push to sustain its
operations and global data management
businesses; provide technology research and
development resources; ensure 24-hour
business continuity and risk-mitigation
support for the DTCC and other securities
industry infrastructure organizations that
contract with the DTCC for business
continuity backup, and manage and support
a broader range of IT vendors.

waterstechnology.com June 2012

Special Report Big Data

IBM: Using Watson for Analysis Is Elementary


IBM is working with investment banks to
identify potential uses for its Watson supercomputerwhich appeared as a contestant
on US game show Jeopardy!as a data and
sentiment analytics engine.
The vendor signed a deal to explore
potential uses for Watson at Citi around its
retail operations, and is now in discussions
with a number of banks around using Watson
to support wholesale and investment banking,
says Likhit Wagle, global industry leader for
banking and financial markets at IBM Global
Business Services.
Banks are facing exponential growth in the
volumes of data they need to process and
draw information from; internal data that is not
necessarily accurate; and a lot of this data is

not structured, Wagle says,


adding that Watson can
address these issues
through its ability to process
vast volumes of data.
Its an adaptable system
that learns through doing, so
the more you give it, the
better output you getand
not just for structured data: Likhit Wagle
IBM
Watson can also draw
insights from unstructured data, such as news
items and blogs, to give analysts more views
of data and sentiment, to enhance the quality
of their recommendations, he adds.
Banks could use Watson to obtain a better
view of risk associated with specific clients, or

to analyze large volumes of data to identify


drivers of systemic risk. Another potential
useespecially in emerging markets where
data is not readily availableis around
identifying suitable clients for firms wealth
management and private banking services.
Implementations of Watson will be on a
bespoke basis. We see Watson working
alongside humans to enhance the quality of
the advice being provided, and it depends on
parameters set by human beings, he says.
However, he says firms could seek to use
Watson as an additional input to the development and execution of sentiment-based
trading strategies, and IBM would look at
building a solution that automates some
activities, if clients demand it.

ISE, Hanweck Unveil Hosted Tick Database


The International Securities Exchange (ISE)
recently released its managed ISE Premium
Hosted Database (ISE PhD) of options and
equities data and options analytics, developed in partnership with options analytics
provider Hanweck Associates, to support
traders back-testing and analysis requirements.
The ISE began piloting the service
which has been in development for almost
two yearswith hedge fund, market-maker
and bank clients late last year, and is now
making it publicly available. ISE PhD provides
options tick data from the Options Price
Reporting Authority, underlying Level-1 US
equities data, and tick-by-tick implied
volatilities and greeks calculated by
Hanwecks Volera hardware-based options
analytics engine, all dating back to 2005.
The service includes ISEs proprietary
open/close prices, which are already
available as a separate historical product,
and are primarily used by quantitative trading
firms and proprietary trading groups to
create analytical models and test trading
strategies. PhD is certainly a quant offering,
so we included that dataset because many
quants who already use that data now will
also want to use PhD, says Jeff Soule, head
of market data at ISE.
In addition, ISE is providing a database of

June 2012 waterstechnology.com

corporate actions as part of PhD, which


users can apply to a query, depending on
their needsfor example, to see when a
Bear Stearns option became a JPMorgan
option, to determine the pricing and implied
volatility for the option at that specific point in
time.
The exchange is also talking to other,
unnamed data providers and exchanges
about including their content in the database,
such as futures data, which Soule says he
expects to add to PhD, and other content, to
be driven by customer demand.
The database supports back-testing, as
well as pre- and post-trade analysis and
transaction-cost analysis. ISE is also seeing
interest from software vendors looking to
incorporate the historical data to enhance
their existing desktop applications, to
support capabilities such as charting, trade
idea generation and requests for time and
sales data, Soule adds.
At launch, the database will be updated
daily after market close, though ISE plans to
add real-time data integration by year-end.
Once we add the real-time data, that will
expand the prospect base significantly. For
example, there are customers that will want
to query the database intra-day for trading
ideas, Soule says.
Client systems can access PhD over the

internet or by cross-connecting to ISEs


servers within the exchanges primary
datacenter at Equinixs NY4 facility in
Secaucus, while traders can use pre-defined
queries in PhDs web interface to quickly
access the data they needfor example, by
simply entering the date range and symbols
they want for back-testingor can use APIs
to write their own queries for retrieving data.
Soule says the managed database
eliminates a key challenge for firms that may
have considered building an infrastructure to
store and manage this data themselves
keeping up with the storage capacity and
performance requirements of a growing
dataset. Were talking about 200 terabytes
of data, and theres a big cost factor for
somebody to build this infrastructure out
not just an upfront cost but significant
ongoing cost, he says.
ISE is offering a flexible pricing model for
the database, to accommodate what it
hopes will be a broad range of users. Clients
can sign up for annual subscriptions to
one-year chunks of content, allowing them
to query or download data for the past 12,
24 or 36 months, dating back to 2005.
Alternatively, users can pay for one-off
queriesfor example, to run analysis on six
months worth of data on a specific option
that they are thinking of trading.

Sponsors Statement

Unlocking the
Value in Big Data
With growing volumes, velocity and variety of data, it is no
longer enough for nancial services rms to limit their analysis
to traditional market data. To unlock the real benets of
Big Data, one needs to analyze broader sources, such as
unstructured data, and combine that information with existing
signals to differentiate and enhance trading, investment, and
risk models. By Richard Brown

Richard Brown

ig Data has been a big IT story


digestible format, it is ready for a broader,
investment decisions, to name just a few.
for many years now, but it is only
Doing this on hundreds, thousands, or even or more common, analytics process. To do
recently that the concept has
that, it is necessary to understand the holismillions of sources can easily overwhelm
caught the attention of the financial servtic information value chain. Combining
most systems and cause analysts to quickly
ices industry. While market data volumes
the unstructured data analysis with more
become lost in the tsunami of data.
have skyrocketed in recent years, some
Thomson Reuters News Analytics ena- traditional sources such as pricing and refmight say the data the financial services
bles users to understand these key attributes erence data, parent/subsidiary information,
industry currently looks at is just the tip
among a wide variety and massive quantity supply-chain dynamics, people/titles/roles
of the iceberg. The more complex
and products/brand
and interesting aspect of Big Data in
databases, and doing
financial services lies in its variety,
so with an accurate
however.
point-in-time perBig Data has been a big IT story for many years now,
While some businesses deal with
spective is not easy,
but it is only recently that the concept has caught the
more isolated data types that do not
but it is required in
attention of the financial services industry. While
necessarily span multiple disciplines,
order to support the
market data volumes have skyrocketed in recent years, downstream uses.
there is a significant range in the variety of data that can have an impact on
The outcome of
some might say the data the financial services industry
a firms risk measurements as well as its currently looks at is just the tip of the iceberg. The more the analytics proctrading and investment performance.
ess will likely vary
complex and interesting aspect of big data in financial
Unstructured data that may impact
depending on who is
services lies in its variety, however.
the market include such types as
ultimately consuming
broker research, industry or economic
the information, but
reports, premium and internet news
one of the important
feeds, blogs, tweets, and audio and video
things to consider is that for the most part,
of this unstructured content. It analyzes
programming.
the more attributes one has on the data, the
the data in a consistent, intelligible way to
When analyzing this vast array of
more extensible it becomes.
help users quickly unlock the potential in
content, one needs to do it in a consistent
Thomson Reuters can provide the
big unstructured data. Whether it be for
manner and note key aspects includsystematic investment and trading strategies content, technology, and data management
ing the source and motivations of the
capabilities to properly analyze this wealth
or to deepen a humans comprehension of
datawho wrote it, who published it,
of unstructured data, enabling financial
data, Thomson Reuters News Analytics
for what audience and for what purpose,
services firms to focus on the implications
transforms this qualitative data into
what it is about, and to what extentthe
to their investment and trading strategies.
structured, quantitative forms to support a
people, companies, places, and so forth; the variety of analytic use-cases.
Together, we can unlock the value of Big
relevance of the data; the tone in which it is
Data.
being talked about; how unique/repetitive/ Analyzing the Analysis
Richard Brown is global head of quant and
popular the story is and any acceleration
One of the main goals of this process is to
event-driven trading at Thomson Reuters.
of trends; the psychological aspects being
understand the implications the informaVisit www.thomsonreuters.com for more
conveyed; the contextual backdrop; and the tion has to various business processes.
potential implications for certain trading/
When the data has been transformed into a information.

waterstechnology.com June 2012

Special Report Big Data

BIG
Challenges

Regulatory and competitive pressures, liquidity


fragmentation, and increasingly sophisticated trading
strategies have led to ballooning data volumes that
traditional technologies are no longer equipped to
handle. Known as Big Data, these massive data sets
must be mined and analyzed to allow capital markets
rms to stay abreast with their competitors. Other
industries have tackled Big Data, but nancial services
rms have been relatively late to the game, and are
looking at new technologies to address the challenges.

How do you define Big Data? Is this a new phenomenon, or simply the next phase of enterprise data
management with a catchy new name?
Louis Lovas, director of solutions, OneMarketData: Big
Data can be defined by two salient points. First there is supporting hardware. Bigger, faster, parallel hardware architectures
have not only enabled greater compute power but also massive
growth in storage capacities. This classic Moores Law model has
created maximal efficiencies in storage per dollar. Yet hardware
has long been subject to commoditization. Practically speaking,

June 2012 waterstechnology.com

it is a necessity, but such entropy creates a trajectory that makes


hardwares relevance in the Big Data equation equal to that of
electricity.
The advancements in this foundational compute power
have paved the way for the true advantage, deriving business
benefit through focused Big Data solutions. The ability to tell
a story with the data is what elevates a Big Data solution over
the underlying commodity hardware and storage architectures.
The story is germane to an industry such as finance and creates
relevance and monetizes the data for a business.

Roundtable

Peter Chirlian
Chief Executive Ofcer
Armanta, Inc.
Tell: +1 973 326 9600
Web: www.armanta.com
bigdata@armanta.com

Peter Chirlian, CEO, Armanta Inc: With competition, new regulations and shortened product lifecycles, managers are forced to run
a data-driven business instead of simply relying on instinct. Big Data
represents the convergence of trends in software and hardware, along
with billions in venture capital, which has led to the emergence of
new platforms for data management and analysis. Its given businesses
the ability to deploy many platforms, each suitable for a class of business questions. Big Data offers the promise of finally enabling a truly
data- and analytics-driven enterprise. In such an enterprise, analytics
isnt just a point solution. It is an end-to-end process involving
everything from data gathering and cleansing to operationalizing
business processesacross the entire spectrum of new Big Data tools
and existing data infrastructure.
Dennis Smith, managing director, BNY Mellon: Big Data is
data that has any of the following characteristics: extreme volume,
wide variety of data formats, high velocity of record creation, along
with variable latencies and the complexity of individual data types
within formats. Note that it is about more than just volume. There
is a bit of an evolution. Existing technologies have allowed us to
perform analysis of historical data. Big Data has the potential to not
only provide us better insight into the current situation, but also
positions us to be more predictive.
Andrew Poulter, head of risk analytics and methodology
technology, RBS: I think there is certainly a cultural shift in
terms of how people think about data, the importance of data
retention, and how this can be fed back to improve business
processes and ultimately margins. Technically, I see Big Data as an
evolution as opposed to a new phenomenon or revolution.
Marcus Kwan, vice president of product strategy and
design, CQG:Big Data is the issue surrounding the massive
increases in the number of data sources, volume of the data, the
speed, and granularity of the data compounded over history. It has
become more relevant in the past few years because of the number
of exchanges going electronic, data collection methods, and the
rate of technological advances. Big Data has become an issue for
financial services due to pressuresboth regulatory and competitiveand the need to identify opportunities for profit. Traders
used to make trading decisions plotting charts with pen and paper.
The technology and complexity is now light years from that time.

Ilya Gorelik, founder and CEO, Deltix: In the world of


quantitative research and trading, we define Big Data by size
(in terabytes), irregularity, and rate of new data arrival. It is one
thing to deal with vast quantities of datait is quite another to
deal with data arriving at rates measured in millions of messages
per second, especially when the data is distributed irregularly
over time. Market data volumes have massively increased since
the fragmentation of trading venues post-Regulation NMS and
the Markets in Financial Instruments Directive (Mifid), and the
increasing adoption of technology, allowing trading firms to
increase the number of orders being sent to trading venues, so we
regard 2007 as the start of Big Data.
Rich Brown, global head, quantitative and event-driven
solutions, Thomson Reuters: The volume, velocity and variety
of data that characterize Big Data is unprecedented and while
the popularity of Big Data as the industrys latest catchphrase
continues to reach new heights each day, its implications cannot
be ignored. Traditional enterprise data management challenges are
dwarfed by the scale and scope of problems particularly surrounding the variety of data. Single asset-class pricing data and cross-asset
depth-of-book are nothing compared to challenges in analyzing
unstructured data such as news, social media, audio and video.

Big Data offers the promise of finally enabling a truly


data- and analytics-driven enterprise. In such an
enterprise, analytics isnt just a point solution. It is an
end-to-end process involving everything from data
gathering and cleansing to operationalizing business
processesacross the entire spectrum of new Big
Data tools and existing data infrastructure.
Peter Chirlian, Armanta

What are the specific business applications for Big


Data across the buy side and sell side? Which business
processes are most affected by the continued growth of
data volumes, in addition to its complexity and variety of
sources?
Kwan: We look at the market data realm of Big Data in the
framework of two pillars: collection and distribution, and analysis
and execution. The business first has to be clear on what its strategy
is and then choose solutions for these two pillars that fit. If you
choose collection and distribution systems before, or misaligned
to, strategy, then its simply an expensive science experiment. For
example, within collection and distribution, firms must decide
whether to go for direct connections to exchanges or source data
from an aggregator. The deciding issues are around how fast you
need the data versus the cost of maintaining a direct connection,
infrastructure to collect and house the data, and so forth.

waterstechnology.com June 2012

Special Report Big Data

Marcus Kwan
Vice President, Product Strategy and Design
CQG
Tel: +1 720 904 2933
Email: marcus@cqg.com
Web: www.cqg.com, news.cqg.com

In the pillar of analytics and execution, we see a more important


shift. Firms need teams who not only can understand the nuances of
the data, but can formulate the right big-picture questions. Though
these people may be rooted in mathematics and quantitative analysis,
the outputs need to be a system that provides decision makers, who
may not be as technically versed, the ability to participate effectively.
Advanced visualization tools need to be able to mash up the multiple
sources and the complex analysis, and sum it up in such a way that a
business person can grow it and make an intelligent, well-informed
decision.

often outpaces existing technology capabilitiestasks like liquidity


management require complex analysis across vast numbers of existing
systems. There is now a sea change both in the way financial institutions look at risk and the technology platforms available to enable this
change.
Poulter: The specific business application for which RBS is using Big
Data is to support internal model method (IMM) default risk capital
calculations. The calculations require thousands of Monte Carlo paths
of market data, representing the future evolution of market data.
Apache Hadoop is used to hold the evolved market data and low-level
results of interest to the business.
Smith: Three come to mind. The first two are pretty common, while
the third will probably become more common: performing batch
operations on a massive amount of data, often as a front-end to existing tools, such as data warehouse appliances; analyzing large amounts
of varied data to predict tendencies or future outcomes; and processing
rapidly changing data such as that now associated with complex event
processing (CEP) systems.

Brown: Firms are challenged from


Gorelik: We see three main
the front office through the back
applications. On the buy side,
office and IT departments with issues
research into alpha-generation We look at the market data realm of Big Data
ranging from database management,
in the framework of two pillars: collection
is key. This involves access to
hardware and software upgrades, and
granular (Level-1 or marketnetwork management, to data sourcand distribution, and analysis and execution.
depth) data, and the means to
ing, permissioning and reporting
The business first has to be clear on what its
do quantitative research on
requirements. Firms are turning to
strategy is and then choose solutions for these
this data. The second applicaThomson Reuters for our managed
two pillars that fit. If you choose collection and
tion is in modeling execution
services offerings, looking to offload
quality. We are often asked
basic non-proprietary functions so
distribution systems before, or misaligned to,
why an alpha model with
they can focus on the higher valuestrategy, then its simply an expensive science
an apparently high Sharpe
added activities like better managing
experiment. Marcus Kwan, CQG
Ratio in back-testing does
risk or finding alpha in this vast array
not perform well in live
of Big Data.
trading. There are, of course, several reasons why this might be the
case. One is order execution. The smaller the profit potential in each
Lovas: Andy Palmer, co-founder of Vertica, once wrote: Big Data is
trade, the more susceptible the model is to execution costs, especially
useless unless you architect your systems to support the questions that
slippage. The effective modeling of, and subsequent improvement in
end-users are going to ask.
execution costs, is achieved by simulation using market-depth data.
Business is not aiming for a do-it-yourself Big Data solution, nor
Thirdly, there are some firms that are using Big Data sets for doing
do they want to be pioneers with a vendor. Competitive pressures
original alpha-generation research. Twitter inevitably appears in such
demand fit-for-purpose solutions. Quant researchers look to combine
discussions, but less prosaically, quantitative researchers are doing
differing data sets to unleash new discovery faster. Vendors that can
serious research combining market data with machine-readable news,
deliver an analytical and data management platform fit-for-purpose
stock-loan and broader economic data.
for risk management, price discovery, and fraud management will hit
the mark.
Chirlian: Risk management, a critical Big Data application, has
Why has the financial services industry seen such
historically been limited by technology and use-cases. Before the
significant growth in data volumes, and how has this
financial crisis, static risk reports based on independent silos of data
were deemed sufficient. This is now not the case. In the past few years, growth impacted firms ability to efficiently manage large
the volume of data and the complexity of the calculations surrounding data volumes?
Gorelik: Reg NMS in the US and Mifid in Europe resulted in
risk have grown significantly. Existing systems can no longer provide
fragmentation of trading, giving rise to more sources of market
the needed resultsboth for regulatory and business management
data. Cheaper hardware and software platforms have made
purposes. The demand for dynamic, real-time risk measurement

June 2012 waterstechnology.com

Roundtable

And it comes back to being able to articulate company strategy


clearly. Firms can easily get overwhelmed by the tide of Big Data,
but keeping the strategy clearly in the forefront will enable firms to
effectively wrestle with the challenge.

Ilya Gorelik
Founder and CEO
Deltix Inc.
Tel: +1 617 273 2540
Email: sales@deltixlab.com
Web: www.deltixlab.com

Poulter: I think financial services has always had the ability to generate far more data than was possible and realistic to storefor example,
price histories, transaction-level risk data, and so on. Big Data has
made it possible to store more of what is currently generated, to enable
more detailed drill down, trend analysis over time.

Lovas: Looking at US listed options, the Options Price Reporting


Authoritys daily peak reached 14 million messages in 2011, an
increase of 131 percent over the previous year. This resulted in total
message volume growing 78 percent. The scale of the options market
is quintessentially Big Data. A number of factors have contributed to
this exponential growth. Venues such as the Chicago Board Options
Exchanges C2 Options Exchange and new products including
Brown: The explosion of market data volumes and venues, the
Weekly Options and Volatility Index-based products have increased
increase in the number and types of traded instruments, and the
trading volumes in strikes and underliers. This proliferation has put
interconnectedness of global markets are dramatically increasing the
options on the forefront as a strategic investment tool. The result has
complexity and cost of capturing, normalizing, processing, storing
been an explosive growth in message traffic. The information flow is
and adjusting these vast
a flooda tsunamiof market
volumes of disparate data.
data. On a human scale, you
Legacy systems and networks
cannot consume or make sense
are no longer adequate. Single It is one thing to deal with vast quantities of datait
of whats inside that tidal wave
databases are not easily able
without fit-for-purpose Big Data
is quite another to deal with data arriving at rates
to handle the various types of
solutions.
measured in millions of messages per second,
data or scale large enough or
especially when the data is distributed irregularly over
fast enough to enable users to
Chirlian: The financial services
react quickly to this informaindustry
has always been an
time. Market data volumes have massively increased
tion. Financial services firms
information business. So it makes
since the fragmentation of trading venues post Reg NMS
are struggling to keep up with
sense that firms with the most
and Mifid, and the increasing adoption of technology,
the changes, especially in
information and the best and
allowing trading firms to increase the number of orders fastest analytics are at a significant
this economic environment
where its no longer easy
advantage. This competitive
being sent to trading venues. Ilya Gorelik, Deltix
to just throw money at the
factor has consistently driven
problembuy more hardware, hire more peoplein order to make
financial services firms to gather as much data as they can access. But
the problem go away.
it has also strained even the largest datacenters. Firms continue to look
for better and more cost-effective solutions for dealing with growth
Smith: In many ways, the data has always been there, but we
a data solution alone, however, is not enough. They also must
could not cost-effectively do anything with it. Additionally, with
apply sophisticated analytic capabilities across this vastly expanded
the need to become more competitive, organizations realize that
datascape, which has put additional stress on their infrastructure.
there could be benefits to including more and different data types
What are the technology and operational challenges
into the mix.
that need to be considered when dealing with Big Data?
What technologies are available to firms looking to address
Kwan: The growth of the data has been exponential. Firms used
this Big Data challenge?
to trade across a small/finite set of instruments. Even with the
Poulter: Data recovery and regeneration options need to be fully
most robust set of analytics applied against them, no problem.
considered, with any business-impacting outage understood.
There has been rapid expansion of the electronic markets,
Challenges exist in training staff, across the development and support
multiplied by the speed and granularity of the data per instruteams, and ensuring the correct infrastructural support is available.
mentnow in microsecond ticks. Factor that with the wealth
Specialist consultancies are being used for training, support and
of internal performance/risk metrics that firms are collecting,
consultancy around the implementation itself. Due to the technoland then with advanced analytics across all of that data. Though
ogy being relatively nascent, there are few experts across the whole
computing speeds continue to increase, this complexity can bring
community.
any system to its knees.
high-frequency trading now normal practice for many trading
firms, which increases the volume of market data. There are very
few tools commercially available that are able to use these large
data sets for meaningful analysis. Some firms have been struggling simply to store data let alone create value from it through
analytical research.

waterstechnology.com June 2012

Special Report Big Data

Louis Lovas
Director of Solutions
OneMarketData
Tel: +1 201 710 5977
Email: louis.lovas@onetick.com
Web: www.onetick.com

Kwan: The technology is evolving rapidly. While the days of proprietary formats are giving way to application programming interfaces
(APIs) and more flexible formats, many firms arent prepared to make
the switch. Legacy systems are so entrenched into processes, that even
the thought of replacement is too painful. The strategy, benefits and
return on investment has to be clear before commitment.

Chirlian: Big Data allows enterprises to deploy a variety of platforms,


each targeting a certain analytic challenge. Businesses therefore must
think through the kinds of analytic applications they want to build
and then tailor their technology. Typically, Big Data platforms are
Lovas: Big Data is messy. Market data comes in many shapes,
incremental to existing data silos in the form of relational databases
sizes and encodings. It continually changes and requires corand file systems already on the enterprise. A critical goal is to provide
rections and an occasional tweak. Discovering new alpha and
analysts and business users with access to all of this data, across silos,
optimizing existing strategies demands a confidence in the result- for on-demand analytics. From an IT perspective, the question is how
ing derived analytics. Big Data solutions must manage the vagaries to build an integrated architecture that allows the business to view
of data sources and complex order-book structures, map ticker
all the data in the enterprise and beyond, retrieve the data as needed,
symbols across a global universe of exchanges and geographies,
and analyze it at high performance and across any scale. This may be
and accurately reflect pricing through cancelations, corrections,
achieved by bringing together multiple independent solutions for each
corporation actions and symbol
layer of the architecture
changes. These are challenging
and integrating them.
financial-data management
Alternatively, financial
obstacles beyond the scope of
services institutions could
Big Data is messy. Market data comes in many
ordinary storage architectures
use an integrated platform
shapes, sizes and encodings. It continually changes
or file systems. Content-aware
such as Armanta, which is
and requires corrections and an occasional tweak.
solutions leveraging the best
purpose-built to enable this
Discovering new alpha and optimizing existing
of high-performance, scalable
end-to-end analytic process
compute power are uniquely
for business applications.
strategies demands a confidence in the resulting
tuned to fulfill the demanding
derived analytics. Big Data solutions must manage
needs of quantitative analysts and
the vagaries of data sources and complex order-book Gorelik: Firstly, recordalgo traders.
ing market data is akin to
structures. Louis Lovas, OneMarketData
drinking from a firehose,
Brown: In financial markets today,
so this is the first challenge.
Big Data offers many types of content, both structured and unstrucThere are only a handful of vendor products on the market that can
tured, that need to be collected, analyzed and stored. Management
do this. Secondly, once you store this data, you do not want to be
of these types of data has been a challenge for capital market firms in
physically moving it far because of the sheer size. Thus, you need to be
general and includes issues like tighter budgets, a skills shortageboth able to use it in situ. Today, that typically means leaving it in or near
with new technologies as well as new data analysis techniqueslegacy the datacenter where it was first collected or where there are ticker
systems inability to scale, an increased number of competitors who
plants located. Secondly, in terms of processing, a metric more relevant
may be more nimble, and the need to keep up with regulatory
than size is the number of data points, or messages. Because market
requirements. The solutions to some of these problems have been
data is measured in hundreds of thousands or millions of messages per
known for a while and can be summed up as follows:
second, then any processing needs be able to process at a similar rate.

Shared-nothing, highly distributed database architectures.


Finally, there are challenges related to the normalization, cleansing and

Consistency is very hard to fulfill in large datasets. Relaxmost filtering of data which often require multiple sets of complex analytical
problems can be solved with eventual consistency.
transformations. All these challenges dictate solutions involving a

Dont insist on normalizatione.g., hierarchical data sets dont


built-for-purpose time-series data warehouse, event-processing and
normalize well.
mathematical libraries, all capable of processing data at hundreds of

Functional programming frameworks are better at solving most thousands of messages per second.
parallel distributed problems.

Data outages can be handled by maintaining enough replicas.


Smith: There are many challenges, one is that these technologies are
Technologies such as Cassandra, Hadoop, and MapReduce give
not out of the box and technical skills in these areas are not plentiful.
firms the ability to massively parallel-process data using functional
It also changes some of our current thinking in data management,
programming constructs, store huge datasets in both distributedsecurity, and compliance. There are numerous associated technologies.
memory and direct attached storage, and a declarative interface that is
I recently spoke to a group about the various Hadoop projects and
not limited by SQLs reliance on relational algebra.
sub-projects. I identified at least 20. In just the areas of modeling/

10

June 2012 waterstechnology.com

Roundtable

development and storage/data management, there are three technologies associated with each: MapReduce, Pig, and Mahout with
modeling/development; and Hadoop Distributed File System, HBase,
and Cassandra with storage/data management.

Richard Brown

Are most firms approaching Big Data management


through a rip-and-replace strategy, or are they layering it
on top of their existing infrastructure?
Chirlian: An interesting thing about Big Data technology is that it
is extending enterprise infrastructure versus replacing it by adding
new fit-for-purpose data management and processing tools. The
challenge today is that infrastructure management has become
increasingly complicatedboth from an IT point of view as well
as for business users who must learn new tools. Enterprises must
develop a way to package this collection of tools and deliver the
benefit of the new technologies, enabling users to perform integrated, end-to-end analytics.
Smith: Big Data technologies are complementary to our legacy
products. Most firms are vetting use-cases and incorporating this
key tool set into the overall solution set.

Global Head, Quantitative and Event-Driven


Solutions
Thomson Reuters
Phone: +1 646 223 7796
Email: QED@thomsonreuters.com
Web: www.thomsonreuters.com

Brown: In many cases, the interdependencies of various systems


would make it impractical to rip and replace it all at once. We see
most new initiatives being brought up in isolated environments and
legacy systems or data moving to those new technologies after the new
systems have gone through the typical teething pains. Once development and support staff are comfortable with the solutions, the pace at
which the older systems can be retired dramatically increases and firms
are able to reap some of the promised rewards of the project.

Are there existing technologiescloud computing, for


examplethat can be deployed in a complementary fashion
alongside specific Big Data technologies to help alleviate the
Big Data burden?
Brown: Cloud computing offers great promise for firms needing
to dynamically flex their processing needs, especially at peak times
such as market open and close,
without having to pay up for the idle
system time. That capacity can be
balanced against other users needs,
Gorelik: The underlying
We see a number of areas that are not fully
particularly in more public clouds,
ability to process vast
appreciated when embarking on Big Data projects,
but financial firms are still reluctant
quantities of market data
particularly in the analysis of unstructured data.
to place proprietary data or processes
is achievable only through
Often times, we see clients attempt to analyze text
in the public cloud. Instead, they are
a few products designed
for purpose. As such, we
believing they can have control over the secret sauce. increasingly building private clouds
see mostly replacement
While the motivations are understandable, it is a very behind their firewalls to exploit the
computational advantages, reschedulstrategies.
difficult proposition on which to successfully execute
ing batch jobs when possible to
Poulter: For the current and one can conjure up the phrase, kids, dont try this balance workloads, and reducing the
overall system footprint. This flexible
implementation, Hadoop at home. Richard Brown, Thomson Reuters
computing environment also enables
is being introduced
firms to deal with sudden data bursts, like the Flash Crash, which
alongside traditional relational database data stores. Reporting
require very rapid and extensive analysis in order to adjust their models
is done across both data stores, summary results are stored in the
to respond appropriately the next time it sees such an event, or even at
database, with detailed drill down functionality being provided
the next market open.
using Hadoop. Summary results are held in this way to mitigate
any risks with data retention for the newer technology.
Smith: With cloud computing, absolutely. The flexible and scalable
characteristics of cloud computing make it the ideal, underlying
Lovas: Big Data is a big deal to customers so theyre not making
infrastructure decisions causally. In the end, well see different firms infrastructure layer on which the Big Data storage/data management
and modeling/development layers lie.
employ different modelsrip-and-replace and layering. The strategy will weigh in numerous factors, with cost being an important
aspect. Firms have to analyze the hardware cost and maintenance of Gorelik: Cloud computing is not only a natural companion to
Big Data, but in the case of serious quantitative research, it is an
insourcing versus outsourcing, and whether its clustered storage or
enabler, and in some cases, essential. The ability to distribute
centralized storage, then compare it against existing architectures,
complex calculations across multiple nodes in the cloud at an
factoring in possible salvage. There never is an easy answer.
Kwan: This is the perpetual enterprise question, and it depends on
which part of the Big Data youre talking about. Weve seen a new
generation of tools that do a better job in both realms. With the
sophistication of APIs and aggregation tools we see that the layering
strategy can work for many situations. Rip-and-replace has its place
when maintenance costs
can be saved.

waterstechnology.com June 2012

11

Special Report Big Data

affordable and variable price is a major


differentiator of cloud computing
architecture. In addition, in many
cases, it is simply impractical to transfer
significant volumes of historical market
data electronically. Rather than physically ship hard drives, cloud computing
services deployed in the datacenters
where data is available, allows analysis
to be done in situ.
Poulter: Yes, we are using Hadoop
Dennis Smith
and DataSynapse in tandem, with the
BNY Mellon
data being held physically on the same
machines as the grid engines. Hadoop is, in effect, replacing the
role for which Coherence has been used traditionally in enterprise environments, providing access to cached data.
Lovas: Big Datas complementary technology is real-time
analysis through the use of complex event processing. These two
technologies define a solutions paradigm for quantitative market
analysis covering quantitative trading, research and transactioncost analysis (TCA). The ideal case is to view historical activity
and real-time as a single time continuum. What happened yesterday, last week or 10 years ago is simply as extension of what is
occurring today and what may occur in the future. Quants look
to compare current market volumes to historic volumes, prices
and volatility in the hunt for alpha and to control trade costs.
Chirlian: Absolutely. On the deployment side, cloud computing
alleviates some Big Data concerns. Customers may now deploy
Big Data solutions on-premises, on physical or virtual resources,
or in the cloud. There are also innovations on the infrastructure
side that allow customers to tackle specific business problems.
However, these advances also push complexity to the user and
require, as we said earlier, a packaging of the technology so
users can realize the true value of Big Data.
Kwan: These existing technologies have to mesh with the
strategy and the appropriate timeliness of the data. The cloud
provides greater access, transparency, and ultimately speed,
to certain types of data. Market data for decision-making on
a desktop or in an algo system many times has to be a direct
connection.

What do most firms tend to overlook when embarking on Big Data projects?
Gorelik: We find that the focus on storing data sometimes
results in insufficient emphasis being placed on the use of data.
Regulatory requirements aside, storing data is only useful if subsequent analysis yields information that is valuable. Such analysis
is computationally and mathematically complex and demanding.
Having tools to define and test trading ideas quickly, and then
refine ideas, is a major competitive advantage in a trading firm.
By focusing on the logistical headaches of collecting data, an
emphasis on analytical tools is often overlooked.

12

June 2012 waterstechnology.com

Lovas: Big Data ultimately defines the end game, that Holy Grail for
profitability. Firms should not lose sight of that. The Big Data dump
and the solutions to manage and analyze it are the fuel that drives the
engine of the trade life cycle. That includes the profitability profile of
new models, optimizing existing models, re-balancing portfolios and
managing the fluid nature of transaction costs. They all depend on Big
Data solutions to provide accurate, clean data across a firms tradable
markets. Firms need a clear understanding that Big Data is pervasive
across the engine of the trading business to ensure success.
Kwan: Strategy, Strategy, Strategy. Have you really figured out
how to make trading decisions off social media and tweets? Maybe.
Historically, successful firms have been able to find patterns in the
market. I believe the game is the same, but the data set is much more
complex. Firms have to be able to adapt to new methods of pattern
finding. A big part of that is infrastructure, of which a very important
piece is a new class of advanced visualization tools.
Brown: We see a number of areas that are not fully appreciated
when embarking on Big Data projects, particularly in the analysis
of unstructured data. Often times, we see clients attempt to analyze
text believing they can have control over the secret sauce. While
the motivations are understandable, it is a very difficult proposition
on which to successfully execute and one can conjure up the phrase,
kids, dont try this at home. Challenges range from the difficulty of
a portfolio manager to vet a qualified linguistic analyst team, to not
knowing what you have/dont have, until its failed/finished.
We see a lot of false starts and abandoned projects in this space and
in some cases it is due to unsuccessful language processing, and in
others, the firms have trouble getting those techniques into production with the necessary fault-tolerant, fully resilient infrastructures to
handle such information in the speed needed for financial markets.
We believe the right system should be flexible enough to enable users
to do what they want, but robust enough for them to simply want
to focus on the higher value-added activities like interpreting the
analytics for their investment or trading strategies. When it comes to
unstructured/text analysis, Thomson Reuters News Analytics offers a
great mix at both ends of the spectrum and everywhere in between.
Smith: I mentioned a few of these before: skills, data management,
and so forth. Looking at the data management layer, Big Data might
cause the thinking to go from a physical orientation to a logical orientation, where things are relative for just a period of time. Additionally,
it might change thinking in the data quality area, from things needing
to be 100 percent accurate to being directionally correct. This also
highlights the complementary nature of the technology where it could
front-end traditional tools.
Chirlian: Big Data is revolutionizing the way business is conducted. It
isnt enough that enterprises invest in new technologies for managing
and analyzing data. They must now be able to arm their business users
with easy, anytime access to the data they want and enable them to
analyze this data interactively. The analytics-driven businesses of the
future are those that understand this end-to-end analytic process and
put in place a well-integrated technology solution empowering the
businesses to be confident in their decisions.

Complete access for your


entire organisation to
WatersTechnology
Our information solutions provide everyone in your
organisation with access to the best information
resource on the nancial market technology industry.

WatersTechnology delivers:

Breaking news and in-depth analysis on the nancial


market technology industry
Detailed features on your market and the deals and
people who shape it
Video features and technical papers
A fully-searchable archive of content from
WatersTechnology and from all of the other marketleading titles incorporated under the site (Buy-Side
Technology, Sell-Side Technology, Inside Market
Data, Inside Reference Data and Waters)
Full compatibility with any mobile device

To find out more about the benefits an information solutions package


would bring, email solutions@waterstechnology.com or call
+44 (0) 20 7484 9933 / +1 646 736 1850

waterstechnology.com

waterstechnology.com

June 2012

S-ar putea să vă placă și