Sunteți pe pagina 1din 12

A-TeamGroup

June 2009

ReferenceData Review presents

Impact of Derivatives on Reference Data Management


They may be complex and burdened with a bad reputation at the moment, but derivatives are here to stay. Although Bank for International Settlements figures indicate that derivatives trading is down for the first time in 10 years, the asset class has been strongly defended by the banking and brokerage community over the last few months. The industry is, however, on course for a significant overhaul of the regulatory regime governing the OTC derivatives market, both in Europe and the US. This, of course, means that the post-trade processing of these instruments is set for big changes. Credit default swaps (CDSs) are the first of the credit derivatives to be ushered onto clearing counterparties in a bid to reduce counterparty risk, but they will likely not be the last. Moreover, the market is also awaiting the introduction of an alternative standard to the current five character Options Price Reporting Authority (Opra) codes next year. Earlier this year, the Options Clearing Corporation (OCC) was named as the operator of the new options symbology system, which has been estimated to cost the industry around US$250 million to introduce. All of these changes are likely to have a significant impact on the data management systems for these complex instruments, requiring the introduction of new processes and procedures. A challenge indeed for the vendor community.

Sponsored By:

An A-TEAMGROUP Publication

www.a-teamgroup.com

Derivatives Data Management - A-Team Group

Angels or Demons?
he last two years have been a rollercoaster ride for the derivatives markets. The fallout from the financial crisis resulted in the demonisation of the asset class, which was cited by many as one of the catalysts for the market troubles and the fall of financial institutions such as Lehman Brothers. What once were celebrated as innovative new financial products, have now been branded as hazardous to a financial institutions health. The sector is also facing a barrage of regulation and intense scrutiny from the market. According to the Bank for International Settlements (BIS), there has been a significant move away from the derivatives markets in terms of trading. Figures for the second half of 2008 indicate that the derivatives markets shrank for the first time in 10 years, as investors moved away from trading assets that they considered to be too risky for their balance sheets. The amount of outstanding derivatives contracts linked to bonds, currencies, commodities, stocks and interest rates fell by 13.4% to US$592 trillion in the second half of last year, according to the BIS figures. However, despite the dip in trading, some corners of the industry have staunchly defended the derivatives markets. The Wholesale Market Brokers Association (WMBA), for one, is critical of the negative perception that OTC derivatives markets have garnered as a result of the crisis. It has warned regulators that overly restrictive regulation of the sector could be harmful to the financial markets at large. Whist the objective of making markets more secure is supported by all market participants, and certainly WMBA members, the unintended consequences of poorly thought through policy decisions would have a serious impact on the real economy, says David Clark, chairman of the association. The WMBA has displayed particular concern about regulatorsattempts to move all OTC business onto exchanges, although it

By Virginie OShea, Editor, A-Team Group


is willing to concede that the introduction of central counterparties (CCPs) for derivatives clearing will reduce counterparty risk in the market. The WMBA wishes to warn again that forcing OTC products onto exchanges would significantly reduce liquidity in financial markets, resulting in increased risks and costs for end users as their ability to hedge their exposures would be handicapped, explains Clark. The association welcomes CCPs for credit default swaps (CDSs) and other financial products that are suitable or relevant for clearing, but feels the US governments approach to the market is unduly harsh. The association points to the fact that most of the severe losses suffered by banks during the crisis occurred in the structured credit markets and not in the OTC CDS market. The OTC world, in the WMBAs view, should not be driven out of business. The association has also expressed concern that US policymakers have not acknowledged that making markets more secure can be achieved through the clearing of products, both OTC and exchange traded, through recognised CCPs. The implication being drawn by some market participants and commentators, is that the only way of achieving regulators ambitions is to coerce OTC products onto exchanges, the association said in a statement earlier in 2009. The US government is considering taking a harder line because of the spectre of counterparty risk, which the fall of Lehman Brothers last year threw into the spotlight. Markets froze and liquidity dried up because investors were concerned about whether their deals would be completed and derivatives are at the heart of this confusion. Risk exposure and counterparty data tracking is especially difficult when dealing with derivatives that are structured in a highly complex manner. The Obama administration and Treasury secretary Timothy Geithner have therefore proposed that firms should centrally clear derivatives trades and that these trades
June 2009 Issue 07

should also be recorded to enable supervisory authorities to prevent market abuse. These firms will also be encouraged to move these trades onto exchanges where possible via the imposition of higher charges for OTC trades. The rules on capital adequacy for these instruments will also be tightened and higher reserves will be required, if the proposals are passed. Securities and Exchange Commission chairman Mary Schapiro has also suggested that regulators should use the Financial Industry Regulatory Authoritys (Finra) system for bond price reporting,Trace, as a model for transparency and reporting requirements for OTC derivatives. The Trace system, which has been in operation since 2005, provides access to trading information on corporate bonds to anyone with internet access. I think its something well look at very closely as a potential model, said Schapiro. Regardless of Traces future role, however, it seems that OTC derivatives regulation is on the cards in a big way. We need better broader authority, better information, and we need a better commitment of supervisory authorities to enforce those laws, said Geithner during the release of the proposals. Given the increased costs and potential limitations on product innovation that such requirements would involve, it is not surprising that the industry is fighting back. The WMBA is, however, less critical of the approach being adopted in Europe. The WMBA believes that European initiatives indicate a firmer grasp of the essential role of the clearing house, and understanding of the transparency and post trade security inherent in the activities of banks, and WMBA members that use platforms that are MiFID compliant in an already regulated environment, explains Clark. The European approach, thus far, has been to encourage the use of CCPs and to initiate a debate about the potential provision of more data to the regulatory community about derivatives products. Committee
An A-TEAMGROUP Publication

Derivatives Data Management - A-Team Group

of European Securities Regulators (CESR) chairman Eddy Wymeersch has indicated that the regulator is looking at these information requirements at the moment. He has stated that these instruments can remain off regulated markets but must use clearing central counterparties (CCPS) to reduce risk. The issue of CCPs is definitely on the map in the near future for CESR, he said. It is not just the regulators that are affecting a revamp of industry practices, however. The International Swaps and Derivatives Association (ISDA) has also developed a new standard protocol for dealing with CDSs, which was introduced into the market on 8 April. The ISDA big bang, as it was dubbed, changed the pricing practices for single name CDS contracts. The changes were aimed at reducing systemic risk by introducing a standardised pricing system for CDSs and making clear what procedures must be adopted should a default occur for a CDS contract. The new convention means that investment grade names trade with a fixed coupon of 100bps and high yield names trade with a fixed coupon of 500bps. According to Robert Pickel, head of ISDA, the big bang protocol provides a framework for the industry by which it can standardise the traditionally opaque credit derivatives. It is hoped that the changes will also assist in the move towards central clearing for CDSs with the advent of CCPs such as IntercontinentalExchanges ICE Trust. Moreover, the industry is also awaiting the introduction of an alternative standard to the Options Price Reporting Authority (Opra) code in 2010. Earlier this year, industry participants finally selected a suitable candidate to operate the new symbology allocation system. There were ongoing discussions throughout 2008 about the lack of agreement on the basics surrounding the introduction of a new code but it seems that some progress has been made, including the selection of the Options Clearing Corporation (OCC) as the operator of the system. The OCC was selected by market participants as the most suitable candidate to maintain a centralised database of securities symbols, although other players in the market were considered, including the
An A-TEAMGROUP Publication

Depository Trust & Clearing Corporation (DTCC), the Financial Industry Regulatory Authority (Finra), the Issuer Advisory Group (IAG) and SFB Market Systems (SFBMS). The OCC is the clearing house that since 2005 has led the Options Symbology Initiative (OSI) and was likely chosen because of this long association with the market. Last year, the Financial Information Forum (FIF) highlighted in a report that the cost of the introduction of the new symbology is likely to be high. Mary Lou Von Kaenel, managing director, management consulting at New York-based consultancy Jordan & Jordan, which chairs the FIF group, explained at the release that the industry will spend an estimated US$250 million in preparation for the new options symbology. Respondents to the survey on which the report was based included 46 of the FIFs members, representing 26 broker-dealers, 12 service bureaus, seven market data vendors and one options market. General industry discontent with a lack of standardisation was prevalent in the survey and respondents also indicated they are aware of the high cost of implementation. The other interesting, although not surprising, finding of the survey is the estimated cost to the industry of US$250 million, which does not include buy side or custodian implementation costs, she added. In this market environment, with no new revenue that can be attributed to the expense, cost has become a significant challenge for some firms. The most costly aspects of the changes were highlighted by Von Kaenel:Looking at it from the customer perspective, FIF members focused on customer confirmations and account statements generated by the back office. At the front end, there are more complex elements to consider, such as ease of order entry or avoiding input errors. Currently, most firms are only able to store nine or 12 digit identification codes and in order to comply with OSI, they will have to create proprietary identifiers for listed contracts for purposes including client reporting and processing. The FIF survey asked respondents to evaluate the various approaches to creating the nine character dummy codes, including the offerings from Standard & Poors, Symbol Management Clearing, Interactive Data and
June 2009 Issue 07

an open source code, Mark 1 Algorithm. According to the survey, 39% of respondents would use codes created by Cusip Service Bureau, 7% would use Interactive Datas solution and 2% would use Symbol Management Clearing. None of the respondents selected Mark 1 Algorithm, but 17% indicated they would use their current internal identifiers, 7% were uncertain and 24% declined to comment. This is indicative of the lack of standardisation across the industry with regards to tacking the OSI, says the FIF. In order to meet these requirements, Standard & Poors Cusip Global Services (CGS) business has recently announced a partnership with UK-based futures and options specialist FOW Tradedata to develop a new Cusip identification system for listed equity options in the US. The vendor will launch the new service by the end of June, says Matthew Bastian, director of product development at S&P. The feedback we have received from the industry is that a common nine to 12 character identifier is a tremendous step forward in standardisation, and is an essential complement to the OCCs planned 21 character OSI code, he explains. The service is therefore aimed at supporting the industry in moving to these standards and will encompass approximately one million option Cusips with accompanying ISINs and related data elements. Market participants who wish to receive the Cusip Options Service will be able to do so directly from CGS or via a properly licensed vendor, says Bastian. FOW Tradedatas Xymbology product, which maps option contracts to market data and proprietary vendor codes, will also contain option Cusips and ISINs. Regardless of which vendors offering they choose, firms cannot afford to rest on their laurels as the first major milestone of the OSI comes in September 2009, with the commencement of industry testing. To be prepared for testing, firms must have completed their internal programming changes to allow for several months of internal testing to insure there is no unintended impact of changes on adjacent systems, said Von Kaenel. It seems that derivatives data will be providing challenges for some time to come.

Derivatives Data Management - SIX Telekurs

Impact of Derivatives on Reference Data Management


By Richard Newbury, market development manager at SIX Telekurs
iFID reporting requirements brought the standard of reference data on derivatives to the front of the back offices mind in late 2007. For post-trade reporting purposes, the Committee of European Securities Regulators (CESR) proposed asking all derivatives exchanges to have ISINs issued for each delivery/strike of each future or option. After that idea was firmly rejected by the exchanges, the regulators and industry worked together and the idea of an Alternative Instrument Identifier (AII) was born. The AII would take the existing reference data the venues Market Identification Code (MIC), the product code, a marker indicating the derivative type, a put or call identifier, the expiry or delivery date and the strike price - to uniquely identify a derivative instrument. At 43 characters, it is not the snappiest of identifiers. Luckily, regulators are not expecting to receive the data as a single identifier, but thereby hangs another problem what if some of the underlying reference data is incorrect or missing? How many cancellation reports will it take until the report is right? And perhaps more importantly, given the potential time lags, what value is that level of correction really going to provide to the European regulators? Another issue with the AII is that it only applies to 11 trading venues in the European Union. The Options Clearing Corporation (OCC) and the Options Price Reporting Authority (Opra) both have different requirements for an identifier, again built on reference data for the derivative itself and the trading venue. Given this, the challenge for banks, clearing houses and regulators is that they potentially need to have three different projects for three different identifiers based on similar reference data. There is a

real possibility of the existence of different mappings between different banks. This would result in a regulatory authority receiving a different identifier from bank A to the one received from bank B, removing any chance of direct comparison. Another issue, in some way related to the issue of poor reference data for derivatives, is the difficulty in collecting this data. Data vendors do their best to carry relevant and comprehensive derivatives data. Currently (end April 2009), SIX Telekurs carries almost 1.25 million futures and options, nearly 1 million hybrid instruments and about 800 thousand warrants in our database. Our army of data experts around the world works valiantly to make sure that the data is clean and useful to our customers, but with this amount of data and with the rate of issuance of new instruments, we clearly rely on a great deal of highly complex automation. The 11 exchanges in the MiFID zone provide data in different formats and, in some cases, different formats for different derivatives types. It is obvious here that a single standard would help along the entire value chain, but a plethora of standards are in use and more have been proposed. A standardised data model is not available and it is this that feeds into the variety of different data formats that exist today. Creating a single standard would be a nirvana. Although regulatory attention seems to be on the rise evidenced by the release in March 2009 of a whitepaper on this subject by the European Central Banks Francis Gross, we must be realistic and recognise that the uptake of data standards - even those invented by the industry itself - remains low. Even widely accepted standards take a long time to develop and adopt, leaving everyone with a problem in the meantime.
June 2009 Issue 07

The volume of derivatives has grown at a steady rate over a number of years and their growth and use has outstripped the knowledge of some of the operations staff that handles data in banks. Knowledge is always an issue, especially as finding time to devote to training is more difficult as volumes grow and workloads increase. Again, banks need to be able to rely on the quality of data they receive and the systems they employ. We have already seen how the sourcing of data is difficult. But systematic issues hamper the operational and business staff alike within a bank. In the same way that human components of systems have struggled to keep pace, IT systems have struggled too. The number of ticks, the number of instruments and the fact that most systems were built to deal with equity and vanilla bonds inundates day to day operations and hamstrings developers who have to squeeze new instrument structures into systems that simply werent designed for them. The situation can be further amplified when a receiving system that has been altered to accept derivatives then feeds into an internal distribution system that is unable to accept these structures - leading to data fudges that can further compromise the standard of reference data used throughout the business. So, there are many issues to be resolved. As with most issues in our industry, there is no single answer to realising improvements and there are many touch points that need attention. In the meantime, using a stable data vendor committed to open standards, continually training staff and making sure that systems infrastructure is suitable will all contribute to effectively managing the mass of data produced by the worlds hectic derivatives trading.
An A-TEAMGROUP Publication

In challenging times you need something to hold on to.


With information on over five million instruments SIX Telekurs reference database will never let you down.

www.six-telekurs.com

Derivatives Data Management - London Stock Exchange

Uniquely Identifying the Worlds Exchange Traded Derivatives


By Mark Husler, head of data and software business development, London Stock Exchange

he London Stock Exchange (the Exchange) is the UKs national numbering agency and has 30 years of experience in providing the industry with timely and accurate global reference data. To this end, the Exchanges own SEDOL Masterfile assists firms with their unique instrument identification by supplying unique, market level, global security identifiers designed to lower costs, streamline post-trade processing and settlement, as well as minimise the risk of cross border trade failures. However, the Exchanges offering is not limited to the equities and fixed income world. In January 2009, the SEDOL Masterfile allocated SEDOL codes, and ISIN codes for UK markets, to over two million exchange traded derivatives (ETDs). The derivatives markets can therefore now experience the same benefits enjoyed by SEDOL code users in the equity and fixed income markets. Dealing with derivative reference data is no mean feat, as it makes up 40% of all financial information and is considered to be the building block for the automation of financial transactions. Typically, in the post trade process, the documentation department needs to use a number of derivative attributes to identify the security. The lack of a common identifier across global exchanges, vendors, front and back offices creates inefficiencies in the documentation process and hinders STP. According to the International Swaps and Derivatives Association (ISDA) 2008 Operations Benchmark survey, mismatched identifiers and associated reference data issues are the most common errors in trade documentation. Moreover, the growth in the volume and complexity of derivatives has added further complication to data management. According

to the Futures Industry Association (FIA), global exchange traded derivative contracts increased by 28% in 2007 and, with rising volumes, the risks related to identification issues significantly increased. The challenges that derivatives pose to the market include ensuring the continued suitability of existing systems, the availability of skilled resources and the sourcing of quality and timely data to ensure clients have the information they need for accurate up to the minute pricing. Consolidation in the market and increasing competition from multiple exchanges offering competing derivative products on the same underlying securities will magnify these challenges. To better tackle the issues surrounding the derivatives data management challenge, SEDOL Masterfile has migrated to the UnaVista reconciliation service platform, providing customers with a new website and a global derivatives package. This enhancement enabled the allocation of two million SEDOL codes to identify unique global exchange traded derivatives sourced from over 80 global exchanges. The Exchanges service aims to help customers improve efficiencies in the documentation process and minimise the risks and costs associated with late settlement and trade failures. In order to achieve this goal, every derivative contract is identified at the exchange level by using the standard SEDOL seven digit alphanumeric format, providing a common global identifier for all derivatives. Derivative SEDOL codes cover over 80 global markets including around 30,000 issuers and are linked to the underlying issuer via the ISIN codes provided. These codes also incorporate GB ISIN for all derivatives on UK-based markets including Liffe, EDX and the London Metal
June 2009 Issue 07

Exchange (LME). The service provides additional reference data required by regulations such as MiFID, for example the Classification of Financial Instruments (CFI) code. SEDOL codes are incorporated into most major data vendor solutions worldwide, but there is no substitute for collecting data directly from the source. As such, ETD SEDOL codes are available in a downloadable format and through a web services API, allowing participants to access data on a daily basis to automate their reference data systems. Derivative SEDOL codes allow users to streamline data feeds and trade with confidence with the thousands of other SEDOL and ISIN users across all time zones. It also enables users front, middle and back offices to communicate seamlessly with each other, reducing the dependence on concatenating multiple reference data attributes. Earlier identification of new issues provides more time for instrument set-up, decreasing the risk of unwelcome dummy securities or codes being introduced and identification codes differing in front and back offices. Using SEDOL codes to identify derivatives enables potential discrepancies to be identified before trade inception, which minimises the need to repair trade failures manually. Also, the Securities Masterfile database is cross referenced to alternative industry identifiers, thereby removing the overheads associated with maintaining direct connectivity to multiple reference data suppliers. Worldwide coverage and open standards ensures derivative SEDOL codes are used by most global institutions and data vendors, helping to assist in cross border communications and automate trade processing.
An A-TEAMGROUP Publication

SEDOL Masterfile two million uniquely identified global derivatives now live

Derivative SEDOL
SEDOL Masterfile recently migrated to the UnaVista reconciliation service platform, providing customers with an enhanced website and global derivatives package. The allocation of two million SEDOL and ISIN codes to uniquely identify global exchange traded derivatives on over 80 global exchanges allows you to:  trade with confidence with other SEDOL and ISIN users across all time zones  enable seamless communication across your front, middle and back offices revolutionise your post-trade processing. For further information about Derivative SEDOL, or to arrange a demonstration of our hosted reference data reconciliation and cross-reference services, please contact us on +44 (0)20 7797 3009 or email sedol@londonstockexchange.com. www.londonstockexchange.com/sedol
May 2009 London Stock Exchange plc. London Stock Exchange, SEDOL Masterfile and the coat of arms device are registered trademarks of London Stock Exchange plc.

Derivatives Data Management -Xenomorph

Time to Embrace, Reject, or Ignore the Spreadsheet?


Brian Sentance, CEO, Xenomorph

hat is Data Management in financial markets? The simplistic, but unsatisfying, answer is probably anything you want it to be, so long as you can associate data with it. For many participants it concerns the management and distribution of real-time data, for others the management of security terms and conditions, the management of counterparty and customer data, corporate events, security transactions and their positions, or the management of prices and valuations etc. Focussing on the derivatives industry, the answer to my initial question is probably all of the above, as no other area is so data intensive, requiring all types and categorisations of data to be delivered and linked together in a consistent, high quality and flexible manner. Given this diversity and complexity of data requirements, it is perhaps unsurprising that the spreadsheet, aka Microsoft Excel, is still the leading platform for derivatives data management in financial markets.

Centralising not Siloing


One of the great ironies of data management marketing is its desire to impress the importance of centralised data management, and moving away from isolated data silos, whilst at the same time presenting its product offerings to the industry in a manner that is itself siloed around specific types of data (eg reference data, market data, counterparty data etc). If a financial product is constituted from a variety of data types, curves and pricing models, how can it be automatically validated as fit for purpose if these constituents are located in separate,

isolated systems? Put another way, derivative data management systems need to be business and product focussed rather than technology and data-type focussed. This is where the spreadsheet currently proves to be a vital tool for business users: it does not differentiate by data type; it copes with a high degree of data complexity; it is a fantastic integration tool for pulling data and analytics together; and it is ultimately easy enough for users to develop business solutions within the rapid timeframes they require. The diversity and complexity of derivatives data management mentioned previously is also the reason why most of the data management vendor community chooses to ignore the problem of spreadsheet management of derivatives data. How many centralised data management implementations are deemed a success by proud vendors and IT departments, whilst ignoring the reality that front-office staff blithely generate a whole separate (and to a great extent duplicate) world of product and market data in desktop spreadsheets? Centralised, transparent data management it most certainly is not, and it is one factor why we all still see issues coming out in the press around front-office staff producing misleading derivative valuation numbers. Of course some of the issues in derivatives data management are more cultural than technical in nature. Derivatives are complex products and as such the knowledge of what makes derivatives data fit for purpose sits with the front-office traders. Back office/ operations staff have their view on the processing data they need, whereas IT staff are often more focussed on Technology rather than Information manJune 2009 Issue 07

agement. These siloed participants in the generation and consumption of derivatives data should be working closely together to maximise the efficiency and effectiveness of the data management processes being operated. At larger institutions it is rare to find these departments strongly aligned and, combined with a lack of ownership of data management, this can lead to poor data quality and much higher operational costs.

Resorting to Spreadsheets
So without a single owner, and without front-office knowledge and involvement, traders will resort to using spreadsheets and the core data management systems will be ignored, increasing operational risk and missing a real opportunity to have the people who know most about the data involved in improving its quality. So should we continue to ignore the usage of spreadsheets in derivatives data management? I think this approach is to bury our heads in the sand and hope that the issue goes away. Should we try to remove spreadsheets from derivatives data management? No, whilst attractive to many in removing the operational risks of spreadsheet usage, I think this approach is impractical and ignores the very positive benefits of spreadsheet usage. So should we endorse the benefits of spreadsheet usage? In my view we should, taking the principles of simplicity, flexibility and ease of use of spreadsheets with a view to combining them with the control and transparency sought from data management. It might pain us all to admit it, but derivatives data management still has a lot to learn from the spreadsheet.
An A-TEAMGROUP Publication

data management
designed

financial
innovation

controlled

for

Xenomorphs TimeScape data and analytics management solution has been designed to support derivatives and structured products data with ease. TimeScape provides out of the box support for mainstream and niche derivative data providers, automated data validation and data cleansing, support for complex data, integrated historic volatility surfaces, curves and spread curves, easy integration of zero curve and pricing models and a powerful analytic framework for reporting and decision support. Email info@xenomorph.com to find out how TimeScape can automate and accelerate the management of your derivatives data.

Xenomorph. Data and Analytics Management.


Call: Europe +44 (0)20 7614 8600 | North America +1 888 936 6457 Email: info@xenomorph.com | Web: www.xenomorph.com

Derivatives Data Management - Roundtable

Mark Husler, head of data and software business development, London Stock Exchange

ur panel of data experts discuss the current market conditions, the future focus of the industry and how to tackle the tricky area of derivatives data management.

Reference Data Review Panel Debate: Derivatives Data Management


manded by regulators. More specifically in our industry, investment is awarded to solutions that can help mitigate risk, improve operational efficiency and, if possible, provide transparency regarding cost structures. For SIX Telekurs information is everything and truly speaking it is obviously impossible to build reliable risk management systems on unreliable reference data.

What is driving investment in derivatives data and/or data management systems in the current market? Husler: The complications raised by
growth in the volume, range and complexity of derivatives and the proliferation of regulation are placing ever-increasing demands on data management systems. It is widely agreed that volumes will increase significantly in the coming years, based on an increasing belief in the efficiency of including derivatives in trading strategies, either in conjunction with cash products, or exclusively.

David Lecompte, market development manager, SIX Telekurs

Lecompte: In a regulatory environment


that is moving towards more control for all systemic risk elements, the need for improved data management solutions is a hot topic now more than ever. There are multiple causes of the current financial crisis and data management is probably not the main one. The period we are going through has its roots in a mixture of global imbalances, interest rates policies and, more generally speaking, regulations and governance. It is currently our political leaders responsibility to lay the foundations for the sustainable development of capital markets. That said, the recent trend we have seen pushing data management debates higher up in the banks hierarchy certainly shows this topic has been underestimated in the past. In the current economical context, data is of paramount importance: it is used to evaluate assets, to feed risk management systems, is disclosed to investors and deJune 2009 Issue 07

Sentance: Risk and regulation, set in the context of the current financial crisis, are the current drivers behind investment in derivatives data management. Data quality and auditability remain important issues for regulators in assessing the overall risk management capability of an institution. Counterparty risk has understandably become a key driver behind the need for accurate counterparty data. New regulatory initiatives such as liquidity risk and improved scenario management are presenting new challenges and opportunities. Greater granularity of data will be required at a centralised firm-wide level. Regulation, cost pressures and the need for greater transparency also present new opportunities for data management to gain top management buy in but firm-wide projects will continue to rub up against tactical, more siloed needs of individual business units. What are the major challenges to be tackled in this area of the market? Husler: The growth in the volume and
complexity of derivatives means that the challenges are likely to be ensuring the continued suitability of existing systems, the availability of skilled resources and the sourcing of quality and timely data to ensure clients have the information they need for accurate up to the minute pricing. Consolidation in the market and increasing competition from
An A-TEAMGROUP Publication

Brian Sentance, CEO, Xenomorph

10

Derivatives Data Management - A-Team Group

multiple exchanges offering competing derivative products on the same underlying securities will magnify these challenges.

data and analytics closer together.

Lecompte: Volumes and complexity are


two major challenges in derivatives data management. If we consider our own database only, traded futures and options account for more than a quarter of the five million financial instruments we hold. And on an annual basis the highest growth rate in the number of instruments is for exchange traded derivatives, structured products and warrants. There are specific issues when dealing with derivatives from a data processing point of view. For exchange traded derivatives, although processing can be automated for creation and maintenance of such instruments, it still requires a high degree of involvement from the back office teams in order to set up new series of instruments. The core of the data challenge is to cope with ever-increasing volumes and a need for efficient data updates. When you need to refresh prices in a snapshot mode for 100,000 derivatives at a pre-defined point in time, you obviously expect reliable systems and precise timestamps from your data provider. Our Intraday Pricing Service (IPS) caters particularly well to this kind of requirement. For more complex structured products, the challenge is different and lies in the capacity to capture terms and conditions so that they can be disseminated to downstream systems. For example, at SIX Telekurs we have always worked on coding complex structured products redemption conditions. It requires highly skilled operation teams, however the benefits are worth the effort. Processing derivatives and complex financial instruments is a challenge but it is the necessary foundation upon which sound risk management practices can be built.

Sentance: I think the challenges are


both cultural and technological. The cultural challenge is that each party has different viewpoints on the data and needs of it. This is not surprising given the number of parties involved in generating, managing and consuming data. For instance, operations staff is typically focused on data relating to trade processing; front office and risk staff are more focused on pricing and sensitivity data; and technology staff might be more interested in the management of a database, rather than the management of the data contained in it. Furthermore, it may also be the case that no one clear overall owner of the data management process has been set. In data management, or EDM, we often talk about technical architecture, but I think investing in understanding and managing the people architecture of a data management project is vital to its success. From a technological point of view, I believe that there will be an increasing convergence of the silos within the data management industry itself that users need systems that combine reference data, products data, counterparty data, market data or derived data. Risk and regulatory reporting will drive this more integrated view of all the data to be analysed and processed. This does not necessarily lead to the need for a single system that does all, but rather means that all these different types of data must be knitted together in an easy to understand form for end users such as risk managers. I also believe that the principles (consistency, transparency) of centralised data management will have to extend to the management of pricing model analytics and zero curve calculators. Why create a great data management infrastructure if you then rely upon pricing or valuation numbers provided from a desktop spreadsheet? Put another way, the investment in the quality of instrument data is wasted if the process is not carried through to the valuation numbers derived from this data. In my view, this disconnection in the management of derivatives needs to be corrected by bringing
An A-TEAMGROUP Publication

cial centres around the world also means we are close enough to our customers to ensure that our product offering adapts to specific business needs and relevant regulatory environments. Right now, what we hear from our customers is a need for increased timeliness in the delivery of reference data. This goes for derivatives as well as plain vanilla underlying instruments. To address this business issue we will introduce an add on to our reference data feed VDF. VDF Pulse will deliver basic reference and cross reference data on new institutions, instruments and listings every 15 minutes, in other words, as soon as we capture the data in our database. We believe this will enable our customers to satisfy their needs across their organisation. More specifically it will contribute to narrowing down the information gap between front and back office areas.

Husler: At present we are the only vendor to offer an open standard, unique identifier widely distributed by all major global vendors, covering all instrument types, including ETDs. Sentance: In the derivatives data management area, I would say that ease of use and the customisability of the data model is a key differentiator. Our clients find it easy to add new asset classes, add new fields and support complex data structures without having to return to us to make the customisation each time. This increases efficiency and reduces the operational running costs of the system. Operational risk is reduced also, since front office staff can be reassured that the new data management system has the flexibility to adapt with their needs, reducing the need for spreadsheet-based management of data. Combined with this ease of use, our TimeScape product supports simple through to very complex data types (matrices, curves, baskets) and can optionally store them on both a historic and multi-sourced basis. One of the main reasons spreadsheets are still used extensively to manage derivatives data is the flexibility they offer, and in this regard we have extended the complex data types TimeScape supports to include spreadsheet-like data and calculations.

How does your offering compare/differentiate itself from everything else on the market? Lecompte: SIX Telekurs is a global financial information provider and, in saying that, we mean that our aim is to deliver sustainable solutions to our clients in the field of data management. More precisely, thanks to our expertise, we can contribute to operational efficiency and compliance with regulations. Having operations in all major finanJune 2009 Issue 07

11

Derivatives Data Management - Roundtable

What are the major trends in the market that are likely to impact your offering in the future? Increasing complexity of derivatives for example? Lecompte: Regulation and pressure on
costs are without a doubt two major business drivers for investment in data management. For instance, financial institutions will have to develop an in-house capacity to understand and evaluate all the assets they hold and possibly compare their findings to external providers data. The big players will have no problems putting together the right resources to perform those tasks. Smaller players might want to rely on third party for valuation services. Concerning costs, many companies right now are embarking on some cost control exercises. The short term effect might be a decrease in new projects. But in the long term, more focus on data management at financial institutions is leading us towards building stronger relationships with our customers, working together to develop solutions, delivering more training and advice. This period is an opportunity for providers like SIX Telekurs to show the real value we can offer with a fully coded reference data feed like VDF. As for the complexity of financial instruments, there is a trend towards increased data exchanges across organisations. This calls for more automation, more standardisation and should continue to be a source of data management projects. Overall there is a need for more information. But this means better quality information, more standardised information favouring STP and more value added information.

be done in less size and there will be more regulatory pressure to quickly move standardised products through to central clearing on to exchange trading. Maybe we now need a new verb for this but vanilla-ise doesnt work for me! At a recent update meeting by the EDM Council it was also mentioned that the regulators are considering the creation of a data utility to sit between the exchanges and the distributors and consumers of data. Standardisation efforts like this will bring fundamental change but as ever only time will tell what will work.

At SIX Telekurs we maintain one of the biggest referential database with more than five million financial instruments. This includes derivatives as well as equities, bonds, funds, forex and also companies. We pride ourselves in providing quality data. In order to satisfy our customers needs in asset valuations, we have chosen to broaden our offer in that field. Our Fair Value Pricing Service now provides four times a day price updates for about 90,000 bonds in 11 different currencies.

Husler: The vendor community has long


recognised the advantage of ISIN and Sedol codes as a cross platform, multi-asset identifier, enjoyed in particular by the sell side and increasingly used by the buy side. To this extent, all the major vendors carry the ISIN and Sedol codes and we expect this to continue. In the next five years it is possible the number of different identification codes will have been much reduced to just a handful, helping clients to reduce costs, increase efficiency and minimise trade failures through misidentification.

Husler: The main trend will be the sheer growth in volume. There are expectations that the cost efficiency and control provided by exchanges means that the volume of exchange derivatives will increase at the expense of OTC derivatives. Having said that, our next challenge is to proactively offer Sedol codes to OTC derivatives. How is the vendor community evolving in this area? What can we expect the landscape to look like in the next five years? Lecompte: The growth in derivatives has
led to an increase in the number of technology providers in derivatives valuations and data management space. If we only focus in on the assets valuation area, there are multiple actors in this field, traditional data vendors, software providers, research companies, risk specialists, marketplaces and bankssecurities services. In a way it is good to see that financial institutions now have access to a wide choice of business partners to fulfil their valuations duties. Now, the essential part here is to assess which partners can offer reliable data, secure processes, transparent methodologies and the financial soundness to work with a sustainable view. Indeed, there is a new market space for providing valuation services and, to some extent, data management solutions, but our customers will always favour long term partnerships, commitment and independence. Prospective is a very difficult exercise right now, but there is room for both global data vendors and more specialised niche players.
June 2009 Issue 07

Sentance: The landscape for data management will obviously depend on how the financial markets develop in the next few years and given the number of regulatory initiatives proposed at the moment, it will be an interesting time. Despite this uncertainty, I think that certain aspects of data management will become commoditised as they become more standardised, and as a result the data management vendors will end up moving further up the food chain with the centralised management of more complex business objects, analytics and processes. It is certain that there will be more standardisation of data driven by both the regulators and industry (take for example current initiatives on standardised business entity identifiers), increased volumes of data and a more integrated approach to risk management, with all that entails in data quality and auditability. Data management systems will become real-time and not just batch, facilitated by grid, distributing caching and cloud computing. To summarise, I think the current crisis presents a real opportunity to get the technical and human infrastructure in place to manage data and analytics more effectively than ever before.
An A-TEAMGROUP Publication

Sentance: The regulators decision on


whether OTC derivatives are to be centrally cleared or maybe put on an exchange is vital to how things will develop in the future. Product standardisation inevitably leads to higher transaction volumes and hence more data to be managed and analysed. In contrast, product innovation and customisation will continue to present its challenges in the management of unusual or non-standard data. I personally do not believe that complex derivatives will disappear, just that they will

12

S-ar putea să vă placă și