The past 12 months have seen a slew of regulatory measures, designed to mitigate the complexity and lack of visibility that were major contributors to the financial crisis. A key area of focus is Europe, where the European Market Infrastructure Regulation (EMIR)is enforcing the use of Central Counterparty Clearing Houses (CCPs) for processing derivatives post trade.
At the same time, the Basel III Capital Requirements Directive (CRD IV) substantially increases the capital and collateral requirements for OTC derivatives in an attempt to shift trading onto exchanges and CCPs, thereby reducing both the likelihood and the impact of default for any trades that remain bilateral.
On the other side of the Atlantic, it is perhaps no surprise that the all-encompassing Dodd-Frank Act has something to say on the matter. It will enforce similar reporting and capital requirements to Basel III/CRD IV, and make central clearing for standardized derivative trades mandatory by the end of 2012.
The Dodd-Frank era will see increased regulatory oversight alongside higher capital and collateral charges for OTC trades – although there will be charge exemptions for firms who can demonstrate that derivatives were used purely for non-speculative hedging purposes.
The watchwords are transparency and liquidity. On the understanding that sunlight is the best disinfectant, the new rules are intended to strengthen the essential infrastructure supporting global financial markets and better position institutions to foresee, withstand and avoid financial shocks. After all, no one wants to be the next Lehman Brothers, no one wants to be exposed to the country that defaults on its sovereign debt, and no one can be completely confident that the already shaky markets would withstand another shock of that magnitude.
The likely impact
It should also be clear that if derivative trades leave the OTC, call-round or upstairs markets, then there will be a compliance cost for individual firms. There will be no more back-room deals with sketchy, ad-hoc or idiosyncratic reporting. As with any regulatory regime, the new demands will increase the volume of data that must be held and made readily available whether it is information on collateral obligations and liquidity, the chain of counterparties behind any given position, or demonstrable proof that any given trade is part of a hedging strategy rather than speculative activity.
That has obvious implications for the way in which data is managed within the institution. The impact became even clearer in August, when the International Organization of Securities Commission (IOSCO) and the Committee on Payment and Settlement Systems (CPSS) issued a joint template for new requirements for data reporting and aggregation of OTC derivative trades. The proposed legislation, which would come into effect as early as 2012, received backing from the International Swaps and Derivatives Association (ISDA) who, in an open letter, reiterated the need for global trade repositories (TRs) to ensure maximum visibility.
TRs are recognized throughout the financial regulatory community for their ability to bring transparency to previously opaque markets. The Dodd-Frank Act has identified repositories as one of the ”three pillars” of its new infrastructure requirements, and the standards proposed by IOSCO and CPSS will reinforce the ability of TRs to provide regulators with the tools necessary for analyzing and assessing systemic risk.
In this environment, voluntary reporting is consigned to history. The need to centrally collect and report data held in TRs has huge implications on a firm’s data management infrastructure as well as its governance processes. The impact is particularly significant as the majority of current systems were not built to cope with the onslaught of requests for greater transparency the markets are currently witnessing.
The necessary response
Financial Institutions will require some form of interface between their own data management system and that of the TR¾and it will need to be an automated interface given the well-known perils and risks associated with manual processing.
They will also need to adopt standardized classification of derivative products. Without standardization, it is difficult – if not impossible – to accurately collateralize against them. With most exchange-traded assets, creating a risk-weighted calculation is relatively straightforward. With something as ephemeral and exotic as a complex option or an intangible as a future, it is far from clear and thus very hard to assess what kind of capital should be held against them.
Holding these classifications will change the data model at most firms. For any institution that has an inflexible data management system – which accounts for a sizable majority – then integrating new classifications and then determining which algorithms to use with which classifications, and how risk calculations are to be performed, will put the system under almost unbearable strain.
Bringing OTC derivatives in from the cold also highlights the problem associated with the proliferation of legal entities in the market place. It is estimated that before it went down, there were more than 2500 separate entities within Lehman Brothers alone. Mapping total global exposure to all counterparties and all guarantors across all instruments, regions and desks into a single report can take days in an environment where even a second’s delay can have a serious affect on the outcome of any execution.
Firms need some way to manage this tension between the need for speedy execution and the necessary pre-trade risk checks that eliminate costly breaches of client mandates and regulatory requirements. The LEI – or legal entity identifier – is rightly being touted as the solution to this problem, and it will certainly enhance visibility and transparency of a highly interconnected global market.
Weighing up the costs
From a data management perspective, whether the need is standardized classifications of derivatives or standardized classifications of legal entities, huge data collection, validation, and reporting requirements will put the squeeze on many data management solutions and infrastructure that are currently in place. We estimate that the cost of changing one attribute on a data file could be substantial, especially once it has been replicated throughout the organization and its multitude of data, technological, operational and functional silos.
But the latest swathe of regulations require wholesale change, where updating all fields could come with a significant bill simply because financial institutions don’t have the flexible systems that can cope with rapid regulatory change.
The pace of regulatory change post-2008 has escalated so fast that the SEC is talking about 400 new rules regarding derivatives alone. We have entered the era of constant change, when tinkering around the edges to accommodate today’s requirements simply creates more problems six months down the track when a new set of directives or guidelines comes on line. This is a problem across the industry: the systems that are in place clearly were not designed to deal with the data volumes, the number of requests, or this pace of regulatory change. And although there is a cost involved in upgrading data management infrastructure, the opportunity cost of not doing so, and the associated risks, are overwhelming.
In this sense, investing in data management should be viewed as an investment in the risk infrastructure.
Facing the future
Regulator-ready data management is now critical. And by “regulator-ready,” we mean data management solutions that can flex and scale to meet the individual firm’s demands now, as well as in six months and in five years. A regulator-ready solution is one that can handle volumes and provide accurate, accessible and actionable information wherever it is needed. It enables institutions to respond efficiently and easily to customer demands, new regulations and changing market landscapes.
But more than that, regulator-ready systems will provide complete and consolidated information about the entire firm’s positions, rather than the narrowly focused silos of mismatched data that have plagued the industry for too long. They also integrate seamlessly with risk systems, trading systems, accounting systems and others. There is no longer room to consider data management as a separate function that falls under the remit of IT or operations departments. It is instead a strategic imperative that drives every critical function of the financial institution.
Preparing for the current changes and the plethora of new demands on the horizon requires a fresh and strategic approach to data management. The changes to OTC derivatives markets are but the latest driver to update the approach to data. There will be many more to come.