8 out of 10  .png

8 out of 10 Reference Data Users Can't Be Wrong, Can They?

The power of the reference data community – Data Quality

The force of the community data model applied to reference data is consistent and progressive but at times supercharged in its reaction to fundamental market change. The real time use of reference data in daily trading operations offers a dynamic exposure to prove, or not, data quality and is an essential part of an inclusive quality assurance process. In a community data model, quality assurance does not stop at the factory gate but is leveraged through the participation of users, in the collective acknowledgement that costs would be commercially prohibitive for any vendor and by default, the industry, to guarantee 100% accuracy. It is not an abdication of a vendor’s responsibility to expect its user community to fine tune data accuracy, but it is an example of a rational and modern approach to what is a mutually dependent ecosystem. Reference dataquality is today very much a differentiator in the competitive data supply market but in pure terms, it should not be, as, by both its name and nature, reference data should be standard in any use context, neutral in terms of conferring any competitive advantage and uniformly accessible to allow firms to fully concentrate on the strategies that will fulfil their prime directive of wealth creation.

The rise of crowdsourcing, whether for fundraising, opinion making, or problem solving, taps into the many minds principle, which has a statistical and undeniable proof of success. The evolution of crowdsourcing, beyond content creation, is content curation, which leads us back to the question of where along this spectrum should a vendor’s engagement be to optimise its data community benefit? The injection of community washed data into a siloed database raises overall data quality by exposing differences or gaps in the silo data. A benefit of the community data model is that participants face a symmetric data risk of data quality where counterparties’ exposure is mitigated through trade controls that would intercede in transaction flow and flag up a process issue. The feedback loop to the vendor data source is timely and any corrections are broadcast back to the community, more often now in regular calls to APIs and delta files highlighting data changes.

Managing Partnerships and Strategic Alliances

Alliances and partnerships are not new news but in a collaborative context, they have never been more active, often driven by the need for new technologies to find a home in markets with strong headwinds. Other partnership drivers include the strategic need for a vendor to present its data to a wider audience as possible, to access what would otherwise be closed system users, who have made their choice of supplier based on product functionality. But as the data community at large well know, it is not just form but content which decides the overall level of system performance within an organisation.

The most successful alliance partnerships are those where a mutual dependency exists, where products or services can only be delivered through collaboration and where specific market needs are met or future market positions are being taken. In many respects, a very focused collaboration can produce more timely and direct results to customers than broader industry based collaboration, which is long term due to the nature of the more diverse needs of its participant members, although the big solutions when implemented, re-shape an industry and its dynamics. However, the low frequency of such solutions trails woefully behind issues that need fixing today, where the longer a fix is in place, the more de facto standards are created and the more processes become embedded.

Data Management and FinTech

With cycle times decreasing for new technology and enabling ever more sophisticated functionality at same or reduced costs, it is of limited debate where the focus should be for firms whose free capital is unnervingly thin after non-discretionary spend driven by regulatory reporting and risk management; that is to lower the operational cost base and invest in trading regimes that increase the returns from wealth creation. Run the bank budgets have been eclipsed by change the bank spending programmes and often at the expense of build the bank budgets, with the retention of quality customers only, remaining core to the stabilisation of revenue to fund such changes.

From a data management perspective, there is still the old chestnut to crack, that is how to ensure golden copy data persists throughout the trade lifecycle and throughout the many moving parts of a trading organisation. Analogous to a District General hospital, the investment over many years in legacy systemmaintenance, fixes and replacement migrations, presents a genuine maze for golden copy data to navigate from end to end. Not many opportunities exist to build on a greenfield site with a shiny new and complete trading system, so the reality still faced by firms is to optimise their overall process in a least worst way in the acknowledgement that it may actually lead to sub-optimisation, higher overheads to maintain and increased risk.

The management of golden copy data plays second cousin firstly, to the front office chasing racier new tech to fulfil ever more complex trading strategies and secondly, to the high impact of significantly increased capital provisions demanded by new global regulation. Collaboration can solve the inconsistency in data quality, this time by leveraging internal resources using data management systems that users can update and maintain using federated but connected access points across the enterprise. Management driven system checks and balances provide an auditable method of control, consistent with maintaining the level of data quality specified.

Data Utilities

Data utilities have their roots in collaboration with a relatively small group of clients and a much larger group of participating vendors. The many-to-one flow of data feeds from the market into a specific client is a recurring IT integration and management overhead as new feeds are regularly deployed and retiring feeds turned off. There is latent demand for data management services to outsource the handling of multiple market data feeds as a repeating, non-value adding process, which absorbs IT hardware and skilled personnel and presents the same operational issues for most firms.

Cost Mutualisation

The regulatory capital burden across asset classes, reporting and risk functions and the specialist capabilities firms need to be compliant, are opening up opportunities for niche providers, as firms realise they are better to buy or rent, rather than sacrificing scarce capital resources for lesser return on investment by building in-house. This environment is analogous to a General Practitioner recognising symptoms but referring on a really sick patient to a specialist Consultant, in a collaborative delivery of medical care, where a Consultant is more equipped to both diagnose and treat the underlying cause. Combine this attitudinal shift in procurement rationale with de rigueur cost mutualisation initiatives and the outreach to the vendor community has to feature collaboration at its heart to deliver the kind of investment returns that are demanded. In addition, the desire to reduced fixed operational costs and substitute these for variable costs that can be scaled up or down with business demand, is high on both CFO and COO agendas. Although this scalable business model is not new, it is gaining more serious attention alongside a paradigm shift in firms’ expectations of how vendors should be engineering data at source to avoid client firms’ costs downstream post-delivery and the engagement model is, once again, based on the power of collaboration.

It looks like true industry collaboration is becoming a material facet in the permanent improvement of trading operations and the delivery of capital efficiencies at a time when it is most needed.

Contact us to find out more about our collaboration with industry partners, enabling us to deliver on-demand futures and options instrument data for over 100,000 contracts on over 110+ exchanges.

Share

Related Insights

  1. Article
    The Merriam Webster dictionary defines ‘symbology’ as a system - and the interpretation of symbols. It also defines it as ‘the art of expression by symbols’ which if applied to the confusion of symbols used to describe specific derivatives reference data might suggest a more abstract than realist art form.
  2. Article
    This week marked the 16th anniversary of the FIA IDX in London - Walt Lukken, President and CEO of FIA opened the event with a keynote address that touched on the developments and challenges facing the global derivatives market, and particularly the potential impacts of a ‘historic election year’ for Europe, the UK and North America.
  3. Article
    The fast-approaching Eid al-Adha holiday, the second of the two main holidays celebrated in Islam that this year begins on June 14, is a timely opportunity to look at another essential element of financial markets reference data - exchange holiday dates.