Asset Control's John Mitchell on the need for financial insitutions to purge the toxic data in their systems.
From the early part of the last decade through to 2008, the
financial services industry was awash in profits, a party where
nearly everyone had their fill. Then in late 2008 the excess
finally caught up with the markets and a lengthy hangover
ensued. Four years later and the financial system still has a
significant number of toxins within its assets, operations and,
down at the core, its data.
In data management terms, toxicity can be described as the
vestigial data processes that have been built to support the
very complex and global businesses of trading and investment in
the last decade. The problem is that for many institutions the
sheer scale of data management has created monumental,
monolithic data management structures that focus on delivering
a single view of the truth when in fact multiple golden copies
The alternative, of having data sourced individually by each
system or business unit or asset class, is no better. In light
of the substantial data management requirements financial
services firms now face this leads to chaos, duplication,
inefficiency, obscurity and more cost.
This article is available to subscribers and registered users
Please log in to continue reading.
Not yet registered? Take a free trial.
If you have already taken a free trial you
have ongoing access to the analysis section of FOW.com including this story.
Log in using your details below to read.
Already have an account? |