Austria’s Alpine slopes tend to get most attention at the start of a year, but in central bank circles, all eyes in early 2015 are on the opposite end of the country. In Vienna ambitious changes to the collection and interrogation of bank data by Austria’s central bank, the Oesterreische Nationalbank (OeNB), are causing quite a stir.
The solution adopted by the country’s forward-thinking central bank and banking sector represents a new approach to regulatory reporting, leaving formatted templates to the annals of history.
The new methodology creates a software platform that bridges the gap between the IT systems of the OeNB and the banks. This allows critical information to be extracted from the sector at will by the central bank without increasing the administrative burden for the data providers.
It marks a significant shift in regulatory and statistical reporting, away from the archaic system of form-filling, to a future framework better able to cope with the growing demands of supervisors, including ad hoc requests that fall outside the regulatory reporting cycle.
Growing compliance burden on financial services industry
From conversations with our banking and insurance clients, we know how much pressure they are under from their growing compliance duties, which often force repetition of data entry and has started to affect the efficiency of day-to-day operations.
Since the latest regulatory rules were introduced in the eurozone in 2014, all 8,000 or so EU banks are committed to reporting up to 700,000 data (1) points quarterly in layers upon layers of different templates in a digital language called XBRL. This is just for prudential reporting; combining with statistical reporting, it is much more. How long before we reach 1,000,000 data points? Is that really sustainable?
Frustrations were apparent at the annual gathering of the financial services industry at Euro Finance Week in Frankfurt am Main in November 2014. Christian Clausen, who was president of the European Banking Federation (EBF) until the end of 2014, told the audience that it was time to ask whether we have gone too far, too fast: ‘It is time to recalibrate the regulations … banks need to be competitive and need to be able to lend.(2)
Even though reporting timeframes have been squeezed from months to weeks, to ensure a more timely view of financial risks, we do not believe that this is adequate to prevent another failure of a financial institution systemic to economic stability – precisely the end goal of supervisors who beefed up the rulebook.
Could we be looking down the barrel of real-time reporting? This could mean tipping the balance too far the other way, with implications for the quality of information delivered to regulators as well as the risk of keeping to the form whilst losing the substance of reported data.
One thing is clear. Prescriptive templates no longer work in a fast changing digital world. Urgent debate is needed on how the world’s financial services industry could be better and less onerously supervised via a smarter approach to regulatory reporting.
IN 30 SECONDS
? Regulation is being overhauled to restore confidence in the financial system and avoid another bank bailout
? Greater harmonisation and standardisation of data are seen as key elements to more effectively assess systemic risk
? Regulators and industry must embrace new technology to create a different way of sharing information
? Supervisors and industry must rethink the reporting model in line with increasing compliance demands
? But the current template-driven regulatory reporting model means a mountain of paperwork for banks and insurers,adversely affecting day-to-day business and increasing inefficiency
Did Lehman collapse in a data void?
Since many of the world’s biggest economies were brought to the brink by the shocking collapse of Lehman Brothers in 2008(3), the focus has been on addressing the cracks in the global financial architecture.
Some of the deepest fissures were caused by gaps in data. Not being able to identify the scale of exposure to Lehman Brothers or its affiliated network, traders panicked and pulled out of positions that may well have been sound. Lehman Brothers wasn’t an isolated case.
All of a sudden, not being able to identify counterparty risk turned a bad situation into a catastrophic one. The crisis exposed the need for high-quality, comparable and timely data on the global financial network. Since then, policymakers, regulatory agencies and standard-setters in Europe have been collaborating to greater harmonize and standardize supervisory reporting for banks and insurance companies.
A milestone in this journey was reached in November 2014(4), when the European Central Bank (ECB) took over the supervision of about 130 of the eurozone’s largest financial institutions under the Single Supervisory Mechanism (SSM).
In regulatory circles, this event has been hailed as a major breakthrough. Mario Draghi, ECB president, described the SSM as an ‘absolute necessity’ at the seventh ECB Statistics Conference in October 2014(5).
Outdated methodologies are not equipped to cope
The SSM is a step in the right direction, with common rules helping to monitor risk more effectively. The stress tests carried out in 2014 on 123 major European banks by the European Banking Authority (EBA)(6), the EU bank watchdog, were lauded as an example of what can be achieved using these common methodologies.
The exercise was designed to provide supervisors, market participants and institutions with consistent data to compare and contrast EU banks’ resilience under adverse market conditions. Stress tests are likely to become a fixture in the already demanding reporting landscape – with additional compliance work for banks under the watch of the EBA.
European insurers are also battling their own regulatory reporting demons with the EU’s pensions and insurance regulator, the European Insurance and Occupational Pensions Authority (EIOPA), conducting its own stress tests in 2014(7).
The EIOPA exercise took six months to complete; the regulator issued nine sets of questions, each with their own reporting template. With the new regulatory framework, Solvency II, set to be introduced in January 2016, even more scrutiny is on its way.
EIOPA is also preparing for the implementation of Pillar 3 of the Solvency II regime(8), where regulatory reporting is even more complex than the submissions required by the EBA. Asking for large numbers of so-called quantitative reporting templates (QRTs) will set a new benchmark in the volume of data that can be collected using outdated methodologies and technologies.
It is right that supervisors should be able to demand data more frequently and with more granularity from organisations that could represent systemic risk, but the current framework of reporting is becoming increasingly costly and timeconsuming for data providers.
Austrian reporting model turns the tables on historic approach
So we come to the radical solution being adopted in Austria, where the regulator and the regulated joined forces to turn the tables on the template-driven model and use new technologies to create a new regulatory value chain(9). The initiative is based on greater harmonisation and integration of data within banks as well as greater integration of the IT systems of the supervisory authority and the supervised entities.
The way it works is through a buffer company, called Austrian Reporting Services GmbH (AuRep), which is co-owned by seven of the largest Austrian banking groups, representing 87% of the market(10). This allows cost-sharing of compliance, as well as standardisation of data collection.
AuRep runs on BearingPoint ABACUS, a common software platform, which works as the central interface between the banks and the OeNB(11). Granular bank data sets are captured automatically for supervisors to interrogate in whichever way they want, whilst the banks retain control over their commercially sensitive data, maintaining only the so-called ‘passive data interface’ on the AuRep platform.
The operating system solves the problems with the status quo, which Dr. Johannes Turner, Director, Directorate General of Statistics, OeNB outlined at the Euro Finance Week conference in November 2014(12). ‘You take a simple, plain vanilla loan and you have to report it five times,’ he said. ‘Different departments within the bank will be required to provide the same data to the regulator at different times.’
Austria’s new framework has the potential to succeed in clearing the information bottleneck. It represents a paradigm shift in bank supervision and statistical data remittance, finally putting an end to the delays associated with requests and formatting, and allowing greater reconciliation between numbers collected for various purposes.
THE AUSTRIAN MODEL
?The Austrian model is a data-input approach – each regulated entity prepares its data in a standard format in a series of basic ‘datacubes’ as prescribed by the national central bank, the Oesterreichische Nationalbank (OeNB), defined by business type, such as mortgages or business loans.
?The granular data stored in the cubes is the same data the supervisory authority recommends as input so it should be separated from the commercially sensitive operations of the bank.
?Direct interrogation of basic cubes by the supervisor would raise concerns over confidentially; data protection closeness of the supervisor to day-to-day operations of banks and other regulated entities.
?The Austrian reporting company (AuRep) is a buffer between the supervisor and the banks. The basic cubes are uploaded to AuRep and transformed into a series of smart cubes formatted to the OeNB’s remittance requirements.
?These cubes are a single interface with the supervisor. The data AuRep receives is in a standard format, so a change in required data needs a single coordinated update to all members.
?Ad hoc data requests do not require the completion of multiple templates but can be gathered from the data uploaded from the basic datacubes, then input into a smartcube by AuRep, which forms the supervisor’s dataset.
Mario Draghi, ECB, said in a speech in 2014: ‘It is one thing to have information, which, like blood, flows through the veins of the system. It is another to ensure that everything beats at the same rhythm and all organs in the body get all they need from the same single flow.(13)
With Italy being a poster child for this ‘input approach’ for 30 years and now Austria’s innovative solution it is really getting supervisory authority officials enthused. With the ECB being tasked with dealing with statistical and supervisory data of the Euro banking sector there is a strong need for consistency, innovation and smarter way of approaching regulatory reporting.
In an exclusive interview with the BearingPoint Institute(14), Dr. Johannes Turner said that the Austrian model ‘ensures consistent, and highly qualitative data’, whilst ‘reducing the amount of checking we have to do… The big win for the banks is that they are not burdened with the problem of completing templates on many different topics’.
In a sign that Mario Draghi is aware of the limitations of the template approach, he introduced the seventh ECB Statistics conference in 2014 saying: ‘Data integration on the side of the ECB and the other authorities only comes at the end of a dataproduction process, the first input of which is in the internal systems of the banks.(15)
THE SINGLE SUPERVISORY MECHANISM (SSM)
The SSM is the first – and most important – of the two ‘pillars’ of reform aimed at making huge government bailouts a thing of the past, at least among banks in the eurozone. Its main aims (18) are to:
? ensure the safety and soundness of the European banking system
? increase financial integration and stability
? ensure consistent supervision
It has been described as a Herculean task, requiring about 1,000 new staff at the ECB(19). The SSM, along with the Single Resolution Mechanism (SRM), will manage the processes around rescuing a troubled bank or bank failure should the need arise. It is expected to be introduced in 2016.
With the arrival of the SSM, the ECB combines supervisory and statistical data collection under one roof. This in turn could herald the introduction of a European input approach to regulatory reporting based on the prominent examples operated in Austria and Italy.
The ECB already kickstarted the discussion under the so-called European Reporting Framework (ERF). The ERF consists of a Banking Data Dictionary (BBD), similar to the Austrian basic cube, and a Statistical Data Dictionary (SDD) harmonising reporting requirements from various domains on the output side.
Enthusiasm for change is tempered with caution
Amongst regulators there seems to be a general acknowledgement that better insight is needed, although there is not yet universal agreement about how this information should be gathered. Also speaking exclusively with the BearingPoint Institute(16), Patrick Hoedjes, Head of Oversight and Operations at EIOPA, agreed that transparency was ‘far from where it should be’.
He said: ‘We can see from the financial crisis how much impact the financial services sector has if it doesn’t perform well. It cuts deep into society, so we need to raise our game. We still don’t know where we would be if another Lehman Brothers happened. That has to be a key objective for 2020, and better data will help towards that.(17)
For many regulators, the data input approach offers a way to increase consistency and quality of data as well as transparency, which is very much on their post-crisis agenda. Some regulators go even further.
In her concluding speech at the ECB Conference on Statistics in 2014, Danièle Nouy, Chair of the Supervisory Board of the SSM, said: ‘Integration, harmonisation and standardisation are necessary conditions, although not sufficient for achieving a fully satisfactory degree of transparency for the banking system. We also need to properly disseminate and communicate the data. In that sense, creating a common repository (‘European Hub’) for publicly available data could be a relatively simple task with a very important and positive impact.(20)
Ms Nouy also addressed the central preoccupation of regulators, policymakers and society; to help prevent future financial crises – or at least make them less likely. She highlighted the benefits that data input could bring: ‘I cannot promise that the ECB can once and for all eliminate the risk of another financial crisis. But the ECB is equipped to minimise this risk, and statistics play a crucial role here. Remember that the inability to correctly measure and analyse the risks associated [with] banking activity was one of the reasons [for] the current financial crisis. Developing and communicating accurate and timely statistics is essential for avoiding the repetition of this failure in the future.(21)
However, for this model to work, buy-in must go beyond the central bankers. Wide cooperation would be needed from the market.
Incentives, including liberation from a labour- and time-intensive process of repeated reformatting of data points seem clear. However, discussions with industry bodies in the banking and insurance sector and their comments at the Euro Finance Week conference in 2014 suggest that, whilst momentum for change is gathering, the mood is still cautious.
Speaking at the conference, Adam Farkas, Executive Director, European Banking Authority (EBA), said the Austrian model was producing ‘nice’ data, but cautioned that there was still more work to be done before regulators embraced this approach with confidence. ‘The compromise is there and the incentive is there but there is no detailed, instructive prescription to an individual bank as to how it should report.’
He added that the large-scale move to digital to produce granular data had to be driven by the banks. ‘Market players do not like regulators telling them what IT solutions to use.’
Also at the Euro Finance Week conference, Robert Priester, Deputy Chief Executive of the EBF, said that European banks are very interested in tackling the problems of an out-of-date and cumbersome reporting methodology, that ‘is not working in the current state of IT systems(22).
He suggested that this made the Austrian model worth exploring: ‘Within the EBF it has produced a very prominent echo,’ he said, but remained vague about his support. ‘We all agree on data integration,’ he added. ‘The question is how to do that.’
European banks and insurers can be proactive in initiating change:
? Define clear boundaries between data for internal management and data for supervisory reporting, exploiting synergies between banks’ internal risk management and supervision, with respect to the required data
? Initiate national market discussions about creating an Austrian-style reporting company to collect the data and interface with the regulator
? Consider common, shared reporting software so supervisory changes are made once and automatically implemented among all participating entities. BearingPoint’s ABACUS/GMP platform provides one such solution
? Through representative bodies, explore the data protection and competition issues surrounding greater data sharing and establish clear policies
? Engage all key stakeholders, including regulators and governments, in the discussions about greater data transparency and the contribution that can make to financial stability
Regulators must not delay in the creation of new fit-for-purpose reporting models:
? Establish common data standards as a precursor to moving from the template-driven data output model to a granular data input approach
? Initiate national market discussions about creating an Austrian-style reporting company to collect the data and interface with the local market
Where next for regulatory reporting?
It is clear the tectonic plates that have been shifting under the regulatory reporting landscape in Europe have not yet settled.
Only a few years ago, banks and insurance companies were obliged to report once a year, using paper forms with a six-month remittance period. In just a short time, the changes have been enormous. As demands for disaggregated and more complex data have risen, templates have increased from a handful to hundreds at a time, and fields for entry have grown, too.
As policymakers and regulators seek a more timely systemic risk assessment through reporting harmonisation and standardisation, they turn up the dial on reporting frequency. Time-to-report has been shortened from months to weeks and data requirements are increasingly granular and comparable.
Does this mean we are facing the prospect of real-time reporting? We don’t think so – at least in the medium term. Against the imperative to build an up-to-date and accurate regulatory picture to assess systemic risks, a move closer to real-time reporting would require a complete overhaul of the regulatory framework. Finance and risk departments predominantly work on quarterly or month-end computations, and a move to real-time reporting would mean moving them to weekly or day-end activities. Such an approach would not be economically viable, but it is also questionable whether regulators are equipped to deal with the consequences, which would see them forced to deal with mountains of raw data, rather than the qualitative information banks and insurance companies provide today. The question remains as to the added value such an approach would bring.
Andreas Ittner, Vice Governor at the OeNB, told the audience at the ECB Statistics Conference in 2014 that the big challenge for supervisors with regard to collecting supervisory statistics is how it fits together with real-time market data and early quarterly reporting by banking groups. He said: ‘Relevant supervisory reporting … has to come up with a means to stagger deadlines in order not to become irrelevant at the shorter end, whilst allowing for completeness and consistency at the longer(23).
To borrow an expression from Patrick Hoedjes, real-time reporting would not cater for completeness at the longer end because it would be the equivalent of a SWAT team raiding a dark room without knowing what they are looking for. Without predefining the substance and the goals of the data for which they are searching, the exercise loses value and the risk is simply that the burden slides from industry to the regulator, which doesn’t ultimately solve the problem.
On the other end of the spectrum, the document-orientated approach does not satisfy the requirements for relevance at the shorter end, and will hamper the drive for more up-to-date regulatory feeds. Old habits die hard; considerable investments have been poured into the current model over the past few years. Like running a second-hand car, there is a point in time when maintenance costs overtake residual values and the first serious fault can be a signal for buying a brand new vehicle.
The current regulatory reporting approach is making it harder to respond effectively with the tight data quality and frequency required to meet the goal of more stringent supervision: to prevent another global financial crisis. Going forward, regulators and industry must agree on a sensible ‘demarcation line’ in the supervisory and statistical data exchange, to reduce the reporting burden for industry whilst improving the transparency of the data in question.
Italy has been doing this successfully for decades under the radar, and now Austria is following suit, although it is early days. With the ECB now looking into the data input approach to manage the mammoth task of supervising the eurozone’s most important banks, it could be that the regulatory value chain in all member countries is ready to explore new and easier terrain.
? Tectonic plates are shifting in the regulatory reporting landscape. Regulators are moving closer toward regional standardisation and harmonisation of practices, to better understand in a timely manner complex economic relations and developments
? Supervisors are also demanding increasing reporting frequencies and shortening time to report, as well as more granular information than in the past. This is for insurance companies as well as banks
? Despite these huge changes and impending changes in compliance, little has changed in the methodology of regulatory reporting, which is still reliant on the document-oriented approach. This is intrinsically time-consuming, costly and complex
? There is wide recognition that this model is unsustainable, but old habits are difficult to break
? Austria is one country pioneering a new approach, where regulatory reporting will be based on granular data input
? In Austria, bank data will be automatically uploaded in datacubes housed by a company that acts as a buffer between the banks and the supervisor. In this way, the supervisor will immediately be able to stress-test without imposing any compliance demands on the industry
? The trend for more up-to-date regulatory data feeds will continue, but it is unlikely, unnecessary and actually not possible under the current framework that we will ever see real-time reporting
? The ideal is for statistical and supervisory data to be relevant at the short end but, just as important, to be complete at the long end
? The key to a successful regulatory supervision model of the future will depend on supervisors predefining the substance and the goals of the data in the first instance
1) EU banks shrink assets by $1.1 trillion as capital ratios rise’, Bloomberg, New York, NY, USA, web, Ben Moshinsky, 17/12/03, http://bloom.bg/1AOcuye
2) Welcome and opening keynote: Risk Management Konferenz II’, Euro Finance Week, web, conference, Christian Clausen, 18/11/14, programme, PDF, http://bit.
3) 6 years later, 7 lessons from Lehman’s collapse’, Time, New York, NY, USA, web, Ian Salisbury and Paul J. Lim, 15/09/14, http://ti.me/1L7llfJ
4) Towards a banking union: the state of play from the ECB’s perspective’, European Central Bank, Frankfurt am Main, Germany, web, speech transcript, Jorg
Asmussen at Banks in Transition: Handelsblatt Conference, Frankfurt am Main, Germany, 04/09/13, http://bit.ly/1L7ns39
5) Towards the banking union: opportunities and challenges for statistics’, European Central Bank, Frankfurt am Main, Germany, web, speech transcript, Mario
Draghi at 7th ECB Statistics Conference, Frankfurt am Main, Germany, 15/10/14, http://bit.ly/1A39e1v
6) EBA publishes 2014 EU-wide stress test results’, European Banking Authority, London, UK, web, press release, Franca Rosa Congiu, 28/10/14, http://bit.ly/1ySB1N9
7) EIOPA announces results of the EU-wide insurance stress test 2014’, EIOPA, Frankfurt am Main, Germany, web, press release, Anzhelika Mayer, 30/11/14, http://
8) Solvency II Pillar 3: Unravelling the technology’, Actuarial Post, Tonbridge, Kent, UK, web, Marc Fakkel and Graham Robertson, 20/03/14, http://bit.ly/17eJuDW
9) Details from BearingPoint contributors, Austrian Reporting Services GmbH (AuRep) website under construction: http://www.aurep.at/
10) ABACUS Solvency II’, BearingPoint, London, UK, web, landing page, Andreas Rindler and Matthais Hohne, http://bit.ly/17O6Jpf
11) Digitalisation, standardisation and harmonisation of regulatory reporting in Europe: Risk Management Konferenz II‘, Euro
Finance Week, web, conference discussion, Dr Johannes Turner, 18/11/14, programme, PDF, http://bit.ly/1zlsGmn
12) Towards the banking union: opportunities and challenges for statistics’, European Central Bank, Frankfurt am Main, Germany, web, speech transcript, Mario
Draghi at 7th ECB Statistics Conference, Frankfurt am Main, Germany, 15/10/14, http://bit.ly/1A39e1v
13) Directorate General for Financial Supervision and Regulation’, Banca d’Italia, Rome, Italy, web, landing page, http://bit.ly/1JnSyYL
14) Towards the banking union: opportunities and challenges for statistics’, European Central Bank, Frankfurt am Main, Germany, web, speech transcript, Mario
Draghi at 7th ECB Statistics Conference, Frankfurt am Main, Germany, 15/10/14, http://bit.ly/1A39e1v
15) Towards a banking union: the state of play from the ECB’s perspective’, European Central Bank, Frankfurt am Main, Germany, web, speech transcript, Jorg
Asmussen at Banks in Transition: Handel