A POV on Bank Stress Testing – CCAR & DFAST..

The recession of 2007 to 2009 was still the most painful since the Depression. At its depths, $15 trillion in household wealth had disappeared, ravaging the pensions and college funds of Americans who had thought their money was in good hands. Nearly 9 million workers lost jobs; 9 million people slipped below the poverty line; 5 million homeowners lost homes.”
― Timothy F. Geithner, Former Secretary of the US Treasury – “Reflections on Financial crises – 2014”

A Quick Introduction to Macroeconomic Stress Testing..

The concept of stress testing in banking is not entirely new. It has been practiced for years in global banks across specific business functions that deal with risk. The goal of these internal tests has been to assess firm wide capital adequacy in periods of economic stress. However,the 2008 financial crisis clearly exposed how unprepared the Bank Holding Companies (BHCs) were to systemic risk brought on as a result of severe macroeconomic distress. Thus the current raft of regulator driven stress tests are motivated from the taxpayer funded bailouts in 2008. Back then banks were neither adequately capitalized to cope with stressed economic conditions nor were their market,credit risk losses across portfolios sustainable.

In 2007, SCAP (Supervisory Capital Access Program) was enacted as a stress testing framework in the US that only 19 leading financial institutions (Banks, Insurers etc) had to adhere to. The exercise was not only focused on the quantity of capital available but also the quality- Tier 1 common capital – with the institution. The emphasis on Tier 1 Common Capital is important as it provided an institution with a higher absorption capacity with minimizing losses to higher capital tiers.  Tier 1 Common Capital can also be managed better during economic stress by adjusting dividends, share buybacks and related activities.

Though it was a one off, the SCAP was a stringent and rigorous test. The Fed performed SCAP audits on the results of all the 19 BHC’s – some of whom failed the test.

Following this in 2010, the Dodd Frank Act was enacted by the Obama Administration.The Dodd Frank Act also introduced it’s own stress test – DFAST (Dodd-Frank Act Stress Testing). DFAST requires BHCs with assets of $10 billion & above to run annual stress tests and to make the results public. The goal of these stress tests is multifold but they are conducted primarily to assure the public, the regulators that BHCs have adequately capitalized their portfolios. BHC’s are required to present detailed capital plans to the Fed.

The SCAP’s successor, CCAR (Comprehensive Capital Adequacy Review) was also enacted around that time. Depending on the overall risk profile of the institution, the CCAR mandates several qualitative & quantitative metrics that BHCs need to report on and make public for several stressed macroeconomic scenarios.


Comprehensive Capital Analysis and Review (CCAR) is a regulatory framework introduced by the Federal Reserve in order to assess, regulate, and supervise large banks and financial institutions – collectively referred to in the framework as Bank Holding Companies (BHCs).
– (WikiPedia)

  • Every year, an increasing number of Tier 2 banks come under the CCAR mandate. CCAR basically requires specific BHCs to develop a set of internal macroeconomic scenarios or use those developed by the regulators. Regulators would then get the individual results of these scenario runs from firms across a nine quarter time horizon. Regulators also develop their own systemic stress tests to verify if a given BHC can withstand negative economic scenarios and continue to operate their lending operations. CCAR coverage primarily includes retail banking operations, auto & home lending, trading, counter party credit risk, AFS (Available For Sale)/HTM (Hold To Maturity) securities etc. The CCAR covers all major kinds of risk – market, credit, liquidity and OpsRisk.
CCAR kicked off global moves by regulators to enforce the same of banks in their respective jurisdictions. The EU requires EBA stress testing. The UK is an example of a country that requires its own stress testing – the Prudential Regulatory Authority. The same evolution of the firm wide stress testing has been followed by other regulators over the world, for example, in Europe with the EBA stress testing. Emerging markets such as India and China are also following this trend. Every year, more and more BHCs are increasingly subject to CCAR reporting mandates.

Similarities & Differences between CCAR and DFAST..

To restate – the CCAR is an annual exercise by the Federal Reserve to assess whether the largest bank holding companies operating in the United States have sufficient capital to continue operations throughout times of economic and financial stress and that they have robust, forward-looking capital-planning processes that account for their unique risks.  As part of this exercise, the Federal Reserve evaluates institutions’ capital adequacy, internal capital adequacy assessment processes, and their individual plans to make capital distributions, such as dividend payments or stock repurchases. Dodd-Frank Act stress testing (DFAST)-an exercise similar to CCAR- is a forward-looking stress test conducted by the Federal Reserve for smaller financial institutions. It is supervised by the Federal Reserve to help assess whether institutions have sufficient capital to absorb losses and support operations during adverse economic conditions.

As part of CCAR reporting guidelines, the BHC’s have to explicitly call out

  1. their sources of capital given their risk profile & breadth of operations,
  2. the internal policies & controls for measuring capital adequacy &
  3. any upcoming business decisions (share buybacks, dividends etc) that may impact their capital adequacy plans.

While both CCAR and DFAST look very similar from a high level  – they both mandate that banks  conduct stress tests – they do differ in the details. DFAST is applicable to banks that have assets between 10-50 billion $. During the planning horizon phase, CCAR allows the BHCs to use their own capital action assessments while DFAST enforces a standardized set of capital actions.The DFAST scenarios represent baseline, adverse and severely adverse scenarios. The DFAST is supervised by the Fed, the OCC (Office of the Comptroller of Currency) and the FDIC.

                                                Summary of DFAST and CCAR (Source: E&Y) 

As can be seen from the above table, while DFAST is complementary to CCAR, both efforts are distinct testing exercises that rely on similar processes, data, supervisory exercises, and requirements. The Federal Reserve coordinates these processes to reduce duplicative requirements and to minimize regulatory burden. CCAR results are reported twice on an annual basis and BHCs are required to also incorporate Basel III capital ratios in their reports with Tier 1 capital ratios calculated using existing rules. DFAST is reported up annually and it does include Basel III reporting.

In a Nutshell…

In CCAR (and DFAST), the Fed is essentially asking the BHC’s the following questions –

(1) For your defined risk profile, please define a process of understanding and mapping the key stakeholders to carry out this process.

(2) Please ensure that you use clean internal data to compute your exposures in the event of economic stress. The entire process of data sourcing, cleaning, computation, analytics & reporting needs to be auditable.

(3) What macroeconomic stress scenarios did you develop in working with key lines of business ? What are the key historical assumptions in developing these? What are the key what-if scenarios that you have developed based on the stressed scenarios? The scenarios need to be auditable as well.

(4) We are then going to run our own macroeconomic numbers & run our own scenarios using our own exposure generators on your raw data.

(5) We want to see how close both sets of numbers are.

Both CCAR and DFAST scenarios are expressed in stressed macroeconomic factors and financial indicators. The regulators typically provide these figures on a quarterly basis a few reporting periods in advance.

What are some examples of these scenarios?
  • Measures of Index Turbulence – E.g. In a certain quarter, regulators might establish that the S&P 500 would go down 30%; Decrease in key indices like home, commercial property & other asset prices.
  • Measures of  Economic Activity – E.g. US unemployment rate spikes, higher interest rates, increased inflation. What if unemployment ran to 14%? What does that do to my mortgage portfolio – the default rates increase and this is what it looks like.
  • Measures of Interest Rate Turbulence –  E.g. US treasury yields, Interest rates on US mortgages etc.

Based on this information, banks would then assess the impact of these economic scenarios as reflected in market and credit losses to their portfolios. This would help them estimate how their capital base would behave in this situation. These internal CCAR metrics are then sent over to the regulators. Every Bank has their own models based on their understanding which the Fed needs to review as well for completeness and quality.

The Fed uses the CCAR and DFAST results to evaluate capital adequacy, the quality of the capital adequacy assessment process and then evaluates the BHC’s plans to make capital distributions using dividends & share repurchases etc in the context of the results. The BHC’s boards of directors are required to approve and sign off on these plans.

What do CCAR & DFAST entail of Banks?

Well, six important things as the above illustration captures –

    1. CCAR is fundamentally very different from other umbrella risk types in that it has a strong external component in terms of reporting on internal bank data to the regulatory authorities. CCAR reporting is done by sending over internal bank Book of Record Transaction (BORT) data from their lending systems (with hundreds of manual adjustments) to the regulators for them to run their models to assess capital adequacy.  Currently , most banks do some model reporting internally which are based on canned CCAR algos in tools like SAS/Spark computed for a few macroeconomic stress scenarios.
    2. Both CCAR and DFAST stress the same business processes, data resources and governance mechanisms. They are both a significant ask on the BHCs from the standpoint of planning, execution and governance. BHCs have found them daunting with the new D-SIB’s that enter the mandate are faced with implementing these programs that need significant organizational and IT spend.
    3. Both CCAR and DFAST challenge the banks on data collection, quality, lineage and reporting. The Fed requires that data needs to be accurate, comprehensive and clean. Data Quality is the single biggest challenge to stress test compliance. Banks need to work on a range of BORT (Book of Record Transaction Systems) like Core Banking, Lending Portfolios, Position data and any other data needed to accurate reflect the business. There is also a reconciliation process that is typically used to reconcile risk data with the GL (General Ledger). For instance if a BHC’s lending portfolio is $4 billion based on the raw summary data. Once reconciliation is performed – it seems to be around $3 billion after adjustments. The regulator runs the aforesaid macroeconomic scenarios at $4 billion and the exposures are naturally off.
    4. Contrary to popular perception -the heavy lifting from is typically not in creating and running the exposure calculations for stress testing. The creation of these is relatively straightforward. Banks historically have had their own analytics groups produce these macroeconomic models. They also already have 10s of libraries in place that can be modified to create supervisory scenarios for CCAR/DFAST- the baseline, adverse & severely adverse. The critical difference with stress testing is that silo-ed models and scenarios need be unified along with the data.
    5. Model development in Banks usually follows a well defined lifecycle.Most of Liquidity Assessment and Liquidity Groups within Banks currently have a good base of quants with a clean separation of job duties. For instance, while one group produces scenarios, others work on exposures that feed into liquidity engines to calculate liquidity. The teams running these liquidity assessments are good candidates to run the CCAR/DFAST models as well. The calculators themselves will need to be rewritten for Big Data using something like SAS/ Spark.
    6. Transparency must be demonstrated down to the source data level. And banks need to be able to document all capital classification and computation rules to a sufficient degree to meet regulatory requirements during the auditing and review process.

The Technology Implications of  CCAR/DFAST..

It can clearly be seen that regulatory stress testing derives inputs from virtually every banking function. Then it should come as no surprise that  it follows that from a technology point of view there are several implications :

    • CCAR and DFAST impact a range of systems, processes and controls. The challenges that most Banks have in integrating front office trading desk data (Position data, pricing data and reporting) with back-office systems –  risk & finance are making the job of accurately reporting on stress numbers all the more difficult. These are causing most BHC’s to resort to manual data operations, analytics and complicated reconciliation process across the front, mid and back offices.
    • Not just from a computation & reporting library standardization, banks need to be able to perform common data storage for data from a range of BORT systems.
    • Banks also need to standardize on data taxonomies across all of these systems.
    • To that end, Banks need to stop creating more silos data across Risk and Finance functions; as I have often advocated in this blog – a move to a Data Lake enabled architecture is appropriate as a way of eliminating silos and the problem of unclean data which is sure to invite regulatory sanction.
    • Banks need to focus on Data Cleanliness by setting appropriate governance and audit-ability policies
    • Move to a paradigm of bringing compute to large datasets instead of the other way around
    • Move towards in memory analytics to transform, aggregate and analyze data in real time across many dimensions to obtain an understanding of the banks risk profile at any given point in time

A Reference Architecture for CCAR and DFAST..

 I recommend readers review the below post on FRTB Architecture as it contains core architectural and IT themes that are broadly applicable to CCAR and DFAST as well.

A Reference Architecture for the FRTB (Fundamental Review of the Trading Book)

Conclusion..

As can be seen from the above, both CCAR & DFAST require a holistic approach across the value chain (model development, data sourcing, reporting) across Risk, Finance and Treasury functions.  Further Regulators are increasingly demanding an automated process across risk & capital calculations under various scenarios using accurate and consistent data. The need of the hour for BHCs is to move to a common model for data storage, stress modeling and testing. Doing this can only ensure that the metrics and outputs of capital adequacy can be produced accurately and in a timely manner, thus satisfying the regulatory mandate.

References –

[1] Federal Reserve CCAR Summary Instructions 2016

https://www.federalreserve.gov/newsevents/press/bcreg/bcreg20160128a1.pdf

Why the Insurance Industry Needs to Learn from Banking’s Risk Management Nightmares..

risk_management_montage

(Image Credit – ENC Consulting)

Why Systemic Financial Crises Are a Broad Failure of Risk Management…

Various posts in this blog have catalogued the practice of  risk management in the financial services industry. To recap briefly, the Great Financial Crisis (GFC) of 2008 was a systemic failure that brought about large scale banking losses across the globe. Considered by many economists to be the worst economic crisis since the Great Depression [1], it not only precipitated the collapse of large financial institutions across the globe but also triggered the onset of sovereign debt crises across Greece, Iceland et al.

Years of deregulation & securitization (a form of risk transfer) combined with expansionary monetary policy during the Greenspan years, in the United States, led to the unprecedented availability of easy consumer credit in lines such as mortgages, credit cards and auto. The loosening of lending standards led to the rise of Subprime Mortgages which were often underwritten using fraudulent practices. Investment Banks were only too happy to create mortgage backed securities (MBS) which were repackaged and sold across the globe to willing institutional investors. Misplaced financial incentives in banking were also a key cause of this mindless financial innovation.

The health of entire global high finance thus rested on the ability of the US consumer to make regular payments on their debt obligations – especially  on their mortgages. However, artificially inflated housing prices began to decline in 2004 and the rate of refinancing dropped, the rate of foreclosures assumed mammoth proportions. Global investors begin to thus suffer significant losses. The crisis assumed the form of a severe liquidity crunch leading to a crisis of confidence among counter parties in the financial system.

Global & National Regulatory Authorities had to step in to conduct massive bailouts of banks. Yet stock markets suffered severe losses as housing markets collapsed causing a large crisis of confidence. Central Banks & Federal Governments responded with massive monetary & fiscal policy stimulus thus yet again crossing the line of Moral Hazard.Risk Management practices in 2008 were clearly inadequate at multiple levels ranging from department to firm to regulatory levels. The point is well made that the while the risks that individual banks ran were seemingly rational on an individual level however taken as a whole, the collective position was irrational & unsustainable. This failure to account for the complex global financial system was reflected across the chain of risk data aggregation, modeling & measurement.

 The Experience Shows That Risk Management Is A Complex Business & Technology Undertaking…

What makes Risk Management a complex job is the nature of Global Banking circa 2016?

Banks today are complex entities engaged in many kinds of activities. The major ones include –

  • Retail Banking – Providing cookie cutter financial services ranging from collecting customer deposits, providing consumer loans, issuing credit cards etc. A POV on Retail Banking at – http://www.vamsitalkstech.com/?p=2323 
  • Commercial Banking –  Banks provide companies with a range of products ranging from business loans, depository services to other financial investments.
  • Capital Markets  – Capital Markets groups provide underwriting services & trading services that engineer custom derivative trades for institutional clients (typically Hedge Funds, Mutual Funds, Corporations, Governments and high net worth individuals and Trusts) as well as for their own treasury group.  They may also do proprietary trading on the banks behalf for a profit – although it is this type of trading that Volcker Rule is seeking to eliminate. A POV on Retail Banking at- http://www.vamsitalkstech.com/?p=2175
  • Wealth Management – Wealth Management provide personal investment management, financial advisory, and planning disciplines directly for the benefit of high-net-worth (HNWI) clients. A POV on Wealth Management at – http://www.vamsitalkstech.com/?p=1447

Firstly, Banks have huge loan portfolios across all of the above areas (each with varying default rates) such as home mortgages, credit loans, commercial loans etc . In the Capital Markets space, a Bank’s book of financial assets gets more complex due to the web of counter-parties across the globe across a range of complex assets such as derivatives. Complex assets mean complex mathematical models that calculate risk exposures across many kinds of risk. These complex models for the most part did not take tail risk and wider systemic risk into account while managing risk.

Secondly, the fact that markets turn in unison during periods of (downward) volatility – which ends up endangering the entire system. Finally, complex and poorly understood financial instruments in the derivatives market had made it easy for Banks to take on highly leveraged positions which placed their own firms & counter parties at downside risk. These models were entirely dependent on predictable historical data which never modeled “black swan” events. That means while the math may have been complex, it never took on sophisticated scenario analysis into account.

Regulatory Guidelines ranging from Basel III to Dodd Frank to MiFiD II to the FRTB (the new kid on the regulatory block) have been put in place by international and national regulators post 2008. The overarching goal being to prevent a repeat of the GFC where taxpayers funded bailouts for managers of a firm – who profit immensely on the upside.

These Regulatory mandates & pressures have begun driving up Risk and Compliance expenditures to unprecedented levels. The Basel Committee guidelines on risk data reporting & aggregation (RDA), Dodd Frank, Volcker Rule as well as regulatory capital adequacy legislation such as CCAR are causing a retooling of existing risk regimens. The Volcker Rule prohibits banks from trading on their own account (proprietary trading) & greatly curtails their investments in hedge funds. The regulatory intent is to avoid banker speculation with retail funds which are insured by the FDIC. Banks have to thus certify across their large portfolios of positions as to which trades have been entered for speculative purposes versus hedging purposes.

The impact of the Volcker Rule has been to shrink margins in the Capital Markets space as business moves to a a flow based trading model that relies less on proprietary trading and more on managing trading for clients. At the same time risk management gets more realtime in key areas such as  market, credit and liquidity risks.

A POV on FRTB is at the below link.

A POV on the FRTB (Fundamental Review of the Trading Book)…

Interestingly enough one of the key players in the GFC was AIG –  an insurance company with a division – FP (Financial Products)- that really operated like a Hedge Fund by looking to insure downside risk it never thought it needed to payout on. 

Which Leads Us to the Insurance Industry…

For the most part of their long existence, insurance companies were relatively boring – they essentially provided protection against adverse events such as loss of property, life & health risks. The consumer of insurance products is a policyholder who makes regular payments called premiums to cover themselves. The major lines of insurance business can be classified into life insurance, non-life insurance and health insurance. Non life insurance is also termed P&C (Property and Casualty) Insurance. While insurers collect premiums, they invest these funds in relatively safer areas such as corporate bonds etc.

Risks In the Insurance Industry & Solvency II…

While the business model in insurance is essentially inverted & more predictable as compared to banking, insurers have to grapple with the risk of ensuring that enough reserves have been set aside for payouts to policyholder claims.  It is very important for them to have a diversified investment portfolio as well as ensure that profitability does not suffer due to defaults on these investments. Thus firms need to ensure that their investments are diverse – both from a sector as well as from a geographical exposure standpoint.

Firms thus need to constantly calculate and monitor their liquidity positions & risks. Further, insurers are constantly entering into agreements with banks and reinsurance companies – which also exposes them to counterparty credit risk.

From a global standpoint, it is interesting that US based insurance firms are largely regulated at the state level while non-US firms are regulated at the federal level. The point is well made that insurance firms have had a culture of running a range of departmentalized analytics as compared to the larger scale analytics that the Banks described above need to run.

In the European Union, all 27 member countries (including the United Kingdom) are expected to adhere to Solvency II [2] from 2016. Solvency II replaced the long standing Solvency I – which only calculates capital for underwriting risk.

Whereas Solvency I calculates capital only for underwriting risks, Solvency II is quite similar to Basel II – discussed below – and imposes guidelines for insurers to calculate investment as well as operational risks.

Towards better Risk Management..Basel III

There are three pillars to Solvency II [2].

  • Pillar 1 sets out quantitative rules and is concerned with the calculation of capital requirements and the types of capital that are eligible.
  • Pillar 2 is concerned with the requirements for the overall insurer supervisory review process &  governance.
  • Pillar 3 focuses on disclosure and transparency requirements.

The three pillars are therefore analogous to the three pillars of Basel II.

Why Bad Data Practices will mean Poor Risk Management & higher Capital Requirements under Solvency II..

While a detailed discussion of Solvency II will follow in a later post, it imposes new  data aggregation, governance and measurement criteria on insurers –

  1. The need to identify, measure and offset risks across the enterprise and often in realtime
  2. Better governance of risks across not just historical data but also fresh data
  3. Running simulations that take in a wider scope of measures as opposed to a narrow spectrum of risks
  4. Timely and accurate Data Reporting

The same issues that hobble banks in the Data Landscape are sadly to be found in insurance as well.

The key challenges with current architectures –

  1.  A high degree of Data is duplicated from system to system leading to multiple inconsistencies at the summary as well as transaction levels. Because different groups perform different risk reporting functions (e.g Credit and Market Risk) – the feeds, the ingestion, the calculators end up being duplicated as well.
  2. Traditional Risk algorithms cannot scale with this explosion of data as well as the heterogeneity inherent in reporting across multiple kinds of risks as needed for Solvency II. E.g Certain kinds of Credit Risk need access to around years of historical data where one is looking at the probability of the counter-party defaulting & to obtain a statistical measure of the same. All of these analytics are highly computationally intensive.
  3. Risk Model and Analytic development needs to be standardized to reflect realities post Solvency II. Solvency II also implies that from an analytics standpoint, a large number of scenarios on a large volume of data. Most Insurers will need to standardize their analytic libraries across their various LOBs. If Banks do not look to move to an optimized data architecture, they will incur tens of millions of dollars in additional hardware spend.

Summary

We have briefly covered the origins of regulatory risk management in both banking and insurance. Though the respective business models vary across both verticals, there is a good degree of harmonization in the  regulatory progression.  The question is if insurers can learn from the bumpy experiences of their banking counterparts in the areas of risk data aggregation and measurement.

References..

[1] https://en.wikipedia.org/wiki/Financial_crisis_of_2007%E2%80%9308

[2] https://en.wikipedia.org/wiki/Solvency_II_Directive_2009