Risk accounting - a next generation risk - (ERM) Symposium

Loading...

Part 1

Allan D. Grody President, Financial Intergroup Holdings Ltd1

Peter J. Hughes

Risk accounting — a next generation risk management system for financial institutions

Managing Director ARC Best Practices Ltd; Visiting Research Fellow, the York Management School, University of York

Steven Toms Professor of Accounting and Finance; Head of the York Management School, University of York

Abstract The financial crisis has awoken financial service organizations to the reality that when financial transactions enter their operating environments they trigger real time risk exposures that can go well beyond nominal transaction values, capital charges, and other measures deemed appropriate for preventing unexpected losses. Traditional risk accounting approaches have caused lagging measures of risk to be recorded in much of managements’ traditional performance and risk reporting systems. Conventional financial and risk management systems are failing management and their boards due to their inability to measure, aggregate, and report risk exposures as they accumulate. In reaction to the current financial crisis the boards of many firms are assigning the additional task of oversight of management’s risk policies and guidelines to audit committees. Accountants are also being asked to discuss the

enterprise’s key risk exposures with management, including those that are beyond financial reporting related risks. The aim of this paper is to consider whether a more comprehensive and timely measurement framework for risk exposure is now needed and to examine one possible approach. The paper introduces a common unit of exposure measurement for a diverse set of business risks and demonstrates how nominal transaction values and relevant quantitative and qualitative risk metrics can be mapped to each transaction and used to calculate a risk-adjusted transaction value. The combination of conventional risk measures derived from the capital conventions mandated by the Basel Committee on Banking Supervision and this proposed risk exposure measurement framework provide the basis for the system of risk accounting described in this paper.

1 Allan D. Grody was also Emeritus Adjunct Professor, Leonard N. Stern Graduate School, Risk Management Systems; Partner (retired) and Founder, Coopers & Lybrand’s (now PWC’s) Financial Industry Consulting Practice; and Founding Board Member, Journal of Risk Management in Financial Institutions.

43

Risk accounting — a next generation risk management system for financial institutions

Regulators have always fostered an expectation that capital is what sustains banks in periods of stress and prevents them from failing. However, given the recent failures, bailouts, and nationalizations of some of the world’s leading financial institutions we should perhaps view regulatory capital as the measure by which banks countdown to failure rather than the system that proactively prevents it. So what offers a bank the greatest protection against failure if it is not capital? Quite simply, it is the risk culture embedded in its people and processes. And at the core of any risk culture are 1) the incentives for individual compensation that balance risk and return with short-term self interest and long-term stakeholder goals, and 2) the early warning systems that highlight growing exposures to risk. Here, Basel regulations for operational risk were designated for such things as model risk, fraud, control weaknesses, faulty product structures, process and control risk, inappropriate sales to counterparties, and business practices that lead to faulty incentive compensation schemes. The risks inherent in the myriad of such business level details went undetected, primarily because its implementation was left to last. Most firms had not implemented meaningful early warning systems for operational risk exposures. Indeed, it was actually pushed back by the industry’s leading risk managers with complacency demonstrated by their management and regulators. Stakeholders in the financial services industry have a right to expect that the new profession of risk management and the risk managers who practice it would, by their rapid elevation to the executive level ‘C suite’ in most financial firms, facilitate a risk culture in these enterprises. Thereafter, they would ensure that early warning systems are installed to highlight growing exposures to risk with the final purpose of presenting reliable and meaningful assessments of future losses. But this is what risk managers and conventional risk management systems have evidently not succeeded in doing. The current financial crisis can be linked to an inability to record and account for risk exposures in a timely manner. Indeed, recent failures of financial institutions provide some measure of the degree to which accumulating risk exposures escaped the exercising of business judgment simply because executive management, investors, auditors, and regulators were unaware of their existence on such a scale. The result is risk management systems and, consequently, financial statements that failed to report the life-threatening concentrations of risk exposures that had unknowingly accumulated in so many of our leading financial organizations around the globe. In recent testimony before Congress, Alan Greenspan acknowledged that he incorrectly assumed managers of financial firms were aligning their risk appetite to their shareholders’ interests. What he did not say, but implied, was that executives had aligned their appetite for risk to their own individual interests and that self imposed risk and accountability controls failed regulators.

44 – The 

 journal of financial transformation

Risk management experts have long been aware of the difficulties attached to the measurement and management of operational risks particularly where this relates to the translation of operational metrics. All the evolving risk measurement systems and generally accepted accounting conventions have been devoid of the ability to accommodate operational metrics into the risk reporting and aggregation methods used to oversee business performance. These have been presented of late as useful management tools in balanced scorecards, dashboards, six sigma measurements, and the like. Such operating metrics have always been open to interpretation by management against the results of the performance and reporting systems of their institutions as they did not have a naturally occurring monetary basis from which they could be extrapolated or transformed into risk valuation measurements of any kind. This problem was to be confronted at the time operational risk was offered as the third leg of the Basel capital regime, following credit risk and market risk, which described a framework in which discovery of new techniques for measuring such risk would be incented with lower capital charges. However, whether for lack of will, inability to communicate across business silos, or preoccupation with the earlier Basel pronouncements of externally focused market and credit risk, the industry pushed back without really trying. For example, when considering the application of the ‘use test’ applied to operational risk, one such expert group commented; “Operational Risk however is very different. The nature of Operational Risk is such that the direct linkage of measurement to management is difficult. This is partly due to the inherent difficulties in assessing the Operational Risk positions that a firm faces and how to measure these, but also because the risk profile of a firm does not change quickly, nor can changes to this profile be identified over a short time frame”2. Risk management experts were publicly airing their misgivings before the financial crisis broke. Such an example is the remarks presented in May 2007 by the Advanced Measurement Approach Group formed by leading U.S. banks under the auspices of the Risk Management Association. In response to the U.S. Joint Regulatory Agencies’ proposals for the supervision of operational risk under Basel II3 this group was dismissive of banks’ ability to discern direct relationships between a change in risk and future losses. In fact, they even dismissed the ability to meet Basel’s requirements to produce management reports to signal risk of future losses as not being capable of being met at this point in time or in the near future. They pointed out that “In many instances, operational risk factors that led to a particular event cannot be uniquely determined retrospectively, let alone detecting a change in factors that signals an increase in future losses”. Later this same group, commenting on the same issue, but doing so when the financial crisis was well under way, concluded that they 2 Operational risk corporate governance expert group, 2005, available at http://www. fsa.gov.uk/pubs/international/orsg_use_test.pdf (accessed April 9, 2009) 3 Federal Reserve, 2007, “Response by the Advanced Measurement Approach Group of the Risk Management Association to the proposed supervisory guidance for internal ratings-based systems for credit risk, advanced measurement approaches for operational risk, and the supervisory review process (pillar 2) related to Basel II implementation,” p. 8

Risk accounting — a next generation risk management system for financial institutions

struggle with the concept of a unit of measure at the operating level with sufficient granularity to be meaningful to the operational capital calculations, specifically in allowing for the determination of dependencies4. They also commented on the common method of performing a ‘risk and control self assessment’ (RCSA), which, while admittedly having limited use in capital estimation, is also inadequate to the task of affecting management’s ability to assess operational risk. They concluded that RCSA should be reinforced in this effort over its use in capital estimation5. Regulators expected that the provisioning of capital for extreme losses would sustain financial enterprises in periods of stress. Did they truly believe these capital rules would prevent financial institutions from failing? To be fair, they did expect to see the coincident evolution of a risk culture within these institutions along with the development of a risk exposure measurement system to capture key operating metrics that could affect its operational risk profile. Taken together, and with regulatory oversight, it was anticipated that the new risk regime would do just that — prevent failures, or at least give an early warning of pending doom. However, whether by abdication or by push back from the industry, or simply because there was not sufficient time to evolve in a natural way, we stopped the risk management process at capital provisioning. And we certainly failed in risk oversight. This paper proposes a new approach to risk measurement that, along with capital measurement and more rigorous oversight, will allow banks to manage risk more effectively.

early benchmarks using past data from an earlier era when subprime debt was not prevalent and mortgage lending criteria were much more stringent. Countrywide, American Century, Wachovia, Merrill Lynch, Citibank, and Bank of America indulged in the miracle of risk modeling of an ever increasing eroding assemblage of NINJA mortgages (No Income, No Job or Assets) into off balance sheet repackaging vehicles. These investment trusts became the preferred mechanism to escape Basel capital requirements by risk-tranching the cash flows of individual mortgages, and later of other forms of assets. These later stage securitized products used cash flows from credit card receivables, car loans, whole loans, and debt that had, itself, previously been brought into existence as a result of securitized and tranched assets. These products evolved into risk adjusted return instruments with names like CLOs, CDO, and CDO-squared. Indeed, the unmeasured and unreported risk exposures that contributed to the current financial crisis were a cocktail of all the principal categories of risk: credit, market, liquidity, and operational. This serves to heighten the awareness of financial institutions and their regulators to the need for the measurement and management of risk exposures in the aggregate rather than on a specific risk category or ‘silo’ basis as described in an April 2008 paper issued by the Basel Committee on Banking Supervision7.

The largest failures of all were caused by model failures which then cascaded into liquidity failures. In general, the models’ creators failed to update them based upon marketplace changes created by the very same products that were enabled by these models. Examples abound: Bear Stearns’ collapse was initially caused by holding a mortgage portfolio of sub-prime debt, improperly rated as relatively risk free when the parameters of the model were no longer valid. The inputs to the models had moved away from the

There is no shortage of evidence that firms recognize these threats and in response have elevated the monitoring of cross-enterprise risk exposures to the board level. For example, the Journal of Accountancy recently reported the results of an Ernst & Young survey8 that found that the boards of many firms are assigning the additional task of risk oversight, despite their already lengthy list of responsibilities, to audit committees. But not only are they being charged with overseeing management’s risk policies and guidelines, they are also being asked to discuss the enterprise’s key risk exposures with management, including those that are beyond financial reporting related risks. In a 2006 survey of Fortune 100 companies9 conducted by the Conference Board Governance Center and Directors’ Institute, McKinsey, and KPMG’s Audit Committee Institute it was found that 71% place responsibility to report on risk to the board with the CFO. The COSO ‘Enterprise risk management – integrated framework’10 provides a broader perspective in that it expects the entirety of enterprise risk management to be monitored through ongoing management activities, separate evaluations, or both. Irrespective of how risk monitoring accountabilities are assigned, if they are not underpinned by a consistent and replicable cross-enterprise risk exposure measurement framework that provides for the consolidation and aggregation of risk exposures, the task borders on the futile. Robert Rubin, a Citigroup

4 Risk Management Association, 2008, “Industry position paper – unit of measure and dependence” 5 Risk Management Association, 2008, “Industry position paper – business environment and internal control factors (BEICFs)” 6 Cummins, J. D., C. Lewis, and R. Wei, R 2006, “The market value impact of operational risk events for U.S. banks and insurers,” Journal of Banking and Finance, 30:10, 2605-2634

7 Basel Committee on Banking Supervision, 2008, “Cross-sectoral review of group-wide identification and management of risk concentrations” 8 Ernst & Young, 2008, “Global internal audit survey” 9 Brancato, C. K., M. Tonello, and E. Hexter, 2006, “The role of the U.S. corporate board of directors in enterprise risk management,” The Conference Board 10 COSO – Committee of Sponsoring Organizations of the Treadway Commission, 2004, “Enterprise risk management — an integrated framework”

Failing to manage risk The current financial crisis and recent failures of financial institutions are all examples of exceptional and unmeasured accumulations of risk exposures that escaped the purview of management, investors, auditors, and regulators who were unaware of their existence on such a scale. The result was a failure to accommodate appropriate unexpected loss scenarios into their risk calculations whereby model and liquidity risk are the most prominent of these failures. Further, many of the more recent events are all examples of risks that can be slotted into one or more of the business level operational risk categories noted in the still unimplemented Basel operational risk framework6.

45

Risk accounting — a next generation risk management system for financial institutions

director and former Treasury secretary, recently told the Wall Street Journal that a “Board can’t run the risk book of a company (...) the Board as a whole is not going to have a granular knowledge of operations”11.

Basel II and the regulatory agenda Recently, the Basel Committee on Banking Supervision (BCBS) has progressively extended requirements for quantifying and reporting financial risk12. It hopes to improve risk management by establishing operational risk as a separate category and publishing guidance for operational risk management. Operational risk is defined as the risk of loss resulting from inadequate or failed internal processes, people and systems, or from external events13. Research to date has considered the importance of operational risk in the financial marketplace, concluding that exposure is significant14. Within banking organizations, corporate level risk has been allocated on a top-down basis. A survey by the BCBS found that, on average, banks had allocated approximately 15% of their capital for operational risk on this basis adjusting for scale factors [Fontnouvelle et al. (2003)]. An important aspect of operational risk is fraud potential and significant losses have resulted from well publicized incidents. More recently, the sub-prime mortgage failures and the unprecedented leverage that had been allowed to accumulate in the financial system have triggered bankruptcies, bailouts, and nationalizations of financial institutions on an unprecedented scale. As prominent as the model failures were, the liquidity failures that resulted were even more significant, triggering cascading waves of collateral liquidations to meet margin and collateral calls impacting the correlation of previously uncorrelated assets. Coupled with the lack of credibility of the value of balance sheet assets that were previously being marked to market and now to suspect models, the flight to quality and known risks led to the abandonment of firms that were now suspected of having failing balance sheets due to marked down assets, newly described as “toxic” assets. In response to BCBS inspired regulatory changes and these high profile cases of fraud and failure, a large body of academic literature has accumulated on the various aspects of operational risk modeling15. Specifically, a number of studies have examined the problems related to the quantification of operational risk and

46

11 Wall Street Journal, 2008, “Rubin, under fire, defends his role at Citi” 12 The BCBS is an international forum for cooperation and produces guidelines on banking supervisory matters. http://www.bis.org/bcbs/. See for example: BCBS, 2001, “Working paper of the regulatory treatment of operational risk,” and BCBS, 2003, “Sound practices for the management and supervision of operational risk,” Basel Committee Publications, No. 96. 13 BCBS, 2003, “Sound practices for the management and supervision of operational risk,” Basel Committee Publications, No. 96, p.2 14 For example, in 2001, operational risk was quantified at €2.5 bln and U.S.$6.8 bln in the annual reports of Deutsche Bank and JPMorgan Chase respectively. See Fontnouvelle, P., V. DeJesus-Rueff, J. Jordan, and E. Rosengren, 2003, “Using loss data to quantify operational risk,” Federal Reserve Bank of Boston Working Paper 15 Cruz, M., 2002, Modeling, measuring and hedging operational risk, Wiley, Chichester and Cruz, M., 2004 (ed), Operational risk modeling and analysis: theory and practice,” Risk Waters Group, London, and King, J., 2001, Measurement and modeling operational risk, Wiley

associated events and processes, for example legal risk, that might defy precise quantification16. Similar problems arise from detected frauds and errors, where infrequent high value occurrences produce an uneven pattern of loss history. Compared to credit and market risk, operational risk has a dramatically different distribution17, requiring different measurement and modeling approaches, characterized by assumptions about the statistical distribution of the loss history and calling on advanced mathematical techniques and theories18. The objective of such techniques is to produce both a consistent measure of risk exposure and robust estimates of value-at-risk (VaR). Such methods typify what is described in Basel II19 as an Advanced Measurement Approach (AMA)20. However, a consequence of attempts at modeling operational risk has been to create significant differences in terms of risk typologies, metrics, and mathematical analysis. According to a recent BCBS report, these differing methods are both impediments to the integration of enterprise risk management and a promise of new modeling and measurement techniques21.

Risk management Risk management has always been an intuitive management skill that was and is expected of all business managers. Business managers manage their revenues and costs through performance management systems. They manage their risk through various operating metrics and balance its impact on risk from their experience and judgment. The problem with this approach is that it lacks the ability to be measured and aggregated in any systematic way. It is left to a wide range of relatively subjective analyses performed by: internal and external auditors around Sarbanes-Oxley, the Committee on Sponsoring Organizations (COSO) reviews, and annual financial audits; cost analysis teams performing unit costing, business process reengineering, and Six Sigma exercises; and risk managers applying scorecards and risk and control self assessments. It was and still is wrongheaded to believe that a historical, mathematically modeled view of past losses, manifest in capital provisioning, would prevent too much risk from being taken. Financial transactions that are entered into in real time have the potential of risk exposures cascading far beyond their notional values and certainly far beyond capital provisioned from past loss events. The industry

16 Chavez-Demoulin, V., P. Embrechts, and J. Neslehova, 2006, “Quantitative models for operational risk: extremes, dependence and aggregation,” Journal of Banking and Finance, 30:10, 2635-2658 17 Nocco, B., and R. Stultz, 2006, “Enterprise risk management: theory and practice,” Journal of Applied Corporate Finance, 18:4, 8-20 18 For example, Chavez-Demoulin et al. (2006) and Allen, L. and T. Bali, 2007, “Cyclicality in catastrophic and operational risk measurements,” Journal of Banking and Finance, 31:4, 1191-1235 19 Basel II, agreed in 2004, is a BCBS Framework for minimum capital adequacy now being implemented by national supervisory authorities. 20 Under AMA, banks must integrate internal data with relevant external loss data, account for stress scenarios, and model the factors which reflect the business environment and the internal control system 21 Basel Committee on Banking Supervision, 2008, “Cross-sectoral review of group-wide identification and management of risk concentrations”

Risk accounting — a next generation risk management system for financial institutions

has not yet found a way to identify operational exposures and put a consistent and comparable value on them. Operational risk, in all its diversity and complexity, is thought not measurable. In the absence of such a direct exposure measurement metric the industry has looked to loss history as being the only objective source of information on operational risks. So what would be an approach to observing the risk of loss in an operating environment?

People

Data

Software

Manual processes Staffing levels Overtime hours Repair rates Staff training

Reference data Product ID Business entity ID Traded date Value date

Automated processes Model risk assessment Security and control Back-up and recovery Business applications

Exposure to risk (risk units – RU)

Contrary to conventional thinking, operational risk can be measured. Just look at all the diversity in the human condition represented in a FICO score for measuring retail credit or the diversity of corporate cultures distilled into credit rating categories, or the complexity of trading strategies across multiple geographies and products synthesized into a market value-at-risk calculation. An answer to measuring operational risk is found in the evolution of FICO scores and credit ratings. Credit reporting was born more than 100 years ago, when small retail merchants banded together to trade financial information about their customers. Lenders eventually began to standardize how they made credit decisions by using a point system that scored the different variables on a consumer’s credit report. Credit granting took a huge leap forward when statistical models were built that considered numerous variables and combinations of variables around these point systems. Today, credit analysis uses a well-defined set of inputs from the historical set of key risk indicators accumulated from many years of refining intuition into predictors of loss. If we move over to the commercial side of credit ratings we get a similar history and methodology from the major credit rating agencies. Their methods, also refined over a century, associate commercial credit scores into A-B-C rating systems where, for example, a confidence level between 99.96 and 99.98 percent has been calibrated as equivalent to the insolvency rate expected for an AA credit rating. We start to solve the problem of determining such a metric for measuring operational risk of loss by returning to the roots of the operational risk capital charge, this being the measure of the potential for losses derived from processing transactions, for truly that is what financial institutions, in the main, do. We then make the observation that all operational processes in a financial institution are driven by transactions interacting with human, automated, and data-dependent activities. Thereafter we dissect each of these pillars into a finite number of subcomponents of standardized activities that reflect key risk indicators that are known intuitively by business managers to cause losses (Figure 1). This is a critical observation in that each of these pillars of activities represent actionable elements in a transactional process. This is important if risk measurement systems are to be able to support management decisions to mitigate risk before they become

Figure 1 – Examples of mapping causes of losses to risk mitigating activities

losses and capital charges. We perform this analysis by using the enterprise’s personnel and documentation in a structured process that allows for the understanding of the exposures inherent in the operating environment in which the business exists and translating this knowledge into risk weights. We then use these values for the calculation of a forward looking measure of risk exposure, a scaled inherent risk value, and a risk-mitigating best-practice control value.  A set of standardized risk metrics is then calculated representing inherent risk, risk mitigation effectiveness, and residual risk. These risk metrics, applied at the transaction level, can then be aggregated to provide departmental, divisional, subsidiary, and group-wide views; and views by categories such as product, geography, business unit, and risk type. This method of calculating risk exposure provides a view of residual risk that is dynamically updated when changes in causal factors occur. In this way the potential for statistical correlation of measurements of exposure to risk and loss history is created which, over time, will cause the risk metrics generated through this new method to become inherently predictive. This is quite different from, but complementary to, the backward-looking capital calculations that financial institutions rely upon today in order to gauge the largest unexpected loss that may occur within a given confidence level and time horizon. More importantly, it is built from the ground up, allowing for the intellectual property of operating management to be imbedded in the very fabric of the risk measurement system. Institutionalizing such knowledge into the operational risk activity creates credibility and actionability — most critical components in enabling a risk culture to evolve and continual risk mitigation to be its outcome. Without a measure of risk exposure, and a dynamic mechanism for seeing it build up, we cannot take preventive actions. The purpose of the product-based approach to risk weighting transactions in the risk accounting method described in this paper and outlined in Figure  2 is to assign ex-ante values to risky processes which can subsequently be correlated with loss history events and, in turn, economic capital as they evolve. Information feedback loops can be developed to provide management with near real-time risk exposure and risk management

47

Risk accounting — a next generation risk management system for financial institutions

Daily inputs from product systems and categorized by product type and G/L account code

Daily transactions categorized by product type Transaction-based products

Trading-based products

Portfolio-based products

Daily transaction count x average value

Daily aggregate buys, sells, and hedges

Daily change in portfolio value

Product risk table Processing risk

Market risk

Credit risk

Liquidity risk

Interest rate risk

Inherent risk is calculated by multiplying the cumulative product risk weightings by value band weightings

Inherent risk by product and risk type in risk units Market risk

Credit risk

Liquidity risk

Interest rate risk

Best practice scoring templates by risk type Processing risk

Market risk

Credit risk

Liquidity risk

Interest rate risk

Residual risk is calculated for each product and risk type by applying the RMI to the inherent risks calculated above

Residual risk by product and risk type in risk units Market risk

Credit risk

Liquidity risk

The risk mitigation effectiveness of risk management and systems and controls are scored based on rest practice scoring templates

Risk mitigation indexes (RMIs) are calculated for each product and risk type

Risk mitigation index (RMI) by risk type

Processing risk

Product risk weightings are appended to the transactions from a product risk table according to the risk types that are triggered by the product

Value band weightings are appended to the transactions

Value table

Processing risk

Daily transaction values are accumulated by product type according to criteria established for transaction based, trading based and portfolio based products

Interest rate risk

Figure 2 – Risk accounting overview

data. In complementary fashion, such an approach will help build more robust, comparable and, therefore, consistent estimates of VaR. Prior work22 has demonstrated that a common measurement framework, connecting operational metrics to risk metrics, will assist the development of better systems to account for all the dimensions of risk, including those captured in expected losses (capital reserves), unexpected losses captured in capital charges, and those yet to be captured by measurement of exposures to potential losses. This later dimension of a prospective measure of “loss potential” is best captured by the proposed introduction of a new unit of measurement for risk exposures and a methodology to map operating metrics to it, in a proposed system of risk accounting, the subject of this paper.

Current state of risk management Today, best practices for the mapping of an organization’s granular knowledge of its operating environment to the risk management systems is done, in the main, through a continual (typically annual or quarterly) people-intense risk assessment process. Questionnaires are used by risk managers to facilitate meetings with operating management and the management group at the top of each of the business silos. Questions and discussions are focused on the status of key risks and controls and the range of expected losses,

48 – The 

 journal of financial transformation

estimating their magnitude and frequency within the timeframes required for input to the capital models of the firm, including the largest expected loss usually at the 99.9% confidence level (a 1 in 1000 year occurrence). Past losses are viewed in context, projects that are in place to manage risk are assessed, and new targets for further risk mitigation are planned. All of these discussions and projections of future losses (really ‘guesstimates’) are summarized and subjected to a number of iterative review sessions until the capital number for operational risk for the firm is agreed and each business silo is comfortable with its own allocation of the top of the house number. The reporting of all these review sessions, loss projections, and risk mitigation projects are formalized in a ‘risk and control self assessment’ (RCSA) system characterized by the reporting of items such as counts of loss events, dates of audits and audit ratings, historical losses per activity of a particular business silo, capital assigned to each department, and a color coded scheme indicating progress in risk mitigation projects. The senior management and the board are presented with a filtered view of all of these reports, highlighting the few key projects and high priority risks determined quite subjectively by the risk management officer after input from the key risk management staff and a review of the RCSA reports. 22 Grody, A., and P. Hughes, 2008, “Financial services in crisis: operational risk management to the rescue,” Journal of Risk Management in Financial Institutions, 2:1, 47- 56

Risk accounting — a next generation risk management system for financial institutions

In this RCSA approach neither senior management nor the board has the ability to observe operational level risk metrics in any granular manner, or in the aggregate, or be able to drill down to the details of the operational risk status or issues being presented. If one were to undertake such a task it would require a review of each of the reports at the departmental level to interpret them, which requires granular knowledge of the activities of each of the business units. This, in turn, would require interaction with departmental personnel in combination with internal audit, risk management, and, perhaps, the business process reengineering team, to assess the interpretations being presented. In fact, this process does occur, typically on a retrospective basis when a significant loss occurs.

corporations face today, and we need to create a new field of ‘risk accounting’ to address this gap in GAAP.”24

Our proposed method of risk accounting is offered as a substitute for this backward looking approach, providing a prospective method to observe risk exposures at both an aggregate and granular level, with the ability to drill down to the root causes of any observed increase in risk exposures. Actions can then be taken to both examine the effects of risk mitigation projects underway and to initiate new projects before exposures turn into losses.

The new challenge is to learn how to tag those same transactions with risk weighted exposure measures and risk-weighted financial values to produce a risk exposure metric that is additive and to do it within a framework that can actually track the value of risk mitigation efforts and drive cross-enterprise risk analysis and reporting.

The proposed method of risk accounting is directed at transactions to which risk weightings and scaled values are assigned. By engaging with the business line managers across the entire enterprise both the historical and current knowledge of the operating metrics used at the business level are interpreted into the risk metrics of the proposed risk accounting system. A method to achieve this has already been published and piloted in a number of institutions23. In developing the risk accounting system an organization deploys its risk management team in each operating department to interact with operating personnel. Together they develop risk scores that represent the department’s exposure to risk and the risk mitigation effectiveness of each of the business processes that comprise their operating environment. Risk scores and/or risk weightings are determined for each business process based on three sets of standardized tables and templates that relate to risk drivers present in all business processes being ‘exposure,’ ‘value,’ and ‘risk mitigation.’ The resulting risk scores and weightings are applied in a scorecard where operational metrics are computed, consolidated, and aggregated.

Summary of risk accounting Following testimony given before the U.S. House Oversight Committee in November 2008, Prof. Andrew Lo commented in an interview with the Wall Street Journal, “The very fact that so many smart and experienced corporate leaders were all led astray suggests that the crisis can’t be blamed on the mistakes of a few greedy CEOs. In my view, there’s something fundamentally wrong with current corporate-governance structures and the language of corporate management. We just don’t have the proper lexicon to have a meaningful discussion about the kinds of risks that typical 23 Hughes, P., 2007, “Operational risk: the direct measurement of exposure and risk in bank operations,” Journal of Risk Management in Financial Institutions, 1:1, 25-43 24 Wall Street Journal, 2009, “Understanding our blind spots,” p. R2

Financial institutions need to find a way of accounting and reporting consolidated and aggregated cross-enterprise risk exposures as they accumulate. The challenge they face is analogous to the one they faced a generation or more ago as businesses evolved from legal entity-based profit centers within sovereign states into globalized lines of business. At that time financial controllers had to learn how to tag transactions with business unit, unit cost, market segment, product, and customer codes to drive cross-enterprise management performance analysis and reporting.

Recognizing that risk exposures are first triggered upon transactions entering the operating environment, it follows that risk exposure measurement for risk accounting and reporting purposes must be transaction-based and occur at a financial institution’s transaction gateway at precisely the same points that financial (general ledger) and management accounting interfaces are positioned. It is upon these basic premises that the approach to risk accounting described in this paper is constructed. Risk accounting represents an extension of financial reporting to embrace a new risk metric, ‘exposure to risk.’ It links changes in reporting of traditional valueat-risk measures to changes in business activity and the reporting of operational and performance metrics in order to make them more effective, timely, and more meaningful to stakeholders. It is in this formulation of the enterprise’s overall risk where the sum of the diversified effects of operational, market, credit, and liquidity risk capital (VaR) — a measure of the potential for the magnitude of future losses — is combined with a current and dynamically changing measure of risk exposure denominated in a new unit of measure, the risk unit (RU). The RU is a mechanism to translate all manner of diversified internal processes described under the general term operational (business) activities into a common risk measurement framework, in much the same way as all manner of externally focused market and credit risks have been mapped into a common risk measurement framework, value-at-risk (VaR), using stochastic calculus. While separate and distinct in terms of managements’ and regulators’ use (capital is future looking, exposure is immediate and actionable), VaR and RUs are complementary measures and may be correlated. Even so, RUs measure aspects not captured by VaR and may provide a better substitute for measuring the elements that are. If true, enterprise risk can be better captured by RUs.

49

Risk accounting — a next generation risk management system for financial institutions

It is also possible for VaR and RUs to be correlated over time by assigning a monetary value to the RUs using a scaling factor associated with the financial dimension of the enterprise. For example, in a top-of-the-house view of enterprise risk using the standard correlation formula: Correlation coefficient R = (nΣVaR XRU — ΣvaR X ΣRU)/√[n(ΣVaR2 – (ΣVaR)2 X [nΣRU2 – (Σ(RU)2] The introduction of a risk exposure metric, the risk unit, is necessitated by business managers’ need to report in quantitative terms how the risks they manage are impacted by operational factors, i.e., high transaction counts, non-reconciled position values, failure counts and values of undelivered securities, overtime hours, absenteeism rates, systems downtime, number of unauthorized systems accesses, number of password changes per employee, number of internal non-client accounts opened, and a myriad of other business level metrics. The RMA25 has documented nearly 2000 such key risk indicators (KRIs) for financial organizations. These KRIs supplement the accounting records and are a major part of the performance evaluation framework available to management and the board. Some of them find their way into the annual report as commentary and footnotes, and some are used by security analysts and external auditors to further assess the performance prospects of the organization. Some of these operating metrics become de-facto industry best practice benchmarks upon which firms gauge their performance in addition to the GAAP performance results published in the audited financial statements. They are, however, unavailable in aggregated form for executive management in a manner that equates changes in these operating factors to real-time or near real-time measures of risk exposures and, in turn, to operational loss predictions and capital requirements which is the desired result of the method of risk accounting proposed in this paper.

■■ Risk mitigation index (RMI) — is a dynamic measure on a scale of 1 to 100 of the effectiveness of the risk management systems and controls that mitigate risk where 100 represents best practice. ■■ Residual risk — is expressed in RUs and is the inherent risk less the risk mitigation effects of risk management systems and controls as represented by the RMI. The RMI can be immediately adjusted when changes in the underlying causal factors occur; thereby causing residual risk to be a dynamic measure of a financial enterprise’s exposure to risk. In this way the potential for statistical correlation of measurements of exposure to risk and loss history is created which, over time, will cause the risk metrics generated through the method described in this paper to become inherently predictive. The application of the method in financial institutions to date has been focused primarily on transaction processing which financial firms typically refer to as ‘operations.’ Its primary aim has been to provide operations managers and their stakeholders with quantitative risk-based management information to complement the qualitative management information relating to the status of risks and controls that exists in the form of audit reports, key risk indicators (KRIs), and risk and control self-assessments (RCSAs). This paper considers whether the method of operational risk measurement developed for transaction processing environments can be extended to represent an enterprise level quantitative and qualitative risk management system. The solution described in this paper proposes a method of identifying and codifying risk information that is appended to transactions to drive cross-enterprise risk reporting. Such a solution is analogous to the work undertaken by financial controllers over a generation ago when they learnt how to codify management information (customer, product, market segment, cost center, unit cost etc.) and append it to transactions in order to drive cross-enterprise management accounting and reporting. Hence the method proposed in this paper is characterized as a next generation risk accounting and reporting system.

The proposed method of risk accounting The method of operational risk measurement upon which the method of risk accounting has been derived has been described in prior papers, such as Hughes (2007) mentioned above, and in a pending patent as well as demonstrated through its diverse application in the financial services industry. Its application to the business processes that comprise financial operating environments presents comprehensive quantitative and qualitative management information concerning the risks of a financial enterprise in risk units (RUs). In this way, the risks inherent in financial operating environments can be represented by three standardized, interrelated, and additive risk metrics: ■■ Inherent risk — is a representation of the risk-weighted size of a financial operating environment expressed in RUs.

The method constitutes a consistent and replicable means of converting notional transaction values into risk-weighted transaction values denominated in RUs representing inherent and residual risks. Thereafter, cross-enterprise risk reporting follows the well established management accounting and reporting lines and principles. It is important to note that the application of the method as a next generation risk accounting system is an area of ongoing research. Whereas certain tables and templates described below and their related risk weightings have been proven through more than a decade of application in diverse financial operating environments and subject to ongoing domain expert validation, there are others that exist conceptually and have not yet been subjected to field testing and expert validation. As researchers we intend to undertake such tests and simulations in the near future. 25 RMA – Risk Management Association, http://www.rmahq.org/RMA/

50 – The 

 journal of financial transformation

Risk accounting — a next generation risk management system for financial institutions

Product risk table

Market risk

Risk criteria

Description

Availability and reliability of market prices

Active market prices

Calculation of inherent risk Weighting

Inactive but observable market prices

5

Unobservable prices that need judgment

8

No prices but economic or other assumptions (demographic, holistic etc.) are required Period the product has been actively traded in the business

2

Between 1 year and 3 years

4

Between 4 months and 1 year

6

Between 1 month and 3 months

8 10

Vetted through independent audit process and in general use

2

Used by many trading institutions

4

Used by some reputable trading institutions

6

Used by few  trading institutions No other known users The manner in which the product is traded

10

More than 3 years

Less than 1 month If the product is model dependent for pricing or valuation purposes, the extent to which the model is used across the industry

2

8 10

Electronic

2

Hybrid (electronic + floor / voicebased)

4

Floor / voice-based

6

Over-the-counter (OTC)

10

Other

10

Note: The above table is an extract of some of the criteria used to determine inherent market risks Figure 3 – Product risk table

Inherent risk is the cumulative value in RUs of the individual riskweightings of each product type multiplied by a value weighting. The product’s risk weightings are derived from a product risk table. An extract from this table relating to market risk is shown in Figure 3. The value weighting is obtained by accumulating daily transaction notional values relative to each product accounted for in the general ledger using the following criteria: ■■ Transaction-based products — daily transaction count multiplied by average transaction value. ■■ Trading-based products — daily aggregate buys, sells, and hedges. ■■ Portfolio-based products — daily change in portfolio value. The resulting values are then processed through the value table shown in Figure  4 to obtain a value band weighting. The value table presents a logarithmic expression of the relationship between transaction values and risk. In general, operational sophistication and the effectiveness of risk mitigation increase as transaction volumes increase primarily due to enhanced automation. The net result is that the rate at which operational risk exposure is created decelerates relative to the rate at which transaction volumes increase. An approach, therefore, to measuring operational risk recognizes this relationship and progressively reduces the rate at which risk exposure is valued relative to increases in the transaction volume and values accepted for processing. The inherent risk by product is calculated in RUs on a daily basis.

The operating environment The method in overview The method operates on three underlying premises. First, inherent risk is triggered upon transactions being accepted into an operating environment for processing. The degree of such risk is a function of a transaction’s notional value and the particular types of risk inherent in the product to which the transaction relates. Second, a primary aim of an operating environment is to mitigate the risks that are inherent in the products and related transactions accepted for processing. Third, that the rate at which operational risk exposure is created decelerates relative to the rate at which transaction volumes increase (mainly due to further automation). The method presented in overview in Figure  2 presupposes that the risk characteristics of product types, the daily values associated with them, and the risk mitigation effectiveness of operating environments can be represented through scores and risk-weightings derived from the tables and templates described in this paper and converted into standardized quantitative risk metrics expressed as inherent and residual RUs.

The operating environment for which an RMI is to be calculated is defined. Typically this is the total enterprise or a business division thereof, for example, an investment bank or investment banking division. Such an operating environment incorporates the business components that are required to achieve business self-sufficiency and includes sales and marketing, operations, information technol-

Value bands ($) 1 trillion 100 billion 10 billion 1 billion 100 million 10 million 1 million 100 thous 0

50

100

150

200

Value band weightings Figure 4 – Value table

51

Risk accounting — a next generation risk management system for financial institutions

ogy, treasury, risk management, finance, internal audit, etc. The risk measurement method recognizes that business components can be deconstructed into business processes and that each business process is comprised of manual and automated activities interacting with data to achieve one or more operating objectives. Thus, the risk mitigation effectiveness of an operating environment is related to two attributes. First, its ability to ensure that the transactions it accepts for processing are properly approved and processed in a complete, accurate, timely, and secure manner (processing risks).

Model management Relates to the management of the product’s models used for risk pricing, valuation, value-at-risk (VaR) calculations and capital adequacy Best practice score = 100 Points Best practice statements

Deductible points

Responsibility for the management of the product’s model is assigned to an independent risk control unit

100

The product’s position is valued at least daily by marking-to-market at readily available close-out prices that are sourced independently in a process under the direct management and control of the independent risk control unit1

100

The independent risk control unit conducts a regular back-testing programme of the product’s model2

80

The independent risk control unit conducts the initial and on-going validation of the internal model3

80

The independent risk control unit produces and analyzes daily reports produced by the product’s model including an evaluation of the relationship between measures of risk exposure and trading limits

70

The product’s model is subject to a routine and rigorous programme of stress testing4

60

Risk factors incorporated into the product’s pricing model are also incorporated into the product’s value-at-risk (VaR) model

50

An independent review of the model is carried out regularly (at least once a year) by the bank’s own internal audit function5

40

Daily reports prepared by the independent risk control unit are reviewed by senior management6

30

The product’s trading and exposure limits are related to the model in a manner that is consistent over time and that is well-understood by both traders and senior management

25

Guidance notes: 1. If there are no readily available ‘active’ market prices for the product the statement is not applicable so no points are deductible. 2. Back-testing at a minimum must include a comparison of the risk measure generated by the model against actual daily changes in portfolio value over longer periods of time as well as hypothetical changes based on static positions. 3. Validation at a minimum must include ensuring that any assumptions made within the internal model are appropriate and do not underestimate risk. 4. Guidelines for stress testing are set out in the Basel Committee on Banking Supervision’s, November 2005, “ Amendment to the Capital Accord to Incorporate Market Risks” Part B.5 5. Internal audit’s review must be conducted by suitably qualified individuals and at a minimum must include the verification of: the consistency, timeliness and reliability of data sources used to run internal models and the independence of such data sources; the accuracy and appropriateness of volatility and correlation assumptions; the accuracy of valuation and risk transformation calculations; and the verification of the model’s accuracy through frequent back-testing. 6. Senior management means individuals with sufficient seniority and authority that they can enforce both reductions of positions taken by individual traders and reductions in the bank’s overall risk exposure. Figure 5 – Best practice scoring template — model management

52 – The 

 journal of financial transformation

Second, risks are quantified to an acceptable degree of precision and are properly reported and applied in, for example, product pricing, economic capital calculations, and allocations and for determining capital adequacy (quantification risks). It follows, therefore, that an RMI is required relative to processing risks — each business process, reference data source, and business information system — and quantification risks — each major risk category including credit risk, market risk, operational risk, liquidity risk, and interest rate (non-trading) risk. To this end, the method provides for the calculation of an RMI at the business process level in a way that the associated inherent and residual risks in RUs can be consolidated and aggregated through to the enterprise level and at multiple intermediate levels including by organizational unit, product, risk type, and geography.

The product summary A sample product summary is shown in Figure A1 in the Appendix relative to a Collateralized Debt Obligation — CDO (all data is fictitious and is presented for illustration purposes only). The business components ‘trade confirmation and matching,’ ‘data risk,’ and ‘market risk’ are supported by detailed scorecards and related calculations presented in Figures A2, A3, and A4 in the Appendix, respectively. The product summary is developed by identifying the business components that are on the product’s end-to-end transaction processing cycle’s critical path and the reference data sources and business information systems they interact with. The business components that process the product’s transactions (transaction processing risk) are deconstructed to identify the individual processes, as illustrated in Figure A2, at which level the RMI and inherent and residual risks in RUs are calculated. The sample product summary in Figure A1 shows the total exposure to operational risks generated by CDOs on a particular day. The total inherent risk is 4,200 RUs, residual risk is 1,830 RUs, and the RMI is 56.4. The amount of inherent risk applied to each business component and risk category within processing risks is the same (1,350 RUs). The repetition of the inherent risk recognizes that each component handles the respective product-related transactions as an organizationally segregated operations unit and, consequently, independently exposes the full amount of inherent risk. The business components listed under ‘transaction processing risk’ in Figure  A1 show the relevant organizational unit — IB front office, operations, and finance — with an indication about whether processing is centralized or decentralized. The inherent and residual risks presented in the product summary are additive and, consequently, the totality of products handled by each centralized or decentralized component can be consolidated and aggregated and the RMI recalculated. In this way, inherent and residual risks can be summed for all the products handled by each centralized or decentralized business component to produce total

Risk accounting — a next generation risk management system for financial institutions

risk metrics by business component or hierarchically from the product (lowest) through to the total enterprise (highest) level.

The scorecards The sample scorecards presented in Figures A2, A3, and A4 show the calculation of RMIs and inherent and residual risks in RUs relative to transaction processing, reference data sources, and market risk respectively. Typically, risk metrics for transaction processing related business components are calculated at the more granular ‘process’ level and others are calculated at the business component level. This recognizes that transaction handling processes are designed to process and control transactions that are value-bearing, whereas processes that maintain reference data sources, business information systems, and risk quantification are generally not value-bearing. Consequently, transaction processing components have a greater need for more granular risk assessment, analysis, and management information. However, inasmuch that non-transaction processing components are also comprised of manual and automated processes interacting with data it is an optional possibility to calculate risk metrics at the process level. The sample scorecard shown in Figure  A2 relates to the business component ‘trade confirmation and matching’ and illustrates the calculation of risk metrics relating to CDOs on a particular day. The process risk-weightings shown in the scorecard are derived from an activity table which is a catalogue of pre-identified processing activities to which a fixed risk weighting has been assigned. This risk weighting represents the relative immediacy and likelihood of financial loss in the event of process failure. For example, a process that involves the release of funds has a higher risk weighting (higher loss immediacy) than a process that involves the matching of trade confirmations (lower loss immediacy). Such processing objectives or activities are collectively referred to as ‘activity types’ and include the following (this is not an exhaustive list): prepare, capture, and control transactions; process transactions; transaction (deal) confirmation; release value items; prepare and issue reports; independent verification and validation; and determine and control cash positions. Current versions of the method use a catalogue comprised of 34 activity types and associated risk weightings. The actual activities of each process are mapped to the activity types in the catalogue and where there is a match the applicable activities and risk-weightings are extracted and applied in the respective scorecard. The inherent risk on a given day (1,350 RUs in Figure A2) is distributed to individual processes in proportion to their total activity risk weightings.

Best practice scoring templates The risk mitigation effectiveness of each process is then determined and scored by reference to best practice scoring templates (Figures 5 and 6) which can be one of two types:

Type 1 – benchmark data — type 1 scoring templates include ‘execution’ and ‘people’. In these templates benchmark data are presented that delineate scores in fixed intervals between zero and 100. Appropriately graduated benchmarks and/or status descriptions are assigned to each score interval. Scores are determined for each Type 1 template by identifying the benchmark and/or status description that best matches the current status or condition of the element being scored. Type 2 – best practice statements — type 2 scoring templates include ‘business recovery,’ ‘model management,’ and ‘data quality management.’ In these templates best practice statements are presented and a value between zero and 100 (deductible points) is

Execution Levels of automation versus manual workarounds; levels of repair rates; and the stability of core application(s).

People Stress, accountability, experience, depth of cover, and availability of staff

Level of automation or STP rate:

Average levels of overtime hours per person per month over last 3 months:

• • • • •

100% score 100 (best practice) 75% score 75 50% score 50 25% score 25 0% score zero

Average percentage of input rejection/ repair: • • • • •

0% score 100 (best practice) 5% score 75 10% score 50 25% score 25 50% score zero

Number of core system failures in year: • None score 100 (best practice) • 1 score 75 • 2 score 50 • 4 score 25 • > 12 score zero

• • • • •

20 or less score 100 (Best practice) 30 score 75 40 score 50 60 score 25 80 or more score zero

Percentage of temporary and new staff to total existing staff: • • • • •

0% score 100 (Best practice) 20% score 75 40% score 50 60% score 25 80% or more score zero

Percentage of activities/controls that can be performed by alternate staff: • • • • •

100% score 100 (Best practice) 90% score 75 75% score 50 50% score 25 25% or less score zero

Data quality management Faulty data is identified, researched, and eliminated in an acceptable timeframe

Business recovery Continuation of operations at an alternative site in a timeframe that is acceptable

Best practice score 100

Best practice score 100

Deduct following scores from best practice score if statement does not apply:

Deduct following scores from best practice score if statement does not apply:

• Business critical data elements validated to at least one independent source or imported through an approved source (100) • Expert resources positioned and empowered to enhance data through appropriate research (100) • Independent quality assurance applied to expert data enhancements (75) • Audit trail available for data validation provenance (50) • Defined and monitored process to escalate recurring issues (25) • Defined and monitored process to provide feedback to supplier source of recurring issues (25) • Automated controls within core application (25) • Data formatting standards exist for each of the defined data elements (10)

• Recovery or reactivation at alternative site in acceptable timeframe (100) • Formal business recovery plan (100) • End-to-end disaster simulation (75) • Plan complete and comprehensive (30) • Supervisory review of plan (20) • Key employees fully briefed (15) • Key employees active participation in disaster simulation (10) • Business recovery specialist review of plan (10) • Key employees’ contact details current (5) • Notification test performed (5) • Key employees ready access to offsite copy of plan (5)

Figure 6 – Sample best practice scoring templates (summarized)

53

Risk accounting — a next generation risk management system for financial institutions

assigned to each statement representing its relative risk mitigation impact. Each template is assigned a starting score of 100 and for each best practice statement that does not apply to the current status or condition the respective deductible points assigned to that statement are deducted from the starting score. The lowest possible score is zero. Figure 5 shows a detailed model management scoring template that is aligned to qualitative standards issued by the Basel Committee on Banking Supervision26 and relates to the market risk quantification scorecard shown in Figure A4. A further six summarized examples of best practice scoring templates relating to transaction processing and reference data sources are shown in Figure 6, whereby the respective deductible points in Type 2 templates relative to each best practice statement are shown in brackets. The scoring templates are structured such that there is only one score, within a reasonable tolerance, that is applicable to the status or condition of the process or component being scored. This has the effect of characterizing the RMI as a true measurement metric as opposed to an assessment metric, thereby reducing subjectivity in the measurement process. Where there is more than one subcategory within a primary category in a Type 1 template a lower score displaces a higher score. This occurs because the condition that gave rise to the higher score is invariably impacted negatively by the condition represented by the lower score. For example, in the primary category ‘execution’ a straight-through-processing (STP) rate of 100% is of limited value if the underlying business information system is highly unstable, characterized by 12 or more failures in a year. In stress conditions it is assumed that the degree of reliance placed on each risk category in the prevention of operational failure is different. The method recognizes this differentiation by assigning category weightings to each of the primary risk categories. The category weightings relative to each of the risk categories in transaction processing, reference data sources, and market risk quantification are shown in Figures A2, A3, and A4. It can be noted that the category weightings are not necessarily consistent from scorecard to scorecard. For example, control evaluation has a category weighting of 10 for transaction processing and 3 for reference data sources. This is consistent with the above discussion that transaction processing cycles are designed for value-bearing processes and, consequently, internal controls have greater risk mitigation significance than processes designed to maintain reference data that are not value-bearing.

The calculations The best practice scoring templates shown in Figures 5 and 6 are scored whereby each score represents the actual status relative to

54

26 Basel Committee on Banking Supervision, 2005, “Amendment to the Capital Accord to incorporate market risks,” p36-37 27 Akerlof, G. A., 1970, “The market for ‘lemons’: quality uncertainty and the market mechanism,” Quarterly Journal of Economics, 84:3, 488-500 28 Brown, S. J., W. N. Goetzmann, B. Liang, and C. Schwarz, 2008, “Mandatory disclosure

best practices. Scores are updated upon changes or dynamically through automated interfaces (i.e., people scores via the human resources system). Scores are then blended with two other weightings: the category weightings on a scale of 1 to 10 shown in Figures A2, A3, and A4, which are calibrated according to the relative risk mitigation impact of each risk category; and the inherent risk representing risk-weighted business processes or components. From these inputs risk metrics are calculated using the formulae below where W = weightings, S = scores, VT = value table (Figure 4), PRT = product risk table (Figure  3), and BPST = best practice scoring templates (Figures 5 and 6): Inherent risk RUs (InhRU) = PRTW × VTW Risk mitigation index (RMI) = [Σ(BPSTS × BPSTW × InhRU) × 100] ÷ [Σ(100 × BPSTW × InhRU)] Residual risk RUs (ResRU) = [(100 — RMI)/100] × InhRU

Conclusion The method described in outline in this paper addresses the recent academic literature and the regulatory agenda in bank risk reporting. It achieves this by offering, in conjunction with current top-down practices, a bottom-up transactional method that offers tractable managerial information in conjunction with established methods and an extension of current financial reporting through additions to the underlying accounting system. If techniques can be applied for the better management of risk factors, as herein described in our view of a risk accounting method and system, their disclosure and audit should add value from the perspective of the stakeholder community. Investors potentially face a ‘market for lemons’ problem,27 in which they have difficulty discerning effective management processes from the ineffective. Such problems might be compounded insofar as specialist and technical disclosures have no information content for outside investors28. There is thus a quality signaling rationale for disclosures that effectively convey the truth of superior processes to non-specialist investors29. At the same time, the process and the information generated by the risk accounting method outlined in this paper can both be subject to audit and external scrutiny, and correlated to actual loss experience over time, adding to their consistency and credibility. To the extent that our method of risk accounting is successful there is reassurance for regulators and a ‘better markets’ solution which, in the face of the current financial meltdown, is surely needed. These authors humbly suggest that new directions are possible, and that this proposed method, perhaps in its minimalist contribution, would simply stimulate others toward further research into these new directions.

and operational risk: evidence from hedge fund registration,” Journal of Finance, 63:6, 2785-2815 29 Toms, S., 2002, “Firm resources, quality signals and the determinants of corporate environmental reputation: some UK evidence,” British Accounting Review, 34, 257282.

Risk accounting — a next generation risk management system for financial institutions

Appendix Collateralized Debt Obligations (CDOs) Date:

DD/MM/YYYY

 

  Product summary

Risk metrics Inherent risk (risk units)

Risk mitigation index (RMI)

Residual risk (risk units)

Processing risk Transaction processing risk Product and service pricing

1,350

63.5

493

IB front office

Decentralized

Deal structuring

1,350

55.2

605

IB front office

Decentralized

Order management

1,350

68.2

429

IB front office

Decentralized

Pre-trade validation

1,350

62.3

509

IB front office

Decentralized Decentralized

Quote management

1,350

73.4

359

IB front office

Trade execution and capture

1,350

44.9

744

IB front office

Decentralized

Cash management

1,350

52.3

644

Operations

Centralized

Trade confirmation and matching*

1,350

60.0

540

Operations

Centralized

Position control and amendments

1,350

60.2

537

Operations

Centralized

Transaction reporting

1,350

63.2

497

Operations

Centralized

Credit limit monitoring

1,350

45.0

743

Operations

Centralized

Trading limit monitoring

1,350

62.4

508

Operations

Centralized

Trade settlements

1,350

63.4

494

Operations

Centralized

Nostro reconcilement

1,350

72.8

367

Operations

Centralized

Trading account reconciliations

1,350

66.7

450

Operations

Centralized

G/L proofs and substantiation

1,350

73.3

360

Operations

Centralized

Management reporting

1,350

64.2

483

Finance

Centralized

Finance

Centralized

Regulatory and external reporting Control totals Transaction processing risk

1,350

64.2

483

24,300

62.0

9,245

1,350

62.0

514

Data risk* Client and counterparty

1,350

79.2

281

Market data

1,350

52.9

636

Products and instruments

1,350

68.2

429

Corporate events

1,350

43.3

765

5,400

60.9

2,111

1,350

60.9

528

Control totals Data risk Business information systems risk Integrated trading system

1,350

78.9

285

Funds transfer system

1,350

65.4

467

Global nostros system

1,350

65.0

473

Global ledger system

1,350

82.3

239

Funding and liquidity system

1,350

69.4

413

Control totals

6,750

72.2

1,877

Business information systems risk Control totals Total processing risk metrics

1,350

72.2

375

36,450

63.7

13,233

1,350

63.7

490

Quantification risk Credit risk

900

65.3

312

Market risk*

1,350

43.9

758

Liquidity risk

600

55.0

270







Total quantification risk metrics

2,850

53.0

1,340

Product operational risk

4,200

56.4

1,830

Interest rate (non-trading) risk

Trading product – IRRBB is N/A

Figure A1 – Sample product summary

55

Risk accounting — a next generation risk management system for financial institutions

Component: Trade confirmation and matching Risk culture/ management

Management oversight

Logical access management

Physical access

Policies and procedures

10

8

6

6

4

4

2

Inherent risk (risk units)

Business recovery

10

Risk mitigation index (RMI)

Execution

10

Inherent risk (risk units)

People

Risk Units

1 5 4

250

Process 1

25

50

45

15

50

75

75

100

50

250

47.8

130

717,500

1,500,000

1 5 6

250

Process 2

80

100

50

0

30

50

40

100

20

250

56.3

109

845,000

1,500,000

400 900

Process 3 Team A - RMI

25 40.3

50 63.9

45 46.4

15 10.8

50 44.4

75 68.1

75 65.3

100 100.0

50 41.7

400 900

47.8 50.2

209 448

200

Process 1

70

70

50

100

100

100

70

100

100

200

79.7

41

956,000

1,200,000

250 450

Process 2 Team B - RMI

70 70.0

70 70.0

50 50.0

100 100.0

100 100.0

100 100.0

70 70.0

100 100.0

100 100.0

250 450

79.7 79.7

51 92

1,195,000 2,151,000

1,500,000 2,700,000

40.6

61.1 1,350

60.0

540 4,861,500

8,100,000

Inherent risk (risk units)

Activity descriptions and weightings

1,350

Control evaluation

Inherent risk

Activity weighting

DD/MM/YYYY

Risk categories and weightings

Actual Best score practice aggregate aggregate

Team A Control and distribute value bearing instructions General administration Prepare. capture and control transactions General administration Issue and match deal confirmations Prepare and issue internal reports Team A - Inherent risk

4

2 8

1,148,000 2,400,000 2,710,500 5,400,000

2 5

Team B - Inherent risk

Total

27

1,350

50.2

65.9

47.6

Score aggregate

362,500

575,000

417,500

Best practice aggregate

900,000

900,000

315,000

315,000

Best practice aggregate

450,000

Score aggregate

677,500

Component - RMI

Score aggregate

Best practice aggregate 1,350,000

63.0

78.7

66.9

100.0

78,000 240,000

367,500

235,000

360,000

75,000

900,000

720,000 540,000

540,000

360,000

360,000

180,000

225,000

360,000 270,000

270,000

126,000

180,000

90,000

450,000

450,000

360,000 270,000

270,000

180,000

180,000

90,000

890,000

642,500

438,000

510,000

637,500

361,000

540,000

165,000

1,350,000 1,350,000

1,080,000

810,000

810,000

540,000

540,000

270,000

Control evaluation

4 4 3

Business recovery

Independent verification / validation Resolution of client initiated queries Process transactions

Logical access management

Team B

Figure A2 – Sample transaction processing business component scorecard and calculations

Component: Reference data services

Best practice aggregate

Actual score aggregate

Residual risk

Risk mitigation index (RMI)

Inherent risk

Management information

Policies and procedures

Vendor data services

Uniqueness

Risk culture

Risk Units

People

Risk categories and weightings

1,350 Data management

Inherent risk Quality management

DD/MM/YYYY

10

10

5

5

4

4

4

4

4

3

3

Client and counterparty data

75

85

80

80

85

75

70

75

80

85

80

1,350

79.2

281

5,987,250

7,560,000

Market data

45

50

65

50

50

65

75

45

45

55

50

1,350

52.9

636

3,996,000

7,560,000

Products and instruments data

65

65

60

70

75

75

80

65

60

75

75

1,350

68.2

429

5,157,000

7,560,000

Corporate events data

35

30

35

45

55

45

55

50

60

55

50

1,350

43.3

765

3,273,750

7,560,000

70.0

58.8

61.3

67.5

63.8

1,350

60.9

528

1,323,000 1,093,500

1,032,750

55.0

57.5

60.0

61.3

66.3

65.0

Score aggregate

2,970,000

3,105,000

1,620,000

1,653,750

1,431,000

1,404,000

Best practice aggregate

5,400,000

5,400,000

2,700,000

2,700,000

2,160,000

2,160,000

Risk mitigation index (RMI)

1,512,000 1,269,000

18,414,000 30,240,000

2,160,000 2,160,000 2,160,000 1,620,000 1,620,000

Figure A3 – Sample reference data source business component scorecard and calculation

2

2

2

Risk mitigation index (RMI)

Actual score aggregate

Best practice aggregate

80

85

75

70

75

1,350

43.8

758

2,011,500

4,590,000

Product 2

75

50

65

50

50

65

75

45

875

61.2

340

1,820,000

2,975,000

Product 3

40

65

60

70

75

75

80

65

950

59.4

386

1,919,000

3,230,000

Product 4

0

30

35

45

55

45

55

50

1,105

27.6

800

1,038,700

3,757,000

24.2

45.0

60.9

62.6

67.9

65.2

69.4

60.2

4,280

46.7

2,283

1,036,250

1,926,500

1,042,200

535,950

581,050

558,200

593,800

515,250

4,280,000

4,280,000

1,712,000

856,000

856,000

856,000

856,000

856,000

Risk mitigation index (RMI) Score aggregate Best practice aggregate

People

Figure A4 – Sample market risk quantification business component scorecard and calculation

 journal of financial transformation

Residual risk

2

Inherent Risk

Management oversight

2

80

Business recovery

4

40

Control evaluation

10

0

Collateralized debt obligations (CDOs)

Execution

10

Risk categories and weightings

56 – The 

Logical access management

Policies and procedures

Model management

Component: Market risk

6,789,200 14,552,000

Loading...

Risk accounting - a next generation risk - (ERM) Symposium

Part 1 Allan D. Grody President, Financial Intergroup Holdings Ltd1 Peter J. Hughes Risk accounting — a next generation risk management system for ...

192KB Sizes 1 Downloads 6 Views

Recommend Documents

ENTERPRISE RISK MANAGEMENT (ERM) soAR - Neliti
memperkenalkan distribusi dari kemungkinan hasil tersebut dan lebih memfokuskan distribusi atas hasil yang mewakili kega

Accounting - Risk and Compliance
Nov 1, 2004 - 9/15/06. 16. Metavante Regulatory Services. Risk–Based Capital. Introduction. The risk-based capital rul

Coso Erm. Coso Releases Updated Enterprise Risk Management
Moving From Enterprise Risk Management To Strategic Risk Management Examining The Revised Coso Erm Framework Pdf Downloa

Five Benefits of Enterprise Risk Management ERM: CliftonLarsonAllen
Aug 29, 2013 - An effective enterprise risk management (ERM) program can help organizations manage their risks and maxim

(ERM) FACT SHEET What is Enterprise Risk - Society of Actuaries
Enterprise risk management (ERM) is the process of coordinated risk management that places a greater emphasis on coopera

Indonesia Risk Report - SI Risk
Overview. Political Risk. Operational Risk. Security Risk. Travel Risk. Terrorism Risk. In Focus. • Jakarta. • Sumat

Educating the Next Generation
Contents. Educating the Next Generation • http://dx.doi.org/10.1596/978-1-4648-0417-5. How the Teacher Training System

Accounting for Risk, Hedging and Complex Contracts
With the exponential growth in financial derivatives, accounting standards setters have had to keep pace and ... The com

RISK MANAGEMENT COMPANIES AND HEDGE ACCOUNTING
Under IFRS 9 before applying hedge accounting to certain preconditions need ... Overall, IFRS 9 retains the IAS 39 accou

next generation vrv - Daikin
SYSTEMS. SPECIFICATIONS. MAIN. FEATURES p52 p34 p06. AIR TREATMENT. EQUIPMENT. LINEUP. OUTDOOR UNIT. COMBINATIONS. OUTDO

Attack on Titan: chapter 110 | Watch Movie | Pretty Little Liars