ERM Flashcards — Part 4
Risk Assessment
Aug 8, 2017 · 17 min read
Final versions: [Pt.1] [Pt.2] [Pt.3] [Pt.4] [Pt.5] [Pt.6]
- Black Monday: October 1987, S&P 500 index fell by over 20%.
- Actual financial data versus normal distribution: More narrowly peaked and has fatter tails (leptokurtic). Returns are heteroscedastic.
- Two ways of defining “extreme value”: 1. max value in a set of n losses (block maximum); 2. all losses exceeding a certain threshold.
- Two main results of extreme value theory: 1. distribution of block maxima is approximately described by the GEV family of distributions if n is sufficiently large; 2. tail of the distribution above a threshold u can be approximated for large values of u by the GPD.
- Three parameters of the GEV family: 1. alpha (location); 2. beta (scale, shifts and stretches distribution); 3. gamma (shape).
- Three distributions in the GEV family: 1. Gumbel (gamma=0); 2. Weibull (gamma<0); 3. Frechet (gamma>0).
- Gumbel GEV characteristics: Tail falls exponentially.
- Weibull GEV characteristics: Finite upper bound (absolute max), can fit reinsured losses and natural phenomenon such as temperature, wind speed, age of human population.
- Frechet GEV characteristics: Have a lower bound and heavy tails (with infinite variance if gamma>0.5) that follow a power law. This is typically the distribution used to model extreme financial loss events.
- Underlying distributions for Weibull GEV: Underlying distributions with finite upper limits 1. beta; 2. uniform; 3. triangular.
- Underlying distributions for Gumbel GEV: Light tailed distribution with finite moments of all orders 1. chi-squared; 2. exponential; 3. gamma; 4. log-normal; 5. normal; 6. Weibull.
- Underlying distributions for Frechet GEV: Heavy tailed distributions whose higher moments can be infinite 1. Burr; 2. F; 3. log-gamma; 4. Pareto; t.
- Return level approach for GEV: Select the max observation in each block.
- Return period approach for GEV: Count the observations in each block that exceed some set level.
- Trade-off considerations for block size selection in return level approach for GEV: Granularity versus variance of estimates. Larger number of blocks means fewer observations in each block so there is less information about extreme values but also lower variance. Fewer blocks means more information about extreme values but greater variance in parameter estimates.
- Two key limitations of the GEV approach: 1. a lot of data is ignored; 2. choice of block size is subjective.
- Parameters for GPD: 1. beta (scale); 2. gamma (shape).
- Characteristics of the GPD (of different gammas): 1. lower bound when gamma>0; 2. upper bound when gamma<0; 3. becomes Pareto when gamma>0.
- Asymptotic property for GEV/GPD: When the maxima of a distribution converge to a GEV distribution, the excess distribution converges to a GPD with the same shape parameter.
- Memory-less property of exponential distribution: Expected future waiting time for an event to occur is independent of the waiting time already elapsed.
- Selecting threshold for GPD: Determine the lowest threshold above which the mean excess function is linear in u. Typically around 90–95th percentile of complete distribution.
- Stochastic uncertainty: Uncertainty arising from the randomness of a finite set of observations.
- Common errors in the use of models: 1. inappropriate projection of past trends into future (errors in historical data, incomplete data, heterogeneity in data); 2. inappropriate underlying distribution (insufficient data, not investigating range of alternative distributions); 3. number of parameters incorrect (over-simplified, does not follow principle of parsimony).
- Ways to allow for parameter uncertainty through dynamic simulations: 1. use multivariate normal distribution (if covariance matrix is available); 2. use joint distribution for parameters (simulated by repeatedly re-fitting parameters to sets of data points each modeled using the original data set).
- Primary objective of building an ERM model: Enable actuary or risk manager to give organization appropriate advice to manage risks in a sound financial way.
- Use of model points: Groups are made in a way that policies in a group are expected to produce similar results when model is run. It is then sufficient for a representative single policy in each group (model point) to be run through the model, then scales up to derive the result of the total set of policies.
- Requirements for actuarial models: 1. model must be valid, sufficiently rigorous for its purposes; 2. adequately documented; 3. model points should reflect distribution of business being modeled; 4. components of model allow for all features of the business that could affect advice being given; 5. input parameter values should be appropriate to the business; 6. workings of the model should be easy to appreciate and communicate; 7. outputs should be capable of independent verification; 8. capable of subsequent development; 9. not overly complex.
- Requirements for actuarial models for ERM: Generic requirements plus 1. model should be amenable to an analysis of the impact of parameter uncertainty; 2. simulations should exhibit behaviors consistent with past events; 3. shortcomings of the model should be clearly stated; 4. reflect dynamics of the organization, now and in the future; 5. be comprehensive across all important well-defined risks; 6. produce outcomes that are balanced.
- Sweeting’s model requirements: 1. be appropriate for purpose; 2. be robust; 3. be as simple as possible, while meeting its purpose; 4. be developed over time; 5. sometimes be avoided when more importance is attached to other activities.
- Purposes of ERM models: 1. price products or services; 2. assess economic value of company; 3. estimate possible volatility of future profits and earnings; 4. determine capital adequacy requirements; 5. project future capital or solvency position; 6. assess effect of risk management and mitigation techniques on profits and capital requirements; 7. assess effect of strategic decisions; 8. evaluate projects.
- Steps in developing and applying model: 1. specify purpose; 2. collect data, group and modify as necessary; 3. choose model form, identify parameters; 4. select appropriate time period; 5. estimate parameters and correlations; 6. ensure acceptable goodness of fit; 7. ensure model is able to project all required outputs; 8. run model using estimated variables or stochastic simulations; 9. output results in appropriate format; 10. perform sensitivity tests.
- Considerations in the discussion of model outputs: 1. corporate risk policy; 2. corporate risk appetite; 3. corporate risk preferences; 4. risk preferences and other investment opportunities of company owners; 5. qualitative factors, judgment and intuition.
- Key characteristics of a time series for returns on individual equities: 1. returns are rarely independent and identically distributed; 2. volatility appears to vary over time; 3. extreme returns appear in clusters; 4. return series are leptokurtic (heavy-tailed, non-symmetric).
- Key characteristics of a time series for returns on equity portfolios: 1. correlations exist between returns of different portfolios at the same point in time; 2. correlations between different series vary over time; 3. multivariate returns data show little evidence of cross-correlation (between time periods); 4. multivariate series of absolute or squared returns do show strong evidence of cross-correlation; 5. extreme returns in one series often coincide with extreme returns in several other series.
- Characteristics of equity returns over a long time period: 1. volatility clustering is less marked; 2. returns appear to be more iid and less heavy-tailed (expected from the Central Limit Theorem).
- Excess kurtosis of daily financial data: Kurtosis measures “peakedness” of a distribution. The kurtosis for daily financial data appears to be higher than that of a normal distribution, which means more of the variance is due to infrequent extreme variations.
- Linked variables considered in a factor-based approach to modeling corporate yield bonds: 1. risk-free yield; 2. coupon rates; 3. credit spread.
- Steps to model portfolio returns using multivariate normal distribution: 1. decide on frequency of calculation (daily, weekly); 2. decide on timeframe of historical data to use (volume versus relevance); 3. for each asset class, choose total return index; 4. calculate log-returns; 5. calculate average returns and variance of each asset class and co-variance between classes; 6. simulate series of returns with same characteristics based on a multivariate normal distribution.
- Reason for usefulness of PCA for modeling bond returns with various durations: Changes in bond yields can be explained largely by shifts in just a couple of factors.
- Steps to model portfolio returns using PCA: #1–5 same as for using multivariate normal distribution; 6. derive matrix of deviations from average returns; 7. derive principal components that explain a sufficiently high proportion of the deviations from average past returns; 8. project this number of independent normal distributed normal random variables using associated eigenvalues as variances; 9. weight projected series of deviations with eigenvectors; 10. add these weighted projected deviations to expected returns from each asset class.
- Term premium: Part of the risk premium that is a function of term. Will vary from market to market and investor to investor.
- Factors reflected by credit spread: 1. expected probability of default and expected loss given default; 3. risk premium (uncertainty) attached to risk of default and LGD; 4. liquidity premium.
- Three common measures of credit spread: 1. nominal spread; 2. static spread; 3. option-adjusted spread.
- Nominal measure of credit spread: Difference between the gross redemption yields of risky and risk-free bonds.
- Static measure of credit spread: Addition to the risk-free rate at which discounted cashflows from a risky bond will equate to its price.
- Option-adjusted measure of credit spread: Static measure plus stochastic modeling to allow for any options embedded in the bond.
- Reasons why observed market credit spreads tend to be higher than can be justified by actual historic bond defaults: 1. higher volatility of returns relative to risk-free asset (credit beta); 2. higher uncertainty of potential future returns; 3. greater skewness of potential future returns on corporate debt (more significant downside); 4. lower liquidity of corporate debt; 5. lower marketability of corporate debt (and higher costs of trade); 6. differences in taxation.
- Features of a good benchmark for market risk: 1. unambiguous; 2. investable and trackable; 3. measurable on a reasonably frequent basis; 4. appropriate (to investor’s objectives); 5. reflective of current investment opinion; 6. specified in advance; Optional 7. contain high proportion of assets held in portfolio; 8. have a similar investment style to portfolio; 9. low turnover of constituents; 10. investable position sizes; 11. behaves like portfolio; 12. low correlation between risky-benchmark and market-benchmark; 13. variability of portfolio relative to benchmark is lower than portfolio relative to market.
- Strategic risk for market risk: Risk of poor performance of benchmark used to judge manager’s performance, relative to liability-based benchmark.
- Active risk: Risk of poor performance of manager’s actual portfolio relative to strategic benchmark.
- Active return: Difference between return on actual portfolio and return on benchmark.
- Brennan-Schwartz model: 1. changes in short term rates vary in line with steepness of the yield curve (differential between long and short term rates) for some alpha and beta; 2. volatility of short term rates varies in proportion to the most recent level of short term rates; 3. changes in long term rates vary in proportion to square of the level of long term rates and influence by short term rates through the product term; 4. volatility of long term rates varies in proportion to the most recent level of long term rates.
- Two approaches to modeling contagion risk: 1. consider contagion as a feedback risk (some serial correlation exists that can be modeled), though this in theory should be eliminated through arbitrage and thus ignored; 2. fit a t-copula with situation dependent correlations.
- Approaches to modeling market returns: 1. historical simulations (bootstrapping); 2. forward-looking data-based approaches (multivariate normal distribution); 3. forward-looking factor-based approaches (PCA); If data is sufficient 4. multivariate distribution other than normal; 5. combining non-normal marginal distributions using a copula.
- Events that could be counted as default: 1. payment due is missed; 2. financial ratio falls above or below a certain level; 3. legal proceedings start against credit issuer; 4. present value of assets falls below liabilities.
- Three components of default risk assessment: 1. probability of default; 2. loss on default; 3. level and nature of interactions between various credit exposures and other risks in portfolio.
- Two methods of strengthening recoveries: 1. requiring collateral; 2. requiring third party guarantees.
- Actions taken to minimize credit risk: 1. diversify credit exposure across a number of counterparties; 2. monitor exposures regularly; 3. take immediate action when default occurs.
- Four sources of information used to assess credit risk: 1. credit issuer (rating agencies); 2. counterparty; 3. publicly available data; 4. proprietary databases.
- Factors for qualitative credit models: 1. nature of contractual obligation; 2. level and nature of any security; 3. nature of the borrower; 4. economic indicators; 5. financial ratios; 6. face-to-face meetings with credit issuer and/or counterparty; 7. how risks change over time.
- Key advantage of qualitative credit models: A wide range of subjective factors can be incorporated into the assessment.
- Disadvantages of qualitative credit models: 1. excessive subjectivity; 2. lack of consistency between ratings; 3. meaning of subjective ratings may change over economic cycle; 4. ratings may fail to respond to changes in the economy or circumstances of the counterparty.
- Examples of quantitative credit models: 1. credit-scoring; 2. structural (firm value); 3. reduced-form; 4. credit-portfolio; 5. credit-exposure.
- Credit scoring models: Forecast the likelihood of a counterparty defaulting at a particular point in time given certain fundamental information about the counterparty (empirical models, expert models).
- Structural models (firm-value): Estimate the likelihood of default using market information such as the company’s share price and volatility of its share price (Merton, KMV).
- Reduced form models: Model default as a statistical process that typically depends on economic variables (credit migration models).
- Credit portfolio models: Estimate credit exposure across several counterparties, may allow for diversification effects of uncorrelated creditors (multivariate structural, multivariate credit migration).
- Credit exposure models: Estimate credit exposure for more complex circumstances, like when derivatives and guarantees are present (Monte Carlo).
- Common difficulties of quantitative credit models: 1. lack of publicly available data on default experience; 2. skewness of the distribution of credit losses; 3. correlation of defaults between different counterparties; 4. model risk.
- Merton model: Structural model using option pricing theory along with equity share price volatility to derive information on the value of company assets (and hence debt). Equity shares are considered a call option on company assets, then value of debt is equal to value of a risk-free bond less the value of a put option on company’s assets.
- Key advantage of using the Merton model for credit risk: Allows us to estimate an appropriate credit spread for a bond, even when the bond is unquoted.
- Disadvantages of using the Merton model for credit risk: Assumes 1. markets are frictionless; 2. risk-free rate is deterministic; 3. asset value follows a log-normal random walk with fixed rate of growth and fixed volatility; 4. asset value is an observable traded security; 5. bond is a zero-coupon bond with only one default opportunity; 6. default results in liquidation; 7. results can be affected by market sentiment.
- Distance to default: Used in the KMV model, it is the number of standard deviations that the company assets have to fall in value before breaching a threshold.
- Advantages of the KMV model over the Merton model: 1. coupon-paying bonds can be modeled; 2. more complex liability structures can be accommodated (uses average coupon and overall gearing level); 3. starting asset values are not assumed to be observable and is derived from the value of the company’s equity shares.
- Credit migration modeling process: 1. historical data is used to determine the probability that a company rated something at the start of the year will be rated something at the end of the year (recorded in rating probability matrices); 2. matrices are applied repeatedly to a counterparty’s current rating to estimate the likelihood of each possible rating in each future year; 3. using the probability of default for a company of a given rating, the model estimates the chance of default in each future year.
- Two advantages of the credit migration approach: 1. volatile equity markets should not overly impact results; 2. model does not rely on publicly traded share information.
- Assumptions of the credit migration approach: 1. credit migration process follows a time-homogeneous Markov chain; 2. there exists a credit rating that reflects the company’s default likelihood through the business cycle.
- Disadvantages of the credit migration approach: 1. incorrect time-homogeneity assumption (recently downgraded company is more likely to be downgraded again); 2. assumes default probabilities for each rating in each future year can be estimated; 3. assumes likelihood of default can be determined solely by credit rating; 4. low number of distinct credit ratings results in a low level of granularity in default estimates; 5. rankings of organizations by different credit rating agencies do not always coincide; 6. not all companies have obtained a credit rating; 7. ratings are sometimes unavailable (withdrawn).
- CreditMetrics approach: Estimates variance of bond’s future value in one year’s time by combining default probabilities from credit migration models with estimate future values at each rating.
- Assumptions of the multivariate implementation of CreditMetrics: 1. each credit rating has an associated probability of default; 2. change in rating is a function of a change in the value of an organization’s assets and the volatility of the value of those assets; 3. value of assets behaves log-normally; 4. correlation between asset values can be estimated from the correlation between corresponding equity values; 5. equity returns can be modeled using country-specific indices and independent firm-specific volatility.
- Examples of credit portfolio models: 1. multivariate structural; 2. multivariate credit-migration (or financial) models; 3. econometric or actuarial models; 4. common shock models; 5. time-until-default (or survival) models.
- Multivariate structural models: Multivariate KMV, model asset values by using correlation matrices or copulas.
- Multivariate credit migration models: Multivariate CreditMetrics, assumes equity returns can be modeled using country-specific indices and firm specific volatility.
- Econometric or actuarial models for credit risk: Estimates default rate of firms using external or empirical data.
- Common shock models for credit risk: Determine probability of no defaults by assuming each bond defaults in line with a Poisson process, and considering shocks that cause the default of one or more of the bonds in the portfolio.
- Time until default models for credit risk: Survival CDFs are linked by a suitably parameterized copula function to estimate the aggregate default rate for the bond portfolio.
- Two common measures of recovery: 1. price after default (short term measure); 2. ultimate recovery (larger, long term measure).
- Factors affecting the likely loss on default: 1. seniority of debt; 2. availability of collateral; 3. nature of industry; 4. point in economic cycle; 5. legal jurisdiction; 6. rights and actions of other creditors.
- Seven reasons for increased interest in active management of operational risks: 1. advent of ERM; 2. introduction of new regulatory capital requirements; 3. increasing emphasis on sophisticated quantitative models for other types of risk; 4. operational risks have no inherent upside potential; 5. main driver behind many cases of major financial disasters; 6. inter-linked with credit and market risk and is particularly important to minimize during already stressed market conditions; 7. may otherwise be treated differently in different areas of the company.
- Benefits of effective operational risk management: 1. minimizes impact of reputational damage (distinctive benefit); 2. minimizes day-to-day losses and reduces potential for more extreme incidents; 3. improves company’s ability to meet its business objectives (reduces time spent on crisis management); 4. strengthens overall ERM process and framework.
- Initial analysis of publicly available data on operational risks: 1. distribution of operational losses is skewed to the right (large number of small losses, few large losses); 2. loss severities have a heavy tailed distribution; 3. losses occur randomly in time; 4. loss frequency may vary considerably over time.
- Features of operational risk data that make applying statistical methods difficult: 1. low volume of data; 2. underlying cyclical component and/or depend on current economic conditions.
- Bottom up model to assess operational risks: Estimates operational risk capital by starting the analysis at a low level of detail and then aggregating the results to obtain an overall assessment.
- Top down model to assess operational risks: Use readily-available data and fairly simple calculations to give a general picture of operational risks.
- Steps in a scenario analysis used to assess operational risks: 1. group risk exposures into broad categories; 2. for each group, develop a plausible adverse scenario; 3. for each scenario, calculate the consequences of the risk event; 4. calculate total costs for all scenarios; 5. assessments of frequency and severity can be displayed on a risk map.
- Benefits of using scenario analysis to assess operational risk: 1. capture opinions, concerns and experience of risk managers; 2. not rely heavily on availability, accuracy and relevance of historical data; 3. provide opportunity to identify hard to predict, high impact events (black swans); 4. identify and improve understanding of cause and effect relationships; 5. reduce risk reward arbitrage opportunities.
- Main advantage of bottom up models for operational risks: Give more robust picture of a company’s overall risk profile.
- Limitations of bottom up models for operational risks: 1. difficult to break down reported aggregate losses into their constituent components; 2. may be little robust internal historic data, especially for low probability and high severity events; 3. differences between companies mean that application of external data is difficult.
- Basel AMA operational risk assessment: Under the Basel advanced measurement approach, operational risk is assessed using internal models and scenario analysis, 1-year holding period and a 99.9% confidence interval.
- Three areas requiring credible data to apply the AMA approach to operational risks: 1. internal data on repetitive, high frequency looses over a three to five year period; 2. external data on non-repetitive, low frequency losses; 3. suitable stress scenarios to consider.
- Advantage of the implied capital model for operational risks: Simple and forward looking.
- Limitations of the implied capital model for operational risks: 1. total risk capital needs to be estimated; 2. inter-relationships between different types of risk are ignored; 3. does not capture cause and effect scenarios.
- Advantage of income volatility model over implied capital model for operational risks: There is better data availability for total income volatility than for total risk capital needed.
- Limitations of the income volatility model for operational risks: 1. ignores rapid evolution of companies and industries; 2. does not capture the softer measures of risk (opportunity cost, reputation) by focusing on income rather than value.
- Advantage of CAPM over the income volatility model for operational risks: Includes both aggregate effect of specific risk events and the softer issues.
- Limitations of CAPM for operational risks: 1. no information is provided on losses due to specific risks; 2. level of operational risk capital is unaffected by any controls put in place (little motivation to improve risk management process); 3. tail end risks are not thoroughly accounted for; 4. does not help anticipate incidents.
- Stages in a comprehensive operational risk management process: 1. risk policy and organization; 2. risk identification and assessment; 3. capital allocation and performance measurement; 4. risk mitigation and control; 5. risk transfer and finance.
- Six components of a comprehensive operational risk management policy: 1. principles for operational risk management; 2. definitions and taxonomy for operational risk; 3. objectives and goals of operational risk management; 4. operational risk management processes and tools; 5. organizational structure as it applies to operational risk management; 6. roles and responsibilities of different business areas.
- Purpose of loss incident database for operational risks: Help company learn lessons from past loss incidents, analyze trends and support analysis of root causes and mitigation strategies.
- Controls self assessment: Internal analysis of key risks and their controls and management implications.
- Examples of top down models for operational risks: 1. implied capital; 2. income volatility; 3. economic pricing models (CAPM); 4. analogue models (use data from similar companies).
- Factor based approach for operational risks: Assume losses are related to the volume of transactions and apply a weighting to the actual or expected volume of transactions (Basel indicator and standardized approaches).
- Funding liquidity risk: Risk of money markets not being able to supply funding to a business when required.
- Market liquidity risk: Lack of capacity in the market to handle asset transactions at a time when the deal is required.
- Reasons why quantitative techniques usually are not applied to liquidity risk: 1. historic data on liquidity crisis is limited; 2. degree and nature of every organization’s exposure to liquidity risk is different.
- Factors included in cash inflows: 1. revenues/income generated by assets; 2. proceeds from sale of assets; 3. drawings upon sources of liquidity (issue of new debt or equity).
- Six specific scenarios that should be considered to assess liquidity risk: 1. rising interest rates; 2. ratings downgrade; 3. large operational loss; 4. large single insurance claim occurrence; 5. loss of control over a key distribution channel; 6. impaired capital markets; 7. sudden termination of a large reinsurance contract.
- Level risk (or underwriting risk) in demographic risk: Risk that the particular underlying population’s loss costs are not as expected over the immediate future.
- Reserving risk in demographic risk: 1. volatility risk (finite policies); 2. catastrophe risk; 3. trend (or cycle) risk (future long term changes).
- Way to best model the risk of something sudden and temporary: Scenario analysis. More complex dependencies can be modeled by copulas.
- Key distinction between life and non-life insurance risk: Non-life policies may experience more than one claim and move through different states over the lifetime of the policy. Severity may also need to be modeled.
- Main methods of assessing liquidity risk: 1. scenario analysis (when and why expected cash outflows might exceed inflows); 2. stress testing (examine effect on liquidity of an extreme event).
