How To Generate Large Covariance Matrices
GlobalCapital, is part of the Delinian Group, DELINIAN (GLOBALCAPITAL) LIMITED, 4 Bouverie Street, London, EC4Y 8AX, Registered in England & Wales, Company number 15236213
Copyright © DELINIAN (GLOBALCAPITAL) LIMITED and its affiliated companies 2024

Accessibility | Terms of Use | Privacy Policy | Modern Slavery Statement
Derivatives

How To Generate Large Covariance Matrices

The generation of large, positive semi-definite covariance matrices that properly reflect market conditions has been a challenge for finance practitioners for several years. Since the 1996 amendment to the 1988 Basel accord, where the principals of internal models for the calculation of market-risk capital were outlined, it has been a major problem to generate the covariance matrices that are necessary to calculate firm-wide Value at Risk estimates. In very large portfolios a risk factor model may be employed, but it is still necessary to have a covariance matrix for all the risk factors of the portfolio.

Of the three methods that are in standard use, the covariance method for linear portfolios, Monte Carlo and historical simulation, only the historical simulation method requires no covariance matrix. And even with historical simulation, covariance matrices are normally used in stress testing and scenario analysis. The need for large covariance matrices is not just confined to the middle office of a large investment bank. Traders in the front office also require these matrices to price and hedge complex option portfolios. Large covariance matrices also have a major role to play in investment analysis, because portfolio risk is determined by the covariance matrix of all the assets in the portfolio.

 

The RiskMetrics Methodology

J.P. Morgan and Reuters have risen to this challenge by providing the RiskMetrics data sets. J.P. Morgan launched the first version of RiskMetrics in October 1994 and over the course of the next two years has made several improvements to the methodology and the data. The data consist of three large covariance matrices of the returns to major foreign exchange rates, money market rates, equity indices, bonds and some key commodities. The statistical methodology for calculating these covariance matrices and a description of how the data should be implemented in Value at Risk models is given in the RiskMetrics technical document.

There are some disadvantages with these data. One problem with the two exponentially weighted matrices is that the same smoothing constant has to be used for all assets, otherwise the matrix would not be positive semi-definite. The smoothing constant in an exponentially weighted moving average corresponds to the persistence in volatility and correlation estimates; they have been set at 0.94 for the daily matrix and 0.97 for the monthly matrix for every market return. However the volatility and correlation persistence characteristics are quite different for different markets; they also change over time. For example, in some of the major equity markets GARCH models indicate that volatility has become less persistent during the last few years and current daily data on the Standard & Poor's 500 indicates a volatility persistence of less than 0.9.

A problem with the equally weighted matrix is that it suffers from ghost effects of extreme events that occurred any time during the past 250 days. Thus a market crash, such as that occurring after Sept. 11, will have exactly the same effect on volatility and correlation estimates at the beginning of September 2002 as it did at the end of September 2001.

The Orthogonal Alternative

Principal component analysis (PCA) may be used to construct large covariance matrices that have many desirable characteristics. The method is computationally very simple: it takes the univariate volatilities of the first few principal components of a system of risk factors and the factor weights matrix of the principal components representation to produce a full covariance matrix for the original system. A schematic of the method is shown in the diagram. Here the matrix A is a matrix of re-scaled factor weights and the diagonal matrix D is a matrix of either generalized autoregressive conditional heteroscedasticity (GARCH) or exponentially weighted moving average (EWMA) variances of the principal components.

The key behind the success of the orthogonal method is to choose a reduced set of principal components; if there are n returns in the full covariance matrix, only k << n principal components are used. The higher principal components are not included because they are assumed to be picking up only the unnecessary noise in the system; thus correlation estimates in particular become more stable over time. Another key to successful calibration of the model is to divide the data into relatively highly correlated blocks before the principal component analysis is applied. This would normally entail division of the risk factors into country and product categories, and/or industry or credit categories. Full description of the method is beyond the scope of this article: further details may be found in the references.

 

Advantages

There are many: The full covariance matrix is always positive semi-definite. Very few constraints are imposed on the movements in volatility and correlation. In particular, it is not necessary to impose the constraint that all volatility and correlation estimates have the same reaction and persistence to market shocks. Instead, the characteristics of an asset or risk factor are determined by its correlation in the system.

The computational burden is much lighter; all of the n(n+1)/2 variances and covariances are simple linear transformations of just k EWMA or GARCH variances, and in many systems k will be of the order of two or three.

The method allows one to implement multivariate GARCH for large systems, something that has previously been beyond the scope of any multivariate GARCH calibration.

If the method is applied with GARCH, rather than EWMA variances for the principal components, one can generate term structures of covariance matrix term structures that are mean reverting. That is, there is no necessity to apply the square-root-of-time rule and assume that volatilities and correlations are constant over all risk horizons. Instead the usual GARCH analytic formulae for computing the term structure of volatility and correlation are applied so that the h-day covariance matrix forecast converges to the long-term average as h increases.

Data may be difficult to obtain directly, particularly on new issues or on financial assets that are not heavily traded. When data are sparse or unreliable on some of the variables in the system a direct estimation of volatilities and correlation may be very difficult. However, provided there is sufficient information to infer their factor weights in the principal component representation, their volatilities and correlations may be obtained using the orthogonal method.

By using only the first few principal components to represent the system, the correlations estimates become more stable and less influenced by variation that would be better ascribed to noise in the data.

I predict that this method will soon become the industry standard for generating large, positive semi-definite covariance matrices. The model can be implemented very easily and commercial software for orthogonal GARCH has been available from Algorithmics since 1997.

 

This week's Learning Curve was written by professor Carol Alexander, chair of risk management and director of research at the ISMA Centre, University of Reading, U.K.

 

References

* Alexander, C. and A. Chibumba (1996) "Multivariate

Orthogonal Factor GARCH" University of Sussex
Discussion Papers in Mathematics

* Alexander, C. (2000) "Orthogonal Methods for

Generating Large Positive Semi-Definite Covariance

Matrices" ISMA Centre Discussion Papers in Finance
2000-06
available from www.ismacentre.rdg.ac.uk

* Alexander, C. (2001) Market Models: a Guide to Financial
Data Analysis,
John Wileys www.wiley.co.uk/marketmodels

* Alexander, C. (2001) "Orthogonal GARCH" in

C. Alexander (ed.), Mastering Risk Volume 2. Financial

Times ­ Prentice Hall, pp21-38 www.financialminds.com

Related articles

Gift this article