Emerging Markets, Stocks

Optimum-Complexity Portfolios: How To Invest In Turbulent Times

Volatility concept with bumpy letter pattern on black

By ONTONIX QUANTITATIVE COMPLEXITY MANAGEMENT

Research confirms that high portfolio complexity impacts negatively mid and long-term expected returns. This is because high complexity is a formidable source of fragility, hence vulnerability. In a turbulent economy highly vulnerable portfolios and financial products are more exposed and more volatile. Therefore, a new portfolio design strategy based on complexity has recently been developed. In this paper we discuss balanced portfolios only but the procedure described herein may be easily extended.

The uniqueness of complexity-based portfolio design lies in the fact that it takes into account, for the first time, the fundamental characteristic of our global economy – its complexity. Conventional approaches adopt simple statistical techniques to compute the so-called beta coefficient, which is a measure of systemic risk of a portfolio or a security in comparison to the market as a whole. Complexity, on the other hand, provides information on how chaotic or predictable the evolution of a security or a portfolio is, as well as invaluable information on the structure of the interactions of portfolio components. Such information is crucial when it comes to designing and managing portfolios in highly turbulent regimes, dominated by shocks, extreme events and sustained instability.

Complexity-based portfolio design works as follows. A pool of securities is defined, such as an index or a group of indices. Let us suppose, without loss of generality, that this pool is the DJIA (or the Dow 30), composed of 30 stocks. The pool has a particular measure of complexity which can be computed using the QCM (Quantitative Complexity Management) tool Ontonet™. The goal is to define a subset of these 30 stocks, say 15, such that their complexity is as low as possible. An example of the so-called Complexity Map of the DJIA obtained on a particular date is illustrated below.

map1

The above map illustrates the structure of interaction between the thirty stocks the size of each node on the diagonal is proportional to its contribution to the total complexity of the DJIA. Clearly, in virtue of the dynamic nature of stock markets, the Complexity Map changes and should be recomputed on a regular basis, also in real-time.

The selection of low complexity portfolios, i.e. subsets of stocks belonging to a given basket, when approached from a classical optimization perspective is an extremely difficult problem to solve. In fact, one way to define the problem is in combinatorial terms as follows: Given a set of N securities, select a subset of M such that its complexity is minimum. In the case of the DJIA, where N=30 and M=15, the number of combinations to analyse is c(N,M) = 155.117.520. In the case of the NASDAQ 100, with N=100 and M=25 we obtain c(N,M) = 2.425 E+23! For large baskets containing, for example N=1000 securities, with a portfolio size of M=25 leads to c(N,M) = 4.764186253 E+49. Such cases are clearly beyond the reach of the most powerful supercomputers.

Evidently, conventional portfolio complexity optimization is an impractical approach and a different methodology must be devised. Such an alternative exists and is called Portfolio Complexity Profiling (PCP). Extracting Low-Complexity portfolios even from huge stock ensembles takes minutes on a PC. It works as follows: once the Complexity Map has been obtained one selects the stocks with the smallest contribution to overall complexity (i.e. smallest nodes) and incorporates them in a portfolio. This doesn’t guarantee the absolute minimum of complexity but gets very close. The above methodology may be easily applied to cases in which portfolios are built from baskets comprising thousands of securities.

In the examples illustrated in the present paper the following assumptions have been made:

  • Analysis period ranges from 2000-2013 to 2003-2013.
  • Portfolios are re-designed every 100 trading days
  • Portfolios are chosen based on Complexity Profiling of the entire index.
  • The stocks that are incorporated into a portfolio form the ‘LC portfolio’, while those that are discarded form the ‘HC portfolio’, where ‘LC’ and ‘HC’ stand for ‘Low Complexity and ‘High Complexity’ respectively.
  • Complexity is computed based on the evolution of the daily adjusted closing values of stocks.
  • Portfolios are balanced.
  • Costs of purchase and/or sale of stocks have not been accounted for.
  • Performance is compared with that of balanced portfolios built on entire indices.

It has been found that with the above portfolio design strategy, the LC portfolios generally perform better than portfolios based on the entire index, while the HC portfolios offer generally inferior performance.  An example of a Low Complexity sub-set of the DJIA is illustrated below.

map2

The complexity of this portfolio, comprising stocks which have been discarded, is C=24.2 cbits. The Low Complexity portfolio (sub-set) is indicated below and has a complexity of C=16.8 cbits).

map3

The performance of Low and High-Complexity portfolios based onto the DJI, the S&P, the NASDAQ 100 and EUROSTOXX50 are indicated in the following pages. Performance is measured versus corresponding balanced portfolios based on all the components of the various indices. In other words, the same initial amount is invested in, say, all 30 stocks of the DJIA and on a Low Complexity portfolio with 15 stocks. In the case of the full portfolio of 30 stocks no modifications are made within the back-testing period while the LC portfolio is recomputed every 100 trading days.

LC portfolios are designated as follows: LC-Ind-X, where ‘Ind’ stands for the index upon which the portfolio is built and ‘X’ represents the percentage of stocks selected from the said index.

LC-DJIA-50 portfolio vs. full DJIA-based balanced portfolio

screenhunter_944-feb-22-23-26

screenhunter_945-feb-22-23-27

screenhunter_946-feb-22-23-28

screenhunter_947-feb-22-23-29

During most of the analysis period, Low Complexity portfolios perform far better than the underlying index-based portfolios especially after 2008, i.e. in conditions of high turbulence and market volatility.

Explained in simple terms, complexity is a new and modern measure of volatility. It takes into account not just the dynamics of each market index component – it also incorporates all the interactions between stocks and the topology of the structure of these interactions. This is done without resorting to conventional statistical methods which prove unreliable in conditions of highly unstable and nonstationary regimes. Take for example the concept of correlation. It is a key instrument in the hands of analysts and is used from performing plain statistics to Monte Carlo Simulations or portfolio design based on the Markovitz approach. Correlations are computed based on covariance and standard deviations. Standard deviations measure dispersion around the mean but do not account for the actual distribution of data. More advanced measures of correlation, based on entropy, have been devised in order to provide a more realistic measure called generalized-correlation.

Linear correlations often lead to excessively optimistic results. An example is illustrated below. The linear correlation coefficient is 0.92, the generalized one is 0.76, a full 16% less.

screenhunter_948-feb-22-23-30

Linear (Pearson’s) correlation must be applied with caution and only to data which is relatively well-behaved. In the example above, linear correlation yields an overly optimistic value of 92% while generalized correlation yields a more realistic 76%. In the case in question linear correlation neglects the fact that there are two clusters in the data. Modern Portfolio Theory, according to which optimum portfolios are obtained by solving the following quadratic problem:

screenhunter_949-feb-22-23-30

relies heavily on correlations which are ‘contained’ in the covariance matrix ∑. In large portfolios there may be tens of thousands of correlations which are all essentially too optimistic (too high). But correlations and variances are also present in the VIX index. The VIX is a measure of expected volatility calculated as 100 times the square root of the expected 30-day variance of the S&P 500 rate of return. The variance is annualized and the VIX expresses volatility in percentage points.  Handle with care.


Courtesy of Ontonix QCM Blog

 

About ETFalpha

Chief ETF Strategist & Co-Founder at EMerging Equity

Discussion

No comments yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow Us On Social Media

Google Translate

Like Us On Facebook

Our Discussion Groups

Facebook Group
LinkedIn Group

Follow EMerging Equity on WordPress.com

Our Social Media Readers

Digg
Feedly
Follow

Get every new post delivered to your Inbox.

Join 264 other followers

%d bloggers like this: