The Kelly Criterion in Applied Portfolio Selection – Part 2
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
Previous blog post on the Kelly Criterion
As pointed out in a previous blog post, the Kelly Criterion is an interesting option to decide on position sizing in portfolio selection. While the previous post looked at single stocks, I will today show how to optimize position sizes for a portfolio with multiple stocks.
The core function
At the core of my portfolio optimization is this function:
12345678910111213 | opt_portfolio <-function(shares, dpf, maxshare) { # calculate portfolios return vector exp_returns <- dpf%*%shares obj = -sum(log(1+exp_returns)) weight.penalty = 1000*(1-sum(shares))^2 # max share penalty: maxpen <- sum(shares[shares>maxshare])*1000 return(obj + weight.penalty + maxpen)} |
The function has three parameters. shares
is the vector of shares (position size for each stock) that is going to be estimated by optimization. dpf
is a matrix where each stock is a column and each line contains the daily stock price movements like e.g.: diff(stock)/lag(stock)
. The maxshare
option can be used to restrict the position size of a single stock to a maximum.
exp_returns <- dpf%*%shares
calculates the daily (or weekly) portfolio returns. obj
is the Kelly Criterion. The higher the volatility, the larger values obj
will take. We are going to minimize the function so low values, i.e. low volatility is preferred. The next line is a trick to restrict the optimizer to values that sum to 1 (100%). If the sum of all position sizes is 1, weight.penalty
is 0 (the minimum possible value). Would the sum deviate from 1, the weight.penalty
would quickly increase. The second penalty term is similar. If a position is sized above the maximum allowed share, the penalty scores. Finally the function returns a value which the optimizer will try to minimize.
The wrapper function
Now to feed stock data into this function I wrote a wrapper function that prepares the matrix of returns and calls optim()
.
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849 | library(quantmod)library(dplyr)opt_portfolio_wrapper <- function(stocks,r=rep(0.03,length(stocks)),maxshare=1,daily=FALSE,short=FALSE) { # test which stocks are already in workspace, else download data for(stock in stocks) { if(!exists(stock)) getSymbols(stock,from="1970-01-01",env=parent.frame(2)) } # transform return vector from yearly to daily or weekly if(daily==TRUE) { r <- (1+r)^(1/250)-1 } else { r <- (1+r)^(1/52)-1 } portfolio <- NULL for(stock in stocks) { if(daily) assign(stock,get(stock)[,6]) if(!daily) assign(stock,to.weekly(get(stock))[,6]) } # merge all stocks together portfolio <- Reduce(function(...) merge(..., all=TRUE), mget(stocks)) # build returns by diff()/lag() d.portfolio <- data.frame(na.omit(diff(portfolio)/stats::lag(portfolio))) # center around the mean to eliminate past performance as an information for portfolio selection d.portfolio.future <- scale(d.portfolio, scale=F) # add the (personally) expected return for each stock for(i in 1:ncol(d.portfolio.future)) { d.portfolio.future[,i] <- d.portfolio.future[,i] + r[i] } # define lower und upper bounds for the parameters. 0 to 1 or -1 to 1 if short positions are allowed lower <- rep(ifelse(short==TRUE,-1,0),length(stocks)) upper <- rep(1,length(stocks)) # starting values for optimizer start <- rep(1/length(stocks),length(stocks)) res <- optim(start, opt_portfolio, dpf=d.portfolio.future, maxshare=maxshare, method="Nelder-Mead") Portfolio <- data.frame("Share"=res$par,"Stock"=stocks) %>% arrange(desc(Share)) return(Portfolio)} |
Most of the function is commented between the lines. I introduced two options: daily
and short
. The default values are, that short positions are not allowed and the returns are calculated on a weekly, not daily basis. One thing thats important for me is, that I dont use the stock returns as they are but I center them around the mean return that I expect. If I would not do this, the algorithm would always pick the historically best performing stock(s). However past performance is not a good predictor of future performance and one has to do his homework and build own (and realistic) expectations.
Testing
To test the function I select some random stocks (not a recommendation to trade these stocks).
123456789 | rstocks <- c("AAPL","GOOG","AMZN","GIS")opt_portfolio_wrapper(rstocks)# Share Stock# 1 0.80851815 GIS# 2 0.09979312 AAPL# 3 0.06343397 GOOG# 4 0.02829082 AMZN |
This tells the following story: Assuming all stocks have the same expected yearly return of 3%, the long term growth rate of wealth (Kelly Criterion) is achieved by investing in the stock with a lowest volatility (here GIS). At the same time this is a hint, that the assumption that all returns are to be expected as equal is too strong. One could now play with the function to find out how high the expected return of a risky stock has to be, to be included into an existing portfolio.
Summary
I personally use this tool as a rough orientation, how stocks from my watchlist would contribute to the risk-return-profile of my portfolio. I usually use the maxshare
option and rather “flat” and moderate return expectations, because the Kelly Criterion highly favors stocks with better return expectations. E.g. if I add 1% additional expected return for GOOG, the numbers change drastically:
1234567 | opt_portfolio_wrapper(rstocks, r=c(0.03,0.04,0.03,0.03))# Share Stock# 1 0.749115293 GIS# 2 0.200639803 GOOG# 3 0.051410711 AAPL# 4 -0.001125542 AMZN |
If find the tool to be useful because it keeps me from overbetting on very volatile stocks and gives hints, if and when risky stocks are worth to bet on.
Other
There are a few additional points worth mentioning:
- while
opt_portfolio_wrapper()
gives the shares to achieve the Kelly-optimal portfolio selection, it does not tell, how much one should bet on the whole portfolio. Approximately this be calculated by dividing the expected portfolio return by the annualized variance. One could also adapt thekelly()
function from the previous blog post to get a number that incorporates the fact, that the return distribution is fat-tailed and non-normal. - I excluded some of the options that my function originally had for the sake of readability. Options that I found interesting are:
- transaction costs
- “fixed shares” (e.g. my pension account is invested in index funds and bonds. When picking stocks, I might want to incorporate the fact, that I am already exposed to these securities)
- a
from
option forgetSymbols()
if the historic returns should be restricted to a certain range
- transaction costs
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.