Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
This post will cover two aspects: the first will be a function to convert daily returns into a table of monthly returns, complete with drawdowns and annual returns. The second will be an interview I had with David Lincoln (now on youtube) to talk about the events of Feb. 5, 2018, and my philosophy on volatility trading.
So, to start off with, a function that I wrote that’s supposed to mimic PerforamnceAnalytics’s table.CalendarReturns is below. What table.CalendarReturns is supposed to do is to create a month X year table of monthly returns with months across and years down. However, it never seemed to give me the output I was expecting, so I went and wrote another function.
Here’s the code for the function:
require(data.table) require(PerformanceAnalytics) require(scales) require(Quandl) # helper functions pastePerc <- function(x) {return(paste0(comma(x),"%"))} rowGsub <- function(x) {x <- gsub("NA%", "NA", x);x} calendarReturnTable <- function(rets, digits = 3, percent = FALSE) { # get maximum drawdown using daily returns dds <- apply.yearly(rets, maxDrawdown) # get monthly returns rets <- apply.monthly(rets, Return.cumulative) # convert to data frame with year, month, and monthly return value dfRets <- cbind(year(index(rets)), month(index(rets)), coredata(rets)) # convert to data table and reshape into year x month table dfRets <- data.frame(dfRets) colnames(dfRets) <- c("Year", "Month", "Value") monthNames <- c("Jan", "Feb", "Mar", "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Oct", "Nov", "Dec") for(i in 1:length(monthNames)) { dfRets$Month[dfRets$Month==i] <- monthNames[i] } dfRets <- data.table(dfRets) dfRets <- data.table::dcast(dfRets, Year~Month) # create row names and rearrange table in month order dfRets <- data.frame(dfRets) yearNames <- dfRets$Year rownames(dfRets) <- yearNames; dfRets$Year <- NULL dfRets <- dfRets[,monthNames] # append yearly returns and drawdowns yearlyRets <- apply.yearly(rets, Return.cumulative) dfRets$Annual <- yearlyRets dfRets$DD <- dds # convert to percentage if(percent) { dfRets <- dfRets * 100 } # round for formatting dfRets <- apply(dfRets, 2, round, digits) # paste the percentage sign if(percent) { dfRets <- apply(dfRets, 2, pastePerc) dfRets <- apply(dfRets, 2, rowGsub) dfRets <- data.frame(dfRets) rownames(dfRets) <- yearNames } return(dfRets) }
Here’s how the output looks like.
spy <- Quandl("EOD/SPY", type='xts', start_date='1990-01-01') spyRets <- Return.calculate(spy$Adj_Close) calendarReturnTable(spyRets, percent = FALSE) Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Annual DD 1993 0.000 0.011 0.022 -0.026 0.027 0.004 -0.005 0.038 -0.007 0.020 -0.011 0.012 0.087 0.047 1994 0.035 -0.029 -0.042 0.011 0.016 -0.023 0.032 0.038 -0.025 0.028 -0.040 0.007 0.004 0.085 1995 0.034 0.041 0.028 0.030 0.040 0.020 0.032 0.004 0.042 -0.003 0.044 0.016 0.380 0.026 1996 0.036 0.003 0.017 0.011 0.023 0.009 -0.045 0.019 0.056 0.032 0.073 -0.024 0.225 0.076 1997 0.062 0.010 -0.044 0.063 0.063 0.041 0.079 -0.052 0.048 -0.025 0.039 0.019 0.335 0.112 1998 0.013 0.069 0.049 0.013 -0.021 0.043 -0.014 -0.141 0.064 0.081 0.056 0.065 0.287 0.190 1999 0.035 -0.032 0.042 0.038 -0.023 0.055 -0.031 -0.005 -0.022 0.064 0.017 0.057 0.204 0.117 2000 -0.050 -0.015 0.097 -0.035 -0.016 0.020 -0.016 0.065 -0.055 -0.005 -0.075 -0.005 -0.097 0.171 2001 0.044 -0.095 -0.056 0.085 -0.006 -0.024 -0.010 -0.059 -0.082 0.013 0.078 0.006 -0.118 0.288 2002 -0.010 -0.018 0.033 -0.058 -0.006 -0.074 -0.079 0.007 -0.105 0.082 0.062 -0.057 -0.216 0.330 2003 -0.025 -0.013 0.002 0.085 0.055 0.011 0.018 0.021 -0.011 0.054 0.011 0.050 0.282 0.137 2004 0.020 0.014 -0.013 -0.019 0.017 0.018 -0.032 0.002 0.010 0.013 0.045 0.030 0.107 0.075 2005 -0.022 0.021 -0.018 -0.019 0.032 0.002 0.038 -0.009 0.008 -0.024 0.044 -0.002 0.048 0.070 2006 0.024 0.006 0.017 0.013 -0.030 0.003 0.004 0.022 0.027 0.032 0.020 0.013 0.158 0.076 2007 0.015 -0.020 0.012 0.044 0.034 -0.015 -0.031 0.013 0.039 0.014 -0.039 -0.011 0.051 0.099 2008 -0.060 -0.026 -0.009 0.048 0.015 -0.084 -0.009 0.015 -0.094 -0.165 -0.070 0.010 -0.368 0.476 2009 -0.082 -0.107 0.083 0.099 0.058 -0.001 0.075 0.037 0.035 -0.019 0.062 0.019 0.264 0.271 2010 -0.036 0.031 0.061 0.015 -0.079 -0.052 0.068 -0.045 0.090 0.038 0.000 0.067 0.151 0.157 2011 0.023 0.035 0.000 0.029 -0.011 -0.017 -0.020 -0.055 -0.069 0.109 -0.004 0.010 0.019 0.186 2012 0.046 0.043 0.032 -0.007 -0.060 0.041 0.012 0.025 0.025 -0.018 0.006 0.009 0.160 0.097 2013 0.051 0.013 0.038 0.019 0.024 -0.013 0.052 -0.030 0.032 0.046 0.030 0.026 0.323 0.056 2014 -0.035 0.046 0.008 0.007 0.023 0.021 -0.013 0.039 -0.014 0.024 0.027 -0.003 0.135 0.073 2015 -0.030 0.056 -0.016 0.010 0.013 -0.020 0.023 -0.061 -0.025 0.085 0.004 -0.017 0.013 0.119 2016 -0.050 -0.001 0.067 0.004 0.017 0.003 0.036 0.001 0.000 -0.017 0.037 0.020 0.120 0.103 2017 0.018 0.039 0.001 0.010 0.014 0.006 0.021 0.003 0.020 0.024 0.031 0.012 0.217 0.026 2018 0.056 -0.031 NA NA NA NA NA NA NA NA NA NA 0.023 0.101
And with percentage formatting:
calendarReturnTable(spyRets, percent = TRUE) Using 'Value' as value column. Use 'value.var' to override Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Annual DD 1993 0.000% 1.067% 2.241% -2.559% 2.697% 0.367% -0.486% 3.833% -0.726% 1.973% -1.067% 1.224% 8.713% 4.674% 1994 3.488% -2.916% -4.190% 1.121% 1.594% -2.288% 3.233% 3.812% -2.521% 2.843% -3.982% 0.724% 0.402% 8.537% 1995 3.361% 4.081% 2.784% 2.962% 3.967% 2.021% 3.217% 0.445% 4.238% -0.294% 4.448% 1.573% 38.046% 2.595% 1996 3.558% 0.319% 1.722% 1.087% 2.270% 0.878% -4.494% 1.926% 5.585% 3.233% 7.300% -2.381% 22.489% 7.629% 1997 6.179% 0.957% -4.414% 6.260% 6.321% 4.112% 7.926% -5.180% 4.808% -2.450% 3.870% 1.910% 33.478% 11.203% 1998 1.288% 6.929% 4.876% 1.279% -2.077% 4.259% -1.351% -14.118% 6.362% 8.108% 5.568% 6.541% 28.688% 19.030% 1999 3.523% -3.207% 4.151% 3.797% -2.287% 5.538% -3.102% -0.518% -2.237% 6.408% 1.665% 5.709% 20.388% 11.699% 2000 -4.979% -1.523% 9.690% -3.512% -1.572% 1.970% -1.570% 6.534% -5.481% -0.468% -7.465% -0.516% -9.730% 17.120% 2001 4.446% -9.539% -5.599% 8.544% -0.561% -2.383% -1.020% -5.933% -8.159% 1.302% 7.798% 0.562% -11.752% 28.808% 2002 -0.980% -1.794% 3.324% -5.816% -0.593% -7.376% -7.882% 0.680% -10.485% 8.228% 6.168% -5.663% -21.588% 32.968% 2003 -2.459% -1.348% 0.206% 8.461% 5.484% 1.066% 1.803% 2.063% -1.089% 5.353% 1.092% 5.033% 28.176% 13.725% 2004 1.977% 1.357% -1.320% -1.892% 1.712% 1.849% -3.222% 0.244% 1.002% 1.288% 4.451% 3.015% 10.704% 7.526% 2005 -2.242% 2.090% -1.828% -1.874% 3.222% 0.150% 3.826% -0.937% 0.800% -2.365% 4.395% -0.190% 4.827% 6.956% 2006 2.401% 0.573% 1.650% 1.263% -3.012% 0.264% 0.448% 2.182% 2.699% 3.152% 1.989% 1.337% 15.847% 7.593% 2007 1.504% -1.962% 1.160% 4.430% 3.392% -1.464% -3.131% 1.283% 3.870% 1.357% -3.873% -1.133% 5.136% 9.925% 2008 -6.046% -2.584% -0.903% 4.766% 1.512% -8.350% -0.899% 1.545% -9.437% -16.519% -6.961% 0.983% -36.807% 47.592% 2009 -8.211% -10.745% 8.348% 9.935% 5.845% -0.068% 7.461% 3.694% 3.545% -1.923% 6.161% 1.907% 26.364% 27.132% 2010 -3.634% 3.119% 6.090% 1.547% -7.945% -5.175% 6.830% -4.498% 8.955% 3.820% 0.000% 6.685% 15.057% 15.700% 2011 2.330% 3.474% 0.010% 2.896% -1.121% -1.688% -2.000% -5.498% -6.945% 10.915% -0.406% 1.044% 1.888% 18.609% 2012 4.637% 4.341% 3.216% -0.668% -6.006% 4.053% 1.183% 2.505% 2.535% -1.820% 0.566% 0.900% 15.991% 9.687% 2013 5.119% 1.276% 3.798% 1.921% 2.361% -1.336% 5.168% -2.999% 3.168% 4.631% 2.964% 2.589% 32.307% 5.552% 2014 -3.525% 4.552% 0.831% 0.695% 2.321% 2.064% -1.344% 3.946% -1.379% 2.355% 2.747% -0.256% 13.462% 7.273% 2015 -2.963% 5.620% -1.574% 0.983% 1.286% -2.029% 2.259% -6.095% -2.543% 8.506% 0.366% -1.718% 1.252% 11.910% 2016 -4.979% -0.083% 6.724% 0.394% 1.701% 0.350% 3.647% 0.120% 0.008% -1.734% 3.684% 2.028% 12.001% 10.306% 2017 1.789% 3.929% 0.126% 0.993% 1.411% 0.637% 2.055% 0.292% 2.014% 2.356% 3.057% 1.209% 21.700% 2.609% 2018 5.636% -3.118% NA NA NA NA NA NA NA NA NA NA 2.342% 10.102%
That covers it for the function. Now, onto volatility trading. Dodging the February short volatility meltdown has, in my opinion, been one of the best out-of-sample validators for my volatility trading research. My subscriber numbers confirm it, as I’ve received 12 new subscribers this month, as individuals interested in the volatility trading space have gained a newfound respect for the risk management that my system uses. After all, it’s the down months that vindicate system traders like myself that do not employ leverage in the up times. Those interested in following my trades can subscribe here. Furthermore, recently, I was able to get a chance to speak with David Lincoln about my background, and philosophy on trading in general, and trading volatility in particular. Those interested can view the interview here.
Thanks for reading.
NOTE: I am currently interested in networking, full-time positions related to my skill set, and long-term consulting projects. Those interested in discussing professional opportunities can find me on LinkedIn after writing a note expressing their interest.
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.