This is just a test page for forecastersblog.org
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
This is just a test page for forecastersblog.org. Ignore it, please.
es() allows selecting between AIC (Akaike Information Criterion), AICc (Akaike Information Criterion corrected) and BIC (Bayesian Information Criterion, also known as Schwarz IC). The very basic information criterion is AIC. It is calculated for a chosen model using formula:
\begin{equation} \label{eq:AIC}
\text{AIC} = -2 \ell \left(\theta, \hat{\sigma}^2 | Y \right) + 2k,
\end{equation}
where \(k\) is number of parameters of the model. Not going too much into details, the model with the smallest AIC is considered to be the closest to the true model. Obviously IC cannot be calculated without model fitting, which implies that a human being needs to form a pool of models, then fit each of them to the data, calculate an information criterion for each of them and after that select the one model that has the lowest IC value. There are 30 ETS models, so this procedure may take some time. Or even too much time, if we deal with large samples. So what can be done in order to increase the speed?
For example, for a time series N2568 from M3 we will have:
es(M3$N2568$x, "ZZZ", h=18)
Which results in:
Forming the pool of models based on... ANN, ANA, ANM, AAM, Estimation progress: 100%... Done! Time elapsed: 2.6 seconds Model estimated: ETS(MMdM) Persistence vector g: alpha beta gamma 0.020 0.020 0.001 Damping parameter: 0.965 Initial values were optimised. 19 parameters were estimated in the process Residuals standard deviation: 0.065 Cost function type: MSE; Cost function value: 169626 Information criteria: AIC AICc BIC 1763.990 1771.907 1816.309
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.