Estimating GARCH parameters from S&P500 index data in Nspire

Yes this could be done easily even in Excel today with a modern PC, so why abuse a calculator that is probably running on 3.3V? But dare I ask why not. Calculators today, especially the top of the line products like the Nspire, are almost on par in terms of computing power with the PC back in the days when I am in college. It is almost a fun in itself to program these little machines for calculations so complex that owners of early generation calculators back then can never dream of.

garch2

One of the usage of GARCH in finance is the pricing of options. The gist of this model is that returns follow normal distribution with volatility dependent on time. Like most models, good parameters are crucial to the success in the model’s capability, and GARCH is no exception. To try out how well Nspire will do the parameter estimation that required computation technique including processing arrays of numerical data, logarithmic, and also optimization, a list of continuous daily S&P500 index figures are loaded into the Spreadsheet application in the Nspire, and the continuous compounded returns are then calculated in the next column, by the formula

Rt = LN(St/St-1)

where st denotes the price of the underlying asset at time t.

There are 1000 data points in this test set but Nspire seems to handle it decently in the Spreadsheet application, with the help from the “Fill” function that works exactly like in Excel, so that the row index will be worked out automatically (t and t-1 correspond to the row index). My only complaint is that unlike in Excel, there is no provision in the Nspire similar to the “Shift+End+Arrow” key feature that instantly jump to the last data row with selection.

The simplest form of GARCH, GARCH(1,1), is targeted in this test for parameter estimation. Three parameters in the model, namely alpha, beta, and omega, are to be estimated using the maximum likelihood method which is a popular technique. In this particular case, i.e. when p=1 and q=1, the GARCH model can be represented by

σt+12 = ω + α • rt2 + β • σt2

The coefficients alpha and beta exhibits the measurement of persistence of volatility. The more close to 1 for their sum, the more persistent the volatility is; while the reverse (towards zero) means a quicker reversal to the long term variance. A function in Nspire is written to do this calculation which takes the list from the log returns of the 1000 S&P data stored in the Spreadsheet application as variable log_return, and the actual maximization (for likelihood) of this function in Nspire is done via the Nelder-Mead algorithm, using the following kernel distribution function with the constants (2π) removed for calculation speed yet preserving the property of proportional to the maximum likelihood.

Σ (- ln(σt2 - rt2 / σt2), t=1..n)

garch3

It took the real Nspire more than two hours to finish the computation. I think I am going back to Excel 😉 Nevertheless, doing all these again from scratch on the Nspire is really a good refresher on how these algorithms, models and formulae work.

Advertisements

2 thoughts on “Estimating GARCH parameters from S&P500 index data in Nspire

  1. Pingback: Implementing Nelder-Mead on Casio fx-9860GII | gmgolem

  2. Pingback: GARCH model in R | gmgolem

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s