Category Archives: finance

Monte Carlo methods in TensorFlow

The Markov Chain Monte Carlo (MCMC) is a sampling method to sample from a probability distribution when direct sampling is not feasible.

The implementation of Monte Carlo in the TensorFlow Probability package included sample to run the Hamiltonian MCMC, which is a variation with input from the Hamiltonian dynamics to avoid slow exploration of state space.

While running the samples, my Python reported errors on importing tensorflow_probability:tfp1

The problem is resolved by uninstalling and re-installing the tensorflow-estimator package:

pip uninstall tensorflow_estimator
pip install tensorflow_estimator

Finally the samples run fine with expected results.


The results from the sample run.

Binomial Tree methods for European options using GPU

Binomial methods are versatile in pricing options for it is suitable for American, European, and Asian options. With an European call option with maturity t, strike price k, spot price S, volatility σ, risk-free rate r:


For put option, the last effective term shall be in the max function shall be:

Stock’s increment, decrements, and probability to move up are given by the below respectively:


One of the CUDA samples from Nvidia is to implement the binomial model on GPU.



Test driving Nvidia RTX 2060 with TensorFlow and VS2017

Finally my new laptop that sports the new GeForce Nvidia RTX 2060 arrived. It is time to check out the muscle of this little beast with the toolset I’m familiar with.

On the hardware, the laptop is a i7-8750 and 16G RAM with a Turing architecture based GeForce RTX 2060.


The laptop came with full drivers installed. Nevertheless I downloaded the latest drivers and CUDA for the most up-to-date experience. The software include Nvidia GeForce drivers, Visual Studio Express 2017, CUDA Toolkit, and TensorFlow.


Be careful when trying all these bleeding edge technologies, not only because TensorFlow 2.0 is currently in Alpha, compatibility issues may haunt like with previous 1.x TensorFlow on CUDA 10.1. I have to fallback to 10.0 to have TF happy with it (although one can always choose the compile from source approach).


And here are my favorite nbody and Mandelbrot simulation, and also the Black Scholes sample in CUDA. The diagnostic tool in VS gives a nice real time profiling interface with graphs.


Finally for this test drive – TensorFlow with GPU. The installation is smooth until I tried to verify TF on GPU. After several failed attempts I realized it could be that CUDA 10.1 may not be compatible with the TF version installed. There are couples of suggested solutions out there, including downgrading to CUDA 9, but since my GPU is the Turing series this is not an option. Actually TF supports CUDA 10 since v.13. So I finally decided to fall back CUDA to 10.0 from 10.1 and it worked!rtx7


Visualizing Volatility Sensitivity in Delta hedged gains with TI Nspire

The TI Nspire calculator is a great platform for visualizing data via interactive graphs. The built-in facility like input slider for variable value adjustment allowed dynamic visualization to complex equations, like the volatility sensitivity in delta-hedged gains used financial investment. Since this strategy involved a single call option, the volatility exposure equals the vega value of the option.

The following setup on the Nspire provided the functions to calculate the vega values.

This spreadsheet input screen stores the spot prices and the calculated Black Scholes vega values.

Finally, with the data plotting screen the graph of Delta hedged gains of volatility sensitivity is completed. An additional slider control can easily be added on it to adjust an offset variable so as to visualize scenarios under different spot price.

Implied volatility in R

The R package RND computes the implied volatility for a Call option. A sample usage is given as below.


The implied volatility based on the Black-Scholes model differs from realized volatility in that the latter is a retrospective estimate of price, while the former provides insight into the future.

Realized volatility can be derived from more traditional approach like standard deviation and GARCH models. Implied volatility, on the other hand, must be found numerically because the Black-Scholes formula cannot be solved for phi in terms of other parameters. A previous installment provides more mathematical details in TI Nspire.

GARCH model in R

A much more practical approach than calculating GARCH parameters on a calculator is to do it in R. Not only is there is available packages, retrieving financial data for experimenting is also a piece of cake as the facilities built-in offered convenient access to historical data.

To use GARCH in R the library must be installed first.


To test the library, data are imported using the tSeries package.



A plot of the log return.




Before running the GARCH model, a QQ plot is reviewed.



Finally, the GARCH model is created using the command below.



Density plot.




With trace=off a clean model can be printed after running the model.


Black-Scholes formula in TI-84 Solver

Came across a ten-year old article from TI on working with the Black-Scholes pricing model in TI-84. In it, a couple of examples are given to utilize various features of the TI-84 to work with the equation to derive an European call option in the Black-Scholes model. One of these method being invoking the Solver.

Entering the equation again is considered quite cumbersome, and it is not quite sure how to archive the Solver equation for later use. After a couple of tries, it become obvious that the Solver wouldn’t work with a function stored in String at all. Fortunately alternative method is found to somehow persist the equation for later use.

The trick is to make use of the following build-in functions available in the TI-84:

  • String>Equ()
  • expr()

By making use of these two functions, the Solver will be able to handle the Black-Scholes equation which is stored in String properly. Firstly the equation must be stored in a String, and then by making use of String>Equ() function, the equation will be able to persist in one of the equation variables. In this form, the Solver will be happy to work with it in its entirety, which means all variables are considered. The equation stored in Str0 is converted to function Y0, and is then processed properly by the Solver as shown in the below screens. For persisting, this can be done in a program including the definition of the Black-Scholes formula itself.

Real estate refinancing – example from HP 12C to TI Nspire

From the “HP 12C Platinum Solutions Handbook”, an example is given on calculation of refinancing an existing mortgage (on page 7). Since the HP 12C is a special breed specializing in financial calculations, much of the steps are optimized and is different from using financial functions available on other higher end calculators like the TI Nspire. In the following re-work of the same example, the Finance Solver is called from within the Calculator Page and the Vars are recalled for calculations.


Monthly payment on existing mortgage received by lender calculation.

Monthly payment on new payment calculation.

Net monthly payment to lender, and Present value of net monthly payment calculation.

Estimating GARCH parameters from S&P500 index data in Nspire

Yes this could be done easily even in Excel today with a modern PC, so why abuse a calculator that is probably running on 3.3V? But dare I ask why not. Calculators today, especially the top of the line products like the Nspire, are almost on par in terms of computing power with the PC back in the days when I am in college. It is almost a fun in itself to program these little machines for calculations so complex that owners of early generation calculators back then can never dream of.


One of the usage of GARCH in finance is the pricing of options. The gist of this model is that returns follow normal distribution with volatility dependent on time. Like most models, good parameters are crucial to the success in the model’s capability, and GARCH is no exception. To try out how well Nspire will do the parameter estimation that required computation technique including processing arrays of numerical data, logarithmic, and also optimization, a list of continuous daily S&P500 index figures are loaded into the Spreadsheet application in the Nspire, and the continuous compounded returns are then calculated in the next column, by the formula

Rt = LN(St/St-1)

where st denotes the price of the underlying asset at time t.

There are 1000 data points in this test set but Nspire seems to handle it decently in the Spreadsheet application, with the help from the “Fill” function that works exactly like in Excel, so that the row index will be worked out automatically (t and t-1 correspond to the row index). My only complaint is that unlike in Excel, there is no provision in the Nspire similar to the “Shift+End+Arrow” key feature that instantly jump to the last data row with selection.

The simplest form of GARCH, GARCH(1,1), is targeted in this test for parameter estimation. Three parameters in the model, namely alpha, beta, and omega, are to be estimated using the maximum likelihood method which is a popular technique. In this particular case, i.e. when p=1 and q=1, the GARCH model can be represented by

σt+12 = ω + α • rt2 + β • σt2

The coefficients alpha and beta exhibits the measurement of persistence of volatility. The more close to 1 for their sum, the more persistent the volatility is; while the reverse (towards zero) means a quicker reversal to the long term variance. A function in Nspire is written to do this calculation which takes the list from the log returns of the 1000 S&P data stored in the Spreadsheet application as variable log_return, and the actual maximization (for likelihood) of this function in Nspire is done via the Nelder-Mead algorithm, using the following kernel distribution function with the constants (2π) removed for calculation speed yet preserving the property of proportional to the maximum likelihood.

Σ (- ln(σt2 - rt2 / σt2), t=1..n)


It took the real Nspire more than two hours to finish the computation. I think I am going back to Excel 😉 Nevertheless, doing all these again from scratch on the Nspire is really a good refresher on how these algorithms, models and formulae work.