The autocorrelation function is the ratio of the k*th* sample autocovariance to the sample covariance, i.e.,

A plot of r_{k} against lag k is evaluated for discernible patterns, relationships, and absolute values (e.g., close to zero).

Leave a reply

The autocorrelation function is the ratio of the k*th* sample autocovariance to the sample covariance, i.e.,

A plot of r_{k} against lag k is evaluated for discernible patterns, relationships, and absolute values (e.g., close to zero).

Advertisements

The gamma function is not a built-in function in the TI Nspire. Nevertheless this function can easily be defined and used for visualising the gamma probability distribution function.

It is the World Cup season. There are many prediction models, and one of the most widely used statistical technique is the Poisson distribution:

The historic match results are available in the public domain, for example, atÂ http://eloratings.net/. The data are analysed to obtain an index referred to attack strength or goal expectancy. This can further elaborates into more complex data like home team and away team expectancy.

By using a matrix of score scales, usually from 0 to 9, all possible outcomes under 10 goals per team are defined with the respective probability. An example matrix with scores from 0 to 2 will look like one below:

This is the basics of predicting soccer in a quantitative way using the Poisson distribution. There are of course many other prediction methods and models, including organic method like asking the famous Paul the octopus đź™‚

Logistic regression is a very powerful tool for classification and prediction. It works very well with linearly separable problem. This installment will attempt to recap on its practical implementation, from traditional perspective by maximum likelihood, to more machine learning approach by neural network, as well as from handheld calculator to GPU cores.

The heart of the logistic regression model is the logistic function. It takes in any real value and return value in the range from 0 to 1. This is ideal for binary classifier system. The following is a graph of this function.

In the TI Nspire calculator, logistic regression is provided as a built-in function but is limited to single variable. For multi-valued problems, custom programming is required to apply optimization techniques to determine the coefficients of the regression model. One such application as shown below is the Nelder-Mead method in TI Nspire calculator.

Suppose in a data setÂ from university admission records, there are four attributes (independent variables: *SAT score, GPA, Interview score, Aptitude score*) and one outcome (“A*dmission*“) as the dependent variable.

Through the use of a Nelder-Mead program, the logistic function is first defined as ** l**. It takes all regression coefficients (

In R, as a sophisticated statistical package, the calculation is much simpler. Consider the sample case above, it is just a few lines of commands to invoke its built-in logistic model.

Apart from the traditional methods, modern advances in computing paradigms made possible neural network coupled with specialized hardware, for example GPU, for solving these problem in a manner much more efficiently, especially on huge volume of data. The Python library Theano is a complex library supporting and enriching these calculations through optimization and symbolic expression evaluation. It also features compiler capabilities for CUDA and integrates Computer Algebra System into Python.

One of the examples come with the Theano documentation depicted the application of logistic regression to showcase various Theano features. It first initializes a random set of data as the sample input and outcome using ** numpy.random**. And then the regression model is created by defining expressions required for the logistic model, including the logistic function and likelihood function. Lastly by using the

A nice feature from Theano is the pretty printing of the expression model in a tree like text format. This is such a feel-like-home reminiscence of my days reading SQL query plans for tuning database queries.

The TI Nspire calculator provided a rich set of common units from area, length, mass, etc. Units start with an understore in Nspire, for example, kg is represented as _kg. User are free to create their own units. On the desktop version of the Nspire software, a short cut for the conversion symbol (â–ş ) isÂ “@>”.

Recently over a conversation with a friend living overseas we are curious of the lowest we can get for a cut of meatÂ at our own places.Â I am getting 12 per 500 gm on discount a few days ago. He gets 7.8 per 1 lb at best.

Some mental calculations for we have different units, but I decided to fire up the Nspire for this inequality to see what will happen:

Alas, doesn’t work. Obviously I was expecting a boolean. The more verbose inequality with some pre-calculation didn’t work either.

That’s where I realize Nspire might not be handling unit in equations theÂ way we expected. An easy fix of course is to times a common unit (e.g. _kg) on both sides, but that pretty much defeat the whole purpose of simplicity of calculations of this kind.

In this installment the Nelder-Mead method is used to train a simple neural network for the XOR problem. The network consisted of 2-input, 1-output, and 2 hidden layers, and is fully connected. In mainstream practical neural network, back propagation and other evolutionary algorithms are much more popular for training neural network for real world problem. Nelder-Mead is used here just out of curiosityÂ to see how this general optimization routine performed under neural network settings on TI Nspire.

The sigmoid function is declaredÂ in an TI Nspire function.

For the XOR problem, the inputs are defined as two lists, and the expected output in another.

The activation functions for each neuron are declared.

To train the network, the sum of squared error function is used to feed into the Nelder-Mead algorithm for minimization. Random numbers are used for initial parameters.

Finally the resulting weights and bias are obtained from running the Nelder-Mead program.

The comparison graph of the performance of the Nelder-Mead trained XOR neural network against expected values.

The FFT is not available as built in function in the TI Nspire, but it is trivial to write a program for doing this calculation. Instead of using the standard TI Basic program, the Lua scripting is attempted this time. Unlike the commonly used TI Basic program, variables are not shared directly. Things get complex when working with lists and matrices. However, there are some utility functions from the Lua scripting in Nspire that make it possible to exchange data with the Calculator page. The example below shown the FFT results from the Lua script in the Table page.

The HP Prime provided built in function for FFT.