Category Archives: Uncategorized

Project in history – Going wireless for a Tamiya kit

This project is completed around 2011. It may look ancient as everyone is crazy about “Drone” today. Nevertheless, it brought me so much fun doing it I am still fond of the good memories today and would like to share here.

The goal back then was simple. A wired Tamiya kit + Arduino + bluetooth + Android phone = Wireless remote controlled bulldozer. Integrate the wireless and accelerometer capability from an Android phone to control this kit. As far as I can recall there was no such term as STEM back then but this sort of project can been seen in those syllabuses today on this exciting subject.

The name Tamiya must ring the bell for everyone who had get their feet wet in the remote control modelling world. Tamiya made great kits at very reasonable price for beginners to moderates. I was lucky enough to own and build one of their classic – a bulldozer kit – many years ago and it remained one of the best part of my memory although that bulldozer itself is nowhere to be seen now.

The below is about pretty much the same kit I came across from Tamiya around 2011 which is Xmas gift from my wife.

The finished mods. It is a Bluetooth enabled Tamiya kit that can be controlled from an Android phone.

The Tamiya kit came with a perfect wired On-Off switches to control the kit for forward-backward-left-right maneuvers.

The classic kit I had decades ago (also from Tamiya) is of a wooden base, with metal gears. They are now replaced as plastic that are easier to work with but honestly I still missed the wood and metal. By the way, no power drill is needed for the classic kit with wood, I built mine comfortably with screwdriver and brute force.384637_333179810028224_1549591974_n395470_333179900028215_716701980_n

The core that made it possible from wired to wireless is an Arduino Mega board. Mounted with ease to the baseboard (white) on the Tamiya kit.

The Communication module is an HC-05 Bluetooth board alongside with the H-bride (SN754410) on an Arduino compatible extension board. The design was pretty standard to divide the control circuit from the power circuit that have to drive two 3V motors with power requirements the Arduino board cannot handle.376070_333180030028202_823667822_n


Prototype in testing.

A variation configuration using a mobile phone mounted on the kit and via WIFI to stream the real-time video, and have the control of the kit via Bluetooth from the viewing PC. It is a very popular mods back then and the experience of controlling a MARS Rover at home is simply fantastic.

Visualizing WordPress stats with Excel Power Map

Power Map is a feature available in recent release of Microsoft Excel providing easy to use visualization of data in geographical form as well as temporal animation.

In a sample usage to understand where the readers of this blog are from this year, the WordPress stats data from 2016 are imported into Excel Power Map. WordPress already offered graphical presentation of visitor origin in a nice world map. For animating the data through a whole year by month, Excel Power Map is a nice and easy to use tool with just a little bit more effort to create such animation.

The statistic page should be familiar to all WordPress user. Download the data for each month as CSV and store on local drive.

Now prepare the data by consolidating them all in a single Excel worksheet. I used the good ole awk as the source files are in plain text CSV to save some copy and paste efforts. For the animation to work, a new column indicating the time of data (e.g. data for USA in January 2016) has to be added for the Power Map to handle temporal attribute, and this column can not be in formula form.

When the source data are ready, select the whole region and click the 3D Map button under the Insert menu.

Now the Power Map UI will be displayed. A globe map will be displayed by default but since I preferred a flat map form instead of 3D so I clicked on the Flat Map button.

The Power Map user interface works pretty much like a pivot table. You drop the dimension on the pre-defined attribute settings pane on the right hand side to tell Power Map how you would like the map to present your data. To create animation, just drag the date column to the “Time” section as the attribute and Power Map will be able to render accordingly.

Now that everything is ready, the animation play nicely and Power Map will also be able to create the animation file.powermap5

GARCH model in R

A much more practical approach than calculating GARCH parameters on a calculator is to do it in R. Not only is there is available packages, retrieving financial data for experimenting is also a piece of cake as the facilities built-in offered convenient access to historical data.

To use GARCH in R the library must be installed first.


To test the library, data are imported using the tSeries package.



A plot of the log return.




Before running the GARCH model, a QQ plot is reviewed.



Finally, the GARCH model is created using the command below.



Density plot.




With trace=off a clean model can be printed after running the model.


Adding Google authenticator support to RFID password keeper

The latest upgrade to my DIY gadget that keep passwords and log in to Windows with a swipe of an RFID card is the support of Google authenticator.

In brief, the gadget is an Atmel based micro-controller connected to an RFID card reader, packaged in the form-factor of a name card holder. With a USB connection to any Windows based PC, all I need to do to log in is to wave my card.

Although supporting static passwords only, this gadget served me well along the years. In recent years I found myself relied more on dynamic authentication like one time passwords provided by Yubikey and Google Authenticator.

These are proven technologies for multi-factor authentication, and I trusted this with many of my Amazon AWS based linux hosts.

Even though the Google authenticator is already very user friendly via its Android app, to use it I have to pull the phone out of pocket and start the app, read the six digit code and then type it in as quickly as possible.

To make life easier, I recently upgraded this RFID gadget to support one time password for Google authenticator. The upgrade in term of programming is easy as the algorithm is open (RFC-6238) and there are handful of libraries available. The obvious hurdle for implementing TOTP on this gadget is the lack of a real time clock (RTC) for the micro-controller to compute the required authentication code. Although most RTC modules are compact these days, fitting one more PCB board to this already cramped gadget is not easy.

So for now I will settle with an alternative – since the micro-controller supported serial communication, providing the time source by the PC host itself can easily be achieved with a simple Powershell script below:

$utctime=[int][double]::Parse($(Get-Date -date (Get-Date).ToUniversalTime()-uformat %s))
$port= new-Object System.IO.Ports.SerialPort COM7,9600,None,8,1

Just run this script to feed the timestamp, and then swipe the RFID card as usual at the SSH prompt asking for Google authenticator verification code. Happy with this upgrade.


Analysis of traffic CCTV camera image from with GPU deep learning on Amazon cloud

In this installment, modern computing technology including Internet, artificial intelligence, deep learning, GPU, and cloud technology are utilized to solve practical problem.

Traffic jam is very common in metropolitan cities. Authorities responsible for road traffic management often install CCTV for monitoring. Data available at site published on the Internet for big data analysis is one such example. The followings are two sample images from the same camera with one showing heavy traffic and the other one with very light traffic.

Modern pattern recognition technology is a piece of cake to determine these two distinct conditions from reading images above. The frontier in this field is no doubt utilizing deep learning network in GPU processor. Amazon AWS provided GPU equipped computing resources for such task.

The first step is to fire up an GPU instance on Amazon AWS. The following AMI is selected. Note that HVM is required for GPU VM.

The next step is to choose a GPU instance. As the problem is simple enough only g2.2xlarge is used.

After logging in to the terminal, set up CUDA 7, cuDNN, caffe, and DIGITS. The steps can be referred from their respective official documentations. A device query test below confirmed successful installation of CUDA. The whole process may took an hour to complete if installed from scratch. There may be pre-built images out there.

Note that an account from NVIDIA Accelerated Computing Developer Program may be required to download some of these packages. A make test below confirmed complete setup of caffe.awsgpu5

Finally, after installing DIGITS, the login page is default to port 5000. At AWS console network connection rule can easily be setup to open this port. Alternatively, for more secure connections, tunneling can be used instead, as shown below running at 8500.awsgpu6

Now it is time to start training. A new image classification dataset is to be created. As stated above the source image set is obtained from At this site, traffic cameras installed over strategic road network point feed JPG image on the web for public access. The images are refreshed every 2 minutes. A simple shell script is prepared to fetch the image to build the data set. Below is the screen where DIGITS configures the classification training.

Since our sample data set size is small, the training completed in no time.


Next, a model is defined. GoogLeNet is selected in this example.

Model training in progress. The charts update in real time.


When the model is completed, some tests can be carried out. In this example, the model is trained to determine whether the camera image taken indicates traffic jam or not.

A traffic jam sample. Prediction: free=-64.23%, jam=89.32%

The opposite. Prediction: free=116.62%, jam=-133.61%

With Amazon cloud, the ability to deploy cutting edge AI technology in GPU is no longer limited to researchers or those rich in resources. General public can now benefit from these easy to access computing resources to explore limitless possibilities in the era of big data.