Monthly Archives: May 2019

Profiling machine learning applications in TensorFlow

TensorFlow provided package timeline by using the import from tensorflow.python.client

from tensorflow.python.client import timeline

This is useful for performance profiling TensorFlow application with graphical visualization similar to the graphs generated from the CUDA Visual Profiler. With a little tweak in the machine learning code, TensorFlow applications can store and report performance metrics of the learning process.
tfprofile3

The design of the timeline package made it easy to add profiling by simply adding code below.

run_options = tf.RunOptions(trace_level=tf.RunOptions.FULL_TRACE)
run_metadata = tf.RunMetadata() 

It is also required to instruct the model to compile with the profiling options:

model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'],
options=run_options,
run_metadata=run_metadata)

With the sample mnist digits classifier for TensorFlow, the output shown Keras history are saved and can be later retrieved to generate reports.
tfprofile2

Finally, using the Chrome tracing page ( chrome://tracing/ ), the performance metrics persisted on file system can be opened for verification.
tfprofile1

 

Advertisements

TensorFlow and Keras on RTX2060 for pattern recognition

The MNIST database is a catalog of handwritten digits for image processing. With TensorFlow and Keras training a neural network classifier using the Nvidia RTX206 GPU is a walk in the park.
mnist2

Using the default import of the MNIST dataset using tf.keras, which comprises of 60,000 handwritten digits images in 28 x 28 pixels, the training of a neural network to learn classifying it could be accomplished in a matter of seconds, depending on the accuracy. The same learning done on ordinary CPU is not as quick as GPU for architectural differences. In this sample run, the digit “eight” is correctly identified using the neural network.
mnist4.PNG

A simple comparison of the training result of the MNIST database on my RTX2060 with varying training samples depicts slight differences in the final accuracy.
mnist1

 

More test driving of Tensorflow with Nvidia RTX 2060

By following the TensorFlow guide, it is easy to see how TensorFlow harnesses the power of my new Nvidia RTX 2060.

The first one is image recognition. Similar to the technology used in a previous installment on neural network training with traffic images from CCTV captured, a sample data set of images with classification of fashion objects are learnt by using TensorFlow. In that previous installment, Amazon Web Service cloud with a K520 GPU instance is used for the model training. In this blog post, the training is actually taking place in the Nvidia RTX 2060.

tfimg1

Another classic sample is the regression analysis, from the Auto MPG data set. With a few line of code, TensorFlow clean up the data set to remove unsuitable values and convert categorical values to numeric ones for the model.

tfimg2

Monte Carlo methods in TensorFlow

The Markov Chain Monte Carlo (MCMC) is a sampling method to sample from a probability distribution when direct sampling is not feasible.

The implementation of Monte Carlo in the TensorFlow Probability package included sample to run the Hamiltonian MCMC, which is a variation with input from the Hamiltonian dynamics to avoid slow exploration of state space.

While running the samples, my Python reported errors on importing tensorflow_probability:tfp1

The problem is resolved by uninstalling and re-installing the tensorflow-estimator package:


pip uninstall tensorflow_estimator
pip install tensorflow_estimator

Finally the samples run fine with expected results.

tfp3.PNG

The results from the sample run.
tfp7tfp4