martin_heller
Contributor

How to use TensorFlow in your browser

feature
Sep 02, 20208 mins
AnalyticsDeep LearningDevelopment Tools

Take advantage of TensorFlow.js to develop and train machine learning models in JavaScript and deploy them in a browser or on Node.js

neural network
Credit: KTSimage / Getty Images

While you can train simple neural networks with relatively small amounts of training data with TensorFlow, for deep neural networks with large training datasets you really need to use CUDA-capable Nvidia GPUs, or Google TPUs, or FPGAs for acceleration. The alternative has, until recently, been to train on clusters of CPUs for weeks.

One of the innovations introduced with TensorFlow 2.0 is a JavaScript implementation, TensorFlow.js. I wouldn’t have expected that to improve training or inference speed, but it does, given its support for all GPUs (not just CUDA-capable GPUs) via the WebGL API.

[ Also on InfoWorld: TensorFlow 2.0 review: Easier machine learning ]

What is TensorFlow.js?

TensorFlow.js is a library for developing and training machine learning models in JavaScript, and deploying them in a browser or on Node.js. You can use existing models, convert Python TensorFlow models, use transfer learning to retrain existing models with your own data, and develop models from scratch.

TensorFlow.js back ends

TensorFlow.js supports multiple back ends for execution, although only one can be active at a time. The TensorFlow.js Node.js environment supports using an installed build of Python/C TensorFlow as a back end, which may in turn use the machine’s available hardware acceleration, for example CUDA. There is also a JavaScript-based back end for Node.js, but its capabilities are limited.

In the browser, TensorFlow.js has several back ends with different characteristics. The WebGL back end provides GPU support using WebGL textures for storage and WebGL shaders for execution, and can be up to 100x faster than the plain CPU back end. WebGL does not require CUDA, so it can take advantage of whatever GPU is present.

The WebAssembly (WASM) TensorFlow.js back end for the browser uses the XNNPACK library for optimized CPU implementation of neural network operators. The WASM back end is generally much faster (10x to 30x) than the JavaScript CPU back end, but is usually slower than the WebGL back end except for very small models. Your mileage may vary, so test both the WASM and WebGL back ends for your own models on your own hardware.

TensorFlow.js models and layers

TensorFlow.js supports two APIs for building neural network models. One is the Layers API, which is essentially the same as the Keras API in TensorFlow 2. The other is the Core API, which is essentially direct manipulation of tensors.

Like Keras, the TensorFlow.js Layers API has two ways to create a model: sequential and functional. The sequential API is a linear stack of layers, implemented with a layer list (as shown below) or with the model.add() method:

const model = tf.sequential({
 layers: [
   tf.layers.dense({inputShape: [784], units: 32, activation: 'relu'}),
   tf.layers.dense({units: 10, activation: 'softmax'}),
 ]
});

The functional API uses the tf.model() API and can create arbitrary DAG (directed acyclic graph) networks:

// Create an arbitrary graph of layers, by connecting them
// via the apply() method.
const input = tf.input({shape: [784]});
const dense1 = tf.layers.dense({units: 32, activation: 'relu'}).apply(input);
const dense2 = tf.layers.dense({units: 10, activation: 'softmax'}).apply(dense1);
const model = tf.model({inputs: input, outputs: dense2});

The Core API can accomplish the same goals, with different code, and less of an intuitive tie to layers. The model below may look like basic tensor operations, but it creates the same network as the two previous formulations. Note the use of relu() and softmax(), which are both neural network operations, in the model() function below.

// The weights and biases for the two dense layers.
const w1 = tf.variable(tf.randomNormal([784, 32]));
const b1 = tf.variable(tf.randomNormal([32]));
const w2 = tf.variable(tf.randomNormal([32, 10]));
const b2 = tf.variable(tf.randomNormal([10]));

function model(x) {
  return x.matMul(w1).add(b1).relu().matMul(w2).add(b2).softmax();
}

Pre-built TensorFlow.js models

There are over a dozen pre-built TensorFlow.js models documented, available in the repository, and hosted on NPM (for use in Node.js) and unpkg (for use in a browser). You can use these models as supplied or for transfer learning. With a little work, you can also use them as building blocks for other models.

Several of these models use a device’s camera in real time, for example handpose:

tensorflow js handpose IDG

Handpose can detect palms and track hand-skeleton fingers. 

The list below is a convenient index into most of the prepackaged TensorFlow.js models.

What is ml5.js?

ml5.js is an open source, friendly, high-level interface to TensorFlow.js developed primarily at NYU. ml5.js provides immediate access in the browser to pre-trained models for detecting human poses, generating text, styling an image with another, composing music, pitch detection, common English language word relationships, and much more. While TensorFlow.js is aimed primarily at data scientists and developers, ml5.js aims to support broader public understanding of machine learning and foster deeper engagement with ethical computing, responsible data collection, and accessibility and diversity of people and perspectives in technology and the arts.

Most of the examples in ml5.js depend on TensorFlow.js models. They have been packaged as web pages that you can run as is, or edit, for example to use different images.

tensorflow js posenet IDG

PoseNet can perform real-time pose estimation in the browser, from images or a video feed. 

Demo: Iris classification with TensorFlow.js

The famous Iris discrimination dataset, originated by R.A. Fisher in 1936 to illustrate linear discriminant analysis, is still used as a test case for statistical and machine learning classification methods. It uses four features, the length and width of the flower sepals and petals, to classify three species of Iris, with 50 samples of each species. (Fisher’s original paper was published in the Annals of Eugenics, which says more about science in 1936 than it does about the data or the statistics.)

If you perform cluster analysis on this data, two of the species will share one cluster, with the third (I. Setosa) in a separate cluster. On the other hand, principal component analysis can separate all three species fairly well.

The TensorFlow.js sample fits the Iris data with two fully-connected (dense) neural network layers, as shown in the code extract below.

// Define the topology of the model: two dense layers.
const model = tf.sequential();
model.add(tf.layers.dense(
     {units: 10, activation: 'sigmoid', inputShape: [xTrain.shape[1]]}
     ));
model.add(tf.layers.dense({units: 3, activation: 'softmax'}));
model.summary();

const optimizer = tf.train.adam(params.learningRate);
model.compile({
     optimizer: optimizer,
     loss: 'categoricalCrossentropy',
     metrics: ['accuracy'],
});

As you can see in the screenshot below, this model does a decent job of classifying the three species. If you play around with the parameters, however, you’ll discover that some confusion between two of the species (the ones in the same cluster) reappears if you iterate for more than 40 epochs.

tensorflow js iris training IDG

Model training for the Iris dataset using a two-dense-layer neural network model.

Converting Python TensorFlow models to JavaScript

Part of the TensorFlow.js repository contains a converter for saved TensorFlow and Keras models. It supports three formats: SavedModel (the default for TensorFlow), HDF5 (the default for Keras), and TensorFlow Hub. You can use the converter for saved models from the standard repositories, models you’ve trained yourself, and models you’ve found elsewhere.

There are actually two steps to the conversion. The first step is to convert the existing model to model.json and binary weight files. The second step is to use an API to load the model into TensorFlow.js, either tf.loadGraphModel for converted TensorFlow and TensorFlow Hub models, or tf.loadLayersModel for converted Keras models.

Using transfer learning

TensorFlow.js supports transfer learning in essentially the same way as TensorFlow. The documentation supplies examples for customizing MobileNet for your own images and customizing a model for speech command recognition for your own sound classes. Essentially, what you are doing in each of these codelabs is adding a small custom classifier on top of the trained model, and training that.

Overall, TensorFlow.js can do almost anything that TensorFlow can do. However, given that the target environments for TensorFlow.js (garden variety GPUs for gaming) typically have less in the way of GPU memory than the big Nvidia server GPUs typically used for TensorFlow deep learning training, you may have to reduce the size of your model to make it run in a browser. The conversion utility does some of this for you, but you may have to take out layers manually and reduce the batch sizes for your training.

martin_heller
Contributor

Martin Heller is a contributing editor and reviewer for InfoWorld. Formerly a web and Windows programming consultant, he developed databases, software, and websites from his office in Andover, Massachusetts, from 1986 to 2010. More recently, he has served as VP of technology and education at Alpha Software and chairman and CEO at Tubifi. Disclosure: He also writes for Hewlett-Packard’s TechBeacon marketing website.

More from this author