TensorFlow on Raspberry Pi: Just in Time for Pi Day!

This work was truly a team effort, so please check out the credits of the repo and give everyone there a warm e-hug.

TensorFlow gets smaller as it is getting bigger

Earlier today, I released instructions for compiling TensorFlow on the Raspberry Pi 3, as well as a pre-built Python wheel that can be used to install it directly. I’m hoping that this will enable some really cool projects involving both portable GPIO device-based learning and experimentation with TensorFlow’s distributed runtime. This has been an effort that has gone on since TensorFlow was open-sourced, and I’m really happy to be part of the group of people that made it happen.

What’s in the Repo

There are two main attractions to the repository: a pre-built Python wheel that can be used to easily install TensorFlow on a Raspberry Pi 3, and a step-by-step guide to building TensorFlow yourself.

Why Bother?

Several people have asked similar questions along the lines of: “Why would you want to run TensorFlow on a Raspberry Pi? Its compute power is miniscule.”

The first, quick answer: you probably don’t want to train your sophisticated models on a Raspberry Pi. Instead, train the model on a computer with more processing power (both CPU and GPU), and then move that pre-trained model onto the Pi for real-time use.

The second, more verbose answer:

With so much focus on the insane amount of computing power some companies are using to create breakthroughs in machine learning, such as Google’s AlphaGo recently beating Go champion Lee Sedol, it’s easy to get caught in the mindset that the only worthwhile machine learning problems require hundreds of GPUs. In truth, there are many applications that are on the opposite end of the spectrum- embedded devices with limited memory and processing power can also take advantage of machine learning. Unfortunately, you can’t keep throwing more hardware in a device smaller than your hand. We don’t want to require our smart health-monitors to be hooked up to the internet in order to detect anomalies- they should be able to use some sort of model built into the device. The hope is that by having a widely cross-compatible and powerful framework, the barrier to making ML-capable devices will be lowered and installing pre-trained models becomes less of a headache on devices with limited hardware.

Plus, having access to GPIO sensors and other devices could enable some really cool prototypes for machine learning that incorporate realtime data from their surroundings. Good stuff all around!

So check it out, let me know what works (and what doesn’t work), and let’s keep making TensorFlow a kick-ass framework with a kick-community.

Read More

TensorFlow Serving: TensorFlow in Production

More good news!

Google announced today that they’re releasing TensorFlow Serving a way to maintain machine learning models that are defined and trained in TensorFlow.

In a typical production setting, you want a way to replace in new models, or possibly train a model online. However, you need to be able to make this process seamless, or else risk losing service for a period of time. TensorFlow Serving offers a framework that manages the behind the scenes work of this process, so that the user can spend more time trying out new models. It’s in its infancy right now, but I’m excited to try it out soon.

Read More

TensorFlow White Pages Notes – On GitHub

Hey all!

Huge content in a short post. I just released my TensorFlow white paper notes on GitHub. It’s been a much larger project than I originally thought, but I hope that it will be useful to people. The notes go from the very beginning of the paper all the way through conclusions, highlighting the important information from each section.

Here are the top features of the notes:

  • Notes broken down and organized section by section, as well as subsection by subsection
  • Relevant links to documentation, resources, and references throughout
  • SVG versions of figures/graphs from the paper

If you haven’t checked out the TensorFlow white paper yet, I highly recommend it. There’s no replacement for reading the original, but I hope these notes are a worthy supplement alongside the actual paper.

Next up: Using TensorFlow on transformed Santa Monica Parking data

Read More

TensorFlow: Google’s Latest Machine Learning Software is Open Sourced!

Yes. Another ML Library.

But this one is different! The hacker community is wickedly excited about it, and so should you! TensorFlow was released by Google today, and it looks to be a really exciting step forward for open source machine learning, or even the entire computational mathematics community.

What is it? From TensorFlow’s introduction, it is “an open source library for numerical computation using data flow graphs”!

What is “an open source library for numerical computation using data flow graphs”?

Sounds like a mouthful, but “data flow graphs” are just a more-encompassing term for the kind of modeling neural networks use. And the library is described that way for a reason- TensorFlow is designed to not only provide flexible, highly optimized neural networks, but to be able to perform any sort of computation that is organized with a similar graph-like structure.

<aside>

More on data flow graphs

These graphs are composed of two primary components, nodes and edges.

Graph showing difference between edge and node
Original chart property of Google.

Nodes are the squares, circles, or ellipses on charts such as the one to the right here. They represent any sort of mathematical operation or function. In a neural network, these are your activation functions (like a sigmoid function).

Edges are the connections between the nodes. As you can see, they are directional, in that data flows from the output of one node and into the input of the next node (or several nodes) through these edges. Edges represent the “tensors”, or multi-dimensional arrays, which contain the weights for each of the outputs from the previous node to the next.

Compare that with a typical neural network model, and you can see how a neural network is just a specialized version of a data flow graph. Back to TensorFlow!

</aside>

So what exactly is there to get excited about with TensorFlow? There are a jillion machine learning libraries out there, so how does this stick out amongst the crowd (other than it’s created by Google)? Well, a fair amount, actually. Here are some of the things I’m most excited about:

  • Easier Transition from Research to Production: Something that was always troublesome in machine learning, especially Neural Networks, was trying to take the model crafted in research and then applying it to a real production setting. Much research is done using Python, R, or MatLab (with accompanying libraries), which allows for faster iterations through the design and testing phase. Before, that code would hardly be touched once the model moved to production, as it needed to be reimplemented with a faster language, such as C++ or Java. Because of the way TensorFlow is designed, we should be able to take what we have and bring it directly to production with minimal, if any, code changes.

  • Flexibility: This is both a great thing and something to keep in mind. TensorFlow is not a neural network library- it is a data flow graph library. This makes it capable of handling much more nuanced and hand-modeled graphs, but it will require more finagling. While it doesn’t appear too difficult to create a simple neural network now, I expect that there will be some higher-level libraries built on top of TensorFlow to make it extremely easy.

  • Automatic CPU/GPU Integration: This might be the most exciting one for me. GPUs, or graphics processing units, have enabled much faster learning (especially neural networks), and taking advantage of them is crucial in having the power to create robust models. The problem, however, is that most machine learning libraries out there don’t have GPU support, and those that do are either hard to use or are much less flexible. For example, scikit-learn, one of the most popular libraries for machine learning, while extremely useful for testing out ideas, has no plans for GPU support in the near future. TensorFlow promises to bring both flexibility and power by taking advantage of all of your computing resources.

I’ll be digging into this more over the next few weeks! Check out Google Research’s blog post if you’re interested in reading more about it.

Read More