Live Coding: TensorFlow vs. PerceptiLabs: Who has more Flower Power

Live Coding: TensorFlow vs. PerceptiLabs: Who has more Flower Power

Ever wondered how PerceptiLabs' visual approach compares to writing TensorFlow code? Recently, Robert Lundberg, CTO and Co-Founder of PerceptiLabs, gave a live coding  demonstration to show just that, using TensorFlow’s flower classification model.

TensorFlow, the most popular machine learning (ML) framework today, provides a tutorial on how to classify different types of flowers. However, as many of us have experienced, creating an ML model using raw TensorFlow code can be a laborious and time consuming process.

For starters, you need both programming knowledge and knowledge of the framework itself to get up and running. On top of this, it's very difficult to visualize the model, especially as it increases in size and complexity, and you have to run (train) the model before you can output results.

These issues become apparent as you try to recreate and experiment with TensorFlow's flower image classification tutorial. To begin with, the model chains together multiple convolutional layers each generating varying levels of feature maps, so it's difficult to visualize feature extraction when tuning the model. There is also the issue of having to rely on third-party modules to visualize how the model performs. Iterating on the model involves training the whole model, viewing the results (most of which are console output), and then trying to find the lines of code to tweak, before doing it all over again.

PerceptiLabs abstracts away common patterns of TensorFlow code as visual components, so that users can drag, drop, and connect them into an ML model while still having access to the underlying TensorFlow code. So when it comes to the flower image classification model, you can literally see how a flower is transformed into feature maps and correlated with its label, as you build, tune, and train the model.

Curious as to how PerceptiLabs stacked up against raw TensorFlow code? See for yourself by accessing the video here:

Here’s some additional information to check out on how PerceptiLabs, the GUI and visual API for TensorFlow, makes your life easier: