In this post, I demonstrate how to build Tensorflow 1.5 from source on MacOS 10.13 High Sierra. To verify that building indeed works, I show how to change some Tensorflow Python code, do an incremental build and observe the change in action via a simple test program.

Installing Pre-Requisites

Tensorflow Source Code

You will most likely want to fork the main Tensorflow git repository on Github and clone your own fork to your local computer. The following clone instruction clones our Knowm fork of Tensorflow, containing a custom build script not present in the main Tensorflow project.

XCode

Install XCode via AppStore. I unsuccessfully tried to avoid this step, by installing just the command line XCode tools using xcode-select --install, but later found out by trial and error that I indeed needed to install XCode.

Bazel

Bazel is a software dependency and build tool similar to ANT and Maven. Installation Instructions are here.

Custom Build Script

We need a custom build_tf.sh because we didn’t install Python the recommended way using VirtualEnv. The inspiration of our build file is here. This retrieves all CPU features and applies some of them to build TF, which makes TF faster as it will utilize specialized CPU instructions if your computer has them.

Here is our build_tf.sh:

Notice the commented out line: #./configure, which you’ll need to run the first time to configure tensorflow. To run the script run:

Make sure the file is executable! chmod +x build_tf.sh

Path to python, when asked, is: /usr/local/bin/python3

The first build might take over two hours!

Getting Tensorboard to Also Work After Building from Source

This references an issue I opened: https://github.com/tensorflow/tensorboard/issues/812, which referenced an error: ImportError: cannot import name 'run_main' after running Tensorboard. If you are running into this issue, just run pip3 install tb-nightly.

Run an Example

Now that Tensorflow was built from source, you should run an example in the main Tensorflow project to verify that everything works:

Congratulations!!

Incrementally Building Tensorflow

For developing Tensorflow, we obviously don’t want to have to build the entire source tree from scratch every single time a change is made. Therefore, we want to build TF incrementally each time a change is made.

To accomplish this, we need to first understand what our build file build_tf.sh does, in particular the last few steps:

bazel clean

This deletes all the build artifacts.

./configure

This causes the GUI to ask the user a list of questions to configure the build.

bazel build -c opt $COPT -k //tensorflow/tools/pip_package:build_pip_package

This does the compiling. It is smart enough to know to only build source that has been modified since the last build.

bazel-bin/tensorflow/tools/pip_package/build_pip_package /tmp/tensorflow_pkg

This creates a python wheel file in /tmp/tensorflow_pkg

pip3 install –upgrade /tmp/tensorflow_pkg/ls /tmp/tensorflow_pkg/ | grep tensorflow

This takes the wheel file and makes it an executable on the system via calling tensorflow.

For an incremental build we only want to run the last 3 (of 5) lines of the build file and to do so we can just comment out the first 2 lines of code. the incremental build then takes less than 2 minutes.

Hacking Tensorflow at the Python Level

Let’s see if we can change some code in TF itself in a trivial way, recompile, re-run asimple hello world program such as hellotf.py from our HelloTensorflow Project and verify that our incremental build setup is working as expected. We’ll use the following simple TF program:

, which produces:

What if we change the tf.constant code to append the String _hack onto the end of the inputted constant?

In constant_op.py (source here), line 212, we can modify it like this:

After re-building and re-running hellotf.py, we get:

, which confirms that our incremental build setup is indeed working!

Related Posts

Subscribe To Our Newsletter

Join our low volume mailing list to receive the latest news and updates from our team.

1 Comment

Leave a Comment

Knowm 32X32 Crossbar

Knowm Newsletter

Are you ready for memristor AI processors? With our newsletter, you will be.