Note: this is an updated post for Tensorflow 1.6.0


I had some tough times in finding out how to prepare a Tensorflow library package that you can use in your own cmake projects. Tensorflow uses bazel as build system and does not provide a build target yet for creating the necessary C++ headers. I describe below the steps I took to make it work for Tensorflow version 1.6.0 for macOS. Note that if you do this for another version of Tensorflow, the procedure, especially finding the correct headers, might be different.


If you just want to get started and don’t do all these steps yourself, grab the zip file below and have fun!

Follow the steps below if you want to compile the C++ library and scrape all necessary headers yourself.



Download the tensorflow repo, checkout the stable release version and run configure:

$ git clone
$ git checkout r1.6
$ cd tensorflow
$ ./configure

Build the library. You can choose for different optimisations. 

No optimisations, runs on many machines:

$ bazel build --config=monolithic //

Choose specific optimisations (pick which cpu extensions you want to support), e.g.:

$ bazel build -c opt --copt=-mavx --copt=-mavx2 --copt=-mfma  --copt=-msse4.2 \
--config=monolithic //

The default choice (–config=opt) is the worst choice if you want to ship the final program and be compatible with other processors.

Compilation takes quite a long time…

Library files

When done, install the header files and library file in e.g. ~/libs, but you can choose another location like /usr/local if you prefer installing system wide. From the tensorflow source folder, run the commands:

$ cd bazel-bin/tensorflow/
$ install_name_tool -id "@loader_path/"

Then install the library where you want, e.g.

$ cp bazel-bin/tensorflow/ ~/libs/tensorflow/lib/


$ cp bazel-bin/tensorflow/ /usr/local/lib/

Header files

Finding the needed header files is a bit tricky. Copy the following folders from the tensorflow source directory to your libs header folder:

Tensorflow header files

cp -RL ./tensorflow/core $DST/tensorflow
cp -RL ./tensorflow/cc $DST/tensorflow

mkdir -p $DST/third_party
cp -RL ./third_party/eigen3 $DST/third_party

#rm -rf $DST/third_party/eigen3/unsupported
cp -RLf ./bazel-tensorflow/external/eigen_archive/unsupported $DST

cp -RL ./bazel-genfiles/tensorflow/cc $DST/tensorflow
cp -RL ./bazel-genfiles/tensorflow/core $DST/tensorflow

cp -RL ./bazel-tensorflow/external/eigen_archive/Eigen $DST/Eigen

nsync header and protobuf files

Get the source from nsync, Eigen and google protobuffer by running

$ tensorflow/contrib/makefile/

The downloaded files can be found in

 $ tensorflow/contrib/makefile/downloads/


Simple C++ test

Let’s try it all out. Create a project folder and copy the libs folder inside your project. This allows you to use multiple versions of Tensorflow  in different projects.


#include <iostream>
#include <vector>
#include "tensorflow/cc/client/client_session.h"
#include "tensorflow/cc/ops/standard_ops.h"

int main() {

    using namespace tensorflow;
    using namespace tensorflow::ops;
    Scope root = Scope::NewRootScope();

    auto A = Const(root, {{1.f, 2.f}, {3.f, 4.f}});
    auto b = Const(root, {{5.f, 6.f}});
    auto x = MatMul(root.WithOpName("v"), A, b, MatMul::TransposeB(true));
    std::vector<Tensor> outputs;

    std::unique_ptr<ClientSession> session = std::make_unique<ClientSession>(root);
    TF_CHECK_OK(session->Run({x}, &outputs));
    std::cout << outputs[0].matrix<float>();



cmake_minimum_required(VERSION 3.9)



add_executable(tftest main.cpp )
target_link_libraries(tftest tensorflow_cc)

If all steps were successful, you should see this:

$ tftest