TensorFlow serving is the gRPC service for general TensorFlow models. We can implement the C++ gRPC client to predict.
Now you need to use bazel
and just refer to to inception_client.cc.
Add the build rule in tensorflow_serving/example/BUILD
and copy sparse_predict_client.cc in example directory.
cc_binary(
name = "sparse_predict_client",
srcs = [
"sparse_predict_client.cc",
],
deps = [
"//tensorflow_serving/apis:prediction_service_proto",
],
)
Compile the gRPC client.
bazel build //tensorflow_serving/example:sparse_predict_client
Run the predict client.
bazel-bin/tensorflow_serving/example/sparse_predict_client