Hi Wai Chee Tau,
Yiran Mao
1

Hi Yiran,

  1. Yes , i have submitted a patch and its been accepted into Tensorflow Serving to allow serving up multiple version of model (https://github.com/tensorflow/serving/pull/217, https://github.com/tensorflow/serving/pull/235). This is available in the master branch, you can turn on multi version with the flag model_version_policy=ALL . You can specify which version of model to hit from your gRPC client using the version attribute as specified here https://github.com/tensorflow/serving/blob/master/tensorflow_serving/apis/model.proto
  2. The previous release would attempt to load the model and fail on variables if it doesnt exists. When that happens, you will see the error logged out. However existing served models would not be affected. SavedModel is the new interface format for exporting models in Tensorflow and its also the current recommended format to use.
  3. Yes I’ve been following the issue myself too. ML and Tensorflow is pretty exciting indeed. :)

Out of curiosity, what type of problem space that you are looking at tackling now with machine learning and deep neural network?

Regards

Wai

Like what you read? Give Wai Chee Yau a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.