TensorFlow: How to freeze a model and serve it with a python API
Morgan
39723

Hello,

Thanks for your great tutorial.

I’m working with InceptionResNet V2 model. After I froze the model, it started to give incorrect predictions, and every time I do a prediction, it gives new results, and new logits and predictions.

input_node = graph.get_tensor_by_name(‘prefix/batch:0’)
output_node = graph.get_tensor_by_name(‘prefix/InceptionResnetV2/Logits/Predictions:0’)
One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.