Creating Custom Model For Android Using TensorFlow
Amit Shekhar
42615

Hi ,

Nice blog, really useful one . I tried the example but when I load the model and call inferenceInterface.readNodeFloat(outputName, outputs); I get following error,

E/TensorFlowInferenceInterface(8184): Failed to run TensorFlow session: java.lang.IllegalArgumentException: No OpKernel was registered to support Op ‘Switch’ with these attrs. Registered devices: [CPU], Registered kernels:
 E/TensorFlowInferenceInterface(8184): device=’GPU’; T in [DT_STRING]
E/TensorFlowInferenceInterface(8184): device=’GPU’; T in [DT_BOOL]
E/TensorFlowInferenceInterface(8184): device=’GPU’; T in [DT_INT32]
E/TensorFlowInferenceInterface(8184): device=’GPU’; T in [DT_FLOAT]
E/TensorFlowInferenceInterface(8184): device=’CPU’; T in [DT_FLOAT]
E/TensorFlowInferenceInterface(8184): device=’CPU’; T in [DT_INT32]

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.