The Art of Loading Custom Objects and Functions in Keras

in the simplest possible way

Siladittya Manna
The Owl
3 min readAug 28, 2023

--

In the articles mentioned below,

and

We saw how to save custom objects by registering them to a global list.

Disclaimer: All the codes in the articles mentioned above and in this article were done in TFv2.12 and Keras-2.12.0 in a Kaggle Notebook environment. Updated frameworks may behave differently.

But, will we face any issues if we try to load our model in an entirely different script or notebook? Or resume training in the same notebook later?

Suppose you have used two custom objects in your model, one custom loss object and a custom learning rate scheduler. Also, suppose you have used tf.keras.optimizers.Adam as the optimizer. I noticed during my implementation that, if you use model.save(‘model_name.keras’) to save your model, the optimizer will be treated as a custom object. (Probably will publish a separate article explaining that, but for now, let us go with it.)

If the model is saved using model.save(‘model_name.keras’), then the model needs to be loaded with keras.models.load_model(...). Before loading the model, the custom loss object and the custom learning rate scheduler need to be defined in the training notebook or script. As tf.keras.optimizers.Adam is already a built-in object; it need not be defined again. Rather, we can pass it as a custom object to the keras.models.load_model function, like this:

model = keras.models.load_model(‘path/to/saved/model’, custom_objects = {“Custom>Adam”: tf.keras.optimizers.Adam})

Since the other custom objects were already registered when saving the model, they need not be passed into the custom_objects dictionary.

The final structure goes as follows:

For Training

#### Training
tf.keras.saving.get_custom_objects().clear()

#### Custom Loss Function
@tf.keras.saving.register_keras_serializable(name=”weighted_binary_crossentropy”)
def weighted_binary_crossentropy(target, output, weights):
...

@tf.keras.saving.register_keras_serializable(name=”WeightedBinaryCrossentropy”)
class WeightedBinaryCrossentropy:
def __init__(
...

#### Custome LR Scheduler
@tf.keras.saving.register_keras_serializable(name=”Cosine Decay”)
class CosineDecay:
def __init__(
...

### Build the Model
model = build_model()
cosine_decay = CosineDecay()
optimizer = tf.keras.optimizers.Adam(learning_rate = cosine_decay)
loss = WeightedbinaryCrossentropy(weights = […])
### Compiling the Model
model.compile(…)
### Training the Model
model.fit()
### Saving the Model
model.save(‘path/to/model’)

For Inference

### Inference
tf.keras.saving.get_custom_objects().clear()

#### Custom Loss Function
@tf.keras.saving.register_keras_serializable(name=”weighted_binary_crossentropy”)
def weighted_binary_crossentropy(target, output, weights):
...

@tf.keras.saving.register_keras_serializable(name=”WeightedBinaryCrossentropy”)
class WeightedBinaryCrossentropy:
def __init__(
...

#### Custome LR Scheduler
@tf.keras.saving.register_keras_serializable(name=”Cosine Decay”)
class CosineDecay:
def __init__(
...

model = tf.keras.models.load_model(‘path/to/model’, custom_objects = {“Custom>Adam”: tf.keras.optimizers.Adam})

If any model from other packages, like keras_cv, is used, then that needs to be passed into the custom_objects dictionary too. I faced some custom module serializing related issues when saving a model that used the ResNet backbone from keras_cv.

However, using the SavedModel format is different from the high-level .keras format. Quoting TF documentation

The key difference between high-level .keras/HDF5 formats and the low-level SavedModel format is that the .keras/HDF5 formats uses object configs to save the model architecture, while SavedModel saves the execution graph. Thus, SavedModels are able to save custom objects like subclassed models and custom layers without requiring the original code. However, debugging low-level SavedModels can be more difficult as a result, and we recommend using the high-level .keras format instead due to its name-based, Keras-native nature.

To save the model in the SavedModel format, we can just use the same method model.save(‘saved_folder/modelname’) and load using tf.keras.models.load_model. This is a workaround when using the custom objects, as it saves the graph altogether. In the next post, we will explore in detail the differences between SavedModel and the .keras/h5 format.

Clap and share if you liked the article. Follow for more.

--

--

Siladittya Manna
The Owl

Senior Research Fellow @ CVPR Unit, Indian Statistical Institute, Kolkata || Research Interest : Computer Vision, SSL, MIA. || https://sadimanna.github.io