Open In App

Export a SavedModel in Tensorflow

Last Updated : 08 Jun, 2023
Improve
Improve
Like Article
Like
Save
Share
Report

In TensorFlow, a SavedModel is basically a serialized format for storing a complete TensorFlow program. The tf.saved_model.save() function in TensorFlow can be used to export a SavedModel. A trained model and its related variables are saved to disc in the SavedModel format by this function. It includes weights and computation graphs. This model can be used for deploying models for inference, transfer learning, or other purposes.

An illustration of code that shows how to export a SavedModel in Tensorflow is shown below:

Here are the following  steps to export a SavedModel in TensorFlow:

Now we can take an example of exporting a Softmax Regression model in TensorFlow:

Step 1: 

Create a Softmax Regression model and train it using the MNIST dataset.

Python




# Import the necessary libraries
import tensorflow as tf
  
# Load MNIST dataset
mnist = tf.keras.datasets.mnist
(x_train, y_train),(x_test, y_test) = mnist.load_data()
  
# Normalize the data
x_train, x_test = x_train / 255.0, x_test / 255.0
  
# Define the Softmax Regression model
model = tf.keras.models.Sequential([
  tf.keras.layers.Flatten(input_shape=(28, 28)),
  tf.keras.layers.Dense(10, activation='softmax')
])
  
# Compile the model
model.compile(optimizer='adam',
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])
  
# Train the model
model.fit(x_train, y_train, epochs=5, validation_data=(x_test, y_test))


Output:

Epoch 1/5
1875/1875 [==============================] - 3s 1ms/step - loss: 0.4726 - accuracy: 0.8764 - val_loss: 0.3060 - val_accuracy: 0.9154
Epoch 2/5
1875/1875 [==============================] - 2s 1ms/step - loss: 0.3038 - accuracy: 0.9158 - val_loss: 0.2808 - val_accuracy: 0.9222
Epoch 3/5
1875/1875 [==============================] - 3s 1ms/step - loss: 0.2831 - accuracy: 0.9209 - val_loss: 0.2758 - val_accuracy: 0.9235
Epoch 4/5
1875/1875 [==============================] - 3s 1ms/step - loss: 0.2734 - accuracy: 0.9229 - val_loss: 0.2688 - val_accuracy: 0.9256
Epoch 5/5
1875/1875 [==============================] - 2s 1ms/step - loss: 0.2661 - accuracy: 0.9257 - val_loss: 0.2675 - val_accuracy: 0.9267

Step 2:

Evaluate the model accuracy:

Python3




loss, accuracy =  model.evaluate(x_test, y_test)
print('The accuracy of the model is :',accuracy)


Output:

313/313 [==============================] - 0s 1ms/step - loss: 0.2675 - accuracy: 0.9267
The accuracy of the model is : 0.9266999959945679

Step 3: 

Save the model using the tf.saved_model.save function.

Python




# Save the model
tf.saved_model.save(model, 'softmax_regression_model')


Step 4: 

Verify that the model has been saved successfully by inspecting the contents of the SavedModel directory.

Python




# Inspect the contents of the SavedModel directory
!ls softmax_regression_model


Output :

assets    fingerprint.pb    saved_model.pb    variables

The saved_model.pb file contains the computation graph and the variables folder contains the trained weights of the model.

Step 5:

 Optionally, you can add version numbers to the SavedModel directory to keep track of different versions of the model.

Python




# Save the model with version number
tf.saved_model.save(model, 'softmax_regression_model/1')


Step 6: 

Verify that the versioned model has been saved successfully.

Python




# Inspect the contents of the SavedModel directory with version number
!ls softmax_regression_model/1


Output:

assets    fingerprint.pb    saved_model.pb    variables

The contents of the versioned directory are the same as of the original directory.

Step 7:

Let’s load the model using the TensorFlow hub

Python3




# import the necessary libraries
import tensorflow_hub as hub
import numpy as np
  
#Load the model
layer = hub.KerasLayer("softmax_regression_model/")
  
# create custom input
Input = np.linspace(0,1,784).reshape(28,28)
Input = tf.expand_dims(Input, axis=0)
  
print(layer(Input)) 
layer.trainable = True
print(layer.trainable_weights)


Output:

tf.Tensor(
[[2.9126030e-16 5.6060693e-19 5.5827466e-03 6.9878437e-03 2.6784150e-19
  9.8738652e-01 1.5881575e-17 4.2873238e-05 3.4423101e-15 5.1269178e-10]], shape=(1, 10), dtype=float32)
[<tf.Variable 'dense/kernel:0' shape=(784, 10) dtype=float32, numpy=
array([[-0.02441423,  0.04307493, -0.00332485, ...,  0.03308415,
         0.07554208, -0.00533313],
       [ 0.01357713, -0.00179666,  0.08344419, ..., -0.07203246,
         0.04122115, -0.08428719],
       [-0.06156562, -0.08187176, -0.06699241, ..., -0.05588092,
        -0.04787212, -0.05234763],
       ...,
       [-0.02957132,  0.07599085, -0.07504247, ...,  0.03821079,
         0.04094885, -0.03252703],
       [ 0.01038309, -0.01061375,  0.03050219, ..., -0.02180967,
         0.07942755,  0.01886416],
       [-0.02210779,  0.06026974, -0.00152316, ..., -0.02114672,
        -0.06010536, -0.05444371]], dtype=float32)>, <tf.Variable 'dense/bias:0' shape=(10,) dtype=float32, numpy=
array([-0.37059996,  0.48123115,  0.09208864, -0.29430756,  0.09193437,
        0.9103967 , -0.08532815,  0.5119505 , -1.0492268 , -0.1949552 ],
      dtype=float32)>]

Here are some of the different ways related to exporting a SavedModel in TensorFlow:

  • Exporting a SavedModel with custom signatures: You can directly specify custom signatures for your SavedModel that can define the inputs and the outputs of the model. This can be useful for serving the model using various interfaces, such as the REST APIs.
  • Exporting a SavedModel for TensorFlow Serving: TensorFlow Serving is basically a system for serving machine learning models in production environments. To export a model for the TensorFlow Serving, first, you need to include a serving_default signature that can specify the inputs and outputs of the model.
  • Exporting a SavedModel for TensorFlow Lite: TensorFlow Lite is a framework used for deploying machine learning


Like Article
Suggest improvement
Share your thoughts in the comments

Similar Reads