Using frontend.from_tensorflow to load a Keras model with custom layers

I am trying to load a Keras model into TVM that contains custom layers. As expected, using frontend.from_keras doesn’t work, because the frontend doesn’t know what to do with the custom layers. So my thought was that I could use the more general frontend.from_tensorflow (since under the hood the Keras model is just a TensorFlow graph). Here is a simplified example showing what I’m trying to do:

import tensorflow as tf
from tvm.relay.frontend import from_tensorflow, from_keras
from tvm.relay.frontend.tensorflow_parser import TFParser

class MyLayer(tf.keras.layers.Layer):
    def call(self, inputs):
        return inputs + 1

inp = tf.keras.Input((1,))
out = MyLayer()(inp)
model = tf.keras.Model(inp, out)

Note that in my actual project MyLayer is quite complicated, so that’s why I want to use the automated frontend.from_tensorflow parsing rather than rewriting the layer using Relay.

This is what I’m trying to do

tf.saved_model.save(model, "my_model")
graphdef = TFParser("my_model").parse()

mod, params = from_tensorflow(graphdef, shape={"input": (1, 1)})

but it gives the error:

Exception: Function not found - __inference_signature_wrapper_93

So two questions:

  1. Is there a better way I should be using to load a Keras model with custom layers?
  2. If not, how can I fix that error?
1 Like

I encountered similar problem using MB-MelGAN model from https://github.com/TensorSpeech/TensorFlowTTS .

Exception: Function not found - __inference_signature_wrapper_6729

I don’t think current relay front-end from_tensorflow support saved format using tf.keras.saved function. We need to convert tf.keras model into a frozen pb saved format manually, and then pass it to relay.frontend.from_tensorflow. Correct me if I am wrong!