I am trying to load a Keras model into TVM that contains custom layers. As expected, using `frontend.from_keras`

doesn’t work, because the frontend doesn’t know what to do with the custom layers. So my thought was that I could use the more general `frontend.from_tensorflow`

(since under the hood the Keras model is just a TensorFlow graph). Here is a simplified example showing what I’m trying to do:

```
import tensorflow as tf
from tvm.relay.frontend import from_tensorflow, from_keras
from tvm.relay.frontend.tensorflow_parser import TFParser
class MyLayer(tf.keras.layers.Layer):
def call(self, inputs):
return inputs + 1
inp = tf.keras.Input((1,))
out = MyLayer()(inp)
model = tf.keras.Model(inp, out)
```

Note that in my actual project `MyLayer`

is quite complicated, so that’s why I want to use the automated `frontend.from_tensorflow`

parsing rather than rewriting the layer using Relay.

This is what I’m trying to do

```
tf.saved_model.save(model, "my_model")
graphdef = TFParser("my_model").parse()
mod, params = from_tensorflow(graphdef, shape={"input": (1, 1)})
```

but it gives the error:

```
Exception: Function not found - __inference_signature_wrapper_93
```

So two questions:

- Is there a better way I should be using to load a Keras model with custom layers?
- If not, how can I fix that error?