SoFunction
Updated on 2024-11-13

Pytorch model to onnx model example

As shown below:

import io
import torch
import 
from models.C3AEModel import PlainC3AENetCBAM

device = ("cuda:0" if .is_available() else "cpu")

def test():
  model = PlainC3AENetCBAM()
 
  pthfile = r'/home/joy/Projects/models/emotion/'
  loaded_model = (pthfile, map_location='cpu')
  # try:
  #   loaded_model.eval()
  # except AttributeError as error:
  #   print(error)

  model.load_state_dict(loaded_model['state_dict'])
  # model = (device)

  #data type nchw
  dummy_input1 = (1, 3, 64, 64)
  # dummy_input2 = (1, 3, 64, 64)
  # dummy_input3 = (1, 3, 64, 64)
  input_names = [ "actual_input_1"]
  output_names = [ "output1" ]
  # (model, (dummy_input1, dummy_input2, dummy_input3), "", verbose=True, input_names=input_names, output_names=output_names)
  (model, dummy_input1, "C3AE_emotion.onnx", verbose=True, input_names=input_names, output_names=output_names)

if __name__ == "__main__":
 test()

Directly replace PlainC3AENetCBAM with the model that needs to be converted, then modify the pthfile, enter and onnx model name and execute.

Note: dummy_input2, dummy_input3, commented in the above code, correspond to multiple input examples.

Summary of problems encountered in the conversion process

RuntimeError: Failed to export an ONNX attribute, since it's not constant, please try to make things (., kernel size) static if possible

Encountered during the conversion processRuntimeError: Failed to export an ONNX attribute, since it's not constant, please try to make things (., kernel size) static if possiblebugs。

Open /home/joy/.tensorflow/venv/lib/python3.6/site-packages/torch/onnx/symbolic_helper.py according to the reported error log information, after adding print to the corresponding location, you can locate which op is out of order.

Example:

Add the

print(())

The output message is as follows:

%124 : Long() = onnx::Gather[axis=0](%122, %121), scope: PlainC3AENetCBAM/Bottleneck[cbam]/CBAM[cbam]/ChannelGate[ChannelGate] # /home/joy/Projects/models/emotion/WhatsTheemotion/models/:46:0

The reason is that the (1) way onnx in pytorch is not recognized and needs to be modified to a constant.

This Pytorch model to onnx model example above is all I have shared with you.