5.2. FP32 Model Generation

Compiler

Function

Use instruction

bmnetc

Compile the Caffe model

BMNETC

bmnett

Compile the TensorFlow model

BMNETT

bmnetm

Compile the MXNet model

BMNETM

bmnetp

Compile the Pytorch model

BMNETP

bmnetd

Compile the Darknet model

BMNETD

bmneto

Compile the ONNX model

BMNETO

bmpaddle

Compile the PaddlePaddle model

BMPADDLE

bmnetu

Compile a customized Unified Framework (UFW) model

BMNETU

Executing source ‘envsetup.sh’ in the docker environment will automatically install the above conversion tool and set the relevant environment variables at the current terminal.

注意

For some models, such as Paddle-orc-detection or other models with a lot of summations or divisions, if the comparison option is opened during the conversion process, the comparison result will exceed the allowable error threshold due to the accumulation of errors, thus interrupting the model conversion. There are also some models that need to be sorted. Although the error is not large, it will still affect the ordering order, resulting in alignment errors and model conversion interruption. In these cases, the cmp parameter can be turned off in the conversion process without data comparison, and the accuracy of the converted model can be verified at the business level after the model transformation is completed.

注解

PyTorch model transformation considerations

  1. What is JIT (torch.jit) : JIT (Just-In-Time) is a set of compilation tools designed to bridge the gap between PyTorch research and production. It allows for the creation of models that can run without relying on the Python interpreter, and can be optimized more aggressively.

  2. Relationship between JIT and BMNETP: BMNETP only accepts the PyTorch JIT model.

  3. How to get JIT model: In the case of an existing PyTorch Python model (the base class is torch.nn.Module), torch.jit.trace(python_model,torch.rand(input_shape)).save(‘jit_model’)

  4. Why not use torch.jit.script to get JIT model: BMNETP does not support JIT models with control-flow operations (such as if statements or loops), inplace operations (such as copy_function), but torch.jit.script can produce such models, whereas torch.jit.trace cannot, tracing and recording only operations on tensors. No control flow operations are recorded.

  5. Why not a GPU model: BMNETP does not support the compilation process.

  6. How to convert GPU model into CPU model? When loading PyTorch’s Python model, use the map_location argument torch.load(python_model, map_location = ‘cpu’)

Darknet model transformation considerations

The batch/subvision value in the cfg file must be larger than the batch size of the input shape set in the conversion script.