Error when Tenserrt trt uses engine infer
Build YOLOV5s model, and try to use trt reasoning service, an error occurs
[TensorRT] ERROR: ../rtSafe/coreReadArchive.cpp (38) - Serialization Error in verifyHeader: 0 (Version tag does not match) [TensorRT] ERROR: INVALID_STATE: std::exception [TensorRT] ERROR: INVALID_CONFIG: Deserialize the cuda engine failed.
The version of tensorrt used when compiling the engine is inconsistent with the version of tensorrt used when using trt inference, and it needs to be consistent.
After the modification, it runs normally, and the speed becomes very fast.