You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
But when I try mobelinet-v3-tf example mentioned in the official document above, the error occurs.
Could OpenVINO guys give some tips about it?
Relevant log output
[Step 1/11] Parsing and validating input arguments
[ INFO ] Parsing input parameters
[Step 2/11] Loading OpenVINO Runtime
[ INFO ] OpenVINO:
[ INFO ] Build ................................. 2025.0.0-17426-287ab9883ac
[ INFO ]
[ INFO ] Device info:
[ INFO ] CPU
[ INFO ] Build ................................. 2025.0.0-17426-287ab9883ac
[ INFO ]
[ INFO ]
[Step 3/11] Setting device configuration
[Step 4/11] Reading model files
[ INFO ] Loading model files
[ INFO ] Read model took 33.87 ms
[ INFO ] Original model I/O parameters:
[ INFO ] Network inputs:
[ INFO ] input:0 (node: input) : f32 / [...] / [1,224,224,3]
[ INFO ] Network outputs:
[ INFO ] MobilenetV3/Predictions/Softmax:0 (node: MobilenetV3/Predictions/Softmax) : f32 / [...] / [1,1001]
[Step 5/11] Resizing model to match image sizes and given batch
[Step 6/11] Configuring input of the model
[ INFO ] Model batch size: 1
[ INFO ] Network inputs:
[ INFO ] input:0 (node: input) : u8 / [N,H,W,C] / [1,224,224,3]
[ INFO ] Network outputs:
[ INFO ] MobilenetV3/Predictions/Softmax:0 (node: MobilenetV3/Predictions/Softmax) : f32 / [...] / [1,1001]
[Step 7/11] Loading the model to the device
[ ERROR ] Exception from src/inference/src/cpp/core.cpp:107:
Exception from src/inference/src/dev/plugin.cpp:53:
Check 'jitter != jitters.end()' failed at src/common/snippets/src/lowered/target_machine.cpp:19:
Supported precisions set is not available for Convert operation.
### Issue submission checklist
- [X] I'm reporting an issue. It's not a question.
- [X] I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found a solution.
- [X] There is reproducer code and related data files such as images, videos, models, etc.
The text was updated successfully, but these errors were encountered:
The error message shows that the exception has been thrown from the Graph Compiler "Snippets".
As temporary solution, I recommend to disable Snippets tokenization. To do that you need:
Create the config file config.json with the content { "CPU" : {"SNIPPETS_MODE" : "DISABLE"} }
Add the key -load_config config.json to your command line for benchmark_app:
LD_LIBRARY_PATH=/data/local/tmp ./data/local/tmp/benchmark_app -d CPU -m /data/local/tmp/mobelinet-v3-tf/v3-small_224_1.0_float.xml -hint throughput -load_config config.json
I believe it should help to temporary fix the problem.
@chenhu-wang May I ask you to take a look please at the exception in Snippets? Looks like Convert op on model input was not transformed to Snippets dialect: ConvertTruncation or ConvertSaturation. Thank you in advance!
@starlitsky2010 ,Could you please provide the CPU info you used and full command line to reproduce. I see you set layout but not full displayed in the pic? Thanks!
OpenVINO Version
Master
Operating System
Android System
Device used for inference
CPU
Framework
ONNX
Model used
mobelinet-v3-tf
Issue description
LD_LIBRARY_PATH=/data/local/tmp ./data/local/tmp/benchmark_app -d CPU -m /data/local/tmp/mobelinet-v3-tf/v3-small_224_1.0_float.xml -hint throughput
Step-by-step reproduction
According to https://github.com/openvinotoolkit/openvino/blob/master/docs/dev/build_android.md I built the android ONNX with ABI x86_64 and enable the benchmark_app compilation with latest version OpenVINO master baseline (commit ID 287ab98)
But when I try mobelinet-v3-tf example mentioned in the official document above, the error occurs.
Could OpenVINO guys give some tips about it?
Relevant log output
The text was updated successfully, but these errors were encountered: