-
-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inception3 not supported ? #70
Comments
It was my mistake, the options were invalid. But I fixed options and now I get error: |
I dowloaded model from Firebase to be sure it was not broken on the way and it is working in Python script on my Mac but not in plugin. |
Thank you for reporting the detailed issue! |
I am trying to dig into that but I don't see where this exception is coming from. |
I'll check this issue within a week. But it may take time. |
Sad to hear it. Your plugin is the only I found for custom models, I will try to find the issue and fork in in worse case. |
Did you run your example recently ? Gradle fails for me without explanation. |
I corrected another input/output format config and I am getting.
|
Hi, |
So it seems your example works fine, the issue is with more complex optimized models. |
I'll resolve #54 in next release. Thanks. |
Do you know what is the problem ? |
Hi did you managed to solve it ? The same problem directly on Android it suppose to be solved by upgrading to newer Tensorflow Lite recently released. |
I uploaded custom tensorflowlite model which is based on InceptionV3,
plugin shows error:
The model is INCOMPATIBLE. It may contain unrecognized custom ops, or not FlatBuffer format: java.lang.IllegalArgumentException: Internal error: Cannot create interpreter: Didn't find op for builtin opcode 'CONV_2D' version '2'
The text was updated successfully, but these errors were encountered: