Replies: 4 comments 1 reply
-
I did a quick search for pytorch and yolov5 => YOLOv5 🚀 in PyTorch
Question to @NiklasGustafsson and @dsyme => is there a need for TorchSharp to support loading e.g yolov5.onnx |
Beta Was this translation helpful? Give feedback.
-
@dstampher -- we're working on it, but for a starting point, please see the examples under the src/ folder. Right now, the TorchSharp APIs are focused on providing and e2e experience in .NET, starting with data pre-processing, training, evaluation, and then inferencing. That said, there is a way to export PyTorch model weights and loading them in TorchSharp, but you still have to replicate the model code in order for the weights to make any sense. This is described in saveload.md ML.NET, which has product support, offers more in terms of loading existing models for inferencing. TorchSharp is available on NuGet, including supporting native code redistributable packages: libtorch-XXX (the name depends on what environment you are using -- Windows, Linux, MacOS, and whether you have an Nvidia GPU. You want version 0.93.1 of TorchSharp and 1.9.0.11 of libtorch-XXX. |
Beta Was this translation helpful? Give feedback.
-
@GeorgeS2019 -- as for loading ONNX models, that's something that we will get to sometime in the medium-term future. I expect that to be post-v1.0. Given the fact that loading ONNX models for inferencing is already available to .NET users via ML.NET, I expect the support in TorchSharp to first focus on transfer learning scenarios, to the extent that the two are distinct. |
Beta Was this translation helpful? Give feedback.
-
@GeorgeS2019 Thanks for linking me to that YOLOv5 C# repo, that might work for my use case. |
Beta Was this translation helpful? Give feedback.
-
There is pretty much 0 documentation for this repo. I have no idea how to use it. I tried to google this repo and didn't find anything helpful that answers my question.
Beta Was this translation helpful? Give feedback.
All reactions