What are the plans for PyTorch 2.0? #900
Replies: 14 comments
-
We are still figuring out what it will take to fully support PT 2.0, and haven't arrived at a conclusion yet. |
Beta Was this translation helpful? Give feedback.
-
Good to know. Do you know if it's feasible to run CUDA-enabled PyTorch within a Python interpreter embedded in a .NET runtime? That seems like a reasonable temporary workaround for simple workflows. |
Beta Was this translation helpful? Give feedback.
-
I don't know, but I don't see why it wouldn't be possible. It would be subject to the limitations of running Python from .NET, but I don't see what CUDA would be special. Doesn't mean it works, just that I'm ignorant of any reasons for it not working. |
Beta Was this translation helpful? Give feedback.
-
I was just looking at that, but couldn't easily determine if it supported cpython packages that use a C ABI. But given your recommendation as a possibility, I'll just fire it up and find out the hard way 🙂 |
Beta Was this translation helpful? Give feedback.
-
@JustinWick I know for sure it does. My TensorFlow binding is based on it. |
Beta Was this translation helpful? Give feedback.
-
Oh, nice! I use TensorFlow.NET, but good to know about yours as well. And thanks for the advice, I don't know if we'll end up using it in production, but for benchmarking, etc, it will probably be necessary. Besides, would be nice to be able to run our analysis scripts inside .NET. |
Beta Was this translation helpful? Give feedback.
-
Comparing different approaches of bridging PyTorch (Python) and .NET https://github.com/zxgamekingdom/PytorchDeploy
SummaryAlthough Torch.NET is stated as 1.0.0, the degree of PyTorch API coverage is significantly LOWER than that of TorchSharp 0.99.2. More important, many more users in .NET are quality checking and testing the high coverage of TorchSharp each day. |
Beta Was this translation helpful? Give feedback.
-
We are now at another crossroad. The indecision and many highly controversial feedback 2+ years back almost prevented the achievement of TorchSharp today to adopting PyTorch-like syntax. Now, we need to decide if the overhead is necessary from TorchSharp to ML.NET without going through the ONNX consumption path. Is PyTorch to Jit export the only feasible path from PyTorch to TorchSharp. More important, considering the recent billions US$ of investment by Microsoft to ChatGPT, the .NET community needs something that they can keep up with Microsoft's latest commitment to ChatGPT. Will the current .NET approach (next 6 months to 1 year) through Onnx Runtime, Onnx Extension, and TorchSharp is sufficient to address this new possibility? The .NET community needs support to make ChatGPT better through e.g. c# and not just PyTorch. |
Beta Was this translation helpful? Give feedback.
-
As always, @GeorgeS2019, you input is appreciated. Given the resources available, TorchSharp will remain focused on being a wrapper around native libtorch and solutions based on the torch runtime will take priority. Anyone is welcome to contribute an ONNX export capability to TorchSharp, if that is a priority for them. Personally, I think getting M1 support (I don't have a Mac with M1, so I can't test it) feels more important. As to your second point, I don't understand how TorchSharp relates to ChatGPT. OpenAI has a published HTTP-based API, and you can access it from .NET languages via HTTP requests, and a few people have developed unofficial bindings: https://www.nuget.org/packages?q=OpenAI .NET is not, as far as I can see, at a disadvantage when it comes to OpenAI model access. |
Beta Was this translation helpful? Give feedback.
-
I had an offline discussion with @natke about why the existing
is simply NOT ENOUGH for the expected "Intelligence" that could come from GPT variant like ChatGPT.
|
Beta Was this translation helpful? Give feedback.
-
@GeorgeS2019 -- you may be absolutely right, but it's not as obvious to me as it is to you, so it would help me understand if you elaborated a little bit rather than just asserting it. |
Beta Was this translation helpful? Give feedback.
-
BTW, maybe it's worth starting a new discussion thread on the implications of ChatGPT and how that relates to TorchSharp. This issue was started as a question about PyTorch 2.0, and I suspect many will miss the ChatGPT angle just looking at the topic in the issues list. |
Beta Was this translation helpful? Give feedback.
-
Does TorchSharp plan to support PyTorch 2? If so, is there documentation of the path forward to that support?
Beta Was this translation helpful? Give feedback.
All reactions