-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[experimental][FP16] Add native __half support for sum_functor #1655
base: release/2.4
Are you sure you want to change the base?
Conversation
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE |
Is there are any perf benefit from this change? if it is an experimental change we need to push to rocm6.3_internal_testing branch and even raise to upstream to get opinion of upstream reviewers. |
I would also like to know more about the motivation for this change. We usually accumulate to wider types; would accumulating in half lead to overflow? |
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE |
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE |
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE |
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE Detected error during Pytorch building:
|
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE Detected error during Pytorch building:
|
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE Detected error during Pytorch building:
|
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE Detected error during Pytorch building:
|
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE Detected error during Pytorch building:
|
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE Detected error during Pytorch building:
|
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE Detected error during Pytorch building:
|
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE Detected error during Pytorch building:
|
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE |
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE |
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE Detected error during Pytorch building:
|
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE |
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE Detected error during Pytorch building:
|
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE |
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE |
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE Detected error during Pytorch building:
|
@@ -172,7 +172,7 @@ template < | |||
typename GeneralDispatcher> | |||
static void reduce_dispatch(TensorIterator& iter, GeneralDispatcher op) { | |||
if (iter.dtype() == kHalf) { | |||
return OpFunctor<at::Half, float>{}(iter); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You need to ifdef this also.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch -- Thank you!
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE |
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE |
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE Detected error during Pytorch building:
|
Jenkins build for 82d9a8816220c4582c87e348a7acb66bda51682e commit finished as FAILURE Detected error during Pytorch building:
|
During cmake step set EnVar `PYTORCH_REDUCESUM_ENABLE_NATIVE_HALF=1`. Enables experimental support of FP16 for `sum_functor`. That is, operator+ will utilize __half types directly, instead of using static_cast<float> of arguments. Note: only additions are affected by these changes.
82d9a88
to
2bfb4fc
Compare
Applied feedback from @doru1004 and changed the EnVar name to be more expressive. |
Jenkins build for 2bfb4fcaf342a40c5df330fbb47fd7ae00647389 commit finished as FAILURE Detected error during Pytorch building:
|
Jenkins build for 2bfb4fcaf342a40c5df330fbb47fd7ae00647389 commit finished as FAILURE Detected error during Pytorch building:
|
Jenkins build for 2bfb4fcaf342a40c5df330fbb47fd7ae00647389 commit finished as FAILURE Detected error during Pytorch building:
|
Jenkins build for 2bfb4fcaf342a40c5df330fbb47fd7ae00647389 commit finished as FAILURE Detected error during Pytorch building:
|
Jenkins build for 2bfb4fcaf342a40c5df330fbb47fd7ae00647389 commit finished as FAILURE |
Jenkins build for 2bfb4fcaf342a40c5df330fbb47fd7ae00647389 commit finished as FAILURE |
Jenkins build for 2bfb4fcaf342a40c5df330fbb47fd7ae00647389 commit finished as FAILURE |
During cmake step set EnVar
PYTORCH_REDUCESUM_ENABLE_NATIVE_HALF=1
.Enables experimental support of FP16 for
sum_functor
.That is, operator+ will utilize __half types directly,
instead of using static_cast of arguments.
Note: only additions are affected by these changes.