Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[torchlib] Fix aten_mean_dim #1962

Merged
merged 1 commit into from
Nov 22, 2024
Merged

[torchlib] Fix aten_mean_dim #1962

merged 1 commit into from
Nov 22, 2024

Conversation

justinchuby
Copy link
Collaborator

Fix when the rank of dim is not known, the conditional will pick the wrong branch because the truthiness of a value is always True.

@justinchuby justinchuby added topic: torch_lib Related to the torch/aten function lib in development merge at lgtm Reviewers can merge when they approve labels Nov 22, 2024
@justinchuby justinchuby enabled auto-merge (squash) November 22, 2024 22:28
@justinchuby
Copy link
Collaborator Author

Validated with

import torch
import transformers

config = transformers.MistralConfig(
    hidden_size=32,
    num_hidden_layers=2,
    vocab_size=99,
    intermediate_size=16,
    max_position_embeddings=512,
    num_attention_heads=2,
    num_key_value_heads=2,
    sliding_window=4096,
)
config._attn_implementation = "eager"
model = transformers.MistralModel(config)
shape = (1, 30)
input_ids = torch.randint(0, 99, shape).to(torch.int64)
attention_mask = torch.ones(shape)
expected = model(input_ids, attention_mask)
ep = torch.export.export(model, (input_ids, attention_mask))
mod = ep.module()
got = mod(input_ids, attention_mask)

onnx_program = torch.onnx.export(ep)
onnx_program.save("mistral.onnx")
import torch.onnx._internal.exporter._testing as onnx_testing
onnx_testing.assert_onnx_program(onnx_program)

Copy link

codecov bot commented Nov 22, 2024

❌ 12 Tests Failed:

Tests completed Failed Passed Skipped
12381 12 12369 1201
View the full list of 3 ❄️ flaky tests
tests.eager_mode_test.TestEagerModeArguments_0_reference_runtime::test_function_input_and_attribute_by_kwargs_out_of_order

Flake rate in main: 39.69% (Passed 9322 times, Failed 6136 times)

Stack Traces | 0.002s run time
..../test_torch_nightly/lib/python3.12.../reference/ops/_op.py:91: in run
    res = self._run(x, y)
..../test_torch_nightly/lib/python3.12.../reference/ops/_op.py:139: in _run
    res = (convert_from_ml_dtypes(res[0]),)
..../test_torch_nightly/lib/python3.12.../onnx/reference/custom_element_types.py:50: in convert_from_ml_dtypes
    return array.view(dtype=dtype)
E   ValueError: Changing the dtype of a 0d array is only supported if the itemsize is unchanged

The above exception was the direct cause of the following exception:
tests/eager_mode_test.py:115: in test_function_input_and_attribute_by_kwargs_out_of_order
    self.assertEqual(add_with_alpha(alpha=3.0, other=2.0, this=1.0), 7.0)
onnxscript/values.py:576: in __call__
    return evaluator.default().eval_function(self, args, kwargs)
onnxscript/evaluator.py:307: in eval_function
    result = function.function(*adapted_args, **adapted_kwargs)
tests/eager_mode_test.py:59: in add_with_alpha
    other = op.Mul(other, alpha)
.../onnx_opset/_impl/opset14.py:696: in Mul
    return op(*self._prepare_inputs(schema, A, B))
onnxscript/values.py:304: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:524: in _eval
    result = session.run(None, session_run_input)
..../test_torch_nightly/lib/python3.12.../onnx/reference/reference_evaluator.py:599: in run
    outputs = node.run(*inputs, **linked_attributes)
..../test_torch_nightly/lib/python3.12.../reference/ops/_op.py:114: in run
    res = OpRunBinary.run(self, x, y)
..../test_torch_nightly/lib/python3.12.../reference/ops/_op.py:93: in run
    raise TypeError(
E   TypeError: Issues with types <class 'numpy.ndarray'>, <class 'numpy.ndarray'> (binary operator 'Mul').
tests.eager_mode_test.TestEagerModeArguments_0_reference_runtime::test_function_some_input_by_kwargs

Flake rate in main: 39.69% (Passed 9322 times, Failed 6136 times)

Stack Traces | 0.002s run time
..../test_torch_nightly/lib/python3.12.../reference/ops/_op.py:91: in run
    res = self._run(x, y)
..../test_torch_nightly/lib/python3.12.../reference/ops/_op.py:139: in _run
    res = (convert_from_ml_dtypes(res[0]),)
..../test_torch_nightly/lib/python3.12.../onnx/reference/custom_element_types.py:50: in convert_from_ml_dtypes
    return array.view(dtype=dtype)
E   ValueError: Changing the dtype of a 0d array is only supported if the itemsize is unchanged

The above exception was the direct cause of the following exception:
tests/eager_mode_test.py:106: in test_function_some_input_by_kwargs
    self.assertEqual(add_with_alpha(1.0, other=2.0), 3.0)
onnxscript/values.py:576: in __call__
    return evaluator.default().eval_function(self, args, kwargs)
onnxscript/evaluator.py:307: in eval_function
    result = function.function(*adapted_args, **adapted_kwargs)
tests/eager_mode_test.py:59: in add_with_alpha
    other = op.Mul(other, alpha)
.../onnx_opset/_impl/opset14.py:696: in Mul
    return op(*self._prepare_inputs(schema, A, B))
onnxscript/values.py:304: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:524: in _eval
    result = session.run(None, session_run_input)
..../test_torch_nightly/lib/python3.12.../onnx/reference/reference_evaluator.py:599: in run
    outputs = node.run(*inputs, **linked_attributes)
..../test_torch_nightly/lib/python3.12.../reference/ops/_op.py:114: in run
    res = OpRunBinary.run(self, x, y)
..../test_torch_nightly/lib/python3.12.../reference/ops/_op.py:93: in run
    raise TypeError(
E   TypeError: Issues with types <class 'numpy.ndarray'>, <class 'numpy.ndarray'> (binary operator 'Mul').
tests.eager_mode_test.TestEagerModeArguments_0_reference_runtime::test_function_attribute_by_positional_args

Flake rate in main: 39.69% (Passed 9322 times, Failed 6136 times)

Stack Traces | 0.002s run time
..../test_torch_nightly/lib/python3.12.../reference/ops/_op.py:91: in run
    res = self._run(x, y)
..../test_torch_nightly/lib/python3.12.../reference/ops/_op.py:139: in _run
    res = (convert_from_ml_dtypes(res[0]),)
..../test_torch_nightly/lib/python3.12.../onnx/reference/custom_element_types.py:50: in convert_from_ml_dtypes
    return array.view(dtype=dtype)
E   ValueError: Changing the dtype of a 0d array is only supported if the itemsize is unchanged

The above exception was the direct cause of the following exception:
tests/eager_mode_test.py:112: in test_function_attribute_by_positional_args
    self.assertEqual(add_with_alpha(1.0, 2.0, 3.0), 7.0)
onnxscript/values.py:576: in __call__
    return evaluator.default().eval_function(self, args, kwargs)
onnxscript/evaluator.py:307: in eval_function
    result = function.function(*adapted_args, **adapted_kwargs)
tests/eager_mode_test.py:59: in add_with_alpha
    other = op.Mul(other, alpha)
.../onnx_opset/_impl/opset14.py:696: in Mul
    return op(*self._prepare_inputs(schema, A, B))
onnxscript/values.py:304: in __call__
    return evaluator.default().eval(schema, args, kwargs)
onnxscript/evaluator.py:194: in eval
    outputs = self._eval(schema, inputs, attributes, closure)
onnxscript/evaluator.py:524: in _eval
    result = session.run(None, session_run_input)
..../test_torch_nightly/lib/python3.12.../onnx/reference/reference_evaluator.py:599: in run
    outputs = node.run(*inputs, **linked_attributes)
..../test_torch_nightly/lib/python3.12.../reference/ops/_op.py:114: in run
    res = OpRunBinary.run(self, x, y)
..../test_torch_nightly/lib/python3.12.../reference/ops/_op.py:93: in run
    raise TypeError(
E   TypeError: Issues with types <class 'numpy.ndarray'>, <class 'numpy.ndarray'> (binary operator 'Mul').

To view more test analytics, go to the Test Analytics Dashboard
Got feedback? Let us know on Github

@justinchuby justinchuby merged commit 99b3d26 into main Nov 22, 2024
26 of 41 checks passed
@justinchuby justinchuby deleted the justinchu/aten_mean_dim branch November 22, 2024 22:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
merge at lgtm Reviewers can merge when they approve topic: torch_lib Related to the torch/aten function lib in development
Projects
Development

Successfully merging this pull request may close these issues.

2 participants