Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Op info test for logspace .. masked.amax #7507

Closed
qihqi opened this issue Jun 25, 2024 · 7 comments · Fixed by #8236, #8254 or #8262
Closed

Op info test for logspace .. masked.amax #7507

qihqi opened this issue Jun 25, 2024 · 7 comments · Fixed by #8236, #8254 or #8262
Assignees
Labels

Comments

@qihqi
Copy link
Collaborator

qihqi commented Jun 25, 2024

Fix the Op info test for logspace .. masked.amax

  1. Find the lines 153 to 157 of test_ops.py and remove
    logspace .. masked.amax from skip_list
  2. Run op_info test with pytest test/test_ops.py
  3. Fix the failure.

Please refer to
this guide as guide to fix:

Also refer to these PRs:

@qihqi qihqi added good first issue Good for newcomers torchxla2 labels Jun 25, 2024
@ManfeiBai ManfeiBai self-assigned this Aug 26, 2024
@ManfeiBai ManfeiBai removed their assignment Sep 24, 2024
@ManfeiBai
Copy link
Collaborator

actual work will be "logspace", "lu", "lu_solve", "lu_unpack"

@ManfeiBai
Copy link
Collaborator

Hi, @barney-s, thanks for contribution! This issue would be a new first issue, please feel free to reply this issue, so that we could assign it to you

@barney-s
Copy link
Contributor

Please assign it to me.

@barney-s
Copy link
Contributor

barney-s commented Oct 4, 2024

Looking at logspace

logspace is decomposing to linspace. We will fix linspace #7505 and revisit this again.

Script:

import torch
import torch_xla2

env = torch_xla2.default_env()
env.config.debug_print_each_op = True
env.config.debug_accuracy_for_each_op = True

with env:
  y = torch.tensor([1, 5, 10])
  print(torch.logspace(start=-10, end=10, steps=5))

Output:

FUNCTION: tensor
/usr/local/google/home/barni/workspace/pytorch/xla/experimental/torch_xla2/torch_xla2/ops/jtorch.py:48: UserWarning: Explicitly requested dtype <class 'jax.numpy.int64'> requested in array is not available, and will be truncated to dtype int32. To enable more dtypes, set the jax_enable_x64 configuration option or the JAX_ENABLE_X64 shell environment variable. See https://github.com/google/jax#current-gotchas for more.
  return jnp.array(
FUNCTION: logspace
 DISPATCH: aten::logspace
  FUNCTION: linspace
   DISPATCH: aten::linspace
    FUNCTION: arange
     DISPATCH: aten::arange.start_step
    FUNCTION: lt
     DISPATCH: aten::lt.Scalar

@barney-s
Copy link
Contributor

barney-s commented Oct 9, 2024

#8236

@ManfeiBai
Copy link
Collaborator

reopen due to we still have lu_unpack

@barney-s
Copy link
Contributor

#8262 for lu_unpack

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
3 participants