Skip to content

Commit

Permalink
- For planner to use cpu for search random generator (#2484)
Browse files Browse the repository at this point in the history
Summary:

torch.rand() defaults to using the default device. If torch.device has
been globally set to 'meta', then this breaks the planner code. Force
the device to cpu instead. This ensures we're using the same seeded
random number alg as before (so no changes to existing plans). We only
generate a handful of random numbers so performance is not a concern.

Example breakage:
https://fb.workplace.com/groups/260102303573409/posts/507200532196917/

Differential Revision: D64260019
  • Loading branch information
Damian Reeves authored and facebook-github-bot committed Oct 16, 2024
1 parent afd5726 commit 36df728
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion torchrec/distributed/planner/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -216,7 +216,7 @@ def clamp(self, x: float) -> float:

def uniform(self, A: float, B: float) -> float:
"Return a random uniform position in range [A,B]."
u = torch.rand(1, generator=self.gen).item()
u = torch.rand(1, generator=self.gen, device="cpu").item()
return A + (B - A) * u

def next(self, fy: float) -> Optional[float]:
Expand Down

0 comments on commit 36df728

Please sign in to comment.