Skip to content

Commit

Permalink
follow the alphafold convention of inner attention gating
Browse files Browse the repository at this point in the history
  • Loading branch information
lucidrains committed May 28, 2024
1 parent 39cf9fc commit 769fd14
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 1 deletion.
1 change: 1 addition & 0 deletions alphafold3_pytorch/alphafold3.py
Original file line number Diff line number Diff line change
Expand Up @@ -1355,6 +1355,7 @@ def __init__(
linear_attn = TaylorSeriesLinearAttn(
dim = dim,
prenorm = True,
gate_value_heads = True,
**linear_attn_kwargs
)

Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "alphafold3-pytorch"
version = "0.0.52"
version = "0.0.53"
description = "Alphafold 3 - Pytorch"
authors = [
{ name = "Phil Wang", email = "[email protected]" }
Expand Down

0 comments on commit 769fd14

Please sign in to comment.