多头注意力机制multi head attention。
-
Notifications
You must be signed in to change notification settings - Fork 4
lizhenping/multi-head-self-attention
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
在sts数据集上用多头注意力机制上进行测试。 pytorch torchtext 代码简练,非常适合新手了解多头注意力机制的运作。不想transformer牵扯很多层 multi-head attention + one layer linear
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published