Skip to content

在sts数据集上用多头注意力机制上进行测试。 pytorch torchtext 代码简练,非常适合新手了解多头注意力机制的运作。不想transformer牵扯很多层 multi-head attention + one layer linear

Notifications You must be signed in to change notification settings

lizhenping/multi-head-self-attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 

Repository files navigation

summary

介绍

多头注意力机制multi head attention。

About

在sts数据集上用多头注意力机制上进行测试。 pytorch torchtext 代码简练,非常适合新手了解多头注意力机制的运作。不想transformer牵扯很多层 multi-head attention + one layer linear

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published