Skip to content

dimitry12/attention-is-all-you-need-single-file

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

About

Single-file implementation of Attention Is All You Need

Why

  • It's a flexible interpretable architecture, more performant than RNN and applicable to POMDP.
  • Simple implementantion to play with my Titan V and see how much I can get out of mixed-precision learning and matmul-friendly tensor-cores with matmul-heavy architecture.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published