Movatterモバイル変換


[0]ホーム

URL:


attention: Self-Attention Algorithm

Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning".

Version:0.4.0
Suggests:covr,knitr,rmarkdown,testthat (≥ 3.0.0)
Published:2023-11-10
DOI:10.32614/CRAN.package.attention
Author:Bastiaan QuastORCID iD [aut, cre]
Maintainer:Bastiaan Quast <bquast at gmail.com>
License:GPL (≥ 3)
NeedsCompilation:no
Materials:README,NEWS
CRAN checks:attention results

Documentation:

Reference manual:attention.html ,attention.pdf
Vignettes:Complete Self-Attention from Scratch (source,R code)
Simple Self-Attention from Scratch (source,R code)

Downloads:

Package source: attention_0.4.0.tar.gz
Windows binaries: r-devel:attention_0.4.0.zip, r-release:attention_0.4.0.zip, r-oldrel:attention_0.4.0.zip
macOS binaries: r-release (arm64):attention_0.4.0.tgz, r-oldrel (arm64):attention_0.4.0.tgz, r-release (x86_64):attention_0.4.0.tgz, r-oldrel (x86_64):attention_0.4.0.tgz
Old sources: attention archive

Reverse dependencies:

Reverse imports:rnn,transformer

Linking:

Please use the canonical formhttps://CRAN.R-project.org/package=attentionto link to this page.


[8]ページ先頭

©2009-2025 Movatter.jp