This is the first part of the Transformer Series. Here, I present an intuitive understanding of the self-attention mechanism in transformer networks.
[Paper] Attention Is All You Need: papers.nips.cc/paper/7181-attention-is-all-you-nee…
Other Resources:
Video Lecture on Word2Vec: • Lecture 2 | Word Vector Representations: w...
Great article on Word2Vec: jalammar.github.io/illustrated-word2vec/
コメント