Loading...
「ツール」は右上に移動しました。
利用したサーバー: wtserver1
6いいね 404 views回再生

Paper Reading Group: Attention Is All You Need

Watch ad-free on Taro: https://www.jointaro.com/lesson/p701C...

Generative AI, specifically large language models, have taken the world by storm. But they wouldn't exist today without a key paper published by Google Brain: "Attention Is All You Need." The 10-page paper, published in 2017, introduced the Transformer - a critical innovation in neural networks.

This session:

A deep dive and interactive discussion of the paper. We recommend reading (or at least skimming) it first: https://arxiv.org/pdf/1706.03762.pdf
A look at what came before the paper (neural networks, embeddings), and what came after (BERT, GPT-2, and ChatGPT).
All software engineers welcome - no AI/ML background necessary!

Did you know?

The paper's authors have all gone on to found multiple billion-dollar AI companies including Cohere and Character.ai.
Since 2017, the paper has been cited over 86,000 times.
The Transformer architecture is the T in "GPT".

Your host:

Charlie Guo is a seasoned founder and author with over 15 years of software experience and more than a decade in Silicon Valley. He is an alum of both Stanford and Y Combinator, and currently publishes Artificial Ignorance, an AI-focused newsletter for engineers and founders.

Charlie:   / charlierguo  

コメント