Loading...
「ツール」は右上に移動しました。
利用したサーバー: natural-voltaic-titanium
14いいね 622回再生

GELU activation function in 💯 lines of PyTorch code | Machine Learning

Machine Learning: Implementation of the paper "Gaussian Error Linear Units (GELUs)" in 100 lines of PyTorch code.

Link to the paper: arxiv.org/abs/1606.08415
GitHub: github.com/MaximeVandegar/Papers-in-100-Lines-of-C…

-----------------------------------------------------------------------------------------------------
CONTACT: papers.100.lines@gmail.com
#python #gelu #neuralnetworks #machinelearning #artificialintelligence #deeplearning #data #bigdata #supervisedlearning #research #activationfunction #relu #activation #function

コメント