Loading...
「ツール」は右上に移動しました。
利用したサーバー: wtserver1
26いいね 919 views回再生

Evolution of AGI and NLP Algorithms Using Latent Variables: Future of AI (Part 3 of 3)

We investigate the evolution of current natural language processing (NLP) algorithms. Latent variables were the key in moving from simple term frequency times inverse document frequency to Word2Vec and Doc2Vec, and were similarly the key in moving from k-means clustering to the latent Dirichlet allocation (LDA) topic-modeling algorithm. The next major evolution was spurred by the introduction of transformers, which made possible the evolution of both BERT and the various large language models, including the GPT series as well as ChatGPT. The LLM introduction has brought about increased awareness of the alignment problem. While real intelligence is not likely to emerge from transformer-based methods, it is likely to emerge from variational inference approaches, especially when using active inference.

Please OPT-IN with Themesis on the About page to get word AS SOON as new YouTubes, blogs, and short courses are released:

Opt-In HERE: www.themesis.com/themesis/

Visit Themesis for the associated blogpost (good links!):
https://themesis.com/2023/05/22/evolu...

Subscribe to the Themesis YouTube channel easily - click this link: https://www.youtube.com/@themesisinc....

コメント