AILinkDeepTech
Mixture of Experts (MoE) Coding | MoE Code Implementation | Mixture of Experts Model
3 months ago - 7:04
Modular
The Future of Compute Portability
2 days ago - 16:33
PyTorch
DeepSpeed – Efficient Training Scalability for Deep Learning Models - Olatunji Ruwase, SnowFlake
1 day ago - 18:40
PyTorch
Multimodal Open Source at Kyutai, From Online Demos to On-Device - Alexandre Défossez
1 day ago - 21:26
PyTorch
Best Practices for Open Multilingual LLM Evaluation - Catherine Arnett, EleutherAI
1 day ago - 15:45
PyTorch
Lightning Thunder: Supercharged PyTorch for Modern Hardware - Luca Antiga, Lightning AI
1 day ago - 17:27
PyTorch
Harnessing Common Crawl for AI and ML Applications - Pedro Ortiz Suarez, Common Crawl
1 day ago - 14:23
PyTorch
Real-World Robotics as the Next Frontier for AI? - Pierre Rouanet, Pollen Robotics
1 day ago - 16:04
Tech Deiyo
Unlock Free AI Power: Test & Compare Top LLM Models with LM Arena!
2 days ago - 12:53
GitHub Daily Trend
GitHub - datawhalechina/self-llm: 《开源大模型食用指南》针对中国宝宝量身打造的基于Linux环境快速微调(全参数/Lora)、部署国内外开源大模型(LLM)/多...
2 days ago - 3:54
nanohubtechtalks
Benchmarking Universal Machine Learning Force Fields with CHIPS-FF
15 hours ago - 1:04:51
2FM RC
Furitek Micro Python Pro! Tuning at your fingertips?
2 days ago - 22:33