Partner im RedaktionsNetzwerk Deutschland

Argmax

Vahe Hagopian, Taka Hasegawa, Farrukh Rahman
Argmax
Neueste Episode

Verfügbare Folgen

5 von 17
  • Mixture of Experts
    In this episode we talk about the paper "Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer, Azalia Mirhoseini, Krzysztof Maziarz, Andy Davis, Quoc Le, Geoffrey Hinton, Jeff Dean.
    --------  
    54:46
  • LoRA
    We talk about Low Rank Approximation for fine tuning Transformers. We are also on YouTube now! Check out the video here: https://youtu.be/lLzHr0VFi3Y
    --------  
    1:02:56
  • 15: InstructGPT
    In this episode we discuss the paper "Training language models to follow instructions with human feedback" by Ouyang et al (2022). We discuss the RLHF paradigm and how important RL is to tuning GPT.
    --------  
    57:27
  • 14: Whisper
    This week we talk about Whisper. It is a weakly supervised speech recognition model.
    --------  
    49:14
  • 13: AlphaTensor
    We talk about AlphaTensor, and how researchers were able to find a new algorithm for matrix multiplication.
    --------  
    49:05

Weitere Wissenschaft Podcasts

Über Argmax

A show where three machine learning enthusiasts talk about recent papers and developments in machine learning. Watch our video on YouTube https://www.youtube.com/@argmaxfm
Podcast-Website

Hören Sie Argmax, Das Wissen | SWR und viele andere Podcasts aus aller Welt mit der radio.de-App

Hol dir die kostenlose radio.de App

  • Sender und Podcasts favorisieren
  • Streamen via Wifi oder Bluetooth
  • Unterstützt Carplay & Android Auto
  • viele weitere App Funktionen
Rechtliches
Social
v7.17.1 | © 2007-2025 radio.de GmbH
Generated: 5/11/2025 - 2:04:20 AM