Artwork

Innhold levert av Jon Krohn. Alt podcastinnhold, inkludert episoder, grafikk og podcastbeskrivelser, lastes opp og leveres direkte av Jon Krohn eller deres podcastplattformpartner. Hvis du tror at noen bruker det opphavsrettsbeskyttede verket ditt uten din tillatelse, kan du følge prosessen skissert her https://no.player.fm/legal.
Player FM - Podcast-app
Gå frakoblet med Player FM -appen!

801: Merged LLMs Are Smaller And More Capable, with Arcee AI's Mark McQuade and Charles Goddard

1:17:05
 
Del
 

Manage episode 429700476 series 1278026
Innhold levert av Jon Krohn. Alt podcastinnhold, inkludert episoder, grafikk og podcastbeskrivelser, lastes opp og leveres direkte av Jon Krohn eller deres podcastplattformpartner. Hvis du tror at noen bruker det opphavsrettsbeskyttede verket ditt uten din tillatelse, kan du følge prosessen skissert her https://no.player.fm/legal.

Merged LLMs are the future, and we’re exploring how with Mark McQuade and Charles Goddard from Arcee AI on this episode with Jon Krohn. Learn how to combine multiple LLMs without adding bulk, train more efficiently, and dive into different expert approaches. Discover how smaller models can outperform larger ones and leverage open-source projects for big enterprise wins. This episode is packed with must-know insights for data scientists and ML engineers. Don’t miss out!

Interested in sponsoring a SuperDataScience Podcast episode? Email natalie@superdatascience.com for sponsorship information.

In this episode you will learn:

• Explanation of Charles' job title: Chief of Frontier Research [03:31]

• Model Merging Technology combining multiple LLMs without increasing size [04:43]

• Using MergeKit for model merging [14:49]

• Evolutionary Model Merging using evolutionary algorithms [22:55]

• Commercial applications and success stories [28:10]

• Comparison of Mixture of Experts (MoE) vs. Mixture of Agents [37:57]

• Spectrum Project for efficient training by targeting specific modules [54:28]

• Future of Small Language Models (SLMs) and their advantages [01:01:22]

Additional materials: www.superdatascience.com/801

  continue reading

1125 episoder

Artwork
iconDel
 
Manage episode 429700476 series 1278026
Innhold levert av Jon Krohn. Alt podcastinnhold, inkludert episoder, grafikk og podcastbeskrivelser, lastes opp og leveres direkte av Jon Krohn eller deres podcastplattformpartner. Hvis du tror at noen bruker det opphavsrettsbeskyttede verket ditt uten din tillatelse, kan du følge prosessen skissert her https://no.player.fm/legal.

Merged LLMs are the future, and we’re exploring how with Mark McQuade and Charles Goddard from Arcee AI on this episode with Jon Krohn. Learn how to combine multiple LLMs without adding bulk, train more efficiently, and dive into different expert approaches. Discover how smaller models can outperform larger ones and leverage open-source projects for big enterprise wins. This episode is packed with must-know insights for data scientists and ML engineers. Don’t miss out!

Interested in sponsoring a SuperDataScience Podcast episode? Email natalie@superdatascience.com for sponsorship information.

In this episode you will learn:

• Explanation of Charles' job title: Chief of Frontier Research [03:31]

• Model Merging Technology combining multiple LLMs without increasing size [04:43]

• Using MergeKit for model merging [14:49]

• Evolutionary Model Merging using evolutionary algorithms [22:55]

• Commercial applications and success stories [28:10]

• Comparison of Mixture of Experts (MoE) vs. Mixture of Agents [37:57]

• Spectrum Project for efficient training by targeting specific modules [54:28]

• Future of Small Language Models (SLMs) and their advantages [01:01:22]

Additional materials: www.superdatascience.com/801

  continue reading

1125 episoder

כל הפרקים

×
 
Loading …

Velkommen til Player FM!

Player FM scanner netter for høykvalitets podcaster som du kan nyte nå. Det er den beste podcastappen og fungerer på Android, iPhone og internett. Registrer deg for å synkronisere abonnement på flere enheter.

 

Hurtigreferanseguide

Copyright 2024 | Sitemap | Personvern | Vilkår for bruk | | opphavsrett