Artwork

Innhold levert av Slator. Alt podcastinnhold, inkludert episoder, grafikk og podcastbeskrivelser, lastes opp og leveres direkte av Slator eller deres podcastplattformpartner. Hvis du tror at noen bruker det opphavsrettsbeskyttede verket ditt uten din tillatelse, kan du følge prosessen skissert her https://no.player.fm/legal.
Player FM - Podcast-app
Gå frakoblet med Player FM -appen!

#179 New Trends in Machine Translation with Large Language Models by Longyue Wang

37:13
 
Del
 

Manage episode 374591073 series 2975363
Innhold levert av Slator. Alt podcastinnhold, inkludert episoder, grafikk og podcastbeskrivelser, lastes opp og leveres direkte av Slator eller deres podcastplattformpartner. Hvis du tror at noen bruker det opphavsrettsbeskyttede verket ditt uten din tillatelse, kan du følge prosessen skissert her https://no.player.fm/legal.

Joining SlatorPod this week is Longyue Wang, a Research Scientist at Tencent AI Lab, where he is involved in the research and practical applications of machine translation (MT) and natural language processing (NLP).
Longyue Longyue expands on Tencent’s approach to language technology where they integrate MT with Tencent Translate (TranSmart). He highlights how Chinese-to-English MT has made significant advancements, thanks to improvements in technology and data size. However, translating Chinese to non-English languages has been more challenging.
Recent research by Longyue explores large language models’ (LLMs) impact on MT, demonstrating their superiority in tasks like document-level translation. He emphasized that GPT-4 outperformed traditional MT engines in translating literary texts like web novels.
Longyue discusses various promising research directions for MT using LLMs, including stylized MT, interactive MT, translation memory-based MT, and a new evaluation paradigm. His research suggests LLMs can enhance personalized MT, adapting translations to users' preferences.
Longyue also sheds light on how Chinese researchers are focusing on building Chinese-centric MT engines, directly translating from Chinese to other languages. There's an effort to reduce reliance on English as a pivot language.
Looking ahead, Longyue's research will address challenges related to LLMs, including handling hallucination and timeless information issues.

  continue reading

Kapitler

1. Intro (00:00:00)

2. What is Tencent? (00:01:29)

3. Professional Background and Interest in MT and NLP (00:03:44)

4. Tencent's Interest in Language Technology (00:06:03)

5. Perception of Language Technology in China (00:08:42)

6. MT Quality for Chinese (00:12:01)

7. ChatGPT's Translation Capabilities (00:16:45)

8. Interesting Directions for MT Using LLMs (00:20:06)

9. Translation Memory-Based MT (00:22:51)

10. Interactive MT (00:24:05)

11. Using ChatGPT to Evaluate Translation (00:25:56)

12. Personalized MT and Multi-Modal MT (00:27:57)

13. The Focus of China-Based Research (00:30:35)

14. Future Research Initiatives (00:33:55)

230 episoder

Artwork
iconDel
 
Manage episode 374591073 series 2975363
Innhold levert av Slator. Alt podcastinnhold, inkludert episoder, grafikk og podcastbeskrivelser, lastes opp og leveres direkte av Slator eller deres podcastplattformpartner. Hvis du tror at noen bruker det opphavsrettsbeskyttede verket ditt uten din tillatelse, kan du følge prosessen skissert her https://no.player.fm/legal.

Joining SlatorPod this week is Longyue Wang, a Research Scientist at Tencent AI Lab, where he is involved in the research and practical applications of machine translation (MT) and natural language processing (NLP).
Longyue Longyue expands on Tencent’s approach to language technology where they integrate MT with Tencent Translate (TranSmart). He highlights how Chinese-to-English MT has made significant advancements, thanks to improvements in technology and data size. However, translating Chinese to non-English languages has been more challenging.
Recent research by Longyue explores large language models’ (LLMs) impact on MT, demonstrating their superiority in tasks like document-level translation. He emphasized that GPT-4 outperformed traditional MT engines in translating literary texts like web novels.
Longyue discusses various promising research directions for MT using LLMs, including stylized MT, interactive MT, translation memory-based MT, and a new evaluation paradigm. His research suggests LLMs can enhance personalized MT, adapting translations to users' preferences.
Longyue also sheds light on how Chinese researchers are focusing on building Chinese-centric MT engines, directly translating from Chinese to other languages. There's an effort to reduce reliance on English as a pivot language.
Looking ahead, Longyue's research will address challenges related to LLMs, including handling hallucination and timeless information issues.

  continue reading

Kapitler

1. Intro (00:00:00)

2. What is Tencent? (00:01:29)

3. Professional Background and Interest in MT and NLP (00:03:44)

4. Tencent's Interest in Language Technology (00:06:03)

5. Perception of Language Technology in China (00:08:42)

6. MT Quality for Chinese (00:12:01)

7. ChatGPT's Translation Capabilities (00:16:45)

8. Interesting Directions for MT Using LLMs (00:20:06)

9. Translation Memory-Based MT (00:22:51)

10. Interactive MT (00:24:05)

11. Using ChatGPT to Evaluate Translation (00:25:56)

12. Personalized MT and Multi-Modal MT (00:27:57)

13. The Focus of China-Based Research (00:30:35)

14. Future Research Initiatives (00:33:55)

230 episoder

Alle episoder

×
 
Loading …

Velkommen til Player FM!

Player FM scanner netter for høykvalitets podcaster som du kan nyte nå. Det er den beste podcastappen og fungerer på Android, iPhone og internett. Registrer deg for å synkronisere abonnement på flere enheter.

 

Hurtigreferanseguide

Copyright 2024 | Sitemap | Personvern | Vilkår for bruk | | opphavsrett