Artwork

Innhold levert av Podcamp Media and Dusty Weis. Alt podcastinnhold, inkludert episoder, grafikk og podcastbeskrivelser, lastes opp og leveres direkte av Podcamp Media and Dusty Weis eller deres podcastplattformpartner. Hvis du tror at noen bruker det opphavsrettsbeskyttede verket ditt uten din tillatelse, kan du følge prosessen skissert her https://no.player.fm/legal.
Player FM - Podcast-app
Gå frakoblet med Player FM -appen!

41. Deepfakes: How Communicators Must Prepare Now for this Imminent Reputation Threat

48:56
 
Del
 

Fetch error

Hmmm there seems to be a problem fetching this series right now. Last successful fetch was on January 17, 2024 11:12 (4M ago)

What now? This series will be checked again in the next day. If you believe it should be working, please verify the publisher's feed link below is valid and includes actual episode links. You can contact support to request the feed be immediately fetched.

Manage episode 359797825 series 2868029
Innhold levert av Podcamp Media and Dusty Weis. Alt podcastinnhold, inkludert episoder, grafikk og podcastbeskrivelser, lastes opp og leveres direkte av Podcamp Media and Dusty Weis eller deres podcastplattformpartner. Hvis du tror at noen bruker det opphavsrettsbeskyttede verket ditt uten din tillatelse, kan du følge prosessen skissert her https://no.player.fm/legal.

It is now only a matter of time until someone attacks your reputation with a deepfake, according to the experts.

So-called deepfake technology, which can synthesize audio and video of things that never happened, has arrived en masse.

And, while these tools for generating potential disinformation were previously only available to trained experts and big institutions, recent advances in artificial intelligence technology mean that ANYONE can create fake videos... nearly instantly, with little to no training, for FREE.

Accordingly, experts like Dr. Hany Farid from UC-Berkeley say deepfakes are suddenly being used the wage disinformation campaigns every day.

So in this episode, Dr. Farid cites some examples of how deepfake technology is being used to attack important people and institutions, and lays out strategies that strategic communicators can use to try and protect their clients and employers.

We talk to Francesca Panetta and Halsey Burgund, the Emmy-winning film directors who used a viral deepfake of President Richard Nixon to try to warn society about the growing threat, and learn some shocking facts about the technology.

And we meet Noelle Martin, a lawyer, researcher and activist from Australia whose reputation has been targeted with deepfake pornography. Noelle tells us about her efforts to create legal recourse for the non-consenting victims of deepfake porn and her battle to reclaim her reputation.

Because deepfake technology no longer poses a reputation threat "sometime in the next few years."

It poses a threat RIGHT NOW.

Subscribe to the Podcamp Media e-newsletter for more updates on the world of strategic communication.

Learn more about your ad choices. Visit megaphone.fm/adchoices

  continue reading

58 episoder

Artwork
iconDel
 

Fetch error

Hmmm there seems to be a problem fetching this series right now. Last successful fetch was on January 17, 2024 11:12 (4M ago)

What now? This series will be checked again in the next day. If you believe it should be working, please verify the publisher's feed link below is valid and includes actual episode links. You can contact support to request the feed be immediately fetched.

Manage episode 359797825 series 2868029
Innhold levert av Podcamp Media and Dusty Weis. Alt podcastinnhold, inkludert episoder, grafikk og podcastbeskrivelser, lastes opp og leveres direkte av Podcamp Media and Dusty Weis eller deres podcastplattformpartner. Hvis du tror at noen bruker det opphavsrettsbeskyttede verket ditt uten din tillatelse, kan du følge prosessen skissert her https://no.player.fm/legal.

It is now only a matter of time until someone attacks your reputation with a deepfake, according to the experts.

So-called deepfake technology, which can synthesize audio and video of things that never happened, has arrived en masse.

And, while these tools for generating potential disinformation were previously only available to trained experts and big institutions, recent advances in artificial intelligence technology mean that ANYONE can create fake videos... nearly instantly, with little to no training, for FREE.

Accordingly, experts like Dr. Hany Farid from UC-Berkeley say deepfakes are suddenly being used the wage disinformation campaigns every day.

So in this episode, Dr. Farid cites some examples of how deepfake technology is being used to attack important people and institutions, and lays out strategies that strategic communicators can use to try and protect their clients and employers.

We talk to Francesca Panetta and Halsey Burgund, the Emmy-winning film directors who used a viral deepfake of President Richard Nixon to try to warn society about the growing threat, and learn some shocking facts about the technology.

And we meet Noelle Martin, a lawyer, researcher and activist from Australia whose reputation has been targeted with deepfake pornography. Noelle tells us about her efforts to create legal recourse for the non-consenting victims of deepfake porn and her battle to reclaim her reputation.

Because deepfake technology no longer poses a reputation threat "sometime in the next few years."

It poses a threat RIGHT NOW.

Subscribe to the Podcamp Media e-newsletter for more updates on the world of strategic communication.

Learn more about your ad choices. Visit megaphone.fm/adchoices

  continue reading

58 episoder

All episodes

×
 
Loading …

Velkommen til Player FM!

Player FM scanner netter for høykvalitets podcaster som du kan nyte nå. Det er den beste podcastappen og fungerer på Android, iPhone og internett. Registrer deg for å synkronisere abonnement på flere enheter.

 

Hurtigreferanseguide

Copyright 2024 | Sitemap | Personvern | Vilkår for bruk | | opphavsrett