If you’re like me, then lately you’ve scrolled past something on social media and thought, “Wait, was that real?” Deepfakes are everywhere, and they’re getting a lot more convincing.
That brings me to my Decoder guest today: Gaurav Misra, the CEO of Captions. You may not have heard of Captions yet, but you’ve probably seen a video that was generated using its AI models. The company’s Mirage Studio platform lets anyone generate AI versions of real people, and the results are alarmingly realistic.
Captions just put out a blog post titled, “We Build Synthetic Humans. Here’s What’s Keeping Us Up at Night.” It’s a good overview of the state of deepfakes and where they’re headed.
As the CEO of a company building deepfake technology, I wanted to know what specifically keeps Gaurav up at night, which you’ll hear us get into. I’m generally more optimistic about the long-term impacts of AI than a lot of people, but as you’ll hear in this conversation, I’m a lot more nervous about this topic.
Ultimately, I came away from this episode unsettled by the fact that the deepfakes of today are the least believable they’ll ever be, we are not ready, and the companies building this tech are racing ahead anyway.
If you’d like to read more on what we talked about in this episode, check out the links below:
Questions or comments about this episode? Hit us up at decoder@theverge.com. We really do read every email!
A dangerous rootkit called OrBit has been quietly targeting Linux systems for years, stealing login…
A recent intrusion uncovered by security researchers revealed a calculated attack campaign that used a…
Cybercriminals behind the Tycoon 2FA phishing kit have added a powerful new weapon to their…
As artificial intelligence frameworks become central to enterprise operations, a critical flaw in a popular…
A critical vulnerability in the Amazon Redshift JDBC driver has put enterprise applications at severe…
In a severe blow to web hosting environments worldwide, administrators are racing against the clock…
This website uses cookies.