Have you ever seen this video?
Here is another video of Will Smith eating spaghetti.
Jaggy, weird and hilarious.
Just less than a year ago, this was what AI-generated videos would look like. And anyone could tell that immediately. People used to look at this video and be assured that AI-generated videos still have a long way to go to produce any believable results; that despite all the crazy progress we’ve made with ChatGPT and other chatbots, despite all the super-believably realistic images from Midjourney and Dall-E, making videos from AI seems too complex, seems like a far-future thing.
And here are AI-generated videos now:
Pretty believeable, right?
These are some recent videos OpenAI published together with their new product - Sora. There are obviously some weird details in both of these videos, but all are pretty minor and hard to spot at first glance. Some videos, as shown by OpenAI, have weird physics, or things popping out of nowhere, deformed hands, etc. Still, you can feel something is off and not entirely authentic.
But I hardly doubt this is the best we can get with AI generated videos. Give them enough time, and soon enough we cannot tell which is which anymore.
So what does this mean for everybody?
To me, this product is so promising, but it also offers some frightening implications if it’s out for the public.
First, this product is a major disruption to the stock video industry. Everyone, including content creators, educators,students, etc., can now create videos themselves from something generic like this (also created by Sora):
to something oddly specific (says Will Smith eating spaghetti) for their own use, for free, or for a small subscription fee.
Second, despite safeguard measures, I believe that as soon as the model is public, there would be communities that try to bypass those measures. It happened to AI chatbots; it happened to AI image generators, it would happen to AI video generators.
The consequences of those safeguard measures being bypassed would vary from some harmless, silly videos of celebrities to something as disgusting as unconsensual pornography of real girls and women, or a propaganda video used for whatever political reasons. Remember those Taylor Swift pictures just a couple weeks ago?
However, I’m hopeful that, given this serious threat that the tool poses, with several scandals already happening, companies like OpenAI would be more cautious before releasing these AI models to the public.
But isn’t it weird that Karpathy, a founding member of OpenAI, left the company just a couple days before they announced Sora?
Read more:
Sora (openai.com)
OpenAI introduces Sora, its text-to-video AI model - The Verge