OpenAI to Release Sora to Public by 2024: Could Be Available in Just Months

OpenAI to Release Sora to Public by 2024: Could Be Available in Just Months

OpenAI recently introduced its innovative text-to-video tool, Sora, which has the capability to produce lifelike 1080p videos. At present, the tool is being tested by a select group of filmmakers and creators to identify and rectify any potential weaknesses before making it available to the public. According to OpenAI's chief technology officer Mira Murati in an interview with the Wall Street Journal, the plan is to release Sora to the public before 2025, with the possibility of this happening in just a few months.

Advancements in Video Generation

OpenAI's objective with Sora is to provide creators with a versatile tool for editing and content creation, aiming to cater to the creative needs of users.

Safety and Future Prospects

In contrast to a video featuring Will Smith consuming spaghetti that was revealed a year ago, content generated by Sora is described as "hyperrealistic" by the OpenAI CTO, with the exception of some anomalies in hand and finger representations. The company is adamant about ensuring the safety of the tool before its public release, with plans to watermark the Sora-generated videos, similar to other text-to-image tools.

Impact on Creators and Data Usage

CTO Murati emphasized that the text-to-video generation model is intended to complement creators' work rather than replace them, serving as a tool to enhance creativity. Regarding the training data for Sora, she mentioned that publicly available and licensed data, possibly sourced from platforms like YouTube, Facebook, and Instagram, was utilized.

Moreover, Sora's generated clips do not include audio currently, but there are intentions to incorporate audio in the future, possibly in an enhanced version of Sora under a different name. Just like DALL-E, users may have to pay to access and use the model, with OpenAI focusing on creating an economically viable model that involves input from contributors in the film industry and beyond.

Scroll to Top