OpenAI puts hold on Sora AI video generator after testers leak info

Testers say they were protesting ‘art washing’

OpenAI has temporarily paused access to its highly anticipated video generation tool, Sora, after a group of artists leaked access to the tool to protest the company's alleged exploitative practices.

The artists claim they were used as unpaid labour to test and refine the AI model, receiving minimal compensation in return.

On Tuesday, the group created a public webpage using an API to give anyone access to Sora.

In an open letter, the artists expressed their frustration with OpenAI's approach, claiming they were "lured into 'art washing'" to promote the company's image.

They say they were tasked with extensive bug testing, feedback provision and experimental work, while receiving minimal benefits.

The artists also criticised the company's selection process for choosing artists to showcase their work, arguing that it prioritised PR and marketing over genuine artistic expression.

"Hundreds of artists provide unpaid labour through bug testing, feedback and experimental work for the program for a $150B valued company," the letter said.

“While hundreds contribute for free, a select few will be chosen through a competition to have their Sora-created films screened — offering minimal compensation which pales in comparison to the substantial PR and marketing value OpenAI receives.”

AI-generated videos created using Sora began circulating online following the leak.

An OpenAI spokesperson said the company is investigating the situation, adding that participation in the early access programme is voluntary.

"Hundreds of artists in our alpha have shaped Sora's development, helping prioritise new features and safeguards. Participation is voluntary, with no obligation to provide feedback or use the tool," they said.

Launched in February 2024, Sora is OpenAI's cutting-edge AI model designed for generating short video clips of up to one minute using natural language prompts. Users can describe scenes in detail, specifying objects, interactions and environmental elements, which the model interprets to create video content.

The technology leverages OpenAI's transformer neural network architecture and employs a diffusion-based approach, starting with noise-filled frames and refining them incrementally.

The tool is currently in a private alpha phase, accessible only to select artists and cybersecurity experts to provide feedback on its creative and security capabilities.

OpenAI has not announced a release date but has hinted at plans to address known limitations, such as challenges in simulating complex physics and interpreting spatial details in prompts.

While the leaked access has been disabled, the incident has started a debate within the AI industry on the ethical implications of using outside testers (aka "red teaming").

While the practice is common, it has drawn criticism from experts who argue that it can stifle independent research, reduce transparency and limit accountability.

The artists involved in the leak say they are not opposed to AI technology itself but rather the exploitative practices employed by companies like OpenAI.

They have called on the company to address their concerns and implement fairer compensation and recognition for creative professionals who contribute to the development of AI tools.