New Sci-Fi Shorts Prompted Into Existence With AI

Charlie Fink
4 min readAug 25, 2023

--

Milan Cucuk made the short video 1000 Miles, Part 1, in eight hours from his own script, using AI tools Mid-Journey (for image generation), Runway (for video generation), and Elven Labs (for voice generation). The exceptionally fast cutting of the visuals are a good match for its tight storytelling, a requirement for a medium in which the maximum length of a shot is four seconds.

1000 Miles is the story of Axel, a down-on-his-luck space trash collector who abandons his post to rescue a girl whose coordinates are unknown. Part 1 and Part 2 are edited with dozens of shots generated from over 1,000 video generations in each of its two minutes thirty second episodes. 1000 Miles Part 2 is better than 1, in my opinion. It took three times longer to produce (three days).

As the second part of 1000 Miles comes to an end, Axel is low on oxygen, losing his mind, perhaps dying. It’s said people afraid of heights and/or claustrophobic like him go insane up there. His deterioration makes you wonder if the girl on the radio is even real.

Cucuk is working on Part 3, which will be released on Monday , August 7th. “These videos were produced over the last 6 weeks and are unique in that they use AI voices, my original script, and are telling a story versus a standalone clip or trailer, but I am also discovering the story, visually, as we go along,” said Cucuk in an interview Monday. “In Part 1, over 100 prompts were used with image references,” he explained. “I was still getting used to using Runway and did a lot of testing to see if I could maintain more control of the video generation. Part 2 didn’t use any prompts in Runway, only reference images I created in Midjourney. In Midjourney, I used around 40 prompts to generate the images I needed to use as reference pictures in Runway.”

Salt Lake City-based AI filmmaker Milan Cucuk. MILAN CUCUK

Both videos were produced using ElevenLabs for the voices, Midjourney for the images/storyboarding, Runway Gen2 for the video generations, and edited for color correction and sound mixing in Adobe Premiere Pro. Cucuk purchased music licenses for the music and wrote the scripts. He has written several screenplays, one novel, and does not use ChatGPT or any AI tools in his creative writing. Cucuk says he thinks “AI filmmaking will never replace traditional filmmaking. AI films and traditional films are two separate art forms. AI film is in its infancy. We are in the camera obscura era of this type of art and technology.”

Cucuk graduated from the University of Utah in 2020 with a BFA in Film and Media Arts. He dreams of making movies, but they seemed a long way away while working a day job as a Sr. UX/UI Designer. He’s kept his cinematic dreams alive over the years by writing and directing short films, and volunteering at the Sundance Film Festival from 2007–2016. After discovering mid-journey last year, Cucuk shifted his attention to producing AI cinema. Since then, he estimates he’s spent over 1,200 hours studying AI, getting involved in the GenAI creator community. He’s currently working on two collaborative projects with other creators from around the world.

Concept art for “1,000 Miles” Part 2. MILAN CUCUK

After Part 2 was released on July 14th, 1000 Miles has been invited to the Paris Short Film Festival and the Athens International Monthly Art Film Festival. Cucuk says he was contacted by Alejandro Matamala Ortiz, one of the co-founders of Runway, who asked to have 1,000 Miles in their upcoming showcase.

1000 Miles and the other emerging cinematic AI shorts demonstrate the disruptive potential of AI-enabled content production. Anyone can now make the greatest game or movie or comic book universe of all time, with no art or coding skills, no actors, nor studio or distributor.

Someone with a good story, vision, and some prompting skills, can summon the power of a Hollywood production which even today costs hundreds of thousands of dollars an hour. The next Star Wars movie may be made in the cloud by a kid in Africa or Asia. They’ll imagine their own version of Tatooine. Thanks to being made of everything ever made, the resulting AI images will feel more authentic than a movie ever could.

Originally published on Forbes.com on August 1st, 2023.

--

--

Charlie Fink
Charlie Fink

Written by Charlie Fink

Consultant, Columnist, Author, Adjunct, Covering AI, XR, Metaverse for Forbes

No responses yet