• VP Land
  • Posts
  • AI Camera Control Keeps Getting Better

AI Camera Control Keeps Getting Better

28 Years Later Uses iPhone, UFC's Crazy Immersive Fight

In partnership with

Welcome to VP Land!

Sorry for the silence last week - we were out in NY, shooting some new videos for the channel.

Spoiler: related to this past podcast episode.

And check out the behind the scenes studio tour of Vu if you missed it last week.

In this edition we’ve got a bunch of new AI tool updates, the UFC’s crazy immersive fight in the Sphere, and a bunch more.

Let’s jump into it!

Runway’s Gen-3 Video to Video, Kling AI’s Motion Brush, and More AI Updates

Here is a roundup of a few of the top AI updates on our radar.

Runway Gen-3: AI with Camera Control & Actor Performances

Runway launched the ability to do video to video generations with their newest Gen-3 Alpha model.

This has proved to be the best way so far to have full camera control and capture actor performances while still being able to tap into AI’s abilities to change character looks and locations.

Check out this side by side demo along with more demoes here, here, and this great one that maintained the actor’s performance while turning him into a wizard.

Hailuo AI - China’s New Gen AI Competitor

Source: el.cine on X

The new AI model Hailuo AI is developed by MiniMax, a rapidly growing Chinese AI startup. MiniMax is backed by major investors such as Alibaba and Tencent, and it has already raised $600 million.

Their text-to-video generator has demonstrated some impressive, realistic results.

Check out the video above and some more demos here.

Kling AI Gets a Motion Brush and Updated Model

Source: el.cine on X

Kling AI has added a motion brush for more precise control over image to video generations. They’ve also released an improved model, version 1.5, with impressive results.

Important to note: currently the motion brush is only available in version 1 of the model, not 1.5.

More AI Updates

  • Diffusion models, traditionally used for AI image generation, have demonstrated the ability to function as game engines by predicting and rendering frames of the classic game DOOM at 20 fps without relying on a traditional game engine.

  • Andreas Jansson releases their open-source FLUX fine-tuner built on Ostris ai-toolkit, enabling captioning, LoRA inference, and HF upload for enhanced AI capabilities.

  • Purz.xyz integrates ComfyUI for fast and efficient video upscaling using the FL Upscale Model and FL KSampler Plus nodes, allowing for high-quality results with customizable options.

  • Synthetic training data, generated by the model itself, can enhance the accuracy, diversity, and style of fine-tuned Flux models for image generation.

  • Behind-the-scenes look at the AI-animated short film ÌRÀWÒ by John The Artist.

SPONSOR MESSAGE

The Daily Newsletter for Intellectually Curious Readers

  • We scour 100+ sources daily

  • Read by CEOs, scientists, business owners and more

  • 3.5 million subscribers

UFC 306 at the Sphere: A Tech-Powered Spectacle

UFC just raised the bar with some of the most mind-being, visually enticing live sports events we’ve ever seen.

Noche UFC, a Mexican Independence Day-themed fight card, at the Las Vegas' Sphere, showcased groundbreaking technology and immersive experiences, setting a new standard for sports entertainment.

The Big Picture

The event combined cutting-edge audiovisual technology with innovative storytelling to create an unprecedented fan experience.

A six-part movie, "For Mexico, For All Time," was integrated throughout the event and was developed for the show by independent artists:

Check out more videos displayed on the sphere:

28 Years Later goes back to roots: Shoots on iPhone. But is it using the iPhone to its advantage?

News broke that Danny Boyle’s upcoming 28 Years Later was shot on an iPhone. This checks out - the first film, 28 Days Later, was shot on the Canon XL1, a prosumer-level Mini DV camcorder (and the holy grail of Mini DV cameras for every kid in high school 👀)

But, unsurprisingly, the internet started losing its mind at this behind-the-scenes pic showing an iPhone rigged into a traditional cinema production kit, which still costs hundreds of thousands of dollars.

This feels like a throwback to last year when Apple revealed they shot their keynote on an iPhone, but they also used professional lighting and rigging.

Nothing new here - let’s throwback to my post from 10 years ago when Apple first started filming spots on iPhone - it doesn’t mean all the other tools needed for crafting high-quality images get thrown out the window.

However…

My bigger question, seeing the iPhone rigged with big, heavy lenses on a big, heavy O’Connor head, is: did they really use it to its advantage?

My favorite shot on iPhone film is Soderbegh’s High Flying Bird, shot on the iPhone 8 with Filmic even though the X was available. Besides filming on the iPhone, they kept the entire production light and nimble since their camera was so tiny.

Here’s a quote from an interview I did for Filmmaker Magazine with Filmic’s creator Neill Barham on the rising use of iPhones in filmmaking back in 2019

Filmmaker: The thing I found interesting when looking at the production of High Flying Bird was not only were they filming on an iPhone 8 but that all of the camera support was kept very consumer level. They used a Moondog anamorphic lensBeastgrip Pro, and Osmo Mobile 2 stabilizer, both very affordable and accessible products.

Neill: I think one of the most amusing iPhone filmmaking images of the past year is the one of Soderbergh in the gym on High Flying Bird. He has two actors sitting on a bench in a basketball gymnasium and the tripod that he’s using looks like any $79 to $129 soccer dad tripod. And you’re like, man, you’ve got to have a Miller or Sachtler tripod sitting in your closet that’s worth $1,500, so I think that’s like a beautiful moment.

Obviously, we’re going off one still BTS photo - it will be interesting to see if the iPhone got used in more unique ways once the film and more BTS info come out.

🗣️ Francis Ford Coppola's vision for 'Megalopolis' included a unique audience participation feature, where viewers could ask questions to Adam Driver's character and receive relevant pre-filmed responses, though the idea was ultimately not implemented. Too bad - could’ve yelled out questions while zip lining through the theater.

🗄️ Lionsgate partners with AI firm Runway to train an exclusive model on its film and TV library, aiming to enhance content creation while reducing costs.

🎥 Explore the gallery of videos shot with the Blackmagic URSA Cine 12K, showcasing its ability to capture resolutions up to 12K for both foreground and background elements.

👁 RED launches the Denz Premium Optics Eyepiece, an upgrade for the RED Compact EVF, designed to enhance viewfinder image quality and assist with camera operation, priced at $4,250.

🎙 Yellowtec's new mic stand, the mika Table Stand, has sturdy construction and ergonomic design ensure reliable support and optimal microphone positioning for clear audio capture.

📆 Upcoming Events


September 30 to October 3
Unreal Fest Seattle 2024

View the full event calendar and submit your own events here.

Thanks for reading VP Land!

Have a link to share or a story idea? Send it here.

Interested in reaching media industry professionals? Advertise with us.

Reply

or to participate.