- VP Land
- Posts
- Vision Pro Updates: Unity, VisionOS, and VPG
Vision Pro Updates: Unity, VisionOS, and VPG
🌐 Motion sickness patents, Qualcomm XR, and more
Powered by
Welcome to VP Land - your daily update for all things virtual production & AI in video.
In this issue we’re covering a grab-bag of updates about Apple’s Vision Pro. Let me know what you think about the coverage - too much? Too little? Just hit reply.
In this issue:
🥽 Unity starts rolling out their Vision Pro toolkit
👫 Vision Pro gets its own SILO
🏢 The “Alaian” initiative for XR startups
And more! Let’s dive in.
PRINCIPAL PHOTOGRAPHY
Time to start building for the Vision Pro with Unity
Unity, the game engine developer, is launching a beta program for creating apps designed to run on Apple's Vision Pro headset.
The program, the result of a collaboration between Unity and Apple, aims to streamline the creation of immersive experiences for Unity developers using its PolySpatial technology alongside Apple's RealityKit.
The Unity platform is poised to make Vision Pro development more accessible, especially for game development, due to its extensive library of tools, plugins, and resources. It could potentially reduce the amount of work needed compared to using SwiftUI.
With the launch of the beta program, Unity pivots its subscription model to enable its developers to design experiences specifically for Apple's new mixed-reality headset.
Unity's PolySpatial technology, when combined with Apple's RealityKit app rendering, can ensure a consistent aesthetic across games and other visionOS apps, improving user experiences on the Vision Pro.
PolySpatial lets apps run in VisionOS's Shared Space. PolySpatial translates Unity's materials, shaders, etc., to RealityKit, which is integral to the Shared Space. However, there are some limitations requiring developers to make tweaks and build new shaders for their apps to run on Vision Pro.
According to ArsTechnica, VisionOS doesn't provide apps with direct access to the cameras, in the interest of privacy. As such, developers can't bypass the necessity to work with RealityKit.
Mike Rockwell, VP of the Vision Products Group at Apple (more on that below), lauded the Unity developer community's ability to build impressive 3D experiences. He also stressed that Unity-based apps would run natively on Vision Pro, granting access to its unique features like low latency pass-through and high-resolution rendering.
Unity's Beta program is currently open for registration.
Sponsor Message
Looking to revolutionize your advertising strategy? Discover Adspective, the innovative solution that automatically adds advertising content to your existing photos and videos.
With Adspective, you can breathe new life into your visual content, making your ads more engaging and effective than ever before.
But that's not all. Adspective also offers new cost-effective ways to publish ads, making it a game-changer for businesses of all sizes. Whether you're a startup on a tight budget or a large corporation looking to maximize your advertising ROI, Adspective is the solution you've been waiting for.
SECOND UNIT
Vision Pro Gets Its Own Corporate SILO
Source: Bloomberg
Apple has split from its usual org structure by creating a dedicated group just focused on the Vision Pro (and future mixed reality headsets), according to Bloomberg.
Traditionally, Apple's structure doesn't have dedicated divisions for specific products, such as the iPhone or iPad. Instead, it operates under a "functional" structure where different departments like software engineering, hardware development, machine learning, and design all collaborate on various products.
With VPG, Apple has formed a standalone unit with its own software and hardware engineering teams and other specific departments, all focused on the development of the Vision Pro headset.
This shift suggests that the Vision Pro's unique and complex nature required a dedicated, specialized team, contrasting with the integrated, cross-functional approach usually applied to Apple's product development.
Fighting VR Sickness
Apple has been granted a patent for a system aimed at reducing the nausea sometimes experienced with VR and AR due to a phenomenon called "oculovestibular mismatch."
This occurs when the motion seen in the headset doesn't match the motion felt by the body, such as when a user is in a moving vehicle.
Apple's system, incorporating two motion-measuring devices, one in the headset and another in the vehicle, can distinguish between the user's movements and the vehicle's movements.
By doing so, it provides a more comfortable VR or AR experience by minimizing the discrepancy between visual inputs and physical sensation.
I personally hope they figure out a way to minimize motion sickness if you're driving in VR while sitting in your chair. I've gotten some hardcore motion sickness doing this in the Quest 2.
AKS
Eight international telecom operators have joined forces with Qualcomm to launch "Alaian", a global initiative aimed at creating and supporting startups focused on XR.
Kimmo Kaunela created a brutalist office space, which is now free on the Unreal Marketplace.
Schedule posts and track your social media with Metricool*
Meta pushes their AR glasses release to 2027
British Film Commission launched a UK Virtual Production Directory
* Sponsored Link
ABBY SINGER
Thanks for reading VP Land!
Have a link to share or a story idea? Send it here.
Interested in reaching media industry professionals? Advertise with us.
What did you think of this email? |
Reply