🎙 Check out Metaverse Roadshow Chanel for interviews and panels with Metaverse builders, projects, experts, researchers.
Meta has released the official Meta Avatars SDKallowing all Unity developers on Quest 1 / 2, Rift and all windows-based VR platforms to start development and integration of the Meta avatar systems and mechanics.
What are Meta Avatars?
In April 2021 Meta announced the new avatar style and demonstrated how could be integrated into the metaverse with the release of Meta avatars within Epic Roller Coasters, PokerStars VR, GOLF+, Synth Riders, and ForeVR Bowl. The Meta Avatars SDK adds social presence to your experience with expressive and customizable avatars. The avatars come with hundreds of different outfit and physical features. Since the release of the avatars in April Meta has made a series of updates both in regards to the overall asset quality and user choices. Meta is currently further developing it’s options with the key message of providing full cultural representation within the metaverse.
Users can configure their Meta Avatars via an editor included in the Oculus platform so that all apps can represent them in a consistent and authentic way. Stopping users from having to constantly re-create their virtual selves and focusing on a consistent metaverse virtual persona.
What is the Meta Avatar SDK?
The SDK is much more than the Meta avatar style. Meta Avatars are built with developer needs in mind to power all kinds of experiences in VR. Meta Avatars interpret hand tracking, controller tracking, and audio input to drive highly expressive Avatar models. Custom IK algorithms help make sure that the avatar’s body positions and facial expressions are as realistic as possible, and these algorithms will improve over time. Developers also have the ability to override body positioning and facial expression data provided. This allows developers to offer completely unique meta avatar mechanics within their desired application.
The intention of Meta Avatars is to enrich social interaction. Optimizing for multi-user experiences has been a fundamental goal. The system locally estimates the avatar’s pose and allows it to be streamed to other participants using your choice of networking stack. This approach reduces the amount of computation effort required, and accommodates the need to override the system’s default avatar facial expression and body position in a manner that is authoritative in the experience. Allowing developers to create natural multi-user experiences whilst simultaneously maintaining performance.
What’s Included in the SDK?
The Meta Avatars Unity plugin includes prefabs, scripts, and assets that allows convenient access to all available functionality, along with several samples including:
- Single avatar mirror scene
- Custom hand poses
- Loading custom avatars via Meta’s web API
- Network loopback
- Gaze tracking
The SDK contains 32 sample avatars that are available for use during development and are freely available for use in shipping products. These sample avatars are representative of the quality of the custom avatars users can create, and can also be used as presets for NPCs. Users who don’t want to customize their own, or to represent users on platforms without a Meta login like SteamVR.