paint-brush
Getting Started With Presence Platform Interaction SDK: Hand Tracking by@shiaart
24,230 reads
24,230 reads

Getting Started With Presence Platform Interaction SDK: Hand Tracking

by Art ShJuly 31st, 2023
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

Presence Platform is designed to create a sense of shared presence and interaction in virtual reality environments. It enables people to connect, interact, engage with each other in immersive virtual spaces using VR headsets. Hand tracking is one of the basic requirements for VR game development - we will setup, build and launch hand tracking game like experience.
featured image - Getting Started With Presence Platform Interaction SDK: Hand Tracking
Art Sh HackerNoon profile picture

As we successfully created our first Unity project targeting Quest 2 device in the previous post, now we will get insight into one of the most powerful set of capabilities provided by Meta via Presence Platform.

What is Meta’s Presence Platform?

The Presence Platform is to create a sense of shared presence and interaction in virtual reality environments. It enables people to connect, interact, engage with each other in immersive virtual spaces using VR headsets.


Features and capabilities of Meta's Presence Platform include:


  1. Avatar System: The Presence Platform allows users to create and customize their digital avatars, which represent them in the virtual world. These avatars can mimic users' real-life movements, expressions, and gestures, enhancing the feeling of presence and social interaction.
  2. Social Interaction: Users can meet and interact with friends and other people in shared virtual environments. They can engage in various activities together, such as playing games, attending virtual events, watching videos, and more.
  3. Spatial Audio: The platform incorporates spatial audio, which means that the sound in the virtual environment is location-based. This creates a more realistic and immersive audio experience, as users can hear sounds coming from specific directions, just like in the real world.
  4. Hand Tracking: The Presence Platform supports hand tracking technology, enabling users to use their hands and fingers directly in VR without the need for controllers. This makes interactions more natural and intuitive.
  5. Cross-Platform Support: The platform is designed to work across different Oculus VR headsets, allowing users with different devices to join and interact with each other seamlessly.
  6. Content Creation Tools: For developers, the Presence Platform provides tools and APIs to create and publish VR applications, games, and experiences, enabling a thriving ecosystem of virtual content.


Starting with Hand tracking

As you can see Presence Platform is comprehensive set of sub-system/features, and I will cover each of them in detail, but to get started - let’s start from hand tracking as one of the basic requirements for VR game development - we will setup, build and launch hand tracking game like experience.


//docs.unity3d.com/


A perfect starting point is experience provided by oculus-samples .

Again, please refer to previous post on setting up development environment, and make sure various dependencies are installed.


Ensure you have Git LFS installed, run this command:
git lfs install


Then, clone repo using the "Code", opening in Github desktop and running the following command:
git clone //github.com/oculus-samples/Unity-FirstHand.git


All of the actual project files are in Assets → Project. This folder includes all scripts and assets to run the sample, excluding Interaction SDK itself. The project includes v41 of the Oculus SDK, including Interaction SDK. You can find Interaction SDK in [Assets/Oculus/Interaction]{Assets/Oculus/Interaction).


FirstHand in Unity Editor


After installing all required dependencies and configuring build to run on Quest device you will get something similar to above in your editor.


Build and run

Go to File → Build Settings if you are using Mac. If you followed instruction from my previous post and connected your device, below is what you should see in Build Settings.

Build and run settings

Click Build and Run, allow a few minutes to Unity to build and wait for messages that your app is deploying to connected Quest device.


Running experience

Customization and making changes

I strongly suggest you play with this example and try to customize components and scripts to learn how it works internally, to be able to do it head to Project section of your Unity editor and expand project directories hierarchy.


We will be customizing LazerProjectile:


Project structure

public class LazerProjectile : ActiveStateObserver
    {
        [SerializeField] Transform _target;
        [SerializeField] GameObject _effect;
        [SerializeField] float _rayCastDelay = 0.5f;
        [SerializeField] float _fadeOutTime = 0.1f;
        [SerializeField] float _delayBetweenShots = 0.5f;
        [SerializeField] AudioTrigger _chargeUp;
        [SerializeField, Optional] AudioTrigger _chargeUpComplete;

        protected override void Update()
        {
            base.Update();

            Vector3 endPos = Vector3.Lerp(_target.position, transform.position + transform.forward * 20, 5 * Time.deltaTime);
            _target.position = endPos;
        }

        //......
    }


Let’s increase Ray cast delay and test the app on the headset by replacing


float _rayCastDelay = 0.1f;
to
float _rayCastDelay = 0.5f;


Here is how it works!

Demo


Try it yourself and let me know what you want to build using hand tracking.





바카라사이트 바카라사이트 온라인바카라