Unity Interaction SDK
1. SDK Release Notes
First-time readers of the document should first understand the V1.0 release content.
V1.5
Bluetooth Pico Collection Edition has already been updated to Pico Collection Edition (Add device connection method)
Add controller application samples
Update data receiver and send back plugin
Add calibration data record
Modified icons of core interaction scripts
Change UI language to English
V1.4.3
Optimize thumb activity performance (only in Bluetooth Pico Collection Edition)
V1.4.2
Correct the wear and tear problem that may occur in the hand model in VR (only in the Bluetooth Pico collection version).
V1.4.1
Correct the performance of physical hands under various Objectfollowtypes, and correct the performance of original non-physical hands in natural interaction patterns
Fixed the numerical error of Active Range on HandInteractor script in the instance body
Example UI adds the restriction of detecting the connection of both hands to continue the process, and adds the joystick range calibration after gesture action calibration (only in Bluetooth Pico Collection Edition).
Optimize the performance of the Bluetooth version of the glove (only in the Bluetooth Pico collection version).
(Important) Update to adapt to Pico SDK 3.1 version (only in Bluetooth Pico Collection Edition)
V1.4
Increase interaction between physical hands and physical properties
Add physical property application options and parameter adjustments for interactive objects
Fixed the issue of RegularShapeFit function triggering incorrectly
VR Base UI adds height adjustment option (only in Bluetooth Pico Collection Edition)
V1.3.3
Fixed an issue where Unity Event listening was invalid on InteractableObject scripts.
V1.3.2
Fixed a bug that caused the program to crash and crash shortly after normal operation. (Only in Bluetooth Pico Collection Edition)
V1.3.1
Fixed a bug in V1.3 where an interaction could not be triggered.
V1.3
Replace the left and right hand models in the SDK with the new standard posture, and add yaw adjustment to all joints of the virtual hand's index finger and little finger.
InteractObject script iteration: Add UnityEvent monitoring for object interaction and completion of interaction, add combination interaction scheme switching function and interface, add interlocking interface, change gesture detection button logic.
Hand Interactor script iteration: add force to change current interactive gesture method
The RegularShapeFit script adds the SpecialCube option to self-adapt to any angle interaction on the cube.
The HandController plugin has been iterated to adapt to HandDriver2.1.2 version.
VR Base UI adds glove connection selection page.
Fixed several interaction bugs
V1.2
Adjust the initial posture of the left and right hand models in the SDK.
V1.1
Add Bluetooth glove algorithm adaptation.
V1.0
Release version, please see the following text for instructions
1.1. Update details
V1.5 Example Scene - Controller Sample
Starting from V1.5, the original Bluetooth Pico Collection Edition has been renamed to Pico Collection Edition. While retaining Bluetooth connection functionality (future updates will primarily focus on 2.4G), it now includes 2.4G Android Service connectivity. For documentation on the Android Service, please refer to:UDEXREAL VR Headset Android Glove Service Manual
In Pico Collection Edition, the VR Base has been removed. Instead of it, the new example scenes VR Base (BLE) — for the original Bluetooth connection scene and VR Base (Android Service) — for the Android Service connection scene, have been added.
In example scene[VR Base (BLE)], there is a plane as ground for motion reference.

After finishing calibration, the newly added VRControllerSample script provides a set of VR movement examples. The left joystick directly controls the movement of the character (camera and hand model), while pushing the right joystick forward enables teleportation. The parabolic direction angle is controlled by the right hand posture, and when the joystick is released, the character will teleport to the position of the blue footprint icon. Pressing the A button on the right controller will return the character to the position before the last teleportation.

Pressing the B button on the left controller opens an additional control panel with the following functions from top to bottom: calibration evaluation, recalibration, and exit game. Pressing the same B button again closes the panel.

In example scene[VR Base (Android Service)],ground was also added. Unlike the previous version, the app no longer has calibration or calibration evaluation functions, which have been migrated to the Android service app. Press Start on the main panel UI to start the game, and the panel will close. The rest of the controller operations are the same as in the BLE scenario, but the additional operation on the panel only has exit game button.
In the Pure Interactive version, there is a new example scene[9. Controller Sample]. A new ControllerSample script was added on Hand root, the left joystick controls the movement of the car in the scene, while the right joystick controls the rotation.

The script contains eight UnityEvents that can be bound to corresponding buttons on the controller to trigger events when pressed. The Menu button is triggered when both A and B are pressed simultaneously, and neither A nor B will trigger individually at that time. The Interact Interval sets the response interval to prevent repeated or overlapping triggers. Once a controller button is pressed and activated, no other buttons will trigger within the specified interval. For detailed logic, please refer to the script code. Control Obj is an example toy car. The Joystick Value below displays the joystick's real-time values.

In both versions of the SDK (pure interaction and Pico collection), the HandDriver under LeftHand and RightHand in Hand is the script that receives data. This script has been updated to the latest version of the Unity plugin. Reference documentation: UDEXREAL HandDriver Unity Plugin Tutorial
The technical role of HandDriver in the Android Service scene of the Pico Collection Edition differs slightly from that in pure interaction. The former is used to receive Android service data during VR application runtime, while the latter is used to receive streamed data from the HandDriver software in PC development and send back vibration feedback commands.In the application switch settings section of the HandDriver script, the first option should be selected by default. The second option should be selected for PC application scene to receive data, and the third option should be selected when receiving Android Service data on VR devices. Please note that options two and three cannot be selected simultaneously; they must be distinguished based on application.

In the Bluetooth connection and Android service scene of the Pico Collection Edition, this option represents the primary difference in application settings. For Bluetooth connection, only the first option needs to be selected. To receive Android services, both the first and third options must be selected.
Additionally, the vibration feedback previously implemented in Bluetooth connections was achieved by calling methods within the Bluetooth SDK. In the Android service scene, this method has been moved to the HandDriver script. After selecting “UsingAndroidService,” simply call the ServiceVibrationControl() method.
void ServiceVibrationControl(int VirbatorIndex = 1, int DurationSecond = 20, int Strength = 10)
Interface FunctionSend vibration feedback to Android Service
Parameters descriptionAs follows:
VirbatorIndex
int
no
1
Hardware oscillator position: 1 corresponds to below the button, 2 corresponds to below the joystick, and 3 represents enabling both positions simultaneously.
DurationSecond
int
no
20
Vibration duration in milliseconds (10 ms), example value represents 200 ms
Strength
int
no
10
Vibration amplitude level, selectable range: 4 to 10 levels
The Pico Collection Edition now includes a feature for recording calibration data in Bluetooth connections. This feature is automatically enabled within the SDK core. For detailed implementation and effects, please refer to VR Base (BLE). Within the same application, the last calibration data (including hand motion calibration and joystick calibration) will be permanently retained. When reopening the application and connecting the gloves, the recorded calibration data can be directly applied.
When calibration data is already recorded, a “Skip” button will appear in the bottom-right corner of the UI panel while the gloves are connected. Clicking this button will close the UI panel and skip the initial calibration step. If calibration needs to be repeated during the application, simply select “Repeat Calibration” on the additional panel.

Within the dual versions of the SDK, those scripts that are not recommended to modify have been replaced with the UDEXREAL logo for easy distinction.

For broader applicability, all descriptions on the UI in the example scenarios have been replaced with English.
V1.4.3
The performance of the thumb has been optimized based on the previous updated version, which is more in line with the actual motion capture effect.
V1.4.2
Updated the method of applying rotation angle to joint points in Bluetooth, eliminating the problem of mold penetration that may occur in hand movements in VR.
V1.4.1
Under the updated physical hand and the original non-physical hand conditions in 1.4, various Objectfollowtype options on the InteractableObject script can interact normally. This includes physical hand adaptation to Instant, Slow Move, Chasing, No Pose, and non-physical hand adaptation to Natural.

In the non-physical Hand prefab of V1.4, the Active Range value on the HandInteractor script became 0, which caused the interaction in the sample scene to fail. It has now been corrected to the initial default value.
On the UI example in the example scene VR Base, when you proceed to the connection device information interface, you need to wait until both the left and right message bars are highlighted before clicking the Continue button below to proceed with the calibration process.

In addition, after completing the original five-finger separation calibration action, you will enter the newly added joystick range calibration page. After winding the left and right joysticks (operated separately or simultaneously) within the maximum interactive range for two to three weeks, click the Complete Calibration button.

In addition, in V1.4 and previous versions, the actual development version of Pico SDK used is the 2.3.0 version. Starting from V1.4.1, the latest Pico Unity SDK 3.1.0 and higher versions will be adapted. Official new SDK acquisition link: https://developer-cn.picoxr.com/resources/#sdk
The specific change is to remove the adaptation of SpatialLocationSetting _L and SpatialLocationSetting _R in the old version, which is used to read the position and posture information of the PICO body tracker and associate it with hand positioning. The script for this function has now been replaced with MotionTracker.cs, which is still located on TrackedLeftHand and TrackedRightHand in XR Origin.

The script has an enumeration item that controls the left and right hands. The Target is the root node of the hand that needs to be associated for actual tracking. When it is not a physical hand, it is the hand root nodes LeftHand and RightHand on the Hand prefab. When it is a physical hand prefab, it is the left and right hand navigation points LeftHandNavigator and RightHandNavigator.

At this point, the old version of Pico SDK 2.3.0 (left and right) is no longer compatible, please pay attention to the iteration of the new version when developing.
V1.4 Example Scene - Physical Nature Simulation [8.Natural Interact]
In the new example scene [8.Natural Interact], natural interactions that conform to real-world physics simulations are introduced.

The hand model in the scene uses a new prefabricated PhysicalHands

Compared to the previous version of the Hands prefab, the physical hand includes LeftHand and RightHand as the hand model body, while LeftHandNavigator and RightHandNavigator are two empty objects that only serve as positioning for the left and right hand models.
At the same level as HandInteractor, now both hands have added PhysicalHand.cs scripts that control the implementation of physical properties.

Navigator is the left/right hand navigation positioning, TraceToloranceDist is the maximum distance that the hand can be separated from the navigation. Velocity Limitationn is the speed limit for the model to follow the navigation.
LeftHandNavigator and RightHandNavigator are real positioning points. In the example, it can be seen that under normal circumstances, a physical hand cannot pass through a desktop with colliders in a certain direction. However, if the distance between the positioning point and the hand model is greater than TraceToloranceDist, the model will be forcibly moved to the positioning point.

In practical applications, it is only necessary to control the navigation points corresponding to the left and right hands. At the same time, there are also adapted collision bodies on the palm and fingers, which can achieve physical effects between the two hands and the environment.

The interactive object SampleStick in the scene is the same prefab in other scenes, and it also has a real collision effect when interacting with physical hands.
There is no rigid body on the object, so there is no gravitational effect. The capsule body in the scene SamplePhysicalCapsule adapted to rigid body gravity and new physical adaptations.
There is a new EnablePhysicalProperties parameter on InteractableObject.cs for interactive objects with rigid body adaptation. (Note: Tapping this option will automatically turn off the IsKinematic option for the object's rigid body, as kinematics will affect normal interaction performance)

When the option is enabled, there will also be an option to apply the object's motion trajectory EnableObjectMotionTrace. After clicking, the LineRenderer will be automatically mounted on the object, and the motion trajectory will be outlined in orange after the object finishes interacting. TraceVisibleSecond controls the time the trajectory remains, in seconds. The trajectory will disappear when the object interacts again or when the time is up.


Deselecting EnableObjectMotionTrace or EnablePhysicalProperties removes the LineRenderer component from the object.
The original Active Interaction Plans were renamed as Enable Interaction Plans, with the same functions.

Added Natural to ObjectFollowType, the effect is that the hand moves to the object, and the hand returns to the real tracking point after reaching the interaction state. This effect is adapted to SamplePhysicalCapsule objects.

The effect of RegularShapeFit has been improved without adding new configuration options. In physical simulation interaction, objects with regular shapes always need to interact in the most suitable position of the hand (a single fixed gesture may occur when the gesture is under the surface or colliding with other objects), but occasionally there may be mold penetration during interaction, which will be optimized as much as possible in subsequent iterations.
The interaction effect is as follows:
Press the red button to reset all capsules.
Tip: When adapting physical properties to interactive objects, if gravity is not considered, it is not recommended to add a rigid body and do not need to check the EnablePhysicalProperties. If a rigid body gravity is introduced and physical interaction is required, the EnablePhysicalProperties must be checked.
To ensure smooth operation of the physical system, please set the Fixed Timestep within Time in Project Settings to 0.01 (or less, depending on device performance).

A button for adjusting sitting or standing posture has been added to the UI initial panel of the VR Base scene in the lower left corner. The function is to allow the UI, perspective, and hands to be at a more appropriate height within the scene when entering the game scene in actual sitting or standing posture.

V1.3.3
Fixed the issue of invalid events for WhenObjInteract and WhenObjDisable on InteractableObject scripts in V1.3.
V1.3.2
Fixed the issue in V1.3 where the program would crash and crash after calibrating the glove and running normally for about two minutes.
V1.3.1
Fixed the bug in V1.3 that could not be triggered by "component interaction within an object".
V1.3 Example Scene - Dynamic Interactions [7. Dynamic Interactions]
The hand model in the SDK has been replaced with the new standard model. The axial configuration of the left and right hands is the same, and the rotation angle is uniform when the same hand performance is performed (for example, the yaw angle of the index fingers of both hands expanding outward is negative). Now, when switching between left and right hand gestures from the HandInteraction script, the posture of the left and right hands will remain consistent. At the same time, the yaw angle adjustment of the index finger and little finger two- and three-joint of the virtual hand has also been added. This configuration is mainly for better hand performance when setting fixed gestures, such as making a fist, without affecting normal hand movement.

Added interaction triggering and Unity Event listening at the end of interaction on Interactable Object.

In the new example Scene [7.Dynamic Interactions], a new interaction composition scheme feature is adopted.

The dumbbell object SampleBell on the left contains two Interaction Plan interaction schemes. To enable this function, you need to check Active Interaction Plans.

During normal operation, interactive gestures will be selected in the already set Hand Interactions. If there is a need to change the gesture, you can call the overloaded interface method SwitchHandInteractioinPlan in InteractableObject.cs to switch the scheme, passing the target scheme sequence number or name as a parameter. The single component of InteractionPlan is the scheme name and the set of interactive gestures of the scheme. When switching schemes, Hand Interactions will be automatically changed to Hand Interactions within the scheme to achieve the effect of dynamically changing interactive gestures. It should be noted that ObjectFllowType cannot be dynamically switched.
public void SwitchHandInteractioinPlan(int index) {...} public void SwitchHandInteractioinPlan(string name) {...}
The dumbbell object SampleDumbbell on the right also contains two interaction schemes. If Hand Interactions are empty at the beginning, no interaction can be performed. The corresponding logic is that the dumbbell needs to be assembled and balanced before it can be used.

Please note that the function of the AutoCheckPose button is still to add all gestures inside the object to HandInteractions, but it will not be automatically called or have an effect during game runtime. The initial interaction list selection Hand Interactions and interaction plan configuration need to be set up in the script by yourself.
In this example Scene, the dumbbell can only interact normally after installing the dumbbell piece, and the hand holding the dumbbell piece will forcibly switch to the Operation Inside Object type gesture in the SampleBell interaction plan 2.

The interface for enforcing gestures here is in the HandInteractor.cs script. It should be noted that the gestures that are forced to change need to have the same relative position as the previous gestures for the same interactive object. Only the finger expression or finger trigger logic (such as interactive use of objects) can be different.
public void ForceActive(InteractableObject newObj, HandInteraction newInteraction){...}
At the same time, the original dumbbell plate will continue to have two interactive Operation Inside Object gestures, but interaction on the dumbbell body is required first.

The BellSample.cs script on SampleBell contains an example of this dynamic operation, which can be read for code research.
In addition, an interlock interface has been added to the InteractableObject script, which can be used to control whether objects can interact.
public void SetActiveLock(bool value) {...}
Added the dice object SampleDice in the original example scene [2.Multiple Interaction].

The updated RegularShapeFit settings are adapted to this object.

Similar to the effect of Special Sphere, this cube only needs to provide a standard left-right hand interaction gesture, and can automatically adapt to the interaction position that best matches the current angle of the cube during runtime. With 4 possibilities for each face and 6 faces for the cube, there are a total of 24 interaction states for a single hand.

It should be noted that gestures other than cubes cannot maintain the same interaction gestures for each face, so they cannot be applied.
The HandDriver script mounted on the Hand under the root joint of the left and right hand models has been modified to adapt to HandDriver2.1.2 in the computer editor. For the convenience of testing and development on the computer, it is recommended to update this V1.3 version. If you need to maintain the old version temporarily, you can check the "old plugin" option on the streaming settings page of HandDriver, and the old plugin can still be driven to make the hand model function normally.
The UI in the VR Base scene has added settings for adapting to multiple pairs of gloves. If the program detects multiple pairs of gloves during actual operation in the VR device, you can click to highlight the gloves that need to be connected (distinguished by left and right hands and name). After both hands are selected, a continue button will appear in the lower left corner, which can enter the connection detection and calibration page.

Fixed bugs where objects may trigger interaction outside the interaction range.
V1.2
There is a slight difference (knuckle angle) between the initial posture of the left and right hand models in V1.0. The performance of the same joint angle on the left and right hands is different, which will cause the left and right hands to need to be adjusted separately for the same gesture action. This has been resolved in V1.3.
V1.1
The new version of the Bluetooth glove adaptation algorithm has been updated. For details, please consult the relevant business management of the company.
2. Pre-configuration process
Unity is applicable to 2021 LTS and above. The SDK is still a long-term iterative version. It is recommended to develop in a new empty project or a backup project to avoid irreversible development losses caused by some unknown errors. (The following part is the V1.0 release version)
First, import the UDE_PicoBased_Interaction & BluetoothSDK_V x.x or UDE_InteractionSDK_V x.x unitypackage file into the project.
Please note that the former includes the configuration of Bluetooth and Pico sdk, so all the configuration steps below are required. If you have obtained the pure interactive version of the latter, some of the following steps need to be skipped. The steps that need to be skipped are marked with text, please pay attention.
2.1. Bluetooth Pico Collection Edition

2.2. Pure interactive version

2.3. Error solving
There may be several errors later, please follow the steps below to solve them step by step.
Install the XR InteractionToolkit in the Package Manager.

(For pure interactive version, please skip this step) Get the folder of Pico SDK and import Package.
Please note that the SDK 1.4.1 version has been updated to adapt to the latest Pico SDK 3.1.0 and above versions. Please go to the official website to get the latest Pico SDK. The import method is the same. Get the link: https://developer-cn.picoxr.com/resources/#sdk


In Player Settings, set Player- > configuration- > Api Compatibility Level to. Net Framework, and then the project may be automatically restarted. If there is no automatic restart, please restart the project manually.

(Pure interactive version please skip this step) Finally, before actual development use, please switch the Unity platform to Android.

If there are no accidents, all error messages have been processed.
3. Detailed explanation of interactive classification
After completing the import in the SDK without any errors, several example Scenes can be found in the path Assets- > SDK- > Interaction SDK- > Sample Scenes, corresponding to the main types of interactive objects and application Scenes for VR integration. The following will gradually introduce the interactive functions contained in the SDK based on the general basic part and segmented example Scenes.
Please note that the use of the SDK and parameter adjustments have avoided changes at the code level as much as possible. In regular development, it is not necessary to understand the logic of the internal code. Please read the documentation as carefully as possible. If you encounter any unsolvable problems or bugs, please contact the developer in a timely manner.
3.1. Interactable Object
InteractableObject scripts need to be mounted on interactive objects.

Hand Interactions contain all pre-set fixed interaction gestures for the object, each corresponding to a gesture from the left or right hand. The Auto Check Hand button below will read all fixed gestures (Hand Interaction script) under the object, and will also perform automatic reading when running. Add Virtual Hand can add a default gesture to the object during non-runtime.




The Interact Priority parameter is the priority of object interaction, type is int, and the range is a natural number. When there are multiple objects in the interactive range, it will only interact with the object with the smallest priority number. If the priority is the same, it will interact with the object read first in the range.

Object Follow Type sets the interaction performance of the object. Instant means that the object is directly teleported to the accurate interaction position of the hand, and will not shift before the interaction ends. Slow Move means that the object moves smoothly to the interaction position. Chasing means that the object will always move towards the set hand interaction position with a lag, and will not be fixed on the hand. No Pose is a non-interactive gesture. When there are no available gestures under the object, this option will be selected by default. When the object is in interaction, it will move relatively still with the hand.

When selecting Slow Move and Chasing, you can further set Tracking Time, which refers to the time from interaction triggering to the object reaching the fixed position of the gesture. The recommended default value is 0.3 seconds.

The optional setting item folded at the bottom is the Inside Operation Part position of the sub-object that can be interacted with separately within the object. It is empty by default and can achieve some special interactions.

When this option is not empty, you can set the movement time, corresponding to the time required for the hand gesture to the special component. The recommended default value is 0.3 seconds.

3.2. Virtual Gesture Hand Interaction
As mentioned earlier, clicking the Add Virtual Hand button in Interactable Object will generate a virtual hand under the sub-object of the currently interactive object. Double-clicking on the virtual hand object in Hierarchy will show the blue virtual hand entity in the scene, which can freely move and rotate the virtual hand relative to the object. The actual performance of the hand during interaction is consistent with the shape of the virtual hand, and the actual position of the object is the same as during the virtual hand grip interaction. The main script for the virtual hand is HandInteraction.

The top Relative To refers to the position of the interactive object to which the gesture belongs, which will be automatically acquired by default. Handedness is selected by both hands.
Fingers Freedom is set from top to bottom to show the performance of fingers when interacting with five fingers. Fixed is a fixed gesture, and fingers can only maintain a fixed gesture posture when in the interactive state; Free is a finger that can move freely without gestures; Flexible is a flexible and triggerable gesture, which can move freely outside the triggering judgment range, and becomes a fixed gesture set within the range; Operation is a special function that can be used after interaction, specifying that finger movements can interact with movable structures such as buttons and triggers on objects.

Non-Free fingers are set. After clicking on the Visibility setting in the upper right corner of the scene editor, you can adjust the circle at the virtual finger joint in the scene to set the bending posture of the virtual hand's fingers for interactive gestures. At the same time, you can also set the position and rotation of the virtual hand to achieve the desired interactive performance.


When fixed gesture interaction, there are HandInteractionSetting scripts on the virtual hand to set the judgment conditions for gesture triggering.

There are two types of Interact Type: AnyFinger and All Fingers. By default, it refers to any finger or all fingers among the five fingers. Active Intense represents the trigger strength (curvature) required for the selected finger, with 0 representing natural extension and 1 representing curved grip.


If you click the Separate Finger checkbox at the bottom, you can make specific adjustments to each of the five fingers. If you check the Active checkbox, it means Yes. The triggering logic will combine the Interact Type with the selected fingers and their respective Active Intense to make an overall judgment.

The check of Operation Inside Object needs to correspond to the Inside Operation Part in the Interactable Object. Specifically, when the component position does not move during interaction, the hand will move to the specified component and use the virtual gesture. This will be explained in detail in the following text.

3.3. Hand Interactor Fundamentals
The HandInteractor script mounted on the root nodes of the left and right hands can obtain the UnityEvent of the hand at the beginning and end of the interaction, corresponding to WhenHandInteract and WhenHandDisable.

When the interaction starts, the Console will also output the Log of the interaction information, including the hand that triggered the interaction, the name of the interaction object, and the name of the interaction gesture.

The default interactive trigger range of the hand is a collision ball with a radius of 0.08, which can be adjusted through the ActiveRange parameter.


Shortcut Key Code is the shortcut setting. Pressing the set shortcut at runtime can trigger the Create Realtime Hand Interaction button at the bottom. This button can also be triggered by clicking the mouse directly. Its effect is to generate a valid fixed gesture prefab when the hand is within the effective range of the interactive object at runtime. The default folder stored is the TempVirtual folder in the SDK.


The virtual hand generated during operation is exactly the same as the hand action and position when the button is triggered, but it will only be saved as a prefab in the local folder. If the name of the generated virtual hand prefab is not changed, the newly generated virtual hand prefab will overwrite the old one.

To use this prefab, you need to drag it into the root node of the object you just selected (if it is the same object, it can be shared) as a child object. Then, in order to avoid the prefab covering the gesture, you need to right-click the prefab and select Perfab- > Unpack Completely. Then, in the Hand Interaction script, mount the virtual hand parent object to Relative To, select the finger joint that needs to be fixed, make adjustments, and use the gesture.
3.4. Example Scene - Basic Interaction [0. Basic Interaction]
Four sample objects of Object Follow Type are provided in this sample scene. Analyze from left to right.

The example left-handed gesture on the SampleCube block object is shown in the following figure.

The actual interaction performance is as follows. The Object Follow Type of the block is Instant, so it will be teleported to the correct interaction position. The right-hand interaction logic is similar.

The fixed posture of the left and right hands on the SampleCylinder cylinder is different.


Object Follow Type is Slow Move, Tracking Time is set to 0.3s. The interaction effect is as follows:

Please note that under the Slow Move setting, the movement of an object is a continuous process. If a hand action is judged to be an interaction, but it is quickly released and cancelled , the object may blink (teleport), which is caused by the time required for the object's movement (such as the default 0.3s, which is longer than the duration of hand interaction and immediately ending the interaction) covering and exceeding the total interaction time. At the same time, after the interaction ends, the object will retract to the coordinates and rotation before the interaction started.
There are two options to avoid this situation: Avoid operations that involve instantaneous interaction and cancellation as much as possible or Set the Tracking Time small enough to be shorter than the duration of instantaneous interaction cancellation. Please set it as needed, and we will strive to optimize this performance gap in subsequent iterations.
The above omissions have been optimized in V1.3.
SampleChasedObject also has fixed gestures for both hands on irregular objects. Since it is set to continuously follow the movement of the hand, the interactive gestures are not fixed on the surface of the object.

Object Follow Type is Chasing, Tracking Time is set to 0.3s. The interaction effect is as follows:

SampleSphere ball objects do not have any fixed gestures. During runtime, the Object Follow Type will be automatically set to No Pose. As long as the interactive trigger setting is met, the ball object will move with the hand. The interaction effect is as follows:

3.5. Example Scene - Interaction Priority [1. Interact Priority]
Three example objects with different interaction priorities are provided in this example Scene.

The Interact Priority parameter of SampleStick is set to 0, representing the highest level. The Interact Priority of all SampleCube cubes is set to 1, followed by the Interact Priority of SampleCylinder cylinders set to 2. The interaction performance is that when several objects are within the interactive range, only the object with the highest interaction level (level 0) can be interacted with first. The interaction is postponed according to different priorities. When SampleStick is taken out, the SampleCylinder cylinders within the range cannot interact because their priority is lower than all SampleCube cubes.
The effect demonstration is as follows:

3.6. Example Scene - Multi-Gesture Interactions [2. Multiple Interactions]
In this example Scene, four objects with multiple gestures or regular interactions are provided.

When there is only one fixed gesture for each hand on an interactive object, the corresponding hand posture will be selected for interaction between the left and right hands. If there are multiple fixed gestures, the distance between the current hand's index finger tip and the virtual hand's index finger tip will be calculated, and then all virtual hands will be traversed in order of distance from near to far, and the virtual hand posture that first meets the triggering interaction condition will be selected as the interactive gesture.
There are several interactive gestures on the SampleMultiShape object on the left side of the center, one on each of the four end objects. When the hand meets the interaction conditions with the object, the interaction judgment will start from the virtual hand closest to the object.






The interaction effect is as follows:

The SampleStick on the center right is an example of interactive rotation judgment.
When there are several virtual hands on an object, and the positions of two virtual hands are very close, when interacting, it will be judged to choose the one with the smaller angle of rotation for the object first. (The tolerance for distance has not been clearly calculated for the time being.)
For example, on SampleStick, each hand has four gestures. Take the left hand as an example.




At the two endpoints of the strip, there are two gestures with the tiger's mouth facing inward and outward, respectively. The virtual hand positions at the same endpoint are already very close, and the interaction will have more gestures that determine the angle at which the object needs to rotate (more convenient).
The interaction effect is as follows:

The leftmost SampleComlexStick object is an example of regular shape interaction.
All virtual hands of the object need to add an additional script RegularShapeFit .

After mounting this script, you can also adjust the interactive range of regular objects in the scene by clicking the Visibility setting in the upper right corner of the scene editor.


Taking the PoseLeft of the object as an example, the vertical red line at the center of the cylinder represents the interactive range on the object. By adjusting the Tramsform coordinate arrows at both ends, the direction and size of the area can be adjusted.

The TargetObject mounted on the script is the adapted regular shape object, which is the main part of the thin cylinder.

StartPoint and EndPoint are the local coordinates representing the starting and ending points of the range on the object. They will automatically change when adjusting the coordinate arrow, or you can manually input the coordinates to configure them.

Draw Radius controls the radius of the red cylinder drawn in the scene, which only has a visual effect and has no effect on the function.
The effect of this setting is that the virtual hand can be adapted 360 degrees at any position on the red solid line of the cylinder, and the actual performance is to interact with the cylinder at any position and angle. The interaction position is the best adaptation position, and the object will not appear additional movement or rotation. At the initial setting, the virtual hand only needs to be placed at any place within the effective interaction range of the object and perform correctly.

Multiple Self-Adaptation interactions can be set based on the shape of the object on the same object. When judging, if two Self-Adaptation ranges cover and belong to the same hand, the judgment will be selected first from the order in the HandInteractions list of the InteractableObject.
The interaction effect is as follows:

RegularSphere Earth model on the right, check Special Sphere on the RegularShapeFit script of the virtual hand.

SpecialSphere is set up for spherical objects, and can self-adapt the interaction position at any angle when interacting with the sphere.

At the same time, both the Start Point and End Point need to be set to (0,0,0), and then the virtual hand can be adapted to a regular interactive gesture at any angle.
The interaction effect is as follows:

3.7. Example Scene - Interaction Use [3. Interact Use]
In this example Scene, three objects with additional functionality are provided for interaction.

The SampleGun gun model on the left side, taking the right hand as an example, has two virtual hands, PoseRight and UsePoseRight, corresponding to the starting action and maximum interaction action of the interaction, respectively. It should be noted that the two gestures need to have exactly the same position and rotation, and at the same time, except for the fingers participating in the interaction, they all need to have the same posture to avoid gesture jumping or deviation in interaction performance during interaction.


The right hand action in the upper and lower pictures corresponds to the initial action of pulling the trigger and the finger action when the trigger is pulled to the bottom position. It can be seen that the two virtual hands only differ in performance with the index finger.
A FingerUseInteraction script needs to be added to the PoseRight virtual hand to which the starting action belongs .

Prepose Interaction requires selecting an interaction action for the virtual hand.

The choice of Trigger is the movable part of the object when used interactively.

Axis is the local coordinate direction of object activity.

Release Thresold and Active Thresold refer to the thresholds for judging release and interaction triggering in interaction. Taking this object as an example, when the degree of interaction is greater than or equal to 0.7, it will be judged as a shot. Then, the degree of interaction needs to be returned to less than or equal to 0.5 before continuing to judge the next shot. That is to say, keeping the trigger pulled will not always trigger the shot command.

The Trigger Process Length is the movable distance of the triggered object. It is necessary to adjust the object position within the scene and record the value for adaptation. Similarly, it is best to adjust the knuckle performance of the interaction action corresponding to UsePose when the triggered object moves to the extreme value.

When Use Active is a Unity Event that runs when an object interaction is triggered. The shooting script Sample Gun called here is just a simple example. In actual development, it is necessary to write it according to the interaction logic of different objects and the desired performance.

In addition, the virtual hand corresponding to the finger that needs to interact should be set to Operation, and the virtual hand corresponding to UsePose does not need to HandInteractionSetting script. The trigger judgment is determined by the HandInteractionSetting configuration of the virtual hand of Pose. HandInteractions will also automatically filter out the associated UsePose.


The interaction effect is as follows:

The centered SampleGripper object provides an example of multi-finger interaction. This object can be triggered by any one or more fingers except the thumb, and the interaction value of the trigger will be determined by the finger with the highest curvature.

When setting fixed gestures, the initial and deepest interaction states of all fingers should still be taken as the standard. In actual use, the bending and interaction degree of the four fingers can vary.


The interaction effect is as follows:

The SampleCounter interactive object on the right is similar in logic to SampleGun, with only one finger participating in interactive use. The button can be pressed by the curvature of the thumb on this object, and each press will increase the displayed number by one.


The interaction performance is as follows:

3.8. Example Scene - Interact Inside Object [4. Interact Inside Object]
The scene provides examples of bows, arrows, and door objects.

The design of the SampleBow bow object is based on the left-handed bow, so the interaction logic is set to only allow the left hand to hold and the right hand to pull the string. In the InteractableObject script of the object root node, the Inside Operation Part in the Optional option at the bottom is assigned as the fixed part Handle for pulling the string inside the bow and arrow, and the Moving Time is set to the default 0.3s.

The virtual hand with fixed gestures on the left hand of the bow is a regular interaction.

The Inside Operation Part inside the object can move independently and needs to be relatively stationary when interacting with the virtual hand. For example, when pulling the string, the main position of the bow does not move, but the position of the string trigger will change, so the position of the virtual hand needs to be placed in the movable component. Therefore, when the Handle is pulled, the correct interaction position can also be obtained.



The virtual hand-mounted HandInteractionSetting script on this inner movable component needs to be checked Operation Inside Part.

The most obvious manifestation is that the object as a whole does not move, but the hand moves onto the object or moves internal components to positions within the interactive range.
The performance of this item is as follows. Moving Time represents the time it takes for the hand to move to the specified position and return to the actual position.

The BowActiveSample script on SampleBow and the ArrowTipSample script on SampleArrow are two interaction control scripts written specifically for bow and arrow interaction. They are only for example reference and are not recommended to be directly used for reuse. You can refer to their internal logic to develop and write similar interactions. I won't go into detail here.
The usage method of the example bow and arrow is to hold the bow in the left hand and place the arrow near the trigger position with the right hand. The arrow will be automatically loaded onto the bow. Then, the right hand triggers the interaction and pulls the bow. After releasing the hand, the shooting process can be completed.
The complete bow and arrow demonstration is as follows. The target is only for demonstration purposes and has not been added to the example scene.
There are four component interaction gestures on SampleDoor, which are the left and right hands on the door handles on both sides.

Here, the door handle is made into a complete whole, so all four virtual hands are placed inside the door handle object.


Similarly, the Operation Inside Object of the HandInteractionSetting of the four virtual hands needs to be checked. The Inside Operation Part on the Interactable Object is set to the door handle position.
DoorActiveSample is an interaction control script written specifically for gate interaction, for reference only.
The interaction pattern of the door is that after pressing the door handle to the bottom, the door can be pushed open. After pushing a little distance, there is no need to press the door handle. When closing the door, the door handle also needs to be pressed to the bottom before it can be completely closed.
The interaction effect is as follows:

3.9. Sample Scene - Interactive Objects [5. Finger Press]
3.9.1. Hand Interactive FingerPressSelector
This function relies on the FingerPressSelector script on the hand root node (the same level as HandInteractor). Checked finger end points can participate in contact surface interaction, while unchecked ones cannot.

The FingerPress script on the Tip object located in the left and right sub-nodes has a configuration corresponding to the five-finger end point.


Taking the left-hand food finger as an example, the number of judgments for each frame, the reference coordinates of the trigger point, and the radius of the trigger sphere can be set on the FingerPressInteractor script. After enabling Toggle Visibility in the scene, you can see the size and position of the detection sphere.

Several examples of finger press triggers are provided in the Finger Press scene.

The leftmost SampleButton object has a circular interactive plane. After enabling Toggle Visibility at the root node of the object, you can see that the trigger surface interaction scheme is from the button surface down.

FingerPressInteractable script is mounted on the object root node

The parameters from top to bottom are finger hover detection distance (Hover Detect Distance), finger hover ignore distance (Hover Ignore Distance, ignore distance needs to be greater than or equal to detection distance), contact surface hover additional extended detection distance (Surface Ext Hover Detect), contact surface hover additional extended ignore distance (Surface Ext Hover Ignore), interaction termination distance (Interact Disable Distance, forward interactive surface backward), contact surface additional extended termination interaction distance (Surface Ext Interact Disable). The actual effect can be felt through debugging during use.
The ButtonPos on the sample button object is the actual trigger for the button. FingerPressController script needs to be mounted.

Button Base Transforms to the location of the Surface component (described below).

There are two display types for Quad, Pure Color or Sprite. When selecting Pure Color, the four colors below correspond to Normal Color, Hover Color, Select Color, and Disabled Color. Correspondingly, when selecting Sprite, the four images below refer to the corresponding regular image, Hover Color, Trigger Color, and Disabled Color.

Renderer is the Mesh Renderer of the object.
In addition, you can also listen for triggers through OnHover and OnSelect Unity Events.
The Surface inside the button object at the same level as the actual trigger component is the surface at the bottom of the trigger surface. After turning on Toggle Visibility, you can see the green grid surface at the bottom of the trigger. Since the trigger surface is circular, the visual grid is also circular in size.

The InteractableSurface script needs to be mounted on the Surface object.

Facing represents the normal direction of the contact surface receiving the trigger. As can be seen from the Local Position shown in the green grid in the above figure, Backward is the opposite direction of the Z-axis, which is the side of the button that receives the press. If Forward is selected, it points to the positive direction of the Z-axis.

Can both sides of the Double Sided control surface be triggered?

There are three types of Surface Types: square, round, and shapeless infinite.

When selecting a square face, you can set the size of the face.

The circular radius can be set for the circular surface.

Infinite Wide Face will generate a visual grid of faces based on the viewing angle position in the scene.

Note: Taking the button object as an example, the ButtonPos that mounts the FingerPressController places the actual Button model in the child object, because the world coordinates of the object Pivot need to be used as the starting point for triggering, and the world coordinates of the surface object are the end position of the movable contact surface. For larger button objects, the coordinates will be based on the geometric center, which will cause the pressing distance to be only half, so an additional parent node is needed as a coordinate reference.
The interaction performance is as follows:

The SamplePad object is a panel-type contact surface, and three types are displayed in the scene. SamplePad is a solid color panel, SamplePadWithSprite is a panel that uses sprite diagrams, and there is also a SamplePadUI in the UI component for the panel used on the UI.

The setting of the solid color panel is shown in the figure below. Under this setting, there is no need to pay attention to the Sprite in the Quad sub-object.

The panel settings for using sprite diagrams are as follows. It should be noted that sprite diagrams do not require the original Mesh Render, so in order to avoid being blocked by solid color panels and affecting the visual appearance during non-runtime, you can uncheck the Mesh Renderer checkbox below. This setting is not necessary and will be automatically unchecked during runtime.

Unlike the UI panel to be mentioned next, the Sprite sub-object in SamplePadWithSprite uses the regular 3D coordinate axis Transform, and the component that mounts the sprite map is Sprite Render. The Sprite referenced in the component is only used as a non-runtime visual display, and the runtime image is subject to the settings in the FingerPressController.

SamplePadUI is placed under the UI hierarchy, and the Render Mode of UI Canvas must be World Space. This means controlling the contact surface interaction object on the UI panel within the world space.

SamplePadUI uses Rect Transform on the Sprite sub-object, and the component for rendering images is the Image on the UI. If Sprite Render is used on the UI panel or Image is used in a regular object, the image cannot be loaded. Therefore, these two interactive panels should be used according to the actual application Scene.

The interaction performance of these three types of Pad is similar, and the settings on Pure Color or Sprite make the performance different.


3.10. Example Scene - Ray UI Interaction [6. Ray UI Interaction]
Ray UI Interaction Scenes provide examples of ray usage, modified by the XR Interaction Toolkit extension.

Please note that an important prerequisite for ray interaction is to have a Camera with the Tag set to MainCamera in the scene.
3.10.1. Hand Interactive Complementary Ray Controller
The RayController script mounted on the roots of the left and right hands controls the finger simulation handle Trigger to use logical trigger ray clicks.

Active Ray controls whether the ray is turned on.
Ray Active Type options include Any Finger and All Fingers. The configuration logic here is consistent with the grip interaction, which can control whether any finger is bent to a certain extent to trigger, or a combination of several fingers to trigger. The default is to trigger only with the index finger.
After the ray is turned on, it will only appear when the UI interface is detected on the path, and it will not be displayed at other times, nor will it affect the normal use of any other interaction.

Please note that when running the sample Scene test in Editor, the Game window needs to be displayed, and only the Scene window ray will not load properly. The window can be arranged as shown in the following figure.

Taking the left hand as an example, the LeftHand RayController object contains various configurations of rays and ray cross anchor objects, which can be changed according to one's own needs.

The ray function is modified from the solution provided by the XR Interaction Toolkit. For a more detailed interpretation, you can search for relevant information yourself.
SampleUI is an example UI object, which differs from normal UI in that the Render Mode in Canvas needs to be set to World Space, indicating that the UI is placed in the scene rather than directly rendered on the camera. In addition, the Graphic Raycaster script on the UI needs to be replaced with the Tracked Device Graphic Raycaster script, which is provided by the XR Interaction Toolkit.

All components on the UI panel do not need special settings, just create them like a regular GUI. The example UI has three example components: buttons, toggle keys, and sliders.

The interaction effect is as follows:
3.11. Summary
3.11.1. Basic structure of LeftHand and RightHand :
The hand root node is mounted with three key scripts: HandInteractor (controls basic interactions), FingerPressSelector (controls the fingers that enable contact surface interactions), and RayController (controls ray interactions). Hand movements must be performed on this root node.
Joint root node, containing Hand Driver for controlling joint movement (does not need to be ignored when running on non-PC platforms).
Mesh Renderer Component
DetectArea detects the interaction range (controlled on the HandInteractor script).
FingerPress fingertip configuration (whether the finger triggers is controlled on the FingerPressSelector script)
LeftHand/RightHand RayController controls ray interaction (controlled on RayController script)
3.11.2. Basic structure of interactive objects:
The object root node needs to mount the InteractableObject script, and the Scale needs to be the default (1, 1, 1), which can include custom collision structures and self-written interaction logic scripts.
The actual model of the object in VisualPart generally includes colliders, Mesh Filters, Mesh Renderers, or other multi-layer structures. There is no requirement for Scale, but the position and rotation need to be default (0,0,0). If the root node does not have a complete collider, it needs to be fully set in this part.
(When components interact within an object) internal components with separate activity logic
Several virtual hand objects that are fixedly mounted with HandInteraction and HandInteractionSetting scripts, with Operation Inside Object checked.
(When there is gesture interaction) Fixed mounting HandInteraction and HandInteractionSetting script virtual hand objects, the number is unlimited, Scale needs to be the default (1, 1, 1). If it is interactive use needs to mount FingerUseInteraction script, if it is a rule object Self-Adaptation needs to mount RegularShapeFit script.
The empty object generated at runtime represents the position of the virtual finger tip. (No need to pay attention)
(When used interactively) The virtual hand that mounts the HandInteraction script is the action when the interaction is fully triggered and does not need to HandInteractionSetting the script. It needs to be referenced by the Prepose Interaction on the corresponding FingerUseInteraction.
(When there is no gesture interaction) only need to HandInteractionSetting the script to control the game object that triggers the interaction judgment.
The empty object CalculateGameobj generated at runtime is used to calculate the interaction position. (No need to pay attention)
3.11.3. Structural specifications for interactive objects on contact surfaces.
Object root node, need to mount FingerPressInteractable script.
Active object parent node, the original coordinate is the initial position of the trigger, need to mount FingerPressController script, the Button Base Transform on the script should be set to the Transform of the face object. The actual model of the object (Mesh Renderer) can not be on the object.
If there is no Mesh Renderer on the parent node, it must be set on the child object and set to FingerPressController Renderer.
If it is a Quad panel type, a Sprite component is required. If it is not in UI, it is Sprite Render, and if it is in UI, it is Image.
Surface object needs to be mounted with InteractableSurface script, and the position of the object is the end position that the movable object can reach. The shape and size of the plane will affect the trigger judgment and performance.
The structural standard specification for ray UI interaction is described in Section 10 above.
4. Full-process interactive solution
4.1. Pico Swift + UDE Interaction SDK + UDE BleSDK
The current fully connected process is the Pico All in One solution. Use Pico's Swift to provide spatial positioning for the hand, and transfer glove data to the Pico device through Bluetooth connection. You need to connect to Pico's SDK and UDE's Bluetooth SDK (Bluetooth SDK is included in the unitypackage).
There is a sample scene in the Sample Scene folder of the Interactive SDK VR Base
This scene contains XR Origin compatible with Pico SDK, as well as scripts corresponding to Swift hand positioning. After packaging this sample scene in Pico, the glove information reading and calibration process can be completed by clicking buttons on the panel UI. (The following demonstration is recorded by PC Editor and only shows the operation process)

Bluetooth SDK documentation used for connection:UDEXREAL Unity Android SDK
Last updated