@zcomponent/three-webxr
TypeScript icon, indicating that this package has built-in type declarations

1.2.1 • Public • Published

@zcomponent/three-webxr

This package adds support for building immersive virtual reality, mixed reality and augumented reality experiences to the Mattercraft 3D content creation platform for the web. It builds upon the WebXR specification, supporting a wide range of headsets and handheld devices, greatly simplifying the process of building such content for distribution over the web.

This package supports a number of immersive user experience paradigms, including:

  • fully virtual reality experiences, where the user's real environment is replaced with a virtual world; and,
  • mixed reality experineces, where the experience takes place in the user's real world but with virtual content added to the environment.

Device Support

In general, content built with this package should function on devices and browsers that support the WebXR specification. We've tested and optimised for the following devices:

Meta Quest 1 / 2 / 3 / Pro

Content built with Mattercraft and this package works well through the Quest browser, with great support for the two controllers, hand tracking and camera passthrough (for mixed reality experiences).

While Quest doesn't feature a built-in QR code scanner, it's possible to launch WebXR experiences on the device in a number of ways:

  • Typing a web address into the web browser's address bar
  • Using Quest's Web Launch feature, where end users tap a link on their phone or laptop/desktop and 'send' the link to their Quest device
  • Clicking 'Launch on Meta Quest' in Mattercraft's 'Live Preview' feature

Magic Leap 1 / 2

Content built with Mattercraft and this package works out of the box with the browser built into the Magic Leap operating system.

Users can launch WebXR experiences by scanning QR codes with Magic Leap's built-in 'QR Reader' app, or with the QR button on the browser's toolbar. This includes project trigger QR codes generated by Zapworks, and the QR provided by Mattercraft's Live Preview feature.

Google Chrome on Android

The Chrome browser on Android has support for the WebXR specification.

Experiences using the 'VR' mode are presented as two side-by-side stereo views suitable for use in 'Google Cardboard' style headsets. User interaction in this mode is primarily by means of the direction that the user is looking, known as 'gaze' input.

Experiences using the 'AR' mode are presented as handheld augmented reality, with interaction in the form of the user tapping on the screen, known as 'screen' input.

Getting Started

Mattercraft includes a number of template projects that are a great starting point for your next AR / VR / MR project. Just select an appropriate template in Mattercraft after creating a new project.

Alternatively this package includes a number of components designed to help you add XR support to an existing project super quickly. In each case it's possible to completely modify and configure the setup to meet your requirements. To get started, right click on the root group in your scene's Hierarchy and add a 'rig' from the 'AR / VR Rigs' menu. The options include:

  • XR Rig VR: for a fully immersive VR experience, including support for controllers and user teleport around the environment.
  • XR Rig VR Passthrough: for a mixed reality experience where the user's real-world environment is shown, including support for controllers and user teleport.
  • XR Rig AR: for a mixed reality experience where the user's real-world environment is shown, including support for controllers.
  • XR Rig AR User Placement: for a mixed reality experience where the user's real-world environment is shown, including support for controllers and the ability for the user to choose the origin location of their experience.

User Input & Controllers

This package provides a number of mechanisms for allowing user interaction in your experience. These include:

  • Pointer emulation, where the users can point at and click on objects using the device controllers, or (for some devices) hand tracking gestures. The package emits the same pointer* and click events that the mouse or touch screen input does in non-XR experiences.
  • Tracked controllers, where content can be attached to the handheld controllers supported by many devices.
  • Controller events, such as when the user presses buttons on the controllers.
  • Hand tracking, for devices that support it, where the user can point and interact with their hands.
  • Hand gesture events, for devices that support hand tracking, where events are emitted for gestures such as 'clenched' and 'pointing'.

Pointer Emulation

When building non-XR interactive projects in Mattercraft, it's common to use the browser events associated with the mouse or finger touch (e.g click, pointerup, pointerdown) to respond to user interaction. Since browsers don't generate these events during XR sessions, the XRManager component provides 'pointer emulation', where XR input events (such as the movement and button presses of handheld controllers) are translated into pointer events based on the direction of the underlying input device. This allows you to build experiences using Mattercraft's pointer events and have users point at and click on objects with their controllers or hands.

The exact mechanism of this user interaction varies depending on the type of device they're using, and the types of input supported by that device.

For headsets with tracked controllers, such as Meta Quest and Magic Leap, users point at objects by moving the controllers, and 'click' on them by pressing the trigger on the controller.

For headsets that support hand tracking, such as the Meta Quest, users point at objects by moving their hands, and 'click' on them with a pinching gesture between their thumb and forefinger.

For headsets without controllers or hand tracking, such as Google Chrome for Android running in the 'Google Cardboard' VR mode, users point at objects by turning their head towards them, and 'click' on them by tapping anywhere on the device screen.

For handheld AR, such as Google Chrome for Android running in the AR mode, users can tap on objects directly on the screen.

By default, the XRManager component enables pointer emulation for all of these input types, however it's possible to configure this, and to completely disable pointer emulation, by setting properties on the component.

In some instances you may wish a specific controller not have an emulated pointer - for example if the user is holding an object or tool with that controller - and you can do this by setting the suppressPointerEmulation of the associated XRController component to true.

Tracked Controllers and Hands

Many headsets support one or two controllers. These are typically held by users in their hands, have a tracked location in 3D space, and feature one or more buttons. The XRController component lets you work with these controllers in the following ways:

  • to show an 3D representation of the controller in the right place in 3D space,
  • to show a line in space in the direction that the controller is pointing,
  • to respond to events associated with controllers, such as button presses, or when controller tracking is lost or restored,
  • to attach 3D content to the controller, for example to let a user hold a tool in their hand.

Some devices that support hand tracking (such as the Meta Quest) represent the user's hands as tracked controllers. The XRController component works just the same for these cases, it just shows a 3D jointed hand rather than a 3D model of the controller. If you'd like your XRController to only work with actual tracked controllers, set its allowHands property to false.

Controller Binding

Since the various headsets each support different numbers and types of controllers, every XRController instance provides a number of properties that let you configure which underlying input device it is paired (aka bound) to. An XRController will bind to the first input device it finds that matches the values of its allow* properties.

In general there are two types of constraint: handedness, which is if the contoller is for the user's left or right hand (or indeed for either hand); and device type, which allows you limit the XRController to any of physical controllers, tracked hands, screen inputs and gaze inputs.

In order to give users a consistent experience, regardless of the device they're on, we recommend always having two XRControllers in your project - a 'primary' controller with allowLeftHandControllers set to false, and a 'secondary' controller with both allowRightHandControllers and allowUnspecifiedHandControllers set to false. This setup ensures that devices with either one or two controllers have predictable behavior. In addition, unless you're specifically targeting a device with two controllers, it's best to ensure that the user experience is functional using just the 'primary' controller. The 'XR Rig' components include this setup by default.

Attaching 3D Content

You can attach content to an XRController by placing items as children of it in the Hierarchy. There are two 'spaces' associated with tracked controllers:

  • grip space, where the origin appears in a location corresponding to the user 'holding' an object, and
  • target ray space, where the -Z axis is aligned with the direction the controller is pointing.

You can choose which space the children of an XRController should appear in with the space parameter. Note that on some devices (such as Meta Quest) there is no meaningful 'grip space' for hand tracking at this time.

To help position content in the correct location, you can use designTimeModel property of an XRController. Switching it to different values allows you to preview, in the editor, how the content will appear for different types of controller.

Hand Gestures

It's possible to respond to user hand gestures using the XRHandGestureManager component (in the 'AR / VR Components' menu). The following gestures are supported:

  • 'clenched' where the fingers and thumb are closed towards the palm
  • 'pointing' where the index finger is extended with the remaining fingers closed towards the palm
  • 'thumbs up' where the thumb is extended with the fingers closed towards the palm
  • 'palm open' where all the fingers are extended
  • 'peace sign' where the index and middle fingers are extended with the remaining fingers closed towards the palm

You can respond to these events either by registering listeners to them from script, or by attaching action behaviors to the XRHandGestureManager node. The component also provides a number of properties for attaching timelines or states from the animation system to each of gestures.

User Movement

Considering how users move around in your virtual and mixed reality environments is an important part of building great content. This package includes support for a number of common movement mechanisms.

Here's a summary, with more information below:

Teleport

Users can choose a destination in the virtual environment to instantaneously jump to.

✓ Great for VR

✗ May not be right for some AR experiences

✗ May require instructions / tutorial

✓ Minimizes nausea

Turn

Users can turn instantly to the left or the right with the thumbstick/touchpad, without having to physically turn their heads/bodies.

✓ Great for VR

✗ May not be right for some AR experiences

✓ Intuitive user experience

✓ Minimizes nausea

Walk

Users can move smoothly forwards/backwards, and turn or strafe left/right, using the thumbstick/touchpad.

✓ Great for VR

✗ May not be right for some AR experiences

✓ Intuitive user experience

✗ Some users may experience nausea

User Placement

At the start of the experience, the user chooses the origin of the content in their environment by pointing the device or controller to a location on the ground and pressing the screen or controller trigger.

✗ May not be right for VR experiences

✓ Great for AR

✓ Intuitive user experience

✓ Minimizes nausea

Teleport

This mechanism allows users to move around the space by teleporting - they select a destination by holding forward on the thumbstick or touchpad and aiming the controller at a point on the ground. Upon releasing the thumbstick/touchpad the user will instantaneously be moved to their chosen location.

One benefit of this mechanism is that it minimizes the nausea that's felt by some users during continuous motion. It also allows users to travel larger distances quickly.

To support this form of movement, just add a 'Teleport Manager' component from 'AR / VR Movement' to your scene's Hierarchy.

The 'XR Rig VR' and 'XR Rig VR Passthrough' rigs already include an instance of 'Teleport Manager' that you can customize.

The Teleport Manager shows a white ring to help the user while they're choosing a destination. If you'd like to customize this, the ring can be disabled in the node properties and alternative content can be added as children in the Hierarchy.

The component's teleportingLayerClip and notTeleportingLayerClip properties allow you to associate timelines or states in Mattercraft's animation system with the different phases of the user experience. In addition, the onTeleportStart and onTeleportEnd events allow you to react to the changes in the phase from script or Action Behaviors.

Turn

This mechanism allows users to turn instantly to the left or the right by pushing a controller thumbstick / touchpad. It's great for longer experiences where users may tire of having to regularly move their head/body while navigating a space, or where the user may be teathered with a cable that might wrap around them when turning. Users do not typically experience nausea when using this form of movement.

To support this form of movement, just add a 'Turn Manager' component from 'AR / VR Movement' to your scene's Hierarchy.

The 'XR Rig VR' and 'XR Rig VR Passthrough' rigs already include an instance of 'Turn Manager' that you can customize.

Walk

This mechanism allows users to move smoothly in the environment using the thumbstick/touchpad of a controller. It's an intuitive form of input as users may be familiar with similar forms of movement in computer games.

The forward/backward axis of the thumbstick/touchpad is mapped to movement in the direction the user is facing. The left/right axis can be mapped to either strafing (i.e. side stepping to the left or right) or smooth turning (i.e. pivoting) about the user's current location.

Some users may experience nausea when using this form of movement in an experience.

To support this form of movement, just add a 'Walk Manager' component from 'AR / VR Movement' to your scene's Hierarchy.

User Placement

This mechanism allows the user to choose the origin location for the experience in their real world environment by pointing the controller (or, for devices without controllers, the direction of the camera) to a position in space and pulling the trigger (or tapping the screen).

During placement mode, normal controller events and interactions are disabled by default.

To support this form of movement, just add a 'User Placement Manager' component from 'AR / VR Movement' to your scene's Hierarchy.

The 'XR Rig AR User Placement' rig already includes an instance of 'User Placement Manager' that you can customize.

The placingLayerClip and notPlacingLayerClip properties of the component allow you to associate these two modes to timelines or states in Mattercraft's animation system. In addition the onPlacementStart and onPlacementEnd events allow you to react to changes in the mode from script or Action Behaviors.

You can restart the placement mode by calling the component's restartPlacement function, either from script or with a 'Node Function' Action Behavior.

Teleport Action Behaviors

You may wish to instantly teleport the user to a different location in your scene, perhaps to move them to a new room or area, or to allow them to 'reset' their position. You can achieve this either from script (see the 'Custom Movement' section below), or using the included Action Behaviors.

The Teleport To Position action behavior allows you teleport the user to a specified X, Y, and Z location in your scene in response to an event emitted by a node (e.g. clicking on a button).

The Teleport To Node action behavior allows you teleport the user to a location specified by a node in your Hierarchy in response to an event emitted by a node (e.g. clicking on a button).

Custom Movement

All forms of user movement work by updating the offsetPosition and offsetQuaternion observables in XRContext. These variables determine the position/rotation offset of the scene's origin versus the origin of the real world environment reported by the device hardware. This makes it possible to implement your own forms of user movement in script.

Start by importing XRContext at the top of your script file:

import { XRContext } from '@zcomponent/three-webxr';

Then modify the offsets in the context as you wish:

const context = this.contextManager.get(XRContext);
context.offsetPosition.value = [0, 0, 0];
context.offsetQuaternion.value = [0, 0, 0, 1];

Since these offsets are held centrally in the XRContext, their values correctly influcence the locations of any XRCamera or XRControllers in your project, regardless of where they appear in your Hierarchy or any subcomponents.

Getting Support

Head over to the Mattercraft Discord server to join the discussion: https://discord.gg/DhFGBVXqkp

Readme

Keywords

none

Package Sidebar

Install

npm i @zcomponent/three-webxr

Weekly Downloads

90

Version

1.2.1

License

Proprietary

Unpacked Size

275 kB

Total Files

60

Last publish

Collaborators

  • deim
  • cgauld