How Does It Work?

Our system relies on placing one device on the headset and one
    on the finger of the user. Ultrasonic communication
    is used to track the hands in 3D and a radio link is used to transfer
  rotational and button information. Each of our devices have their own
   rechargeable battery. 

  1. RECEIVER - This device is the data hub to relay all the information to the display and is also the center of the tracked volume. It has two variants: one mounts on the side (specialized for AR) and one mounts flat in front of the headset (specialized for VR). The position of the hands are tracked with respect to the receiver, and the pose of the headset can be used to determine the world space position of the hand (especially in VR applications).
  2. HAND DEVICE - This is the tracked device. It has two variants: one is mounted on the finger (specialized for AR), to keep the hands free to work; the other variant is a handheld controller with buttons (specialized for VR).

Work mobility

Our approach to hand tracking uses devices on the user, rather than fixed in the room. This means that our 6 degrees-of-freedom (6dof) tracking works from the inside-out, so your hands are tracked and available whereever you go. The lightweight, low cost, low power nature of our system make it uniquely well suited to equip a crew of mobile users, in contrast to other 6dof solutions.

Compatibility

The receiver can readily communicate over USB to a Windows computer (either through a C++ API or through C# in Unity). The receiver can also connect via Bluetooth low energy to an Android or IOS device through Unity. Other communication interfaces (websocket, etc.) are available on demand. Feel free to reach out to us (either through our forum or contact form) as our staff can also assist you in rapidly deploying new applications with hand interaction.

Specifications

Circuit board:

Emitter size: 592mm2 (37mm x 16mm)
Receiver size: 801mm2 (36mm x 16mm + 15mm x 15mm)

Devices:

Finger-device: 13cm3 (44mm x 20mm x 15mm)
7.9 grams (Including battery)
Controller: 380cm3 (150mm x 39mm x 65mm)
Receiver side-mounted (AR): 23cm3 (40mm x 24mm x 22mm)
16 grams (Including battery)
Receiver front-mounted (VR): 20cm3 (56mm x 22mm x 17mm)
16 grams (Including battery)

Battery included: Yes, battery is included
Rechargeable: Yes, battery is recargeable through USB

Hand device: Current Power
Only monitoring gestures: 9.6mA 38mW
Actively tracked with ultrasound: 14mA 56mW
Receiver:
Awaiting communication from transmitter: 34mA 135mW
Reporting over BLE: 39mA 158mW


The receiver power consumption could be improved on demand.

Sampling rate: 0-90Hz
Computational load on main computer: none

The ultrasonic computation takes less than 6ms, followed by the USB transfer (~1ms) or BLE transfer (~10ms). An unsynchronized display may incur an additional delay corresponding to its refresh rate. 

When the hand device is aimed at the receiver, hand device is tracked in almost a full 180° degree hemisphere in front of the receiver. 

The accuracy is a function of how well the sounds is received on the receiver. In the center of the tracking, the tracking resolution can reach 0.5mm. 

Free AI-based training app for Android and iOS ensures that end users can customize both their tracking area and their gestures. Free Unity-based projects are also included to ensure rapid prototyping and deployment to lower time to value.   

Tracking modes

  1. MOUSE SUPPORT - By converting the 3D hand movement into 2D mouse movement, we provide a cross-platform way to interact with applications. This is the fastest way to add compability with any application that may not have been purpose-built for a specific headset.
  2. 3D SUPPORT - 3D hand movement is the most versatile to interact with 3D models, to move, scale, rotate, design, etc.
  3. HOVER SUPPORT - For headsets that do not support absolute mouse movement, hover mode is designed to translate the physical world in front of the user into 2D or 3D zones of interactions that get highlighted when the hand enters them.
  4. GESTURES - Our hand devices have an artificial intelligence API that allows to then to learn and recognize gestures. This is especially useful for our finger device to activate/deactivate tracking, select and go back in the application without having to press any buttons.
  5. TILT - This mode is enabled on demand for certain applications requiring a joystick-like gradation of values based on the users' hand tilt and roll.

Why Ultrasound?

Camera tracking has significant weaknesses when it comes to wearable headsets. First, camera tracking does not work well in a variety of real world lighting conditions, including outside in direct sunlight or in locations with a lot of reflections. Second, camera-based hand tracking is computationally intensive, which leads to an even shorter battery life for untethered smart glasses. Last, camera-based systems often have very small field-of-views (FOVs), which force users to move in unnatural ways. Depending on each employee's range of motion, this could be impossible. Overall, these drawbacks lead to inconsistent usability, lower productivity, and more device downtime.

The low cost per unit and small size of both the transmitter and receiver is a result of our proprietary 6 degrees of freedom tracking technology. Our technology has been internationally recognized and validated through our involvement in HTC's Vive X Batch 1, and we have been a provider of prototyping hardware for the advanced research departments of various Tier 1 OEMs.

BreqLabs

Address

112 College Street, Suite 411
Toronto, ON M5G1L6

Contact us

Email: info@breqlabs.com
Contact support and sales