Invention Title:

CONTROLLER ENGAGEMENT DETECTION USING HYBRID SENSOR APPROACH

Publication number:

US20250377742

Publication date:
Section:

Physics

Class:

G06F3/038

Inventors:

Assignee:

Applicant:

Smart overview of the Invention

The patent application describes a method for enhancing user interaction with headsets used in virtual reality (VR), augmented reality (AR), and mixed reality (MR) environments. This is achieved by using a combination of image recognition and various sensors to detect when a user picks up and holds a controller, allowing the headset to switch its input mode accordingly. The approach aims to integrate both hand and controller inputs seamlessly, enhancing the user experience by providing flexibility and conserving power through automatic mode switching.

Hybrid Sensor Approach

The system employs multiple models to track hand and controller movements. A controller-tracking model uses video frames and motion data from inertial measurement units (IMUs) to keep track of the controller's position. Simultaneously, a hand-tracking model identifies and predicts the 3D poses of the user's hands. By evaluating the spatial relationship between hands and the controller, the system determines when a user is holding the controller, triggering a switch from hand mode to controller mode.

Handheld Model and Proximity Detection

A handheld model is used to ascertain if a user has picked up the controller. This involves analyzing the spatial relationship between hands and the controller using data from both tracking models and proximity sensors on the controller. When a hand is within a predefined proximity threshold of the controller, the headset switches to controller mode. This threshold is based on the distance from particular parts of the controller, such as its handle, ensuring precise input mode switching.

Machine Learning Integration

In some embodiments, a machine learning model is employed to enhance the accuracy of detecting whether a user is holding the controller. This model is trained to recognize patterns indicative of controller engagement, further improving the system's responsiveness and reliability. The integration of visual detection and motion data allows for real-time synchronization with the controller's movements, ensuring a seamless user experience.

Advantages and Applications

The described system offers several benefits, including improved accuracy in detecting controller engagement and reducing power consumption by eliminating manual input mode changes. By combining visual and sensor data, the system can predict and adapt to user interactions in real-time. This technology is applicable across various XR environments, providing users with a more intuitive and immersive experience as they interact with virtual elements using either hands or controllers.