最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

c# - How can I use Building Blocks' Real Hands to interact with Unity Canvas UI (dropdowns)? - Stack Overflow

programmeradmin0浏览0评论

I am currently developing an MR application for Meta Quest 3 in Unity. To utilize the passthrough functionality, I am using Building Blocks and relying on hand tracking instead of controllers.

Specifically, I intend to use the Real Hands component included in Building Blocks to interact with the default Canvas UI in Unity. For example, I aim to use hand gestures to touch and select items in a dropdown menu on the Canvas.

However, simply adding the Real Hands component to the scene does not generate the expected touch or click events on Unity's standard UI (Canvas), and thus the UI elements are not responding to the hand interactions.

Question: How can I implement a solution that uses data from Real Hands (such as hand position and gestures) to interact with Unity’s Canvas UI (specifically dropdown menus)? In particular, I would like to know how to convert the hand tracking data into equivalent mouse/touch events that can be processed by Unity's EventSystem.

What I've Tried:

I have placed the Real Hands component in the scene and successfully obtained hand movement and gesture data (e.g., detecting a pinch). However, this alone does not trigger touch or click events on the Canvas UI (such as dropdown menus), so the UI does not respond as intended. Any help would be greatly appreciated.

发布评论

评论列表(0)

  1. 暂无评论