During winterschool we had the opportunity to create a project based around the leap motion and a htc vive. The vive was used to walk around in a limited space and the leap motion provided the interaction tracking. The idea was to be able to virtually interact with different machines and see the overall status of a room.
The repository for the project can be found here: https://github.com/TheCell/Blockwoche_AVR
The first thing we did was a menu attached to the hands that gets displayed when you look at your palm. A challange was to make interactions possible by using one hand only. We implemented connecting the middle finger and thumb to detach the menu from your hand and attach it again.
Room panel: Further we developed a customizable panel that floats around with the person. It should be out of the way when working normally but be available when needed.
We then created a panel that was thought to be usable for general options. In this scene we used the panel to turn on and off ambient lights.
The last cube was meant to be machine specific. It is sensitive to whats around it and displays different options for every machine.