If you are just looking to have the buttons rotate with the user. You can edit the VRUI system with the following steps:
In runtime, it will create a VRUI parent under
Hierarchy -> Equal Reality tools -> Advanced GUIs -> VRGUI
This object is only created after it is needed. So you will need to have started a branching event before it is created.
However, if you create this object yourself, the system should just use the one you have created. It should be just an empty game object with two components
- Auto Height Adjust.
Don't create buttons for it, it does this a special way that I can go into detail after I submit the next update.
If you create your own, you should be able to add a component to it call CopyTranform and have it copy the transform of the head object, which is equivalent of parenting it to the head. (Do not check the box to parent it to the target, it needs to be in the right directory to be recognised by the gui system)
This system described above is what is changing in the new update. The interfaces created, will be more easily cuztomizable and moduler. The answer above will be slightly different as of the next version, 0.017.004
To compliment this last answer there were two other problems we needed to solve:
I wanted the buttons to track to where I was looking
The Solution is to add a copy transform to the VRGUI object that the previous comment suggested you make. (Parent it under Equal Reality Tools > Advanced GUIs in the scene hierarchy).
Under Camera Rig > Camera (head) > Camera (eye), make an empty gameobject called HUDposition, and move it within arms reach in the Z axis. This will dictate how far away from you the buttons spawn.
Copy the X and Y position, and the Y rotation.
The buttons in this verison are not designed to be constantly localized to your heads position. However, the next version will have the option to customize the interface, which can support localizing the buttons position.
A few things to be aware of. Traveling buttons come with a lot of potential issues:
- Position localized buttons are very dependant on the users arm length. If the users arm length is not known to enough accuracy, it is easy to make the interface unusable or uncomfortable.
- Buttons stuck to a HUD or locatlized to the position of your head are difficult for many users to understand and use. It does not reflect how things behave in the natural world. It becomes offputting when the interface moves when you lean forward to reach it.
- Buttons can be harder to read, you can not try and position yourself to get a better angle to read the button.
Feel free to make this as resolved after you test an SDK update that satisfies what you are after here.