The September newsletter from High Fidelity appeared at the end of that month, with Chris Collins highlighting some of the work that has been going on of late, providing an update on particle effects, procedural textures and – most interestingly – avatar kinematics and in-world object manipulation using an avatar’s hands and via suitable controllers.
Procedural textures allow for complex, algorithm based textures to be created using tools such as ShaderToy and used directly within High Fidelity. Brad Davis has created a video tutorial on procedural entities which Chris references in the newsletter, the write-up also follows a short video released on the High Fidelity you Tube channel which briefly demonstrates procedural textures in HiFi.
However, it is the object manipulation that’s likely to get the most attention, together with avatar kinematics and attempts to imply a force when moving an object.
In terms of avatar kinematics, Chris notes:
In 2016, when the consumer versions of the HMD’s are released, you are also going to be using a hand controller. It is therefore important that we can make your avatar body simulate correct movement with the hand data that we receive back from the controllers.
The results are shown in the newsletter in the form of some animated GIFs. In the first, Chris’ avatar is shown responding to a Hydra controller for hand movements and echoing his jaw movements. The second demonstrates object manipulation, with Chris’ avatar using its hand to pick up a block from an in-world game, echoing Chris’ motions using a hand-held controller.
The animation in picking up the block may not be entire accurate at this point in time – the block seems to travel through the avatar’s thumb as the wrist is rotated – but that isn’t what matters. The level of manipulation is impressive, and it’ll be interesting to see if this might be matched with things like feedback through a haptic style device, so that users can really get a sense of manipulating objects.
The object manipulation element, together with attempts to imply a force when moving objects in-world which make up a core part of the video accompanying the newsletter (and which is embedded below). Again, this really is worth watching, as the results are both impressive, and illustrate some of the problems High Fidelity are trying to solve in order to give virtual spaces greater fidelity.
Coupling object manipulation with implied force opens up a range of opportunities for things like in-world games, physical activities, puzzles, and so on. There’s also potential for learning and teaching as well, so it’ll be interesting to see how this aspect of the work develops.
The newsletter also promises that we’ll be seeing some further VR demo videos from High Fidelity in October, so keep an eye out for those as well.