2022 Puppetry project week #45 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

The following notes have been taken from chat logs and audio recording of the Thursday, November 10th Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).

Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Project Summary

  • Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
    • Note that facial expressions and finger movements are not currently enabled.
    • Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
  • The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
  • Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
  • No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
  • There is now a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues – those experiencing issues can also contact Wulf Linden.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Further Information

Meeting Notes

Viewer and Plug-in Updates

  • The puppetry team is working on updating the viewer and LEAP plug-in, and an update to the project viewer is liable to be released in week #46.
  • This viewer includes:
    • The ability to move the avatar pelvis.
    • Ability to stretch other bones – although this is awaiting testing at the time of writing. However, the reference frame scale is that of the normal puppetry targets, so you would have to scale the data correctly; therefore additional work on this is required to provide a way for the plug-in to get the data necessary to know now to scale individual joint bones (e.g. change their parent-relative positions).
  • It still won’t be possible to clear puppetry target/config data, which remains on the teams “to do” list.
  • Aura Linden noted the new LEAP module it initialises on-demand rather than via instantiation (as with puppetry). LL will provide demos of using the new module.

Kincect v2 Support

  • Simon Linden has been working on an experimental plug-in taking inputs from a Kincect v2 device.
  • He describes the the code as being “pretty rough”  and using only basic geometry, but it allows avatar elbows / arms to be moved around.
  • This work in part utilises the data syntax described in OPEN-366 “Simplify Puppetry Configuration Through LEAP”, the new protocol proposed by Leviathan Linden as per previous meeting notes.
  • The code is not ready to be pushed to a public branch as let, and doing so is somewhat dependent on feedback from developers /creators.

Avatar Constraints / Interactions

  • OPEN-368 “[Puppetry] [LEAP]: Location Constraints” – LL have indicated there is “much” within this Jira they would like to support “eventually”.
  • The feeling at the Lab is that constraints can “definitely” be improved  – although what this may look like has yet to be properly determined. However, the general feeling is that there should be constraint data associated with a given skeleton, for example, so we’re not just imposing a human-centric model on the SL avatar.
  • A  good portion of the meeting was given over to a general discussion of how best to handle puppetry and avatar animations – and the potential to need to move away from canned animations and provide a more direct means of avatar animation.
  • Avatar-object interactions are potentially complex issue (e.g. how can an avatar accurately take and hold an in-world object – say an apple) through puppetry? If the apple is a physical object, does it collide when held? Does it become an attachment? If  the latter, how is this registered, together with hold is it properly released from the attachment system? etc.).
    • A suggestion for handling avatar’s handling objects is to have some for of temp-attach system or to use a key frame motion (KFM) system to match the position to the avatar’s hand, allowing the avatar can hold the object without directly “owning” it (thus also avoiding permission system issues).
  • Collisions also raise questions: avatar arms currently do not collide, and so would not under puppetry. So what about cases of simple interactions – flicking a light switch or similar. These are not “proper” collisions per se, but are rather event-triggered; how can this be managed if there is no actual collision between the scripted object and an avatar’s arm / hand to trigger the associated event?

In Brief

  • It has been suggested that a version number is included in puppetry-related messaging, so that changes to message formats are not read by versions of the viewer unable to do so, thus reducing the risk of crashes during development / testing.
  • It has been indicated that puppetry will eventually have LSL support for LEAP. Although what form this will take and how the simulator will track things is  still TBD, as currently animations are entirely viewer-side and untracked by the simulator.
  • There is concern that understanding of the potential of the puppetry project isn’t being fully understood by creators (and others) as it is being seen more as a “VR thing” than an ability to much improve avatar animations and their supporting systems / constraints, including the IK system.
  • How to manage network latency also formed a core discussion, together with making better use of the Havok physics sub-licence to allow the viewer do a lot more of the work, and simply stream the results through the simulator to other viewers.

Date of Next Meeting

  • Thursday, December 8th, 2022, 13:00 SLT.

2022 Puppetry project week #43 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

The following notes have been taken from chat logs and audio recording of the Thursday, October 27th Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).

Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Project Summary

  • Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
    • Note that facial expressions and finger movements are not currently enabled.
    • Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
  • The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
  • Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
  • No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
  • There is now a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues – those experiencing issues can also contact Wulf Linden.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Further Information

Meeting Notes

Protocol Overhaul

At the previous meeting, Leviathan Linden noted the project team is going to overhaul the Puppetry/LEAP protocol. Since then:

OpenXR Support

Leviathan Linden asked for feedback on what the requested “OpenXR support” mean to those requesting it – e.g.: Is it to run an OpenXR app and have a VR experience in SL,  or is it to run an OpenXR app as a plug-in to provide measurement input to Puppetry?

The general response was a mix of both:

  • To generally provide the means for “proper” hardware support for motion capture such that puppetry isn’t just a “best guess” response via a webcam
  • To allow for more accurate interactions between avatars and objects; eventually moving to provide full support for VR headsets and controllers (requiring the ability to intact with scripted devices, operating levers, controls, etc., which could be correctly interpreted and acted upon by said scripts).

Currently, LL are more willing to consider OpenXR support as a part of the Puppetry work whilst regarding it as a potential step towards wider VR support in SL in the future.

Avatar Constraints / Interactions

The above question led to a broader discussion on avatar-to-avatar and avatar-to-object interactions starting with the avatar constraints / collision system.

  • As they are right now, avatar constraints and collisions within SL have been adequate for the platform, but lacking (collisions, for example have no concept of the avatars arms / legs, limiting interactions with them and other objects).
  • OPEN-368 “[Puppetry] [LEAP]: Location Constraints” is a feature request outlining the benefits of overhauling the SL avatar constraints system to allow better interactions with objects, etc. This is currently open to those wishing to add further comments and feedback.
  • The question was raised as to how “fast” / reliable the required communications (including all the required bone interactions) could be made in order to ensure adequate / accurate response times with actions (e.g..so when shaking hands, he hands of each avatar arrive at the same point at the seem time to be seen as  shaking in both viewers).
  • Also discussed was determining how “reactions” might best be defined – could it be as “simple” a pre-set animation?
  • One issue with this – interactions, OPEN-368, etc., – is that direct hooks from Puppetry to LSL had been seen as outside the scope of the project, simply because puppetry and the LEAP API are entirely viewer-side, and LSL simulator-side.  However, the discussion opened a debate on whether some means for this interaction should be provided, with two options being put forward:
    • Broadening the LEAP protocol, essentially using it to make the viewer scriptable with plug-ins that run on their own threads.
    • Providing a specific LSL function that would enable LSL to be able to communicate / interact with the LEAP protocol / JSON (as is the case with the RLV / RLVa APIs used by some third-party viewers).
    • Both of these approaches were seen as potentially “doable”, if beyond the intended scope of the puppetry project.
  • A further issue  with interactions and bone tracking (which would be required for accurate avatar-based interactions) as that bone tracking via LSL is as best limited to non-existent; this raised the subject of possibly using attachment points as a proxy.
    • An additional problem here is whether or not is possible to track the location of the attachment points in 3D space relative to any animation the avatar is playing (e.g. if an animation causes the avatar to raise their arm, is it possible to check the position of the wrist point)? This is currently something of an unknown, as it would either:
      • Require the simulator to inject a lot of additional calculations for joint and attach positions;
      • Or require a  new (optional) protocol where the viewer would just supply its in-world positions at some frame rate – which would require some calculation overhead on the part of the viewer;
      • Or – given work is in-hand to add the in world camera position relative the viewer, and also the avatar’s world orientation and look at target – provide a straight dump of the animation mixdown together with the skeleton data, enabling the processing to be carried out in a module rather than the viewer.
  • As a result of these discussions, time has been requested to investigate the various options (which will likely include a determination of what, if anything is to be included in the current project in terms of these additional capabilities).

Date of Next Meeting

  • Thursday, November 10th, 2022, 13:00 SLT.

2022 Puppetry project week #41 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

The following notes have been taken from chat logs and audio recording of the Thursday, October 13th Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).

Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Project Summary

  • Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
    • Note that facial expressions and finger movements are not currently enabled.
    • Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
  • The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
  • Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
  • No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
  • There is now a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues – those experiencing issues can also contact Wulf Linden.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Further Information

Meeting Notes

New Viewer Version – 6.6.3.575529 Dated October 12th

  • This viewer uses a different, more efficient data format sending updates up to the region, and from the region to viewers.
    • The new and old formats and viewers are not compatible; someone on the new project viewer will be unable to see puppetry rendered for someone using the older viewer version, and vice-versa.
    • It is hoped that severe breakages between viewer versions like this will be avoided going forward, but this change was deemed necessary
  • This viewer also a crash (deadlock) fix, and puppetry animations should fade in/out when starting or explicitly stopping (animations may stop abruptly should the LEAP plugin crash, or the data stream is lost, etc.).
  •  Those self-compiling viewers with the puppetry code should ensure they are pulling the updated code from the  6.6.3.575529 (or later as new versions appear) repositories.

Protocol Overhaul

Leviathan Linden Linden noted the project team is going to overhaul the Puppetry/LEAP protocol.

  • The intent is to replace all the current LEAP commands (“move”, “set_this”, “set_that”, etc.), and replace with just two commands: “set” and “get”.
  • On the “set” side:
    • It will be possible set avatar joint transforms, or specify IK targets, and also set various configuration settings as necessary.
    • These set commands will be “incremental” in nature (so that changes can be made to reach the final state), and once set, they stay at the defined value until modified, cleared, or the plug-in “goes away”.
  • On the “get” side:
    • get_skeleton and any other get_foo commands (if used) will be replaced with {get: [skeleton, foo, …]}.
    • A message will be generated and set back to the viewer making the Get request, but the form of the message is still TBD.
  • Meanwhile, the viewer will only do IK for your own avatar, and will transmit the full parent-relative joint transforms of all puppeted joints through the server to other viewers, and LL will make it possible for a plug-in to just supply full parent-relative joint transforms if desired (e.g. no IK, just play the data)
  • This overhaul will also provide:
    • A way to move the Pelvis. This will include both a pre-IK transform (which is just setting the Pelvis transform) and also a post-IK transform, in case the avatar is to be moved after setting all the joints.
    • A “terse” format for the LEAP/Puppetry protocol to simplify some “set” commands to reduce data going over the LEAP data channel. It will be possible to mix these “terse” command with long-form explicit commands.
  • Leviathan plans to break all of this work down into a set of Jira issues and place them on the kanban board for ease of viewing.

The overall aim of this overhaul is to make the protocol more easily extendible in the future.

To the above, Simon Linden added:

The data stream is radically different than what we started with. Essentially your viewer will do the work for your avatar: send[ing] all data needed for your puppetry animations [so] the people seeing you just have to use those positions – no IK or significant processing. That should help out in the long run with crowds 

Example Script

Simon Linden has produced a simple example script that is pushed to the Leap repository:

  • It reads a JSON file and sends that puppetry data to the viewer.
  • Using it, is is possible to edit some values, save the JSON text file, and see bones move as an example of doing so.

In Brief

  • BUG-232764 “[PUPPETRY] [LEAP] Puppetry should be able to ‘Get’ and ‘Set’ avatar camera angle” has been raised to go with the protocol overhaul, and while it has yet to be formally accepted, has been viewed as a good idea by the Puppetry team.
  • Puppetry does not support physics feedback or collisions as yet, and work for it to do so is not on the short list of “things to do next”
  • There is currently an issue of “near-clipping” within a a first-person (e.g. Mouselook) view and using puppetry (so, for example, holding a hand up in front of your avatar’s face in Mouselook results in the hand being clipped and now fully rendering). This is believed to by an artefact of the viewer still rendering the head (even though unseen when in first-person view), and this interfering with rendering near-point objects like hands. The solution for this is still TBD.

Date of Next Meeting

  • Thursday, October 27th, 2022, 13:00 SLT.

2022 Puppetry Project weeks #36 and #38 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

The following notes have been taken from chat logs and audio recordings of the September 8th and September 22nd Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are:

  • Generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).
  • A mixed Voice / text chat format – attendees are not obligated to use voice when asking questions, but will need to listen to voice to hear the entire meeting.

Notes in these summaries are not intended to be a full transcript of every meeting.

Project Summary

  • Previously referred to as “avatar expressiveness”,
  • Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
    • Note that facial expressions and finger movements are not currently enabled.
    • Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
  • The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
  • Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
  • No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
  • There is now a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).

Further Information

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Summary of September 8th Meeting

Note: timing issues on my part meant I was unable to attend the first third of this meeting.

  • It is acknowledged that the current Puppetry viewer (viewer branch DRTVWR-558) is somewhat crashy and subject to some looping issues.
  • One aspect of Puppetry that should be highlighted is the ability for it to work alongside / in concert with existing SL animations – so you can be running a dance animation and still wave to a friend using puppeteering without the two animations clashing.
  • It is acknowledged that to ensure some reasonable smoothness of movement and to prevent things like movement conflicts between joints, there will need to be a more formalised animation constraints system. The current plan is to make this configurable via XML.
  • It is also acknowledged that tracking in general needs to be tightened within the plug-in code.
  • Puppetry does not currently interact with the Havok physics system (puppetry is largely viewer-side; physics – with the exception of some special use sub-libraries – is largely simulator-side).
  • The protocols which are used server-side to support Puppetry are not set in stone at this point; cases which require additional messaging, etc. can be discussed with the Puppetry team members from the simulator / server side of LL (e.g. Rider and Simon Linden).
  • Direct avatar interactions (e.g. shaking / holding hands, swinging a tennis racket to strike a ball, etc.): the IK system could help enable this, but it would also require a lot more work on the avatar / world mapping system to be fully possible, and this work has yet to be tackled (if it is to be tackled as a part of this initial Puppetry work).
  • The project is, at this point, fairly open as to where it might go: these initial project meetings are geared towards developers who may be interested in contributing and pushing elements of the project forward (e.g. support for full body tracking, etc.). Obviously, at some point, constraints will be placed on what is to be initially delivered.

Plugins (Pros and Cons)

  • Requests were made for the Puppetry system to support OpenXR (as well as LEAP). It was indicated that OpenXR would be considered as a default if a suitable plug-in were to be developed and contributed to Linden Lab for proper vetting and formal inclusion in the viewer.
  • The fact that the Puppetry project is using plug-ins raised concerns over system security. Plug-ins are executable, and so if accepted to run, a malicious plug-in could do considerable harm to a person’s system.
    • LL is aware of this, and is actively trying to minimise risk as far as possible.
    • However, safety also lay with users – do not download viewers from unofficial sites / sites that cannot be trusted; do not accept and run plug-ins that are passed around through forums, etc.
  • The benefits of using plug-ins was summarised as:
    • Speed of internal development / testing: there is no need to run a complete viewer build process simply because a couple on lines of code have been changed in testing; only the plug-in needs to be updated.
    • Extensibility: plugs-ins allow for more flexible support of additional creation tools or to add support for additional data formats (e.g. as with OpenXR) / hardware / programming languages (e.g. Python, C++, etc.).
    • Performance: using plug-ins allows the required additional processing such as webcam capture, processing and translation to be handed-off the separate processing threads within a computer from the viewer, thus preventing the latter losing performance by having to do the processing itself.
    • User assurance: removing things like the webcam controls to a plug-in that is not run by default as a de facto part of the viewer’s processing will (hopefully) remove fears about webcams somehow being used to “spy” on users.

Summary of September 22nd Meeting

  • It is hoped an updated version of the Puppetry Project Viewer will be available via the Alternate Viewers page in week #39 (commencing Monday, September 26th). This includes fixes and updates to the motion logic that should make avatar motion more predictable.
  • In terms of device support for puppeteering, any device that can be recognised as a joystick should be supportable within the Puppetry viewer (utilising the existing Joystick support options through Preferences) – although some refinement to the controls may be required via LL.
  • LSL support for puppeteering: nothing has been defined at present, but there are some ideas as to what might be needed / nice to have. It has been suggested LSL support is a subject for discussion at the next meeting.
  • Simon Linden has pushed a couple of capabilities:
    • A simple poser contained in a side branch of the LEAP repository. This reads a basic JSON file with bone positions (rotations)  for all 133 bones in the avatar skeleton and sends it as LEAP data to the viewer for animating the avatar. Thisfile can be live-edited, and is desgined to help those working with puppeteering  to experiment with it in an easy format – it will not be an end feature for the project.
    • Added a further branch to the Puppetry viewer repository called DRTVWR-558 Data Packing. This converts the data going from the viewer to the server onwards to a more efficient format, allowing the full animation data set to be contained in  a single packet for transmission.
      • However, this format is incompatible with the existing data format used within viewers built via DRTVWR-558; so as viewers are built using the newer code, this will not be able to show puppeteering using the older format, and vice-versa.
      • Those involved in experimenting with Puppetry should therefore switch to the viewer using the updated data format, once this is made available through the Alternate Viewer page, as it will be replacing the current data format going forward.
  • Leviathan Linden has suggested that if LL can transmit all bone data in compressed format, then they may not need to send IK targets and have the viewer manage the IK for all avatars in a scene, but rather have the viewer run the IK for a user’s avatar and then stream the avatar’s entire state, reducing the load on the viewer.

Pelvis Movement / Full Body Tracking / OpenXR Support

  • There was initial discussion about supporting local joint offsets and particularly off-setting the avatar pelvis to allow for subtle movements without actually moving the avatar.
    • This is somewhat similar to scripted animations, such as stands with an AO system – the avatar appears to step forward / back / walk in circle, but it is not physically moving as far as the simulator is concerned – the motions are the result of the avatar pelvis being offset from it’s actual position as seen by the simulator, and the animations running based on that offset.
    • There was some initial confusion over this and physically moving the avatar, as such, it was suggested this be referred to as “pelvis movement, rather than “offsetting joints / bones”.
  • Part of the reason for this discussion is because several non-Linden developers have been experimenting with partial and full-body tracking via OpenXR, and have found that not being able to move the pelvis within Puppetry can lead to issues of floating, etc., when an avatar kneels or crouches (as seen within existing SL animations) – the result of the legs being pulled up towards the pelvis, rather than the pelvis being moved towards the ground.
  • In addition this work has noted:
    • If Second Life were to return the “full” appearance data for an avatar (i.e. after allmesh transforms, slider data,, baked appearance information, etc.) has been applied, rather than the “raw” skeletal appearance, better calculations could be made around the pelvis height from the floor.
    • The approach works equally well with partially body tracking via a Rift S headset, and fully body tracking using alve headsets and Kinect devices.
    • However, it currently uses Blender as a conduit for translating movement within an OpenXR rig to the Second Life puppeteering rig, and would benefit enormously from a dedicated OpenXR plug-in, and the developers are willing to provide data data gathered from the work they’ve thus far completed to help facilitate this.
    • Separately to this, OPEN-363 “[Puppetry] [LEAP]: Add native OpenXR plugin” has been raised, but is (at the time of writing) awaiting review.
  • The above formed a nucleus of the discussion for much of the meeting with the ability to move the avatar pelvis now being seen as more of a priority requirement, with Leviathan Linden indicating they will try to look specifically at this between now and the next meeting.

Date of Next Meeting