2023 SL Puppetry project week #28 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

The following notes have been taken from chat logs and audio recording of the Thursday, July 13th, 2023 Puppetry Project meetings. Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Project Overview

General Project Description as Originally Conceived

LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions. 
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the avatar. Can that be the new improved direction of the Puppetry project? How to do it?

– Leviathan Linden

  • The project is rooted in the in idea of “avatar expressiveness”, referenced in a February 2022 Lab Gab session with Philip Rosedale and Brad Oberwager and officially introduced as Puppetry in August 2022 to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
  • Since that time the project has expanded in size, attempting to encompass improving SL’s (somewhat primitive) IK system; investigating and developing ideas for direct avatar-to-avatar, avatar-to-object interactions (“picking up” an apple; high-fives. etc.); providing enhanced LSL integration for animation control; broader hardware support; adoption of better animation standards, etc.
  • This has led to a change in approach for the project – see below for more.

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Resources

Change In Approach

  • The Puppetry User Group meetings have, until now, been held on Aditi (the Beta grid) at the Castelet Puppetry Theatre, commencing at 13:00 SLT, and generally on alternate Thursdays to the Content Creation meetings, and as per the Second Life Public Calendar.
  • As of the July 13th, 2023 these meetings have now been suspended until further notice.
  • This does not mean the project is being abandoned; it was noted during the meeting that as several of those involved in the project attend other User Group (SUG) meetings – notably, but not exclusively, the Tuesday Simulator User Group meeting -, discussions on Puppetry can continue within hose meetings.
  • Explaining the decision, Simon Linden Noted:
There’s definitely a lot of interested tech and possible features with [Puppetry]. [However] the original idea of doing real-time mocap on webcams was like opening Pandora’s box in terms of features and ideas, and also was a lot harder than we expected … in the end I think it’s better to work on some fundamental tech that can be used in a lot of other ways – like IK, streaming, figuring out how animation data can work with scripts, solving some challenges like just doing a decent hand-shake.

– Simon Linden, July 13th, 2023

  • Elements of Puppetry which have thus far been confirmed as continuing as WIP projects comprised (but are not necessarily limited to):
    • The real time animation streaming component of Puppetry (forming something of a hybrid between the LEAP <-> viewer work already undertaken, and Leviathan Linden’s work in streaming animation playback from one viewer, through the simulator and to other viewers without any direct interaction with the animations by the simulator).
    • IK Improvements and updates (see below).
    • Improved animation import support (see below).
  • There are also broader discussions going on in the Lab regarding possible further overhaul of the animation system.
  • The above decision re: Puppetry meetings being the case, this will be the last of these my dedicated Puppetry Puppetry summaries for the time being, but I will continue to report on Puppetry / related work as and when it is discussed at other meetings such as SUG meetings and Content Creation User Group meetings.

Meeting Notes

Animation Import

  • One of the Puppetry expansions, improving / broadening animation import into Second Life was spun-off into its own project and June.
  • Notably with this work, LL is using Assimp (github: https://github.com/assimp/assimp), which supports some 30-40 animation formats (including the likes of .FBX and glTF), converting them to its own format for ease of import to multiple platforms.
  • The work is now in its own branch of the official viewer (DRTVWR-584, not currently ready for public consumption in any way).
  • This viewer uses the Assimp engine to read animation files, and the viewer extracts data from there for preview and then uploading as a SL animation. Animation imports it supports include:
    • BVH files, as per the current animation import within the viewer.
    • FBX format files.
    • Animations saved with the Mixamo skeleton, as supported by other tools.
  • The Mixamo element of the viewer is currently incomplete, but there is a focus on getting it wrapped up so the viewer can enter the project viewer pipeline at some point for public testing. However, when complete, it is hoped that importing an animation with a Mixamo skeleton from the likes of  Blender or a tool like Rokoko Studio should work fairly seamlessly.
  • To help with imports, Aura Linden has included an option on import to scale motion, which might be further automated slightly, for improved ease-of-use among less experienced content creators.
    • If this process is automated, it will include an capability for manual override of course for those who are more experienced with animation creation and import.

Inverse Kinetics  (IK) Updates

  • Leviathan Linden’s IK work has pretty much become a mini-project in its own right.
  • Most recently, he has been focused on implementing Forward And Backward Reaching Inverse Kinematics (FABRIK) – which is the fundamental algorithm for suggesting new joint positions in a range of applications, including 3D modelling.
  • This work has been in part a matter of trial-and-error, and most recently, Leviathan has been fixing issues of where constraints are enforced in FABRIK which impact the SL avatar, although he still has some more constraints to fix.
  • Fixing these issues has required additional visualisation / debugging tools, which he’s having to code for himself.

Additional Notes

  • A further request was made for updating the bento reference skeletons on the wiki, which are reported as being “broken”, per BUG-10981 “Test content for public Project Bento Testing wiki page” and this Content Creation Discord channel discussion. This will be chased internally at LL to see if action is being taken.

2023 SL Puppetry project week #23 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

The following notes have been taken from chat logs and audio recording of the Thursday, June 8th, 2023 Puppetry Project meetings. Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Meeting And Project Overview

  • The Puppetry User Group exists to provide an opportunity for discussion about the development of, and feature for, the upcoming Second Life Puppetry project(see below for details), bugs, and feature ideas.
  • These meetings are conducted (as a rule):
    • On Aditi (the Beta grid) at the Castelet Puppetry Theatre, commencing at 13:00 SLT.
    • Those encountering issues in logging-in to Aditi should contact Second Life support for assistance.
    • Generally on alternate Thursdays to the Content Creation meetings.
    • Comprise a mix of text and voice – attendees can use text only, if preferred, but should enable to hear comments / responses given in voice.
  • These meetings are open to anyone with a concern / interest in the Puppetry project, and form one of a series of regular / semi-regular User Group meetings conducted by Linden Lab. Dates and times of all current meetings can be found on the Second Life Public Calendar, and descriptions of meetings are defined on the SL wiki.

General Project Description as Originally Conceived

LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions. 
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the avatar. Can that be the new improved direction of the Puppetry project? How to do it?

Leviathan Linden

  • Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
    • Note that facial expressions and finger movements are not currently enabled.
    • Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
  • The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
  • Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
  • No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
  • There is a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).

Additional Work Not Originally In-Scope

  • Direct avatar / object / avatar-avatar interactions (“picking up” an apple; high-fives. etc.).
  • Animations streaming: allowing one viewer to run animations and have them sent via the simulator to all receiving viewers without any further processing of the animations by those viewers.
  • Enhanced LSL integration for animation control.
  • Adoption of better animation standards – possibly glTF.
  • Given the project is incorporating a lot of additional ideas, it is likely to evolve into a rolling development, with immediate targets for development / implementation decided as they are agreed upon, to be followed by future enhancements. As such, much of what goes into the meetings at present is general discussion and recommendations for consideration, rather than confirmed lines o development.

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Resources

Meeting Notes

  • The viewer remains at  version 6.6.12.579958, issued on Thursday, May 11th.
    • This update includes access to Rider Linden’s experimental attachment point tracking & forwarding to the server feature.
    • It also includes various incremental improvements to handling puppetry, such as support for parsing binary LEAP data from the LEAP script.
  • Work has slowed a little due to Linden staff being out-of-office recently (hence why no meeting since May 11th), and personnel on the Puppetry project also working on other simulator projects.
    • This has notably impacted Leviathan Linden’s work on Inverse Kinetics (IK), which has had a knock-on impact slowing Rider Linden’s work on LSL support for driving puppetry.
    • However, progress has resumed on the IK work and while described as currently “not stable”, and problems are still to be solved in situations where a target position is too far away for a joint in the skeleton to reach, or where multiple joints (e.g. 5 or 6) are involved.
    • One issue that is proving difficult to handle is the default avatar mesh joint weighting is incorrect along the forearm and wrist. What is required is two distinct joints at the forearm to do mesh bending correctly: a hinge at the elbow and also a twist constraint along the forearm bone, toward the wrist, rather than (as is currently the case) treating the wrist as a ball joint. This may be the subject of further internal discussion at LL as Leviathan gets more of the IK work nailed down.
  • WRT IK:
    • Leviathan is looking to solve the targeting issues first, then work back to ensure that there are no collisions between a limb and avatar body (e.g. reaching across the avatar’s body to pick something up, and the avatar’s elbow / part of the arm appears to go through the body).
    • Forward And Backward Reaching Inverse Kinematics (FABRIK) – which is the fundamental algorithm for suggesting new joint positions in a range of applications, including 3D modelling – is the route of choice of the Lab; however, adopting to FABRIK is taking some trial and error.

Additional Notes

  • Aura Linden is here but she is working on the Animation importer project which has been split off from the Puppetry project. Currently, the status is animations can be import import animations from some tools/formats, but others aren’t working yet.
    • It was noted at the last meeting, that for animation import, LL is looking towards using / supporting Assimp (github: https://github.com/assimp/assimp), which supports some 30-40 animation formats, converting them to it ownformat for ease of import to multiple platforms. Notably, it supports .FBX and glTF, so it fits with the Lab’s goal of utilising glTF for materials, mesh imports, etc.

Date of Next Meeting

  • Thursday, June 22nd, 2023, 13:00 SLT.

2023 SL Puppetry project week #19 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

Note: this has been updated to include comments made at the TPV Developer meeting on Friday, May 12th.

The following notes have been taken from chat logs and audio recording of the Thursday, May 11th, 2023 Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).

Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Project Summary

General Project Description as Originally Conceived

LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions. 
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the avatar. Can that be the new improved direction of the Puppetry project? How to do it?

Leviathan Linden

  • Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
    • Note that facial expressions and finger movements are not currently enabled.
    • Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
  • The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
  • Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
  • No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
  • There is a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).

Additional Work Not Originally In-Scope

  • Direct avatar / object / avatar-avatar interactions (“picking up” an apple; high-fives. etc.).
  • Animations streaming: allowing one viewer to run animations and have them sent via the simulator to all receiving viewers without any further processing of the animations by those viewers.
  • Enhanced LSL integration for animation control.
  • Adoption of better animation standards – possibly glTF.
  • Given the project is incorporating a lot of additional ideas, it is likely to evolve into a rolling development, with immediate targets for development / implementation decided as they are agreed upon, to be followed by future enhancements. As such, much of what goes into the meetings at present is general discussion and recommendations for consideration, rather than confirmed lines o development.

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Further Information

Meeting Notes

Viewer Progress

  • An updated version of the viewer – version 6.6.12.579958 – was release on Thursday, May 11th.
    • This update includes access to Rider Linden’s experimental attachment point tracking & forwarding to the server feature.
    • It also includes various incremental improvements to handling puppetry, such as support for parsing binary LEAP data from the LEAP script.
  • Avatar attachment point tracking (per the TPVD meeting discussion on May 12th):
    • This allows the tracking of joints (using attachment points) using a script.
    • Using visible attachment points (i.e. those on the avatar , NOT any of the screen-based HUD attachment points) cuts down on the the amount of data having to be be handled at both ends.
    • The speed at which the attachment point movement is read back is such that it could not be exploited to create a copy of an animation with any real fidelity.
    • This is a deliberate move to ensure that animation creators are not left feeling uncomfortable about LSL animation tracking.
    • There is a combined throttle / sleep time elements to tracking attachment points: the throttle limits the number of attachment points that can be tracked over a certain period of time; he script sleep time is designed to allow an animation to move those attachment points forward sufficiently before a further taking record is made. Thus, it is next to impossible to track and record a coherent animation frame.
  • It was noted that previously, joint constraints had been hard coded in C++, but their configuration has been moved into a human-readable LLSD file which can be modified without rebuilding the viewer.
  • Two areas of focus going forward are:
    • Improving the Inverse Kinematic (IK) system within the viewer – something Leviathan Linden is already working on. This will include overall improvements to IK constraints as well as to positioning, with the existing source-code constraints moved  replaced by a further config file – “constraints” here being in terms of joint rotation / movement.
    • Providing .FBX animation file import and Mixamo skeleton re-targeting.
  • The IK work is still being thrashed out (and subject to much more discussion at meetings, but is seen as a priority over other elements of work, such as the animations streaming idea Leviathan Linden had been working on. The hope is that by improving IK, it will play into streaming and “live” animations a lot more robustly and smoothly. It is also seen as a foundational piece of work for further opening up puppetry and animation work.

General Notes

  • It was noted that for animation import, LL is looking towards using / supporting Assimp (github: https://github.com/assimp/assimp), which supports some 30-40 animation formats, converting them to it ownformat for ease of import to multiple platforms. Notably, it supports .FBX and glTF, so it fits with the Lab’s goal of utilising glTF for materials, mesh imports, etc.
  • [TPVD meeting, May 12th] This will not alter the existing internal format for animation. It is just  to allow the import of other formats.
  • It is acknowledged that alongside o that, the Labe will require a retargeting system for animations; although what form this will take is still TBD.
  • The core of the meeting was a general discussion of things that might / could e done in the future, and what technologies LL might look towards.

Date of Next Meeting

  • Thursday, May 25th, 2023, 13:00 SLT.

2023 SL Puppetry project week #15 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

The following notes have been taken from chat logs and audio recording of the Thursday, March 23rd, 2023 Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).

Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Project Summary

General Project Description as Originally Conceived

LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions. 
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the the avatar. Can that be the new improved direction of the Puppetry project? How to do it?

Leviathan Linden

  • Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
    • Note that facial expressions and finger movements are not currently enabled.
    • Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
  • The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
  • Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
  • No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
  • There is a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).

Additional Work Not Originally In-Scope

  • Direct avatar / object / avatar-avatar interactions (“picking up” an apple; high-fives. etc.).
  • Animations streaming: allowing one viewer to run animations and have them sent via the simulator to all receiving viewers without any further processing of the animations by those viewers.
  • Enhanced LSL integration for animation control.
  • Adoption of better animation standards – possibly glTF.
  • Given the project is incorporating a lot of additional ideas, it is likely to evolve into a rolling development, with immediate targets for development / implementation decided as they are agreed upon, to be followed by future enhancements. As such, much of what goes into the meetings at present is general discussion and recommendations for consideration, rather than confirmed lines o development.

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Further Information

Meeting Notes

Viewer Progress

  • An updated version of the project viewer is still “close” – it is currently awaiting clearance by QA.
  • This will include the attachment point tracking discussed in previous meetings (e.g., a step towards being able to “pick things up” in SL. The simulator support for this is already in place on the Aditi Puppetry regions.
  • However when available it will not include:
    • LSL animation control as yet (which has yet to be added to the simulator code anyway. Rider Linden believes he has a good protocol for single avatar animation, but would rather work on it some more.
    • Any IK improvements, as Leviathan Linden is still working on these.
    • Any extended LEAP API functionality (the added features for getting world position/orientation, lookat position/orientation, camera position/orientation/target). This will be coming in a future viewer update.
  • Another thing in this release will be the change to llsd_binary for the LEAP messaging protocol. This will be in the release notes but to use it you will want to update to either 1.3.1 of llbase and/or 1.2.0 of llsd for python scripts. Messages from LEAP scripts to the viewer will still work with the older python libraries, but messages from the viewer to the script will not be parsed correctly.

Server-Side Work

  • The LSL function API has been published to the Content Creation Discord group (sorry, I’ve been asked by LL not to publish details on joining the server – if you are a content creator interested in joining it, please contact Vir Linden or attend a meeting (Content Creation / Puppetry and ask in person).
  • Getting attachment point positions has been given a throttle, in part to not make it trivial to use LSL to rip an animation, and in part to prevent the server doesn’t get overwhelmed. This latter rate of throttling is variable and can change as load increases/decreases. However, as Rider linden noted, there would always be some delay and some disagreement about the actual position of the attachment point between LSL and all the observing viewers. As such, function is not meant for a high-fidelity use. Collision volumes on the attachment points will be a better solution in this respect, but that is functionality which is still down the line.

General Notes

  • Leviathan Linden’s work for streaming the full avatar animation state has stalled, due to it essentially hijacking the main puppetry data channel to send everything, even when not running a puppetry script, through LEAP. As such, Leviathan thinks it needs to be moved to its own experimental viewer.
  • Simon Linden’s work on allowing animation uploads of new/different formats has been decoupled from the Puppetry project’s codebase, and is now being built on the main viewer branch, allowing it to move forward without dependencies on Puppetry.
  • OpenXR support as a LEAP plug-in is still seen as desirable, since it would allow support for a broader range of devices. However, it is seen as a little more “down the road”, as there is some core infrastructure that needs to finish being vetted prior to work starting on this.

My thanks to Jenna Huntsman for the chat transcript from the meeting, and you can see her video recording of the session here.

Date of Next Meeting

  • Thursday, April 27th, 2023, 13:00 SLT.

2023 SL Puppetry project week #12 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

The following notes have been taken from chat logs and audio recording of the Thursday, March 23rd, 2023 Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).

Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Project Summary

General Project Description as Originally Conceived

LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions. 
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the the avatar. Can that be the new improved direction of the Puppetry project? How to do it?

Leviathan Linden

  • Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
    • Note that facial expressions and finger movements are not currently enabled.
    • Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
  • The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
  • Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
  • No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
  • There is a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).

Additional Work Not Originally In-Scope

  • Direct avatar / object / avatar-avatar interactions (“picking up” an apple; high-fives. etc.
  • Animations streaming: allowing one viewer to run animations and have them sent via the simulator to all receiving viewers without any further processing of the animations by those viewers.
  • Enhanced LSL integration for animation control.
  • Adoption of better animation standards – possibly glTF.
  • Given the project is incorporating a lot of additional ideas, it is likely to evolve into a rolling development, with immediate targets for development / implementation decided as they are agreed upon, to be followed by future enhancements. As such, much of what goes into the meetings at present is general discussion and recommendations for consideration, rather than confirmed lines o development.

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Further Information

Meeting Notes

Viewer Progress

  • An updated version of the project viewer is still “close” – it did not make it past QA, but it is hoped it will be available soon.
  • Leviathan Linden is working on the IK system in order to try to make it more robust to handle the kind of work Rider Linden is doing, but is not at the point of having anything ready for delivery into a viewer, although the idea was to have something possibly ready by the viewer update after the one that is still waiting to be cleared for project release.

Server-Side Work

  • Following the meeting, the current Puppetry test regions on Aditi were due to be updated with a simulator version which merges the server-side Puppetry code with the latest release of the simulator code.
  • Rider Linden is continuing to work on llSetAttachmentPoint, intended to allow avatars “pick up” objects using puppetry. At the time of the meeting he was completing work on ANIM_POS/ROT_TARGET which is intended to try to keep the attachment point directed  at an object in world (as opposed to a fixed location or a location relative to the avatar.
    • This uses a command to tell the viewer to carry out the necessary IK work to move an attachment point as close to a location as it can get, using the target object’s UUID / location as reported by the simulator.
    • The idea is that this functionality will work not only with hands / arms but also with other appendages (e.g. wings).
    • In theory, this should also allow avatars to touch attachment point on other avatars (e.g. holding hands), however, exactly how this works within the framework of the Permissions system – in terms of others accepting / refusing any direct interaction through something like a dialogue request as we see today – has yet to be worked out.
  • This led to a broader discussion on attaching object to avatars, the core of which is summarised below.

 Object Parenting vs. Temp Attachments

  • The idea of being able to use Puppetry to reach out and grasp a (suitably scripted) object in-world – for example, an apple, a bottle or similar) raised questions on how the process will work.
    • Currently, “temp” attachment can be made to avatars (e.g. via an Experience), but this still actually requires the object being temporarily transfers to the avatar’s inventory (where it does not how up) and from there attached to the relevant attach point (e.g. a hand).
    • This is somewhat slow and cumbersome – particularly if you want to do something with the object (e.g. if it is a ball, throw it), as the object needs to be picked up, held, follow the throwing motion of the avatar’s arm, detach at the point of release, resume its status as a physical object in-world, have direct and velocity applied, and then move in the direction of the throw.
    • The suggestion was made that to simplify things, a concept of avatar-to-object parenting needs to be introduced to SL – so when the ball is picked up, it immediately becomes a child of that avatar – no need for the passing to inventory, attaching from there, detaching to inventory / deletion, etc., as seen with temp attachments.
  • Developing a hierarchy scheme for SL is already being mused through the Content Creation User Group, so it was suggested this could help present a road to object parenting with avatars. However, as it will take some time for any hierarchy system to be developed and implemented – and given it falls outside of the Puppetry project – , its might mean that existing mechanisms may have to be accepted, even if they do place some limitations on what might be achieved until such time as a hierarchy system can be introduced.
  • As an alternative, it was suggested that the physics engine might be used to attach a degree of object parenting to an avatar:
    • The physics engine allows actions to be created, which are updated every sub-step; so in the case of a physical object, it should be possible to write an action that says, “Follow the hand of the avatar picking you up”.
    • Then, as long as a the physics engine knows the position of the “holding” hand, the object could move with it; while there would be a small degree of physics lag, as long as the viewer knows to render the object at the avatar’s hand, rather than where the physics updates are saying he object is, this should not be visually noticeable.
    • This approach would not require an explicit hierarchy system, but it would require the viewer to send updates on the avatar’s hand position to the simulator’s physics engine – which is available.
    • The idea is modelled on the old physics action that perhaps most famously allowed a beach ball to be grabbed and pushed / lifted up onto a table as a part of the orientation process to using SL incoming new users used to go through, and can still be used today to push suitably scripted objects around.
    • If possible, this approach would also need some form of constraints and permissions (e.g. you don’t really want to make your entire building capable of being grabbed and shunted around willy-nilly).

General Notes

  • There was a general conversation on Permissions within avatar-to-avatar interactions and where they need to fall.
    • As noted above, there will need to be some form of explicit granting of permission for Puppetry-generated hugs, handshakes, high-fives, etc.
    • However, concern was raised about permissions being needed for basic one-way interactions – such as pointing at someone. The concern here being that the LookAt debug targets has become so conflated with ideas of “privacy” tools (“Stop looking at me, perv! You don’t have permission!”) TPVs have surfaced the means to disable LookAts being sent via a viewer so that people do not get shouted at if their LookAt cross-hairs happen to reside on another avatar. Thus, the worry was that if there is a similar kind of visual indicator used for Puppetry, it might result in a similar reaction.
    • The short answer to this was, no, it should not be an issue as avatar locations can already be obtained through LSL without the need for a visual indicator being generated by the viewer.

Date of Next Meeting

  • Thursday, April 13th, 2023, 13:00 SLT.

2023 SL Puppetry project week #10 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

The following notes have been taken from chat logs and audio recording of the Thursday, March 9th, 2023 Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).

Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Project Summary

General Project Description as Originally Conceived

LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions. 
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the the avatar. Can that be the new improved direction of the Puppetry project? How to do it?

Leviathan Linden

  • Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
    • Note that facial expressions and finger movements are not currently enabled.
    • Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
  • The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
  • Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
  • No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
  • There is a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).

Additional Work Not Originally In-Scope

  • Direct avatar / object / avatar-avatar interactions (“picking up” an apple; high-fives. etc.
  • Animations streaming: allowing one viewer to run animations and have them sent via the simulator to all receiving viewers without any further processing of the animations by those viewers.
  • Enhanced LSL integration for animation control.
  • Adoption of better animation standards – possibly glTF.
  • Given the project is incorporating a lot of additional ideas, it is likely to evolve into a rolling development, with immediate targets for development / implementation decided as they are agreed upon, to be followed by future enhancements. As such, much of what goes into the meetings at present is general discussion and recommendations for consideration, rather than confirmed lines o development.

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Further Information

Meeting Notes

Viewer Progress

  • An updated version of the project viewer is due to be made available once it has cleared LL’s QA process. This includes:
    • Using the binary protocol for the LEAP module communication, with new logic which causes LEAP modules to one be loaded by the viewer when they are used.
    • The AgentIO LEAP module adds the ability to adjust the look at target, viewer camera and agent orientation.
    • Support for sending the joint position of your avatar to the server, which is then available in LSL.
      • The code reports the post animation location for attachment points, allowing the the sever to know where things like hands and wings, etc.,  are, and this in turn allows LSL to query where that attachment point is in space and how it is rotated.
  • HOWEVER, the animation streaming code (see previous Puppetry meeting notes) will not be in the next viewer update.

Server-Side Work

  • The simulator code now has llGetAttachmentPointAnim() support, which should be recognised by the upcoming viewer update.
  • The Aditi puppetry regions are to be merged with the updated code so this can be tested.
  • While there has been some work completed on animation imports since the last meeting, there was nothing significant for LL to report on progress at this meeting.

General Notes

  • There is additional work going on to try to improve the IK system, with the aim of having the basics working better than is currently the case – better stability, etc. This work may appear in the viewer update after the one currently being prepared to go public.
  • Performance:
    • To prevent puppetry generating too much messaging traffic (UDP) between the viewer and simulator, a throttle is being worked on so that when the simulator is under a heavy load from multiple viewers running puppetry code, it can tell them all to tone down the volume of messages.
    • There will also be some switches and logic put into place that can be used when needed, helping to protect regions in case the load gets overwhelming.
    • A further suggestion made is to ensure the simulator does not broadcast puppetry messages for avatars seated and not using the code (such as an audience at a performance) to further reduce to volume of messaging, this is viewed as a potentially good avenue of work to consider.
    • There is also a threshold in place – if an attachment point does not move beyond it, it is not considered as moved, which will hopefully also reduce the amount of messaging the simulator has to handle.
  • LSL Integration:
    • See: OPEN-375: “LSL Functions for reading avatar animation positions”.
    • This work is now paused. Rider Linden developed a proof of concept, but found that in order to better manipulate parameters within the constraints, a configuration file should be used. He is therefore refactoring the code to do this before proceeding further.
    • The configuration file will be called avatar_constraints.llsd and it will live alongside avatar_lad.xml in the character directory.
  • Questions were again raised on whether Puppetry is for VR / will enable the viewer to run VR.
    • It was again pointed out that while Puppetry lays more foundational work which could be leveraged for use with VR headsets, than is not the aim of the Puppetry project.
    • Providing VR headset support is a much broader issue, which would require the involvement of other teams from LL – Product, the Graphics Team, the viewer developers, etc.

Date of Next Meeting

  • Thursday, March 23rd, 2023, 13:00 SLT.