
The following notes have been taken from chat logs and audio recording of the Thursday, November 10th Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).
Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.
Project Summary
- Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the LLSD Event API Plug-in (LEAP) system.
- Note that facial expressions and finger movements are not currently enabled.
- Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
- The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
- Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
- No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
- There is now a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).
Bugs, Feature Requests and Code Submissions
- For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
- There is also a public facing Kanban board with public issues – those experiencing issues can also contact Wulf Linden.
- Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:
- Discuss them with the Puppetry team: and work with them to ensure a proper convergence of ideas.
- Be signed-up to the Lab’s contribution agreement in order for submitted code to be accepted for review / use:
- Contributor Agreement notes – SL wiki.
- Contributor Agreement FAQ – SL wiki.
- Code Contributor agreement – PDF form.
Further Information
- Introducing Second Life Puppetry – Linden Lab blog post (August 30th, 2022).
- Puppetry: How it Works – Second Life Knowledge Base.
- Second Life Puppetry wiki page.
- Second Life Public Calendar – meeting dates.
- Alternate Viewers page – for the latest version of the Puppetry viewer.
Meeting Notes
Viewer and Plug-in Updates
- The puppetry team is working on updating the viewer and LEAP plug-in, and an update to the project viewer is liable to be released in week #46.
- This viewer includes:
- The ability to move the avatar pelvis.
- Ability to stretch other bones – although this is awaiting testing at the time of writing. However, the reference frame scale is that of the normal puppetry targets, so you would have to scale the data correctly; therefore additional work on this is required to provide a way for the plug-in to get the data necessary to know now to scale individual joint bones (e.g. change their parent-relative positions).
- It still won’t be possible to clear puppetry target/config data, which remains on the teams “to do” list.
- Aura Linden noted the new LEAP module it initialises on-demand rather than via instantiation (as with puppetry). LL will provide demos of using the new module.
Kincect v2 Support
- Simon Linden has been working on an experimental plug-in taking inputs from a Kincect v2 device.
- He describes the the code as being “pretty rough” and using only basic geometry, but it allows avatar elbows / arms to be moved around.
- This work in part utilises the data syntax described in OPEN-366 “Simplify Puppetry Configuration Through LEAP”, the new protocol proposed by Leviathan Linden as per previous meeting notes.
- The code is not ready to be pushed to a public branch as let, and doing so is somewhat dependent on feedback from developers /creators.
Avatar Constraints / Interactions
- OPEN-368 “[Puppetry] [LEAP]: Location Constraints” – LL have indicated there is “much” within this Jira they would like to support “eventually”.
- The feeling at the Lab is that constraints can “definitely” be improved – although what this may look like has yet to be properly determined. However, the general feeling is that there should be constraint data associated with a given skeleton, for example, so we’re not just imposing a human-centric model on the SL avatar.
- A good portion of the meeting was given over to a general discussion of how best to handle puppetry and avatar animations – and the potential to need to move away from canned animations and provide a more direct means of avatar animation.
- Avatar-object interactions are potentially complex issue (e.g. how can an avatar accurately take and hold an in-world object – say an apple) through puppetry? If the apple is a physical object, does it collide when held? Does it become an attachment? If the latter, how is this registered, together with hold is it properly released from the attachment system? etc.).
- A suggestion for handling avatar’s handling objects is to have some for of temp-attach system or to use a key frame motion (KFM) system to match the position to the avatar’s hand, allowing the avatar can hold the object without directly “owning” it (thus also avoiding permission system issues).
- Collisions also raise questions: avatar arms currently do not collide, and so would not under puppetry. So what about cases of simple interactions – flicking a light switch or similar. These are not “proper” collisions per se, but are rather event-triggered; how can this be managed if there is no actual collision between the scripted object and an avatar’s arm / hand to trigger the associated event?
In Brief
- It has been suggested that a version number is included in puppetry-related messaging, so that changes to message formats are not read by versions of the viewer unable to do so, thus reducing the risk of crashes during development / testing.
- It has been indicated that puppetry will eventually have LSL support for LEAP. Although what form this will take and how the simulator will track things is still TBD, as currently animations are entirely viewer-side and untracked by the simulator.
- There is concern that understanding of the potential of the puppetry project isn’t being fully understood by creators (and others) as it is being seen more as a “VR thing” than an ability to much improve avatar animations and their supporting systems / constraints, including the IK system.
- How to manage network latency also formed a core discussion, together with making better use of the Havok physics sub-licence to allow the viewer do a lot more of the work, and simply stream the results through the simulator to other viewers.
Date of Next Meeting
- Thursday, December 8th, 2022, 13:00 SLT.