
The following notes have been taken from chat logs and audio recording of the Thursday, June 8th, 2023 Puppetry Project meetings. Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.
Meeting And Project Overview
- The Puppetry User Group exists to provide an opportunity for discussion about the development of, and feature for, the upcoming Second Life Puppetry project(see below for details), bugs, and feature ideas.
- These meetings are conducted (as a rule):
- On Aditi (the Beta grid) at the Castelet Puppetry Theatre, commencing at 13:00 SLT.
- Those encountering issues in logging-in to Aditi should contact Second Life support for assistance.
- Generally on alternate Thursdays to the Content Creation meetings.
- Comprise a mix of text and voice – attendees can use text only, if preferred, but should enable to hear comments / responses given in voice.
- These meetings are open to anyone with a concern / interest in the Puppetry project, and form one of a series of regular / semi-regular User Group meetings conducted by Linden Lab. Dates and times of all current meetings can be found on the Second Life Public Calendar, and descriptions of meetings are defined on the SL wiki.
General Project Description as Originally Conceived
LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions.
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the avatar. Can that be the new improved direction of the Puppetry project? How to do it?
Leviathan Linden
- Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the LLSD Event API Plug-in (LEAP) system.
- Note that facial expressions and finger movements are not currently enabled.
- Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
- The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
- Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
- No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
- There is a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).
Additional Work Not Originally In-Scope
- Direct avatar / object / avatar-avatar interactions (“picking up” an apple; high-fives. etc.).
- Animations streaming: allowing one viewer to run animations and have them sent via the simulator to all receiving viewers without any further processing of the animations by those viewers.
- Enhanced LSL integration for animation control.
- Adoption of better animation standards – possibly glTF.
- Given the project is incorporating a lot of additional ideas, it is likely to evolve into a rolling development, with immediate targets for development / implementation decided as they are agreed upon, to be followed by future enhancements. As such, much of what goes into the meetings at present is general discussion and recommendations for consideration, rather than confirmed lines o development.
Bugs, Feature Requests and Code Submissions
- For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
- There is also a public facing Kanban board with public issues.
- Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:
- Discuss them with the Puppetry team: and work with them to ensure a proper convergence of ideas.
- Be signed-up to the Lab’s contribution agreement in order for submitted code to be accepted for review / use:
- Contributor Agreement notes – SL wiki.
- Contributor Agreement FAQ – SL wiki.
- Code Contributor agreement – PDF form.
Resources
- LEAP Source Code
- Viewer Source Code
- Alternate Viewers page – for the latest version of the Puppetry viewer.
- Documentation:
- Introducing Second Life Puppetry – Linden Lab blog post (August 30th, 2022).
- Second Life Puppetry wiki index page – note that more subjects / categories are / will be added over time.
Meeting Notes
- The viewer remains at version 6.6.12.579958, issued on Thursday, May 11th.
- This update includes access to Rider Linden’s experimental attachment point tracking & forwarding to the server feature.
- It also includes various incremental improvements to handling puppetry, such as support for parsing binary LEAP data from the LEAP script.
- Work has slowed a little due to Linden staff being out-of-office recently (hence why no meeting since May 11th), and personnel on the Puppetry project also working on other simulator projects.
- This has notably impacted Leviathan Linden’s work on Inverse Kinetics (IK), which has had a knock-on impact slowing Rider Linden’s work on LSL support for driving puppetry.
- However, progress has resumed on the IK work and while described as currently “not stable”, and problems are still to be solved in situations where a target position is too far away for a joint in the skeleton to reach, or where multiple joints (e.g. 5 or 6) are involved.
- One issue that is proving difficult to handle is the default avatar mesh joint weighting is incorrect along the forearm and wrist. What is required is two distinct joints at the forearm to do mesh bending correctly: a hinge at the elbow and also a twist constraint along the forearm bone, toward the wrist, rather than (as is currently the case) treating the wrist as a ball joint. This may be the subject of further internal discussion at LL as Leviathan gets more of the IK work nailed down.
- WRT IK:
- Leviathan is looking to solve the targeting issues first, then work back to ensure that there are no collisions between a limb and avatar body (e.g. reaching across the avatar’s body to pick something up, and the avatar’s elbow / part of the arm appears to go through the body).
- Forward And Backward Reaching Inverse Kinematics (FABRIK) – which is the fundamental algorithm for suggesting new joint positions in a range of applications, including 3D modelling – is the route of choice of the Lab; however, adopting to FABRIK is taking some trial and error.
Additional Notes
- Aura Linden is here but she is working on the Animation importer project which has been split off from the Puppetry project. Currently, the status is animations can be import import animations from some tools/formats, but others aren’t working yet.
- It was noted at the last meeting, that for animation import, LL is looking towards using / supporting Assimp (github: https://github.com/assimp/assimp), which supports some 30-40 animation formats, converting them to it ownformat for ease of import to multiple platforms. Notably, it supports .FBX and glTF, so it fits with the Lab’s goal of utilising glTF for materials, mesh imports, etc.
Date of Next Meeting
- Thursday, June 22nd, 2023, 13:00 SLT.