
The following notes have been taken from chat logs and audio recording of the Thursday, January 12th, 2023 Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).
Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.
Project Summary
General description of the project and its inception:
LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions.
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the the avatar. Can that be the new improved direction of the Puppetry project? How to do it?
Leviathan Linden
- Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the LLSD Event API Plug-in (LEAP) system.
- Note that facial expressions and finger movements are not currently enabled.
- Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
- The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
- Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
- No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
- This project is taking in a lot of additional ideas – animation standards, improving the current animation system, enabling truer avatar / avatar and avatar object interactions such that it is likely to evolve into a rolling development, with immediate targets for development / implementation as they are agreed upon, to be followed by future enhancements.
- As such, much of what goes into the meetings at present is general discussion and recommendations for consideration, rather than confirmed lines o development.
- There is a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).
Bugs, Feature Requests and Code Submissions
- For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
- There is also a public facing Kanban board with public issues – those experiencing issues can also contact Wulf Linden.
- Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:
- Discuss them with the Puppetry team: and work with them to ensure a proper convergence of ideas.
- Be signed-up to the Lab’s contribution agreement in order for submitted code to be accepted for review / use:
- Contributor Agreement notes – SL wiki.
- Contributor Agreement FAQ – SL wiki.
- Code Contributor agreement – PDF form.
Further Information
- Introducing Second Life Puppetry – Linden Lab blog post (August 30th, 2022).
- Puppetry: How it Works – Second Life Knowledge Base.
- Second Life Puppetry wiki page.
- Second Life Public Calendar – meeting dates.
- Alternate Viewers page – for the latest version of the Puppetry viewer.
Meeting Notes
LSL Integration
- See: OPEN-375: “LSL Functions for reading avatar animation positions”.
- Rider Linden is starting to look at LSL integration – the first step being to make the simulator aware of what is actually animating.
- Currently, the code he has developed lets the server know the position of an avatars attachment points; this sends details of 55 points (HUD points excepted). Attachment points have been selected over bones, as the simulator already has a solid concept of attachment points, and it avoids complications with rigged meshes “doing their own thing” with bone positions.
- A concern with this is the number of updates being sent to the server for processing.
- One idea is to refine the the code so that only the attachment points which change relative to the avatar centre (avatar frame/Local Position relative to the avatar) actually send information to the server, in order to reduce the number of updates being generated.
- Another idea might be to only send updates every n frames, rather than every frame. This would reduce the fidelity of movement, but could still provide sufficient data while reducing the load on the simulator, particularly where multiple avatars in a region are using puppetry.
- This issue is related to synchronising puppetry actions across multiple viewers as well; a long-standing issues, given that animation playback of animations is viewer-side, and not genuinely across viewers (the resync function found in some TPVs only does so locally).
- All of the above lead to a discussions of ways and means to best allow LSL integration with animations and ensure a reasonable transmission of results together with decent synchronisation between the viewer and the simulator, whether by frame count or time stamp, in order to ensure predictability of results across multiple viewers. .
- In addition, the discussion included the advantage in enhancing Second Life to support procedural animations as well as the current canned animations.
- Rider is also looking into a script enhancement to register collisions.
- There was some conflating of ideas during the discussion – immediate first steps in opening Puppetry to LSL, and more far reaching goals – setting position, registering collisions (per the above), defining better interpolation for positioning (e.g. as defined in the Khronos glTF specification), etc., which caused a degree of confusion.
- However, the openness towards making Puppetry a good foundation for future enhancement (such as moving more to procedural-based animations, enabling SL to support “industry standard” animation workflows to encourage animators into the platform, etc., remains, together with (hopefully) enabling more realistic avatar / avatar and avatar / object interactions.
- That said, Simon Linden did offer a not of caution to all discussing the work:
Not to pop the bubble, but every one please keep in mind all the stuff we’re talked about is experimental and really interesting. I have no idea what we can make into real features and what can work with crowds and all the other interesting problems to make it happen well – we’ll see what we all can do this year 🙂
– Simon Linden
Date of Next Meeting
- Thursday, January 26th, 2023, 13:00 SLT.