
The following notes have been taken from chat logs and audio recording of the Thursday, February 9th, 2023 Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).
Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.
Project Summary
General description of the project and its inception:
LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions.
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the the avatar. Can that be the new improved direction of the Puppetry project? How to do it?
Leviathan Linden
- Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the LLSD Event API Plug-in (LEAP) system.
- Note that facial expressions and finger movements are not currently enabled.
- Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
- The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
- Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
- No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
- This project is taking in a lot of additional ideas – animation standards, improving the current animation system, enabling truer avatar / avatar and avatar object interactions such that it is likely to evolve into a rolling development, with immediate targets for development / implementation as they are agreed upon, to be followed by future enhancements.
- As such, much of what goes into the meetings at present is general discussion and recommendations for consideration, rather than confirmed lines o development.
- There is a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).
Bugs, Feature Requests and Code Submissions
- For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
- There is also a public facing Kanban board with public issues – those experiencing issues can also contact Wulf Linden.
- Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:
- Discuss them with the Puppetry team: and work with them to ensure a proper convergence of ideas.
- Be signed-up to the Lab’s contribution agreement in order for submitted code to be accepted for review / use:
- Contributor Agreement notes – SL wiki.
- Contributor Agreement FAQ – SL wiki.
- Code Contributor agreement – PDF form.
Further Information
- Introducing Second Life Puppetry – Linden Lab blog post (August 30th, 2022).
- Puppetry: How it Works – Second Life Knowledge Base.
- Second Life Puppetry wiki page.
- Second Life Public Calendar – meeting dates.
- Alternate Viewers page – for the latest version of the Puppetry viewer.
Meeting Notes
Animation Streaming
- Leviathan Linden has been experimenting with animation streaming over the viewer’s animation channel, such that whatever is sent from the controlling viewer is played directly by all receiving viewers without any further processing of the animations by those receiving viewers.
- This had been discussed in previous meetings as a potential means of lightening the load of animation data processing individual viewers would have to perform, reducing the potential performance impact in situations where animation synchronisation is important. It also lays something of a further foundation for more procedural-based animation processing, allowing viewers to work smarter – sending less data more frequently, which will in turn help enable synchronised animations through puppetry, such as spontaneously sharing a high five.
- With this initial test, featuring just one viewer using puppetry and one receiving it, actually revealed a more noticeable lag in streaming compared to individual processing and playback of received animation data. It is not clear at this time whether this would worsen in situations where multiple puppetry animations are being streamed / received.
- The videos were initially posted to the restricted-access Second Life Content Creation Discord server, and then combined into a single video by Kadah Coba, which is reproduced here as an animated GIF – my thanks to Kadah for the work in combining the videos.

- Leviathan notes this is a very quick and dirty test, requiring “some hackery” within the viewer’s animation code, but does not (as yet) require any changes to the server-side puppetry management code.
- Further refinement of the code is required, together with further testing to see if the approach can be smoothed / improved, as such, the work is not currently fit for integration into the Puppetry Project viewer, although it has been suggested it might be offered as a temporary, separate test viewer to allow broader testing.
- On potential issue is that the streaming is currently dependent on the reliability of the originating viewer; if it is running at a low FPS, potentially causing the receiving viewers see a “choppier” result with lagging and smooth animations within the stream.
LSL Integration
- See: OPEN-375: “LSL Functions for reading avatar animation positions”.
- Rider Linden has been working on an LSL API to control animation on an avatar.
Conceptually it would behave like LEAP from LSL. The simulator will send the animation instructions down to the targeted viewer which will perform the IK and animation, and then send the results back as though they were coming from any other LEAP plugin. Targeting objects should be possible with the API (although I’ll add a world position/rotation parameter so you don’t have to do the math yourself).
– Rider Linden
- This work is possibly best described as moving a step towards enabling an avatar using puppetry to reach out and pick up an apple by allowing a script to position the avatar’s hand at the location of the apple, from where the user can use a supported capture tool to “pick up” the apple.
- The envisioned actions would be: the user move their avatar’s arm towards the apple, the apple detects the collision between the avatar hand and itself and attaches to the hand as if it has been directly picked up.
- Further work is required involving collisions between the apple and the avatar’s hand, so the apple knows it is being “grabbed”. This might be achieved by using an existing collision event as the trigger for attachment, or an entirely new event.
- One problem is to avoid having multiple extra collision objects bouncing around the physics engine for every single attachment point on an avatar (55 in total), which would add-up, performance-wise, very quickly.
- One suggestion for mitigating this is that as region knows where your hand is (which is true with the attachment update stream), it could be possible by implementing a new “grab” action that works in the physics simulation for picking up small objects; however, it would likely need some hint/magic on the viewer to render the object “at the hand” rather than “near the hand”..
- Beyond this, there is also additional work to allow avatar-to-avatar interactions via puppetry – such as the aforementioned high five – which involves addressing some permission issues.
In Brief
- Concern was raised that emphasis on puppetry over the traditional canned animation assets is that it could make SL inaccessible for some (because of the need for additional motion capture hardware, a possible need for more powerful client computers, etc). In response, those at the meeting pointed out:
- The approach being taken by the Lab is not new – it has been a common factor (albeit implemented in a verity of ways) within games for well over a decade, and is used in multi-player games without participants being “lagged” to the point where gameplay is broken.
- What is being proposed with Puppetry is not even “new” (in the broadest sense); rather it is adding a further layer of animation capabilities to Second life which can enable a greater sense of interactivity to the platform.
- In terms of hardware, it was further pointed out that while some at the meeting are using VR hardware – headsets and peripherals – all that is actually required to start leveraging the capabilities (and as LL have demonstrated in the animated GIF forming the banner of this summary is a basic webcam.
- In a more general conversation, it was pointed out by those at the meeting and the Lab engineers that:
- Whilst things like streaming puppetry animations may at times result in more visible lag / animation desynchronization, it offers so much more in the way of avatar interaction with the world, it would be more than worthwhile.
- This work is purely about puppetry and interactivity; it does not actually alter the way more general animations – walking, standing, etc., work, as the underpinning locomotion engine within the simulator and how the viewer calculates motion based on data from the simulator is not being altered.
- Instead, the LSL API (and the LEAP API?) will enable general avatar orientation and attachment point orientation / movement to ensure that arm correctly reaches out to “grab” the apple mentioned above, by effectively running in conjunction with the locomotion engine.
Date of Next Meeting
- Thursday, February 23rd, 2023, 13:00 SLT.