2023 SL Puppetry project week #8 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

The following notes have been taken from chat logs and audio recording of the Thursday, February 24th, 2023 Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).

Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Project Summary

General Project Description as Originally Conceived

LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions. 
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the the avatar. Can that be the new improved direction of the Puppetry project? How to do it?

Leviathan Linden

  • Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
    • Note that facial expressions and finger movements are not currently enabled.
    • Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
  • The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
  • Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
  • No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
  • There is a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).

Additional Work Not Originally In-Scope

  • Direct avatar / object / avatar-avatar interactions (“picking up” an apple; high-fives. etc.
  • Animations streaming: allowing one viewer to run animations and have them sent via the simulator to all receiving viewers without any further processing of the animations by those viewers.
  • Enhanced LSL integration for animation control.
  • Adoption of better animation standards – possibly glTF.
  • Given the project is incorporating a lot of additional ideas, it is likely to evolve into a rolling development, with immediate targets for development / implementation decided as they are agreed upon, to be followed by future enhancements. As such, much of what goes into the meetings at present is general discussion and recommendations for consideration, rather than confirmed lines o development.

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Further Information

Meeting Notes

General Progress

  • LSL Integration:
    • See: OPEN-375: “LSL Functions for reading avatar animation positions”.
    • Rider Linden has not been able to get a lot done on the scripted control due to being out of the office. He does have the LSL function discussed in the last meeting so that it is correctly sending the necessary data down to the agent’s viewer.
    • He is now working now on how to feed that into the IK, and has a general framework, although he notes it’s been slow progress.
  • Simon Linden has been working on animation importing. This is additional work in terms of the Puppetry project, but comes as a result of discussions at previous meetings.
    • He is looking to add additional .BVH support, and possibly .FBX (e.g. .FBX using some specific skeletons and settings; the goal is to be able to get data out of animation tools and into SL without requiring 2 years of Blender skills). Given the general move towards glTF, this is seen as being more preferable (there is a possible appetite within LL for a re-write of the animation system, although it not on the immediate horizon (or a visible horizon at present).
    • Requests are still being made to allow animation priorities to be changed post-upload and edit animation values dynamically – it is not clear how much of this will be touched.
    • Changing the manner in which animation priorities currently work is not something LL are planning on touching.
    • Right now the messages that transmit what animations to play do not have a way to specify a priority, just the animation’s asset ID and the viewer will get the priority from the asset. This may change in the future, but the focus right now is on getting scripted animation control improved.
  • Leviathan Linden is continuing to work on animation streaming, but progress has been delayed due to bug hunting and fixing. However, he hopes to get the code into the Puppetry project viewer branch sooner rather than later. He has noted that this is very sensitive to bad framerate on the sender and on the simulator. This probably means that before animation streaming and/or puppetry could be “delivered”, some technical debt on the server at least.
  • The focus at the moment is on putting everything that has been worked on together and then making sure it all works within the viewer. After that comes the issue of making sure that things work between viewers (e.g. that 20 people running animation streaming in a scene does not result in the viewers collapsing or being unable to playback all the streams; ensuring the new capabilities paly nicely with existing “canned animation” systems (e.g. dance machines, etc.).

In Brief

  • It’s been noted that moving the simulators to 64-bit is being worked on.

Date of Next Meeting

  • Thursday, March 9th, 2023, 13:00 SLT.

2023 SL SUG meetings week #8 summary

Moruya Sanctum, December 2022 – blog post

The following notes were taken from the Tuesday, February 21st, 2023 Simulator User Group (SUG) meeting. They form a summary of the items discussed and is not intended to be a full transcript. A video of the entire meeting is embedded at the end of the article for those wishing to review the meeting in full – my thanks to Pantera for recording it.

Server Deployments

  • There are no planned deployments for the week, so the various channels will just be restarted.
  • Release 578100, made to the BlueSteel RC channel had to be rolled back post-deployment as a result of BUG-233402 “Second Life Server 2023-02-02.578100 – LSO scripts not running on_rez() event.”

Available Official Viewers

There have been no updates to the current crop of official viewers to mark the start of the week, leaving the pipelines as follows:

  • Release viewer: Maintenance Q(uality) viewer, version 6.6.9.577968 Thursday, February 2, 2023.
  • Release channel cohorts (please see my notes on manually installing RC viewer versions if you wish to install any release candidate(s) yourself).
  • Project viewers:
    • PBR Materials project viewer, version 7.0.0.578161, February 14, 2023. This viewer will only function on the following Aditi (beta grid) regions: Materials1; Materials Adult and Rumpus Room 1 through 4.
    • Puppetry project viewer, version 6.6.8.576972, December 8, 2022.

In Brief

  • Whilst not simulator-related, there have been some additional requests for improvements to the particle system – see: BUG-233438 “Larger particle size limits”, BUG-233439 “Per-generator particle limits” – it is possible these and requests such as BUG-5307 “New Particle texture parameters (repeat/offset/rotation/animation)” might received some attention in the near future – although a request for more information on the first two has been made.
  • A possible reason on why objects don’t always rez on login-in / following a teleport has been identified. Essentially, on arrival in a region, the viewer must inform the simulator as to its camera placement and rotation. It does so via an AgentUpdate. However, this in turn requires the viewer to receive an ObjectUpdate confirming the avatar has arrived. As there is a delay in these two events, the interest list can start sending data ahead of the camera position being confirmed, only for the camera to “jump” once its position has been confirmed, and this leads to confusion as to the data the interest list needs to send, resulting in some data being missed, and thus objects failing to render. If this is correct, there needs to be better check on synchronisation between the viewer and simulator before interest list information is sent.

2023 week 7: SL CCUG and TPVD meeting summaries: Mirrors!

Under the Northern Lights, December 2022 – blog post
The following notes were taken from:
  • My audio recording and chat log transcript of the Content Creation User Group (CCUG) meeting held on Thursday, February 16th 2023 at 13:00 SLT.
  • My chat transcript and the video recording of the Friday, February 17th TPV Developer’s meeting, recorded by Pantera Północy and embedded at the end of this article. My thanks, as always, to her for recording these meetings.
These meetings are for discussion of work related to content creation in Second Life, including current work, upcoming work, and requests or comments from the community, together with viewer development work. They are chaired by Vir Linden, and dates and times can be obtained from the SL Public Calendar. Notes:
  • These meetings are conducted in mixed voice and text chat. Participants can use either to make comments / ask or respond to comments, but note that you will need Voice to be enabled to hear responses and comments from the Linden reps and other using it. If you have issues with hearing or following the voice discussions, please inform the Lindens at the meeting.
  • The following is a summary of the key topics discussed in the meeting, and is not intended to be a full transcript of all points raised.

Official Viewers Summary

Available Viewers

The have been no further updates to the currently available official viewers sine the PBR materials viewer was updated at the start of the week, as reported in my week #7 SUG meeting summary. Therefore the pipelines remain as follows:
  • Release viewer: Maintenance Q(uality) viewer, version 6.6.9.577968 Thursday, February 2, 2023.
  • Release channel cohorts (please see my notes on manually installing RC viewer versions if you wish to install any release candidate(s) yourself).
  • Project viewers:
    • PBR Materials project viewer, version 7.0.0.578161, February 14, 2023. This viewer will only function on the following Aditi (beta grid) regions: Materials1; Materials Adult and Rumpus Room 1 through 4.
    • Puppetry project viewer, version 6.6.8.576972, December 8, 2022.

General Viewer Notes

  • It is hoped that the Performance Floater RC viewer will be promoted to de facto release status within the week, which would allow all official viewers to leverage Visual Studio 2022 on Windows builds going forward.
  • There are some changes to be made to github due to all the pull requests (PRs) going to branches which can change over time, causing issues as they do so. In the future, it is likely that PRs will go into the Main branch (which only changes on a per release basis) and from their moved into their intended branch.

CCUG – glTF Materials and Reflection Probes

Project Summary

  • To provide support for PBR materials using the core glTF 2.0 specification Section 3.9 and using mikkTSpace tangents, including the ability to have PBR Materials assets which can be applied to surfaces and also traded / sold.
  • To provide support for reflection probes and cubemap reflections.
  • The overall goal is to provide as much support for the glTF 2.0 specification as possible.
  • In the near-term, glTF materials assets are materials scenes that don’t have any nodes / geometry, they only have the materials array, and there is only one material in that array.
    • It is currently to early to state how this might change when glTF support is expanded to include entire objects.
  • The project viewer is available via the Alternate Viewers page, but will only work on the following regions on Aditi (the Beta grid):  Materials1; Materials Adult and Rumpus Room 1 through 4.
  • Please also see previous CCUG meeting summaries for further background on this project.

Status

  • Viewer:
    • Work continues on bug fixes.
    • A major new bug is the discovery that the UV treatment is off-specification. This appears to be due to OpenGL putting the 0,0 coordinate in the lower left corner of the image rather than the top left. This does mean that all PBR materials uploaded to Aditi (the beta grid) prior to the fix going into the viewer will effectively be “broken” post-fix. Viewer:
    • The lighting model for water in the project viewer has been updated to use the glTF specification lighting model for water so that reflection probes can be used to generate reflections on water. However, trying to adapt the “old” water shader to use the glTF lighting model is proving difficult, due the “bonkers” way things like fresnel  offset and scale have been implemented. This issue is to be addressed.
    • It is believed that most existing content should render reasonably faithfully under the PBR / glTF, with the exception of the known issue of alpha blending on colour curves. Runitai Linden has a couple more ideas how this might be improved, but overall, it might come down to having to explain that the colour space is changing for glTF, and as a result some alpha blended content will need to be adjusted in order to render correctly.
    • As Advanced Lightning Model (ALM) will be enabled all the time in the PBR viewer (the Forward renderer will be disabled), the viewer’s quality settings are being updated so that Shadows will be disabled by default across a much wider range of settings, as these are what causes the significant performance hit when ALM in enabled, rather than ALM itself (but Shadows can still obviously be manually enabled).
    • This viewer also causes instrumentation regressions within the Performance Floater viewer, which will likely be addressed when the code is ready to be merged with the release version of the viewer. .
  • It is hoped that the simulator-side support can be deployed to an RC on the Main grid (Agni) in the near future in order to further advance viewer testing as that moves from project to RC status as well.

CCUG – Mirrors(!)

  • The “very next thing” LL plans to implement after PBR Materials reaches Release Candidate status is – mirrors!
  • These will be planar mirrors, so best suited to flat surfaces such as the face of a cube, rather than curved or spherical surfaces.
  • Mirrors will effectively be a real-time 1:1 rendering of what is seen within the scene that is being reflected, but with some limitations to cater for performance. Those limitations  / controls under discussion at the Lab include:
    • The mirror effect will only be generated in viewers that are very close to it.
    • Perhaps limiting the number of mirrors which can be active within a viewer to just one per scene (so if there are two mirrors close by your avatar, only one will be active at a time). Or allowing user select the number of mirrors they wish see “working” at any given time.
    • Adding a viewer Preferences option to enable / disable mirrors, depending on the user’s needs.
    • Nevertheless, even with precautions such as the above, there will be a performance impact in having real-time mirrors active in the viewer.
  • Mirrors will likely support LSL control over them.
  • It is already being recommended that mirror surfaces are only used as mirrors, not as a means of generating “reflections” in general – which should be left to reflection probes / cube maps.
  • It is hoped that the way the mechanism for rendering reflections onto a mirror surface would use the same channels as reflection probes – so when the mirror is seen from a distance, it uses the reflection rendering based on the local reflection probes, but when approached, the reflection probe rendering would fade out, and the real-time planar mirror reflection rendering would fade in.
  • That said, precisely HOW real-time mirrors will work is still subject to discussion and planning: at the moment, the focus has been only considering time in terms of ensuring the PBR work does not block opportunities for adding real-time reflections, and that they will play nicely with the PBR Materials work when they are being developed.

CCUG – Avatars / New Start Avatars / Ecosystem

  • A question was raised about the upcoming new mesh starter avatars previewed at SL19B in June2022. These have yet to be releases, and are not intended to compete with existing mesh avatars, also LL hopes creators will help develop an ecosystem in support for the avatars as the devkits for them are released – there is no confirmed release date for the avatars.
  • The above lead to a general discussion on the learning curves involved in getting to grips with avatar bodies and heads, trying to math heads to bodies, etc., the need for more discussions on avatar capabilities, helping people understand the avatar content creation process so they can join the ecosystem, etc.
  • The was an agreement that more discussion on avatar-related content creation, real and perceived limitations on the avatar system – particularly rigging clothing and attachments and the reliance on additional toolsets (e.g. AvaStar MayaStar, etc.), issues of supporting information available through the SL Wiki / Knowledge Base, etc. See In Brief for more on discussions / potential new meetings.
  • There are internal discussions going on at the Lab concerning avatar physics, enabling the simulator to “know” more bout the avatar, how it is being animated, having the simulator-side physics engine fully recognise the avatar body as a physical object (rather than just a simplified capsule), etc., via the likes of the Puppetry project and elsewhere, but solutions are still TBD.

In Brief

  • CCUG: Alpha blending issues on avatars – there was a general discussion on alpha stacking/ordering and blending issues, with Beq Janus’ blog post on the subject relating to avatars / outfits being referenced as a good primer on the issue and steps to mitigate problems.
  • TPVD: work is continuing on the Inventory thumbnails work, but nothing ready for any form of public release.
  •  TPVD: it has been suggested that LL might want to add code to the new Group Chat History functionality to indicate the end of historic Group chat within a Group chat tab / panel, as people appear to be getting confused as to why they are opening Group chat to find past conversations displayed (due to word about the new functionality taking time to spread).
  • TPVD: concern was raised that the allowance of lossless Normal Map under PBR will lead to a lot of abuse with people using it to upload lossless textures as well, which it was feared would hit people’s VRAM. Runitai pointed out that lossless does not necessarily hit VRAM, but does impact caching and bandwidth. This sparked a general conversation on textures, resolution, quality, etc. However, the risk of people abusing the upload was acknowledged, and store will be monitored for unexpected spike in usage after the release of PBR.
  • TPVD: a discussion on viewer development as support for AAA game-style rendering. Please refer to the video for details,
  • Both meetings: user on-boarding – at both the CCUG and the TPVD meeting it was suggested that there needs to be a regular user group meeting to discuss user on-boarding, engagement and retention and how to address these on an ongoing basis.
    • This led to a lengthy discussion on the issues of engagement + retention which illustrated one of the core issues in just discussing it: everyone has a different opinion on what “the problem” is with engagement / retention.  Some see it as primarily being an expense issue (the cost of creating a good-looking avatar); some see it as people being unable to find interesting this to do; some see it as being performance / hardware / overall appearance of SL.
    • The problem with the above is (and as demonstrated at the TPVD meeting particularly) it can lead to very siloed outlooks where disagreements as to “the problem” become the focus of conversations, rather than agreement that all of these issues can play a role, and as such, solutions need to be perhaps more “holistic” in nature and encompassing all of the perceived pain points.
    • It has been suggested that an upcoming CCUG or TPVD meeting could be utilised as a kick-off session for broader discussions about on-boarding, etc.

Next Meetings

  • CCUG: Thursday, March 2nd, 2023.
  • TPVD: Friday, March 16th, 2023.

2023 SL SUG meetings week #7 summary

The Arctic Sanctuary, December 2022 – blog post

The following notes were taken from the Tuesday, February 14th, 2023 Simulator User Group (SUG) meeting. They form a summary of the items discussed and is not intended to be a full transcript. A video of the entire meeting is embedded at the end of the article for those wishing to review the meeting in full – my thanks to Pantera for recording it.

Server Deployments

  • On Tuesday, February 14th 2023, the simhosts on the Main SLS channel were restarted without any change to their simulator code, leaving them on release 577734.
  • On Wednesday, February 15th, 2023:
    • The majority of simhosts on the RC channel will be restarted without and version change.
    • The BlueSteel RC will be updated with release 578100, comprising:
      • New function llReplaceSubString(): find and replace instances of one substring with another string
      • New function llHMAC(): generate the HMAC hash of a message
      • New function llSignRSA(): generate an RSA signature of a message, given a private key
      • New function llVerifyRSA(): validate whether an RSA signature for a message is valid, given the public key
      • New parameters for llGetEnv(): “grid” and “region_rating”
      • Dozens of new parameters for llGetSimStats().

The BlueSteel deployment further includes a fix for BUG-233288 “Scripts do not operate properly under this new server version 577942”. This was cause of the February 1st, roll-back as described in this official blog post.

Available Official Viewers

On Monday February 13th, 2023, the PBR Materials project viewer updated to version 7.0.0.578161, February 14, 2023. This viewer will only function on the following Aditi (beta grid) regions: Materials1; Materials Adult and Rumpus Room 1 through 4.

The remaining official viewer pipelines stay as follows:

In Brief

  • The new Group Chat functionality is currently available in the Maintenance R (above) is also being picked-up by some TPVs. The hope at LL is that once the capability has been available for a while, it will be improved (e.g. allow Group owners set their own parameters for it.
  • The code branch which provided the Group Chat history code also contains code which allows for text chat translation. If an external translator is correctly configured, the viewer can send a translation request on receipt of a foreign language group chat.  This has yet to be implemented.
  • LL have received requests from residents to be able to submit changes to the XML file used to provide the content for the hover tooltip when writing LSL scripts. This work may be carried out in the next quarter, and the file may be converted to JSON.
  • The SL wiki has been unavailable for some, apparently as a result of a CDN issue.
  • General discussion on LSL editing, key bindings,

2023 SL Puppetry project week #6 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

The following notes have been taken from chat logs and audio recording of the Thursday, February 9th, 2023 Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).

Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Project Summary

General description of the project and its inception:

LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions. 
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the the avatar. Can that be the new improved direction of the Puppetry project? How to do it?

Leviathan Linden

  • Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
    • Note that facial expressions and finger movements are not currently enabled.
    • Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
  • The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
  • Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
  • No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
  • This project is taking in a lot of additional ideas – animation standards, improving the current animation system, enabling truer avatar / avatar and avatar object interactions such that it is likely to evolve into a rolling development, with immediate targets for development / implementation as they are agreed upon, to be followed by future enhancements.
  • As such, much of what goes into the meetings at present is general discussion and recommendations for consideration, rather than confirmed lines o development.
  • There is a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues – those experiencing issues can also contact Wulf Linden.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Further Information

Meeting Notes

Animation Streaming

  • Leviathan Linden has been experimenting with animation streaming over the viewer’s animation channel, such that whatever is sent from the controlling viewer is played directly by all receiving viewers without any further processing of the animations by those receiving viewers.
  • This had been discussed in previous meetings as a potential means of lightening the load of animation data processing individual viewers would have to perform, reducing the potential performance impact in situations where animation synchronisation is important. It also lays something of a further foundation for more procedural-based animation processing, allowing viewers to work smarter – sending less data more frequently, which will in turn help enable synchronised animations through puppetry, such as spontaneously sharing a high five.
  • With this initial test, featuring just one viewer using puppetry and one receiving it, actually revealed a more noticeable lag in streaming compared to individual processing and playback of received animation data. It is not clear at this time whether this would worsen in situations where multiple puppetry animations are being streamed / received.
  • The videos were initially posted to the restricted-access Second Life Content Creation Discord server, and then combined into a single video by Kadah Coba, which is reproduced here as an animated GIF – my thanks to Kadah for the work in combining the videos.

Puppetry streaming test: top – animation played in one viewer (large image) with data sent for processing by a receiving video (inset). Bottom: the same animation played on the same viewer and then streamed to the receiving viewer (inset) and played on receipt without any additional animation processing.

  • Leviathan notes this is a very quick and dirty test, requiring “some hackery” within the viewer’s animation code, but does not (as yet) require any changes to the server-side puppetry management code.
  • Further refinement of the code is required, together with further testing to see if the approach can be smoothed / improved, as such, the work is not currently fit for integration into the Puppetry Project viewer, although it has been suggested it might be offered as a temporary, separate test viewer to allow broader testing.
  • On potential issue is that the streaming is currently dependent on the reliability of the originating viewer; if it is running at a low FPS, potentially causing the receiving viewers see a “choppier” result with lagging and smooth animations within the stream.

LSL Integration

  • See: OPEN-375: “LSL Functions for reading avatar animation positions”.
  • Rider Linden has been working on an LSL API to control animation on an avatar.
Conceptually it would behave like LEAP from LSL. The simulator will send the animation instructions down to the targeted viewer which will perform the IK and animation, and then send the results back as though they were coming from any other LEAP plugin. Targeting objects should be possible with the API (although I’ll add a world position/rotation parameter so you don’t have to do the math yourself).

– Rider Linden

  • This work is possibly best described as moving a step towards enabling an avatar using puppetry to reach out and pick up an apple by allowing a script to position the avatar’s hand at the location of the apple, from where the user can use a supported capture tool to “pick up” the apple.
  • The envisioned actions would be: the user move their avatar’s arm towards the apple, the apple detects the collision between the avatar hand and itself and attaches to the hand as if it has been directly picked up.
  • Further work is required involving collisions between the apple and the avatar’s hand, so the apple knows it is being “grabbed”. This might be achieved by using an existing collision event as the trigger for attachment, or an entirely new event.
  • One problem is to avoid having multiple extra collision objects bouncing around the physics engine for every single attachment point on an avatar (55 in total), which would add-up, performance-wise, very quickly.
    • One suggestion for mitigating this is that as region knows where your hand is (which is true with the attachment update stream), it could be possible by implementing a new “grab” action that works in the physics simulation for picking up small objects; however, it would likely need some hint/magic on the viewer to render the object “at the hand” rather than “near the hand”..
  • Beyond this, there is also additional work to allow avatar-to-avatar interactions via puppetry – such as the aforementioned high five – which involves addressing some permission issues.

In Brief

  • Concern was raised that emphasis on puppetry over the traditional canned animation assets is that it could make SL inaccessible for some (because of the need for additional motion capture hardware, a possible need for more powerful client computers, etc). In response, those at the meeting pointed out:
    • The approach being taken by the Lab is not new – it has been a common factor (albeit implemented in a verity of ways) within games for well over a decade, and is used in multi-player games without participants being “lagged” to the point where gameplay is broken.
    • What is being proposed with Puppetry is not even “new” (in the broadest sense); rather it is adding a further layer of animation capabilities to Second life which can enable a greater sense of interactivity to the platform.
    • In terms of hardware, it was further pointed out that while some at the meeting are using VR hardware – headsets and peripherals – all that is actually required to start leveraging the capabilities (and as LL have demonstrated in the animated GIF forming the banner of this summary is a basic webcam.
  • In a more general conversation, it was pointed out by those at the meeting and the Lab engineers that:
    • Whilst things like streaming puppetry animations may at times result in more visible lag / animation desynchronization, it offers so much more in the way of avatar interaction with the world, it would be more than worthwhile.
    • This work is purely about puppetry and interactivity; it does not actually alter the way more general animations – walking, standing, etc., work, as the underpinning locomotion engine within the simulator and how the viewer calculates motion based on data from the simulator is not being altered.
    • Instead, the LSL API (and the LEAP API?) will enable general avatar orientation and attachment point orientation / movement to ensure that arm correctly reaches out to “grab” the apple mentioned above, by effectively running in conjunction with the locomotion engine.

Date of Next Meeting

  • Thursday, February 23rd, 2023, 13:00 SLT.

2023 SL SUG meetings week #6 summary

Angel Mist – The Cloud Garden, December 2022 – blog post

The following notes were taken from the Tuesday, February 7th, 2023 Simulator User Group (SUG) meeting. They form a summary of the items discussed and is not intended to be a full transcript. A video of the entire meeting is embedded at the end of the article for those wishing to review the meeting in full – my thanks to Pantera for recording it.

Server Deployments

  • On Tuesday, February 7th 2023, the simhosts on the Main SLS channel were restarted without any change to their simulator code, leaving them on release 577734.
  • On Wednesday, February 8th, 2023, the RC channels will also be restarted without any version update.

Available Official Viewers

There have been no updates to the current list of available official viewers, leaving them as:

  • Release viewer: Maintenance Q(uality) viewer, version 6.6.9.577968 Thursday, February 2, 2023.
  • Release channel cohorts:
  • Project viewers:
    • PBR Materials project viewer, version 7.0.0.577997, February 2, 2023. This viewer will only function on the following Aditi (beta grid) regions: Materials1; Materials Adult and Rumpus Room 1 through 4.
    • Puppetry project viewer, version 6.6.8.576972, December 8, 2022.

In Brief

  • The announcement about the Group Chat History gave rise to a discussion on the capability and making it more robust and deeper (e.g. be presenting more than just the last hour of group chat when used), together with general improvements to chat history management (timestamps, etc.). Please refer to the video for the full context.
  • BUG-229675 “Stopping llSetKeyframedMotion should always succeed and never shout an error” was again raised and noted as a not unreasonable request. Again, please refer to the video for further details.
  • Wednesday February 1st issues: a post-mortem on these was published on Thursday, February 2nd – please read it here for specifics.