2023 week #16: SL CCUG meeting summary (abbreviated)

OCWA Experience The Ocean, February 2023  – blog post

The following notes were taken from my audio recording and chat log transcript of the Content Creation User Group (CCUG) meeting held on Thursday, April 20th, 2023 at 13:00 SLT. 

These meetings are for discussion of work related to content creation in Second Life, including current work, upcoming work, and requests or comments from the community, together with viewer development work. They are usually chaired by Vir Linden, and dates and times can be obtained from the SL Public Calendar.

Notes:

  • These meetings are conducted in mixed voice and text chat. Participants can use either to make comments / ask or respond to comments, but note that you will need Voice to be enabled to hear responses and comments from the Linden reps and other using it. If you have issues with hearing or following the voice discussions, please inform the Lindens at the meeting.
  • The following is a summary of the key topics discussed in the meeting, and is not intended to be a full transcript of all points raised.

Additional note: unfortunately, my audio recording died whilst saving to disk, leaving me with just the first 10 minutes of audio from the meeting available for playback / summary. Given responses to questions in text are supplied in Voice, it is impossible to provide any reasonable summary beyond the point at which the recording save failed, so this is therefore a very foreshortened report, and not representative of the entire meeting.

Official Viewer Status

The Performance Floater / Auto FPS RC viewer updated to version  6.6.11.579629 on April 20th.

The rest of the official viewers remain as:

  • Release viewer: Maintenance R viewer, version 6.6.10.579060, dated March 28, promoted March 30th.
  • Release channel cohorts:
    • Maintenance T(ranslation) RC viewer, version 6.6.11.579154, April 6th.
    • Maintenance S, version 6.6.11.579153, March 31st.
  • Project viewers:
    • The PBR Materials / reflection probes project viewer, version 7.0.0.579401, April 10th.
    • Puppetry project viewer, version 6.6.8.576972, December 8, 2022.

The Performance Floater / Auto FPS viewer looks set to become the next viewer to be promoted to de facto release status in the coming week.

glTF Materials and Reflection Probes

Project Summary

  • To provide support for PBR materials using the core glTF 2.0 specification Section 3.9 and using mikkTSpace tangents, including the ability to have PBR Materials assets which can be applied to surfaces and also traded / sold.
  • To provide support for reflection probes and cubemap reflections.
  • The overall goal is to provide as much support for the glTF 2.0 specification as possible.
  • In the near-term, glTF materials assets are materials scenes that don’t have any nodes / geometry, they only have the materials array, and there is only one material in that array.
    • It is currently to early to state how this might change when glTF support is expanded to include entire objects.
  • The project viewer is available via the Alternate Viewers page, but will only work on the following regions on Aditi (the Beta grid):  Materials1; Materials Adult and Rumpus Room 1 through 4.
  • Please also see previous CCUG meeting summaries for further background on this project.

Status

  • Viewer:
    • The viewer is now very close to being promoted to Release Candidate status. Issues within the viewer build farm prevented it from getting fully cleared by QA, who are currently taking one more look at it.
    • It is believed that all the significant showstoppers thus far found have been dealt with (although more may show up as a result of it becoming available to a wider audience in RC).
    • The above should include the issue of objects in the camera’s view failing to render unless occlusion culling is disabled and the issues of some mesh items “exploding in the the viewer’s viewer both being fixed.
    • Ton mapping has been updated so there is no longer the ability to change / turn off tone mapping. This has been done in the name of “trying to keep things consistent” with older contents that has tone mapping built-in. This does lead to a few edge cases (such as not being able to get totally pitch black environments), but also fixes some issues around general exposure (e.g. preventing full bright objects changing brightness depending on camera distance).
    • It is likely that the work on exposures will eventually be fed into the snapshot tool, so photographers can adjust the exposure settings for them pictures.
    • There is still a collection of minor issues / bugs still to be resolved – such as getting parity some some of the current sky settings – which will be dealt with in RC as the viewer progresses.
    • Those who do find significant issues in using the viewer in RC are asked to report them via a BUG report ASAP.
  • When the view does go to RC, it is likely the server-side support will be deployed to one (Preflight) or possibly two (Preflight and Snack) small simulator RC channels to allow for testing on Agni (the Main grid). Details of the available regions will be published in my project summaries as and when available.

Future glTF Work

  • There is an internal (to LL) design document in development which is intended to set-out the next steps in the glTF work; however, this is not currently ready for public release.
  • Past indicators have been that it is possible the near-term work for glTF could include planar mirrors (with some controls around their use) and also glTF mesh uploads.

Next Meeting

  • Thursday, May 4th, 2023.

† The header images included in these summaries are not intended to represent anything discussed at the meetings; they are simply here to avoid a repeated image of a gathering of people every week. They are taken from my list of region visits, with a link to the post for those interested.

2023 SL Puppetry project week #15 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

The following notes have been taken from chat logs and audio recording of the Thursday, March 23rd, 2023 Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).

Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Project Summary

General Project Description as Originally Conceived

LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions. 
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the the avatar. Can that be the new improved direction of the Puppetry project? How to do it?

Leviathan Linden

  • Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
    • Note that facial expressions and finger movements are not currently enabled.
    • Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
  • The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
  • Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
  • No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
  • There is a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).

Additional Work Not Originally In-Scope

  • Direct avatar / object / avatar-avatar interactions (“picking up” an apple; high-fives. etc.).
  • Animations streaming: allowing one viewer to run animations and have them sent via the simulator to all receiving viewers without any further processing of the animations by those viewers.
  • Enhanced LSL integration for animation control.
  • Adoption of better animation standards – possibly glTF.
  • Given the project is incorporating a lot of additional ideas, it is likely to evolve into a rolling development, with immediate targets for development / implementation decided as they are agreed upon, to be followed by future enhancements. As such, much of what goes into the meetings at present is general discussion and recommendations for consideration, rather than confirmed lines o development.

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Further Information

Meeting Notes

Viewer Progress

  • An updated version of the project viewer is still “close” – it is currently awaiting clearance by QA.
  • This will include the attachment point tracking discussed in previous meetings (e.g., a step towards being able to “pick things up” in SL. The simulator support for this is already in place on the Aditi Puppetry regions.
  • However when available it will not include:
    • LSL animation control as yet (which has yet to be added to the simulator code anyway. Rider Linden believes he has a good protocol for single avatar animation, but would rather work on it some more.
    • Any IK improvements, as Leviathan Linden is still working on these.
    • Any extended LEAP API functionality (the added features for getting world position/orientation, lookat position/orientation, camera position/orientation/target). This will be coming in a future viewer update.
  • Another thing in this release will be the change to llsd_binary for the LEAP messaging protocol. This will be in the release notes but to use it you will want to update to either 1.3.1 of llbase and/or 1.2.0 of llsd for python scripts. Messages from LEAP scripts to the viewer will still work with the older python libraries, but messages from the viewer to the script will not be parsed correctly.

Server-Side Work

  • The LSL function API has been published to the Content Creation Discord group (sorry, I’ve been asked by LL not to publish details on joining the server – if you are a content creator interested in joining it, please contact Vir Linden or attend a meeting (Content Creation / Puppetry and ask in person).
  • Getting attachment point positions has been given a throttle, in part to not make it trivial to use LSL to rip an animation, and in part to prevent the server doesn’t get overwhelmed. This latter rate of throttling is variable and can change as load increases/decreases. However, as Rider linden noted, there would always be some delay and some disagreement about the actual position of the attachment point between LSL and all the observing viewers. As such, function is not meant for a high-fidelity use. Collision volumes on the attachment points will be a better solution in this respect, but that is functionality which is still down the line.

General Notes

  • Leviathan Linden’s work for streaming the full avatar animation state has stalled, due to it essentially hijacking the main puppetry data channel to send everything, even when not running a puppetry script, through LEAP. As such, Leviathan thinks it needs to be moved to its own experimental viewer.
  • Simon Linden’s work on allowing animation uploads of new/different formats has been decoupled from the Puppetry project’s codebase, and is now being built on the main viewer branch, allowing it to move forward without dependencies on Puppetry.
  • OpenXR support as a LEAP plug-in is still seen as desirable, since it would allow support for a broader range of devices. However, it is seen as a little more “down the road”, as there is some core infrastructure that needs to finish being vetted prior to work starting on this.

My thanks to Jenna Huntsman for the chat transcript from the meeting, and you can see her video recording of the session here.

Date of Next Meeting

  • Thursday, April 27th, 2023, 13:00 SLT.

2023 week 14: SL CCUG meeting summary

Perpetuity, February 2023 – blog post
The following notes were taken from my audio recording and chat log transcript of the Content Creation User Group (CCUG) meeting held on Thursday, April 6th, 2023 at 13:00 SLT.  These meetings are for discussion of work related to content creation in Second Life, including current work, upcoming work, and requests or comments from the community, together with viewer development work. They are usually chaired by Vir Linden, and dates and times can be obtained from the SL Public Calendar. Notes:
  • These meetings are conducted in mixed voice and text chat. Participants can use either to make comments / ask or respond to comments, but note that you will need Voice to be enabled to hear responses and comments from the Linden reps and other using it. If you have issues with hearing or following the voice discussions, please inform the Lindens at the meeting.
  • The following is a summary of the key topics discussed in the meeting, and is not intended to be a full transcript of all points raised.
Additional note: unfortunately, physical world matters meant I missed the initial part of the meeting, and as it is held in voice, there is little by way of chat transcript to reflect initial discussions prior to my arrival.

Official Viewer Status

On April 6th:
  • Maintenance T(ranslation) RC viewer, version 6.6.11.579154, was issued.
  • The PBR Materials / reflection probes project viewer updated to version 7.0.0.579241.
The rest of the current official viewers remain as:
  • Release viewer: Maintenance R viewer, version 6.6.10.579060, dated March 28, promoted March 30th.
  • Release channel cohorts:
    • Maintenance S, version 6.6.11.579153, March 31st.
    • Performance Floater / Auto FPS RC viewer updated to version 6.6.11.579238, April 4th.
  • Project viewers:
    • Puppetry project viewer, version 6.6.8.576972, December 8, 2022.

glTF Materials and Reflection Probes

Project Summary

  • To provide support for PBR materials using the core glTF 2.0 specification Section 3.9 and using mikkTSpace tangents, including the ability to have PBR Materials assets which can be applied to surfaces and also traded / sold.
  • To provide support for reflection probes and cubemap reflections.
  • The overall goal is to provide as much support for the glTF 2.0 specification as possible.
  • In the near-term, glTF materials assets are materials scenes that don’t have any nodes / geometry, they only have the materials array, and there is only one material in that array.
    • It is currently to early to state how this might change when glTF support is expanded to include entire objects.
  • The project viewer is available via the Alternate Viewers page, but will only work on the following regions on Aditi (the Beta grid):  Materials1; Materials Adult and Rumpus Room 1 through 4.
  • Please also see previous CCUG meeting summaries for further background on this project.

Status

  • The viewer has been updated to maintain parity with the release viewer, and work continues to get the viewer to a position where it can move to RC status.
    • Once it does go to RC status, it is expected to remain there for “a few months”.
  • Currently, the viewer is at a point where creators who wish to make content using PBR tools such as Substance Painter can do so and work to the rule-of-thumb that if it looks the same in both Substance Painter and the glTF viewer, than all is well and good – BUT, if the SL version looks noticeably different in the viewer, then a bug report should be filed, the issue should not be worked around.
  • Getting the simulator support for glTF moved to Agni is now being considered.
  • With regards to Bakes on Mesh, glTF Materials work in a similar manner to the current materials – the result of the BoM process gets fed into the base colour (+ the emissive map) like it does with the diffuse map for materials at present.
    • This does not mean BoM is glTF materials enabled; that still requires an update to the Bake Service to support materials data.
    • Updating the Bake Service is still seen as a “high value” future project.
  • The Sun midday position of the Sun has been adjusted so that it is no longer directly overhead, but is angled to appear as it would at a latitude of around 40ºN/S in spring.
Left: the glTF viewer repositions the midday Sun so it is in similar position as it would appear in the physical world at a latitude of around 40ºN/S in the spring, as opposed to being directly overhead as seen in the image on the right. Credit: Runitai Linden
  • Automatic alpha masks are turned off in the PBR viewer, and are likely to remain this way unless a compelling reason emerges for this not to be the case. So the Develop(er) → Rendering → Automatic Alpha Masks option for deferred rendering is off (and the one for forward rendering removed, as the glTF viewer does not support forward rendering).

HDRi Sky Rendering

  • In order to  get parity with High Dynamic Range Imaging (HDRi) environment maps has meant the sky as rendered on the glTF viewer is essentially HDR with added dynamic exposure. Without this change, the sky was lighting everything as if it were a “giant blue wall” rather than a bright sky.
  • This has impacted EEP (the Environment Enhancement Project, and means that the sky can look over-exposed under some settings.
  • LL is trying to zero in on a sky of sky parameters that is acceptable to most EEP settings. However, the issue is particularly noticeable for EEP settings which use “day for night” (e.g. they utilise dark sky tinting, etc., and replace the Sun texture with a planet or moon or some such, because the HDR rendering assumes that because the Sun “up”, there should be a brighter lighting used in the sky.
  • The choice here is:
    • Should the parameter be adjusted for uniformity (and some EEP settings require adjustment), or
    • Should additional control be supplied to allow additional control over the sky brightness, etc., to deal with EEP settings  where the above issues occur?
  • The problems with this second approach are that:
    • It “severely” fragment the expected colour space in the process, leaving content creators having to work with multiple lighting models (e,g. as with working with ALM on or off at present)?
    • It is akin to LL removing the ability to disable ALM in the PBR viewer and remove the older forward rendering code, only to then implement another “button” to alter the environment rendering, rather than keeping things uniform.
  • This topic has been the subject of heated debate within the Content Creation Discord channel.

In Brief

  • Priorities for graphics / content creation work after glTF Materials are currently planar mirrors and then glTF mesh imports.

Next Meeting

  • Thursday, April 20th, 2023.

2023 week 13: SL CCUG meeting summary

Celestial Glade, February 2023 – blog post
The following notes were taken from my audio recording and chat log transcript of the Content Creation User Group (CCUG) meeting held on Thursday, March 30th, 2023 at 13:00 SLT.  These meetings are for discussion of work related to content creation in Second Life, including current work, upcoming work, and requests or comments from the community, together with viewer development work. They are chaired by Vir Linden, and dates and times can be obtained from the SL Public Calendar. Notes:
  • These meetings are conducted in mixed voice and text chat. Participants can use either to make comments / ask or respond to comments, but note that you will need Voice to be enabled to hear responses and comments from the Linden reps and other using it. If you have issues with hearing or following the voice discussions, please inform the Lindens at the meeting.
  • The following is a summary of the key topics discussed in the meeting, and is not intended to be a full transcript of all points raised.
Additional note: unfortunately, physical world matters meant I missed the initial part of the meeting, and as it is held in voice, there is little by way of chat transcript to reflect initial discussions prior to my arrival.

Official Viewer Status

  • On March 30th, the Maintenance R viewer, version 6.6.10.579060, was promoted to de facto release status.
  • On March 31st, the  Maintenance S RC viewer updated to version 6.6.11.579153, bringing it to parity with the above release viewer.
The rest of the current official viewers remain as:
  • Release channel cohorts:
    • Performance Floater / Auto FPS RC viewer updated to version 6.6.10.578172, February 21, 2023.
  • Project viewers:
    • PBR Materials project viewer, version 7.0.0.578921, March 23 – This viewer will only function on the following Aditi (beta grid) regions: Materials1; Materials Adult and Rumpus Room 1 through 4.
    • Puppetry project viewer, version 6.6.8.576972, December 8, 2022.

glTF Materials and Reflection Probes

Project Summary

  • To provide support for PBR materials using the core glTF 2.0 specification Section 3.9 and using mikkTSpace tangents, including the ability to have PBR Materials assets which can be applied to surfaces and also traded / sold.
  • To provide support for reflection probes and cubemap reflections.
  • The overall goal is to provide as much support for the glTF 2.0 specification as possible.
  • In the near-term, glTF materials assets are materials scenes that don’t have any nodes / geometry, they only have the materials array, and there is only one material in that array.
    • It is currently to early to state how this might change when glTF support is expanded to include entire objects.
  • The project viewer is available via the Alternate Viewers page, but will only work on the following regions on Aditi (the Beta grid):  Materials1; Materials Adult and Rumpus Room 1 through 4.
  • Please also see previous CCUG meeting summaries for further background on this project.

Status

  • The viewer is still being progressed, and will likely update to keep in line with the current release viewer.
    • There is still a bug with glow which needs to be addressed, but this is not seen as a significant issue in terms of fixing.
  • Information in support of PBR materials and reflection probes is in development. This includes new wiki pages and – at least for reflection probes, if not both, new tutorial videos.
    • A very much Work In Progress version of the wiki information can be found in: PBR Materials.
  • The methodology for modifying PBR materials via script is to have the materials contained within glTF assets which can be stored in inventory (containing an LLSD header and glTF JSON, with the texture / material UUID stored in the image URI field), with LSL APIs used to modify the parameters within the glTF file.
    • Manual modification of the parameters can be done via the viewer UI in a similar manner to manipulating textures, etc., currently offered by the viewer.
    • The LSL APIs will work in a similar manner to functions such as llSetPrimPratams, etc. The only thing that cannot be changed in is actual image data.
  • For testing, it was noted the “local materials” should allow Materials to be tested directly from a user’s computer in a similar manner to local textures (and only visible in that user’s world view), which dynamic updating of the materials in the session as they are locally modified.

In Brief

  • Requests have been made for a more visual coding capability in SL to help those who are not programmers  / scripters, with the likes of capabilities similar to Blueprints in Unreal Engine. While such tools as the latter are acknowledged as being useful for putting snippets of code together, the notion of a visual coding system is seen by LL as potentially cumbersome for the results they would garner.
  • A very general discussion on games in SL.

Next Meeting

  • Thursday, April 6th, 2023.

2023 SL Puppetry project week #12 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

The following notes have been taken from chat logs and audio recording of the Thursday, March 23rd, 2023 Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).

Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Project Summary

General Project Description as Originally Conceived

LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions. 
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the the avatar. Can that be the new improved direction of the Puppetry project? How to do it?

Leviathan Linden

  • Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
    • Note that facial expressions and finger movements are not currently enabled.
    • Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
  • The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
  • Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
  • No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
  • There is a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).

Additional Work Not Originally In-Scope

  • Direct avatar / object / avatar-avatar interactions (“picking up” an apple; high-fives. etc.
  • Animations streaming: allowing one viewer to run animations and have them sent via the simulator to all receiving viewers without any further processing of the animations by those viewers.
  • Enhanced LSL integration for animation control.
  • Adoption of better animation standards – possibly glTF.
  • Given the project is incorporating a lot of additional ideas, it is likely to evolve into a rolling development, with immediate targets for development / implementation decided as they are agreed upon, to be followed by future enhancements. As such, much of what goes into the meetings at present is general discussion and recommendations for consideration, rather than confirmed lines o development.

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Further Information

Meeting Notes

Viewer Progress

  • An updated version of the project viewer is still “close” – it did not make it past QA, but it is hoped it will be available soon.
  • Leviathan Linden is working on the IK system in order to try to make it more robust to handle the kind of work Rider Linden is doing, but is not at the point of having anything ready for delivery into a viewer, although the idea was to have something possibly ready by the viewer update after the one that is still waiting to be cleared for project release.

Server-Side Work

  • Following the meeting, the current Puppetry test regions on Aditi were due to be updated with a simulator version which merges the server-side Puppetry code with the latest release of the simulator code.
  • Rider Linden is continuing to work on llSetAttachmentPoint, intended to allow avatars “pick up” objects using puppetry. At the time of the meeting he was completing work on ANIM_POS/ROT_TARGET which is intended to try to keep the attachment point directed  at an object in world (as opposed to a fixed location or a location relative to the avatar.
    • This uses a command to tell the viewer to carry out the necessary IK work to move an attachment point as close to a location as it can get, using the target object’s UUID / location as reported by the simulator.
    • The idea is that this functionality will work not only with hands / arms but also with other appendages (e.g. wings).
    • In theory, this should also allow avatars to touch attachment point on other avatars (e.g. holding hands), however, exactly how this works within the framework of the Permissions system – in terms of others accepting / refusing any direct interaction through something like a dialogue request as we see today – has yet to be worked out.
  • This led to a broader discussion on attaching object to avatars, the core of which is summarised below.

 Object Parenting vs. Temp Attachments

  • The idea of being able to use Puppetry to reach out and grasp a (suitably scripted) object in-world – for example, an apple, a bottle or similar) raised questions on how the process will work.
    • Currently, “temp” attachment can be made to avatars (e.g. via an Experience), but this still actually requires the object being temporarily transfers to the avatar’s inventory (where it does not how up) and from there attached to the relevant attach point (e.g. a hand).
    • This is somewhat slow and cumbersome – particularly if you want to do something with the object (e.g. if it is a ball, throw it), as the object needs to be picked up, held, follow the throwing motion of the avatar’s arm, detach at the point of release, resume its status as a physical object in-world, have direct and velocity applied, and then move in the direction of the throw.
    • The suggestion was made that to simplify things, a concept of avatar-to-object parenting needs to be introduced to SL – so when the ball is picked up, it immediately becomes a child of that avatar – no need for the passing to inventory, attaching from there, detaching to inventory / deletion, etc., as seen with temp attachments.
  • Developing a hierarchy scheme for SL is already being mused through the Content Creation User Group, so it was suggested this could help present a road to object parenting with avatars. However, as it will take some time for any hierarchy system to be developed and implemented – and given it falls outside of the Puppetry project – , its might mean that existing mechanisms may have to be accepted, even if they do place some limitations on what might be achieved until such time as a hierarchy system can be introduced.
  • As an alternative, it was suggested that the physics engine might be used to attach a degree of object parenting to an avatar:
    • The physics engine allows actions to be created, which are updated every sub-step; so in the case of a physical object, it should be possible to write an action that says, “Follow the hand of the avatar picking you up”.
    • Then, as long as a the physics engine knows the position of the “holding” hand, the object could move with it; while there would be a small degree of physics lag, as long as the viewer knows to render the object at the avatar’s hand, rather than where the physics updates are saying he object is, this should not be visually noticeable.
    • This approach would not require an explicit hierarchy system, but it would require the viewer to send updates on the avatar’s hand position to the simulator’s physics engine – which is available.
    • The idea is modelled on the old physics action that perhaps most famously allowed a beach ball to be grabbed and pushed / lifted up onto a table as a part of the orientation process to using SL incoming new users used to go through, and can still be used today to push suitably scripted objects around.
    • If possible, this approach would also need some form of constraints and permissions (e.g. you don’t really want to make your entire building capable of being grabbed and shunted around willy-nilly).

General Notes

  • There was a general conversation on Permissions within avatar-to-avatar interactions and where they need to fall.
    • As noted above, there will need to be some form of explicit granting of permission for Puppetry-generated hugs, handshakes, high-fives, etc.
    • However, concern was raised about permissions being needed for basic one-way interactions – such as pointing at someone. The concern here being that the LookAt debug targets has become so conflated with ideas of “privacy” tools (“Stop looking at me, perv! You don’t have permission!”) TPVs have surfaced the means to disable LookAts being sent via a viewer so that people do not get shouted at if their LookAt cross-hairs happen to reside on another avatar. Thus, the worry was that if there is a similar kind of visual indicator used for Puppetry, it might result in a similar reaction.
    • The short answer to this was, no, it should not be an issue as avatar locations can already be obtained through LSL without the need for a visual indicator being generated by the viewer.

Date of Next Meeting

  • Thursday, April 13th, 2023, 13:00 SLT.

2023 week #11: SL TPV Developer meeting summary

The Great Library of Eruanna, January 2023 – blog post
The following notes were taken from m y audio recording and chat log transcript of the TPV Developer (TPVD) meeting held on Friday,  March 17th 2023 at 13:00 SLT. These meetings are chaired by Vir Linden, and their dates and times can be obtained from the SL Public Calendar; also note that the following is a summary of the key topics discussed in the meeting and is not intended to be a full transcript of all points raised.

Official Viewers Status

  • Release viewer: Maintenance Q(uality) viewer, version 6.6.9.577968 Thursday, February 2.
  • Release channel cohorts (please see my notes on manually installing RC viewer versions if you wish to install any release candidate(s) yourself).
  • Project viewers:
    • PBR Materials project viewer, version 7.0.0.578792, March 15 – This viewer will only function on the following Aditi (beta grid) regions: Materials1; Materials Adult and Rumpus Room 1 through 4.
    • Puppetry project viewer, version 6.6.8.576972, December 8, 2022.

General Viewer Notes

  • The Performance Floater / Auto FPS viewer is still being worked on in the hope that performance can be further improved on lower-end systems.
  • Work is also being carried out to have the viewer work with a broader cross-section of translation tools.

Inventory Enhancement Project

  • The work to provide thumbnail images of  folders and items in Inventory is progressing on the viewer-side, but deployment will be dependent on both viewer availability (project / RC to release) and assorted back-end service and simulator updates to handle the new data.
  • Once the thumbnail preview work has been completed, it is possible the Lab will look to further enhancements to inventory management. One future enhancement under consideration is support for folders to be included in the Contents inventory of individual objects.

In Brief

  • See also my CCUG meeting summary, as this meeting crosses topics with that.
  • Estate Level Scripted Agent Controls (aka “Ban the Bots”): per my SUG meeting notes, there is a new simulator release due to be deployed which will provide estate / region holders and their managers limit access to their regions by scripted agents (bots).
    • This work will initially be console-based in the viewer until UI updates can be made and a viewer update with them deployed.
    • However, a bug has been found in the simulator code, which is currently being worked upon. The hope is this will be fixed without any delay to the code being rolled-out.
  • It was re-iterated that the next major new graphics project following glTF materials is likely to be support for Vulkan / MotenVK as the graphics API of preference (Windows / Mac). No time frame on when this work will commence, tho.
  • As I reported in November 2022, Atlassian has announced it will be restructuring how it licenses the Jira bug reporting product from 2024 onwards.
    • No decision on the direction LL will take as this change in made has as yet been taken, but there are ongoing internal discussions on options.
    • However, at this point it appears as if whatever route LL decides to take, they will need to review how public issues are raised and passed to them, etc.

 Next Meeting

  • Friday, April 14th, 2023.