2023 week #26: SL CCUG meeting summary

Pemberley – May 2023, blog post
The following notes were taken from my audio recording and chat log transcript of the Content Creation User Group (CCUG) meeting held on Thursday, June 15th, 2023 at 13:00 SLT. 

These meetings are:

  • For discussion of work related to content creation in Second Life, including current work, upcoming work, and requests or comments from the community, together with viewer development work.
  • Usually chaired by Vir Linden, and dates and times can be obtained from the SL Public Calendar.
  • Conducted in mixed voice and text chat. Participants can use either when making comments  or ask or respond to comments, but note that you will need Voice to be enabled to hear responses and comments from the Linden reps and other using it. If you have issues with hearing or following the voice discussions, please inform the Lindens at the meeting.

The following is a summary of the key topics discussed in the meeting, and is not intended to be a full transcript of all points raised.

Viewer News

  • Maintenance T RC viewer updated to version 6.6.13.580700 on Wednesday, June 28th.
  • The GLTF viewer updated to version 7.0.0.580717 on Tuesday, June 27th.

The rest of the official viewers currently in the pipeline remain as:

  • Release viewer: Maintenance S RC viewer, version 6.6.12.579987, dated May 11, promoted May 16.
  • Project viewers:

In addition:

  • Maintenance T is likely the next RC viewer remains the next in line for promotion to de facto release status.
  • The Emoji project viewer is liable to see further improvements to the Emoji picker in the UI.

glTF Materials and Reflection Probes

Project Summary

  • To provide support for PBR materials using the core glTF 2.0 specification Section 3.9 and using mikkTSpace tangents, including the ability to have PBR Materials assets which can be applied to surfaces and also traded / sold.
  • There is a general introduction / overview / guide to authoring PBR Materials available via the Second Life Wiki.
  • For a list of tools and libraries that support GLTF, see https://github.khronos.org/glTF-Project-Explorer/
  • Substance Painter is also used as a guiding principal for how PBR materials should look in Second Life.
  • Up to four texture maps are supported for PBR Materials: the base colour (which includes the alpha); normal; metallic / roughness; and emissive, each with independent scaling.
  • Given the additional texture load, work has been put into improving texture handling within the PBR viewer.
  • In the near-term, glTF materials assets are materials scenes that don’t have any nodes / geometry, they only have the materials array, and there is only one material in that array.
  • The overall goal is to provide as much support for the glTF 2.0 specification as possible.
  • To provide support for reflection probes and cubemap reflections.
  • The viewer is available via the Alternate Viewers page.
  • Please also see previous CCUG meeting summaries for further background on this project.

Status

  • The new RC viewer includes the automatic adjustment to make skies “PBR-ified” (so the checkbox in Graphics Preferences to disable automatically apply reflection probe ambience on skies that do not have it set, so they will be set to a default of 1). Probe ambience must now be set to 0 through the sky settings to deactivate HDR rendering (so no dynamic exposure so the sky is not brightened, and objects with PBR materials will appear duller than intended), and tone mapping is turned off.
    • However, there is a recognised bug related to probe ambience (BUG-234060 “[PBR] Simulator Clamping Environment’s Reflection Probe Ambience to 1”, and it is hoped a fix for this will be in the next simulator build – although it may take a while for the build to make its way into a deployment.
  • The-user supplied PBR content created on Aditi is being transferred to the PBR regions on the Main grid in readiness for more widespread viewing / testing.
  • The LDPW will be producing a PBR demonstrator which will be available through the viewer library.
  • The simulator-side support for glTF Materials is due to be expanded with further simulator RC deployments, which will notably include some sandbox environments.
  • However, it should be noted that:
    • Any glTF / PBR materials content created within these environments which is taken and then rezzed in any region that is not Materials-enabled, will become “material-less” in a non-recoverable way, and will need to be recreated.
    • Because of the above point, until the glTF support is fully gird-wide, any attempts to put PBR-enabled goods on the Marketplace will be sanctioned.
  • General bug fixing on both the simulator and the viewer is continuing.

PBR Terrain

Materials applied to Second Life terrains. Credit: Linden Lab
  • Per past meeting notes, Cosmic Linden is prototyping the application of PBR materials on terrain (see this blog post for more).
  • Further work on bug fixing, including correcting an issue with triplanar mapping to the for terrain repeats, particularly on steep elevation changes (so as to avoid the all-too familiar “stretching” seen with textures).
  • Important notes with this work:
    • It is not terrain painting. It is the application of PBR materials – terrain painting is described as “something that’s on the radar” at LL.
    • The work does not include support for displacement maps.
    • The work is currently only viewer-side, with no corresponding server-side support, the idea here being to prototype what might be achieved and testing approaches / results.

Level of Detail (LOD) Discussion

  • Level of detail (LOD) refers to the complexity of a 3D model representation. In short the idea is to reduce the load on the rendering system by reducing the complexity of the 3D object based on various of criteria (e.g. the distance of the object from the viewer / camera) and using various techniques.
  • Second Life uses the approach of Discrete Levels of Detail (DLOD) method – the use of discrete versions of the original with decreased levels of geometric detail to replace the more complex models using an algorithm primarily based on distance. This has both positives and negatives, some rooted in poor modelling practices by some content creators, others are inherent to flaws within SL itself.
  • In looking at glTF and mesh imports in the future,  LL is considering moving towards more automated and better optimised means of creating LODs to try to reduce some of the current issues.
  • One idea under consideration to achieve this is to leverage Simplygon (or see the Wikipedia entry), which although a proprietary tool, is available as a plug-in for a number of 3D modelling towards (including Blender); the idea being the glTF importer consume whatever output is generated by Simplygon at the creator’s end of the workflow (it being noted that “simply” integrating Simplygon into SL isn’t feasible).
    • Such an approach would offer advantages of optimisation whilst leaving the current upload process in place for those wishing to continue to use it, together with the continued ability to manually create LODs.
  • Some concerns over this possible approach raised at the meeting were:
    • It does not address issues of avatar complexity, which is potentially the biggest viewer performance hit – although in fairness, avatar complexity is really an issue requiring its own focus / project.
    • The nature of some of the Simplygon licensing statements (concern about which might be a mix of the genuine and a basic misunderstanding of legal terms used with regards to “how the Internet works”, so to speak; an issue we had back when LL changed some of the terminology within the SL TOS in 2013).
  • While there are other optimisations tools available, given the state of flux with things within the open source environment, and given Simplygon is recognised as an “industry standard”, offering a means to take output from in and bring it into Second Life is currently seen as the best strategy by the Lab, subject to the concerns raised about the licensing requirements.

In Brief

  • There have been reports of Animesh objects changing / becoming stuck in their lowest LOD in various circumstances, including on region crossing (see BUG-233691 “Animesh re-renders at lowest LOD for extended interval after long-range llSetRegionPos” + listed duplicates). This is believed to be a bounding box issue, and a fix is being developed which should hopefully make it into the next update of the Maint T RC viewer.
  • For PBR, the questions was raised about converting older content with the current Blinn-Phong materials to PBR where the base colour map is not available. The recommendation is that the two are kept separate: if the base colour map is not available for conversion, that Blinn-Phong should continue to be used. No attempt on LL’s part will be made to try to combine the two (PBR / Blinn-Phong) through any kind of conversion tools, as the two are currently entirely separate, allowing creators / users to apply PBR to items which may already have Blinn-Phong, and if the results aren’t good, strip the PBR away and immediately revert to the Blinn-Phong materials. Any attempt to allow “blending” between the two approaches will immediately break this.
  • One consideration actively being worked on within the Lab is glTF licenses and Second Life being able to recognise these. Licences tend to be built-in to glTF complaint 3D models, so as the end goal of the glTF project is to effectively be able to take a glTF model from (say) Sketchfab and drop it into SL, this should be done in respect of the object’s licensing (e.g. if it has a Creative Commons Non-Commercial license, it can be imported into SL but not re-sold after the fact or placed on the SL Marketplace).

Next Meeting

  • Thursday, July 6th, 2023.

† The header images included in these summaries are not intended to represent anything discussed at the meetings; they are simply here to avoid a repeated image of a gathering of people every week. They are taken from my list of region visits, with a link to the post for those interested.

2023 week #24: SL CCUG meeting summary

The Last Aokigahara Souls, April 2023 – blog post

The following notes were taken from my audio recording and chat log transcript of the Content Creation User Group (CCUG) meeting held on Thursday, June 15th, 2023 at 13:00 SLT. 

These meetings are:

  • For discussion of work related to content creation in Second Life, including current work, upcoming work, and requests or comments from the community, together with viewer development work.
  • Usually chaired by Vir Linden, and dates and times can be obtained from the SL Public Calendar.
  • Conducted in mixed voice and text chat. Participants can use either when making comments  or ask or respond to comments, but note that you will need Voice to be enabled to hear responses and comments from the Linden reps and other using it. If you have issues with hearing or following the voice discussions, please inform the Lindens at the meeting.

The following is a summary of the key topics discussed in the meeting, and is not intended to be a full transcript of all points raised.

Viewer News

No official viewer updates up until the meeting, leaving the list as:

In addition:

  • Maintenance T is likely the next RC viewer in line for promotion to de facto release status; however, it currently has an elevated crashed rate compared to the current release viewer, so LL is looking to bring that down before any promotion in made.
  • The Emoji project viewer is liable to see further improvements to the Emoji picker in the UI.
  • All of the back-end work (simulator code and the inventory service) seen as necessary to support Inventory Thumbnails is now thought to be in place, opening the door for a Thumbnails project viewer “Soon™”, which is apparently awaiting QA clearance.
    • Apparently this now provides the option of displaying the contents of an inventory folder either as we’re currently familiar with them (icons and text) or in a “thumbnail view” – I’ll review this once the viewer is available.
    • It’s anticipated that as the feature goes mainstream, creators will include thumbnails with their product offerings, as well as users creating their own thumbnails of items already in their inventory.
  • A further Maintenance RC viewer (Maint U) is in development and could be surfacing “Soon™”.

glTF Materials and Reflection Probes

Project Summary

  • To provide support for PBR materials using the core glTF 2.0 specification Section 3.9 and using mikkTSpace tangents, including the ability to have PBR Materials assets which can be applied to surfaces and also traded / sold.
  • There is a general introduction / overview / guide to authoring PBR Materials available via the Second Life Wiki.
  • Substance Painter is also used as a guiding principal for how PBR materials should look in Second Life.
  • Up to four texture maps are supported for PBR Materials: the base colour (which includes the alpha); normal; metallic / roughness; and emissive, each with independent scaling.
  • Given the additional texture load, work has been put into improving texture handling within the PBR viewer.
  • In the near-term, glTF materials assets are materials scenes that don’t have any nodes / geometry, they only have the materials array, and there is only one material in that array.
  • The overall goal is to provide as much support for the glTF 2.0 specification as possible.
  • To provide support for reflection probes and cubemap reflections.
  • The viewer is available via the Alternate Viewers page.
  • Please also see previous CCUG meeting summaries for further background on this project.

Status

  • There is another viewer RC in the works. This has been delayed whilst a number of bugs were resolved, but it is now with QA and pending their OK, it should be surfacing in the very near future.
  • The water fog density range has been updated in the PBR viewer. It was capable of being set from -10 through to 10, but is now  “almost zero” through to 100, with the higher numbers indicating higher fog density (and lower visibility).
  • Sky settings:
    • As per my notes from the last CCUG, all “legacy” skies which do not have a probe ambience set, will have one automatically applied by the viewer.
    • This has lead to a checkbox being including in the Graphic Preferences to disable the automatic application. However, it is possible / likely the check box will be removed.
    • This will mean the only way to have “legacy” skies render as they did pre-PBR will be to set the probe ambience setting to zero in the sky settings.
    • Setting probe ambience to 0 effectively deactivates HDR rendering and tone mapping etc.
  • The above requirement to set the probe ambience to zero has two further impacts:
    • It will be the only means by which “legacy” materials can be rendered as they did pre-PBR; the viewer “hack” to desaturate and “nudge” the luminance of “legacy” materials to have them render as they did pre-PBR has been removed due to combination of factors, including a noticeable performance hit.
    • The hack that allowed full bright objects to be exempt from dynamic exposure (so they would always stay the same brightness no matter what the exposure) has also had to be removed from the viewer, again as a result of a noticeable performance hit (an extra texture sample for each pixel). So, full bright object will, in future versions of the PBR viewer, be subject to the same exposure rules as everything else in the scene.
      • This means that full bright objects will still appear to be lit by 100% of the light and be unaffected by things like local lights, but the overall dynamic exposure of the scene will also brighten / darken them in keeping with the rest of the scene, and tone mapping will be applied.
      • SL photographers who display their images in-world with full bright are going to need to experiment with using sky settings with tone mapping disabled when taking their shots in order to avoid it being applied twice to images (e.g. once when the snapshot was taken and again when displayed in-world) and having this unduly affect the full bright rendering of the image.
  • Screen Space Reflections (SSR) – (non-SL overview):
    • Glossy support led to performance hits, particularly when there were a lot of alpha objects in the scene being rendered.
    • This has now been mitigated via a number of means; e.g. SSR is not even applied if it exceeds a certain roughness value, nor is it applied to alpha blended objects (the “old fade-out” method is used – that is: the higher the roughness, the weaker the SSR and the greater the reliance on reflection probes).
    • There have also been some general adjustments to SSR to again try to reduce the performance hits, whilst also allowing more distant objects to be reflected off of surfaces – said to be noticeable on water and flat surfaces.
  • The SSR work provoked a reminder that it is important not to abuse the use of alpha surfaces when SSR is used in rendering.
    • For example:
      • If the object is just a single layer users will be looking through to see what lays beyond, than alpha blending “may be the right choice”.
      • However, alphas should not be used to create layered effects (e.g. setting the faces of a surface to transparent for a “window” and then applying a further alpha to give the window a “frosted glass” look, and having the viewer composite them when rendering. Instead, both alphas should be baked down into one material in PhotoShop (or similar) & applied).
    • This sparked a discussion on how general users who dabble in content creation and who may be used to working in certain ways, can be properly educated to understand things like this – which may be known and understood by mainstream content creators, but could easily pass others by, as there is no means within the viewer UI of telling what is “good” or “bad” techniques when banging things together (at least until frame rates disappear through the floor and into the cellar).
    • It was therefore suggested again that documentation on the wiki – “best practices” / a “bible” for PBR Materials creation / use really should accompany the formal release of PBR Materials, and this should be clearly pointed at though blog posts, etc.

PBR Terrain

Materials applied to Second Life terrains. Credit: Linden Lab
  • Per past meeting notes, Cosmic Linden is prototyping the application of PBR materials on terrain (see this blog post for more).
  • The focus currently is on adding triplanar mapping to the for terrain repeats, particularly on steep elevation changes (so as to avoid the all-too familiar “stretching” seen with textures).
  • The above does potentially incur a performance hit, as it is noted as being for “sufficiently capable machines”, so Cosmic has also been working on trying to optimise performance elsewhere in the use of materials on terrain, as well as carrying out some bug fixing.
  • Important notes with this work:
    • It is not terrain painting. It is the application of PBR materials – terrain painting is described as “something that’s on the radar” at LL.
    • The work does not include support for displacement maps.
    • The work is currently only viewer-side, with no corresponding server-side support, the idea here being to prototype what might be achieved and testing approaches / results.

In Brief

  • It was re-iterated that long-term graphics support on the MacOS side will be MoltenVK (with Vulkan the likely route for Windows), with Zink also being looked at.
  • New user retention:
    • It has been suggested that as many in the various SL communities – those running Community Gateways, content creators, etc., – all have a interest in the new-user experience and brining people into SL, that a semi-regular “New User Experience” meeting might be established by LL at which ideas can be more readily discussed, feedback given and communities and creators more directly engaged with LL’s own efforts (e.g. developing an ecosystem of clothing, etc., to support the (still) upcoming new starter avatars. etc.).
    • It was pointed out that one of the best ways to encourage retention is for existing users to be welcoming, polite and supportive of new users (clubs that have scripts auto-ejecting people just because they are only a handful of days old or on the basis that an avatar is not PIOF, for example, aren’t exactly rolling out the welcome mat, for example).
    • The above spun into a general discussion on content, content moderation, and similar.
  • There was a general discussion on Land Impact (LI) and how it is “too high” – although this is highly subjectively, given the diversity of content and its uses in SL (static, animated, Animesh, etc. – how high is “too high”? Is “too high” down to more a lack of understanding of actual rendering + simulator load costs than an objective measurement? etc.). That said, it was acknowledged that more could be done t help “improve” LI in some areas.
  • The above spun into a discussion on physics costs, and how convex hull physics forms are not the most efficient for SL due to the complexity of the tools used to create them, and the ease with which mistakes can be made in creating them, with indications that as the glTF project moves towards support for mesh uploads how physics shapes are used / applied is likely to be be revised.

Next Meeting

  • Thursday, June 29th, 2023.

† The header images included in these summaries are not intended to represent anything discussed at the meetings; they are simply here to avoid a repeated image of a gathering of people every week. They are taken from my list of region visits, with a link to the post for those interested.

2023 SL Puppetry project week #23 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

The following notes have been taken from chat logs and audio recording of the Thursday, June 8th, 2023 Puppetry Project meetings. Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Meeting And Project Overview

  • The Puppetry User Group exists to provide an opportunity for discussion about the development of, and feature for, the upcoming Second Life Puppetry project(see below for details), bugs, and feature ideas.
  • These meetings are conducted (as a rule):
    • On Aditi (the Beta grid) at the Castelet Puppetry Theatre, commencing at 13:00 SLT.
    • Those encountering issues in logging-in to Aditi should contact Second Life support for assistance.
    • Generally on alternate Thursdays to the Content Creation meetings.
    • Comprise a mix of text and voice – attendees can use text only, if preferred, but should enable to hear comments / responses given in voice.
  • These meetings are open to anyone with a concern / interest in the Puppetry project, and form one of a series of regular / semi-regular User Group meetings conducted by Linden Lab. Dates and times of all current meetings can be found on the Second Life Public Calendar, and descriptions of meetings are defined on the SL wiki.

General Project Description as Originally Conceived

LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions. 
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the avatar. Can that be the new improved direction of the Puppetry project? How to do it?

Leviathan Linden

  • Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
    • Note that facial expressions and finger movements are not currently enabled.
    • Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
  • The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
  • Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
  • No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
  • There is a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).

Additional Work Not Originally In-Scope

  • Direct avatar / object / avatar-avatar interactions (“picking up” an apple; high-fives. etc.).
  • Animations streaming: allowing one viewer to run animations and have them sent via the simulator to all receiving viewers without any further processing of the animations by those viewers.
  • Enhanced LSL integration for animation control.
  • Adoption of better animation standards – possibly glTF.
  • Given the project is incorporating a lot of additional ideas, it is likely to evolve into a rolling development, with immediate targets for development / implementation decided as they are agreed upon, to be followed by future enhancements. As such, much of what goes into the meetings at present is general discussion and recommendations for consideration, rather than confirmed lines o development.

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Resources

Meeting Notes

  • The viewer remains at  version 6.6.12.579958, issued on Thursday, May 11th.
    • This update includes access to Rider Linden’s experimental attachment point tracking & forwarding to the server feature.
    • It also includes various incremental improvements to handling puppetry, such as support for parsing binary LEAP data from the LEAP script.
  • Work has slowed a little due to Linden staff being out-of-office recently (hence why no meeting since May 11th), and personnel on the Puppetry project also working on other simulator projects.
    • This has notably impacted Leviathan Linden’s work on Inverse Kinetics (IK), which has had a knock-on impact slowing Rider Linden’s work on LSL support for driving puppetry.
    • However, progress has resumed on the IK work and while described as currently “not stable”, and problems are still to be solved in situations where a target position is too far away for a joint in the skeleton to reach, or where multiple joints (e.g. 5 or 6) are involved.
    • One issue that is proving difficult to handle is the default avatar mesh joint weighting is incorrect along the forearm and wrist. What is required is two distinct joints at the forearm to do mesh bending correctly: a hinge at the elbow and also a twist constraint along the forearm bone, toward the wrist, rather than (as is currently the case) treating the wrist as a ball joint. This may be the subject of further internal discussion at LL as Leviathan gets more of the IK work nailed down.
  • WRT IK:
    • Leviathan is looking to solve the targeting issues first, then work back to ensure that there are no collisions between a limb and avatar body (e.g. reaching across the avatar’s body to pick something up, and the avatar’s elbow / part of the arm appears to go through the body).
    • Forward And Backward Reaching Inverse Kinematics (FABRIK) – which is the fundamental algorithm for suggesting new joint positions in a range of applications, including 3D modelling – is the route of choice of the Lab; however, adopting to FABRIK is taking some trial and error.

Additional Notes

  • Aura Linden is here but she is working on the Animation importer project which has been split off from the Puppetry project. Currently, the status is animations can be import import animations from some tools/formats, but others aren’t working yet.
    • It was noted at the last meeting, that for animation import, LL is looking towards using / supporting Assimp (github: https://github.com/assimp/assimp), which supports some 30-40 animation formats, converting them to it ownformat for ease of import to multiple platforms. Notably, it supports .FBX and glTF, so it fits with the Lab’s goal of utilising glTF for materials, mesh imports, etc.

Date of Next Meeting

  • Thursday, June 22nd, 2023, 13:00 SLT.

2023 week #20: SL CCUG meeting summary: mirrors and PBR terrain

Small Town Green, March 2023 – blog post

The following notes were taken from my audio recording and chat log transcript of the Content Creation User Group (CCUG) meeting held on Thursday, May 18th, 2023 at 13:00 SLT. 

These meetings are for discussion of work related to content creation in Second Life, including current work, upcoming work, and requests or comments from the community, together with viewer development work. They are usually chaired by Vir Linden, and dates and times can be obtained from the SL Public Calendar.

Notes:

  • These meetings are conducted in mixed voice and text chat. Participants can use either when making comments  or ask or respond to comments, but note that you will need Voice to be enabled to hear responses and comments from the Linden reps and other using it. If you have issues with hearing or following the voice discussions, please inform the Lindens at the meeting.
  • The following is a summary of the key topics discussed in the meeting, and is not intended to be a full transcript of all points raised.

glTF Materials and Reflection Probes

Project Summary

  • To provide support for PBR materials using the core glTF 2.0 specification Section 3.9 and using mikkTSpace tangents, including the ability to have PBR Materials assets which can be applied to surfaces and also traded / sold.
  • There is a general introduction / overview / guide to authoring PBR Materials available via the Second Life Wiki.
  • Substance Painter is also used as a guiding principal for how PBR materials should look in Second Life.
  • Up to four texture maps are supported for PBR Materials: the base colour (which includes the alpha); normal; metallic / roughness; and emissive, each with independent scaling.
  • Given the additional texture load, work has been put into improving texture handling within the PBR viewer.
  • In the near-term, glTF materials assets are materials scenes that don’t have any nodes / geometry, they only have the materials array, and there is only one material in that array.
  • The overall goal is to provide as much support for the glTF 2.0 specification as possible.
  • To provide support for reflection probes and cubemap reflections.
  • The viewer is available via the Alternate Viewers page.
  • Please also see previous CCUG meeting summaries for further background on this project.

Status

  • The RC glTF / PBR viewer was updated to version 7.0.0.580085 on Tuesday. May 16th.
  • Work is continuing to try to fix the issues with pre-PBR skies looking “broken” in the PBR viewer. This is seen as the last major fix required to the viewer, the rest of the required work being seen as or “maintenance” fixes rather than major breakages
    • The probable solution for the sky issue is to use the reflection probe ambience as a hint as to how the sky should rendered. If the ambience setting is zero, various environment sliders such as Brightest (currently referred to a Gamma in the viewer) should respond in much the same way as they do in pre-PBR viewers. Otherwise the sky will be rendered as PDR / HDR.
    • This is acknowledged as being something of a kludge, but is seen as a easiest way to maintain rendering for non glTF / PBR scenes.
  • Water reflections: the glTF / PBR viewer includes updates to the rendering of Linden Water reflections.
    • These are seen as being “not as good” as water reflections seen in non-PBR viewers, but they run with a lot less in the way of overhead on the viewer.
    • The reduction in reflection quality is the result of no longer doing a full rendering pass on Linden Water, and the decision to do this was made to offset the cost of reflection probes and planar mirrors (see below).

Screen Space Reflections and Mirrors

  • Screen Space Reflections (SSR) – (non-SL overview):
    • Now supports both glossy and “stupid” roughness values, with “good” performance, and supports adaptive sampling rates.
    • SRR will apply to “everything” in a scene.
    • The same approach taken with SSR will also be used for planar mirrors.
  • Planar mirrors:
    • Geenz Linden is working on both occlusion culling (at what distance from an avatar / camera should mirrors render) and general mirror constraints (how many mirrors should be active for an avatar at any given time). There are currently no plans to limit mirrors on the basis of size.
    • The latter will most likely initially be set to one, and if there are multiple mirrors within range of an avatar, only the nearest will have reflections actively rendered, the rest will simply render as glossy surfaces.
    • The distance culling is unlikely to exceed 12 m, and there are cases (e.g. some Linden Homes regions) where it would be best suited as being a lot less, to avoid situations where someone has a mirror in their home – but it is impacting their neighbours’ rendering.
    • The rate of update for mirrors might be throttled; a decision has not been taken on this as yet.
    • The limitations are unlikely to have debug overrides (although this might change in part for some as testing progresses) in order to prevent people from inadvertently crippling their frame rates. In this latter regard, Geenz notes:  “The performance sucks, the use cases are limited, and you should plan accordingly.”
    • Rendering takes the form of: reculling the scene from the perspective of the mirror, re-rendering that scene into the G-Buffer, re-shading that scene in the deferred pass, and finally re-rendering semi-transparent objects – all of which is intensive.
  • The intent: LL want to get to a point where SSR and reflection probes should be sufficient for more reflection use-cases. Mirrors should only ever used in exceptional cases where SSR and reflection probes cannot achieve the desired effect (i.e. the shiny metal coffee pot on the table and the “glass” on the picture frame on the wall should both utilise SSR / reflection probes for the reflections on their surfaces and not be set as a mirror).

PBR Terrain

  • As per my TPVD summary for Friday, May 12th, there is a project underway to provide PBR support for terrain.
  • This is seen as a means of leveraging PBR Materials to offer some quality improvements to terrain ahead of any longer-term terrain project which might yet be considered / actioned.
  • The idea is to enable the use of Materials asset IDs in place of the usual texture IDs and applying them to the ground.
  • An initial alpha build of a viewer supporting this work is available through the content creation Discord server. However, note that it is only alpha and unsupported outside of the project at this time.
    • Please also note that at the request of Linden Lab, I am unable to publish details on how creators can obtain access to the content creation Discord server. Those who are interested should wither a) attend a Content Creation User Group meeting and request access there; or b) contact Vir Linden to request details on how to request access.
  • As this is purely a viewer-side change, it does not require a server-side update, but for testing, the viewer should preferably be used on Aditi (the beta grid), where there are materials available within the PBR regions expressly for testing the capability. There is also a debug setting in the viewer which allows it to be used “anywhere”, but this is described as currently “hacky”.
  • This work also sees an increase in the overall texel density for terrain, raising it to 1024×1024, and the texture repeat has been doubled. The latter may only be a temporary move, with discussions at LL revolving around various ideas such as hex tiling.
An example of PBR materials applied to Second Life Terrain. Via Niri Nebula, original by Animats
  • Important notes with this work:
    • It is not terrain painting. It is the application of PBR materials – terrain painting is described as “something that’s on the radar” at LL.
    • The work does not include support for displacement maps.
    • The work is currently only viewer-side, with no corresponding server-side support, the idea here being to prototype what might be achieved and testing approaches / results.
    • It is viewed as a “mini project”, which can potentially be built upon to include elements such as simulator support (including EM tools, etc.).
    • Given the above point, there are also discussions on how best to handle the default grass texture for land (which is just a basic diffuse map) if the PBR terrain work is to go mainstream. Currently, updating this is not part of the mini-project.

glTF and the Future

  • glTF “phase one”: PBR materials (current project).
  • glTF “phase two”: mesh asset uploads support and scenes (up to, but not including animations).
  • glTF “phase three”: animations, morph targets, etc.

Next Meeting

  • Thursday, June 1st, 2023.

† The header images included in these summaries are not intended to represent anything discussed at the meetings; they are simply here to avoid a repeated image of a gathering of people every week. They are taken from my list of region visits, with a link to the post for those interested.

2023 SL Puppetry project week #19 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

Note: this has been updated to include comments made at the TPV Developer meeting on Friday, May 12th.

The following notes have been taken from chat logs and audio recording of the Thursday, May 11th, 2023 Puppetry Project meetings held at the Castelet Puppetry Theatre on Aditi. These meetings are generally held on alternate weeks to the Content Creation User Group (CCUG), on same day / time (Thursdays at 13:00 SLT).

Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Project Summary

General Project Description as Originally Conceived

LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions. 
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the avatar. Can that be the new improved direction of the Puppetry project? How to do it?

Leviathan Linden

  • Previously referred to as “avatar expressiveness”, Puppetry is intended to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
    • Note that facial expressions and finger movements are not currently enabled.
    • Most movement is in the 2D plain (e.g., hand movements from side-to-side but not forward / back), due to limitations with things like depth of field tracking through a webcam, which has yet to be addressed.
  • The back-end support for the capability is only available on Aditi (the Beta grid) and within the following regions: Bunraku, Marionette, and Castelet.
  • Puppetry requires the use of a dedicated viewer, the Project Puppetry viewer, available through the official Second Life Alternate Viewers page.
  • No other special needs beyond the project viewer are required to “see” Puppetry animations. However, to use the capability to animate your own avatar and broadcast the results, requires additional work – refer to the links below.
  • There is a Puppetry Discord channel – those wishing to join it should contact members of LL’s puppetry team, e.g. Aura Linden, Simon Linden, Rider Linden, Leviathan Linden (not a full list of names at this time – my apologies to those involved whom I have missed).

Additional Work Not Originally In-Scope

  • Direct avatar / object / avatar-avatar interactions (“picking up” an apple; high-fives. etc.).
  • Animations streaming: allowing one viewer to run animations and have them sent via the simulator to all receiving viewers without any further processing of the animations by those viewers.
  • Enhanced LSL integration for animation control.
  • Adoption of better animation standards – possibly glTF.
  • Given the project is incorporating a lot of additional ideas, it is likely to evolve into a rolling development, with immediate targets for development / implementation decided as they are agreed upon, to be followed by future enhancements. As such, much of what goes into the meetings at present is general discussion and recommendations for consideration, rather than confirmed lines o development.

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Further Information

Meeting Notes

Viewer Progress

  • An updated version of the viewer – version 6.6.12.579958 – was release on Thursday, May 11th.
    • This update includes access to Rider Linden’s experimental attachment point tracking & forwarding to the server feature.
    • It also includes various incremental improvements to handling puppetry, such as support for parsing binary LEAP data from the LEAP script.
  • Avatar attachment point tracking (per the TPVD meeting discussion on May 12th):
    • This allows the tracking of joints (using attachment points) using a script.
    • Using visible attachment points (i.e. those on the avatar , NOT any of the screen-based HUD attachment points) cuts down on the the amount of data having to be be handled at both ends.
    • The speed at which the attachment point movement is read back is such that it could not be exploited to create a copy of an animation with any real fidelity.
    • This is a deliberate move to ensure that animation creators are not left feeling uncomfortable about LSL animation tracking.
    • There is a combined throttle / sleep time elements to tracking attachment points: the throttle limits the number of attachment points that can be tracked over a certain period of time; he script sleep time is designed to allow an animation to move those attachment points forward sufficiently before a further taking record is made. Thus, it is next to impossible to track and record a coherent animation frame.
  • It was noted that previously, joint constraints had been hard coded in C++, but their configuration has been moved into a human-readable LLSD file which can be modified without rebuilding the viewer.
  • Two areas of focus going forward are:
    • Improving the Inverse Kinematic (IK) system within the viewer – something Leviathan Linden is already working on. This will include overall improvements to IK constraints as well as to positioning, with the existing source-code constraints moved  replaced by a further config file – “constraints” here being in terms of joint rotation / movement.
    • Providing .FBX animation file import and Mixamo skeleton re-targeting.
  • The IK work is still being thrashed out (and subject to much more discussion at meetings, but is seen as a priority over other elements of work, such as the animations streaming idea Leviathan Linden had been working on. The hope is that by improving IK, it will play into streaming and “live” animations a lot more robustly and smoothly. It is also seen as a foundational piece of work for further opening up puppetry and animation work.

General Notes

  • It was noted that for animation import, LL is looking towards using / supporting Assimp (github: https://github.com/assimp/assimp), which supports some 30-40 animation formats, converting them to it ownformat for ease of import to multiple platforms. Notably, it supports .FBX and glTF, so it fits with the Lab’s goal of utilising glTF for materials, mesh imports, etc.
  • [TPVD meeting, May 12th] This will not alter the existing internal format for animation. It is just  to allow the import of other formats.
  • It is acknowledged that alongside o that, the Labe will require a retargeting system for animations; although what form this will take is still TBD.
  • The core of the meeting was a general discussion of things that might / could e done in the future, and what technologies LL might look towards.

Date of Next Meeting

  • Thursday, May 25th, 2023, 13:00 SLT.

2023 week #18: SL CCUG meeting summary – PBR

Elvion, March 2023 – blog post †

The following notes were taken from my audio recording and chat log transcript of the Content Creation User Group (CCUG) meeting held on Thursday, May 4th, 2023 at 13:00 SLT. 

These meetings are for discussion of work related to content creation in Second Life, including current work, upcoming work, and requests or comments from the community, together with viewer development work. They are usually chaired by Vir Linden, and dates and times can be obtained from the SL Public Calendar.

Notes:

  • These meetings are conducted in mixed voice and text chat. Participants can use either to make comments / ask or respond to comments, but note that you will need Voice to be enabled to hear responses and comments from the Linden reps and other using it. If you have issues with hearing or following the voice discussions, please inform the Lindens at the meeting.
  • The following is a summary of the key topics discussed in the meeting, and is not intended to be a full transcript of all points raised.

Official Viewer Status

No updates through until the meeting, leaving the official viewer pipelines as:

  • Release viewer: Performance Floater / Auto FPS RC viewer, version 6.6.11.579629, promoted April 25.
  • Release channel cohorts (please see my notes on manually installing RC viewer versions if you wish to install any release candidate(s) yourself).
  • Project viewers:
    • Puppetry project viewer, version 6.6.8.576972, December 8, 2022.

glTF Materials and Reflection Probes

Project Summary

  • To provide support for PBR materials using the core glTF 2.0 specification Section 3.9 and using mikkTSpace tangents, including the ability to have PBR Materials assets which can be applied to surfaces and also traded / sold.
  • There is a general introduction / overview / guide to authoring PBR Materials available via the Second Life Wiki.
  • Substance Painter is also used as a guiding principal for how PBR materials should look in Second Life.
  • Up to four texture maps are supported for PBR Materials: the base colour (which includes the alpha); normal; metallic / roughness; and emissive, each with independent scaling.
  • Given the additional texture load, work has been put into improving texture handling within the PBR viewer.
  • In the near-term, glTF materials assets are materials scenes that don’t have any nodes / geometry, they only have the materials array, and there is only one material in that array.
    • It is currently to early to state how this might change when glTF support is expanded to include entire objects.
  • The overall goal is to provide as much support for the glTF 2.0 specification as possible.
  • To provide support for reflection probes and cubemap reflections.
  • The viewer is available via the Alternate Viewers page.
  • Please also see previous CCUG meeting summaries for further background on this project.

Status

  • The viewer is now at Release Candidate status, per the viewer update list above. HOWEVER, the server-side support for glTF / PBR is still awaiting deployment to the Preflight RC channel on the main grid, so for the time being, the viewer still only works on Aditi (the Beta grid), on the following regions: Materials1, Materials Adult, Rumpus Room and Rumpus Room 2 through 4.
  • The viewer will remain in RC for some time to allow for broader feedback to be gained, particularly once the server support has been deployed to simhosts on Preflight (and, most likely Snack as a follow-on), and so is more amenable for testing by a wider group of users / creators.
  • As always, those who do find significant issues in using the viewer in RC are asked to report them via a BUG report ASAP.
  • Runitai Linden (aka Dave P), has been working on avatar performance with PBR, hoping to up the performance a little further, as well as continuing to refine reflection probes.
  • Brad Linden continues to work on bug fixing, improving network traffic overheads, etc.
  • A new addition to the PBR viewer is a reflection probe visualisation debug tool, allowing the volume of space specific probes are influencing to be seen, allowing people to better understand where reflections on surfaces are coming from, etc.
  • Application priorities: if a surface had either only PBR Materials applied, or PBR overlaying the “traditional” SL materials, it will be rendered according to the glTF specification. If an object has faces with different materials types (e.g. PBR Materials on some faces – such as the sides of a prim cube, and “traditional” SL materials on the others), the viewer will render the PDR faces via the the PBR renderer, and those face with the older materials in a manner consistent with how the should appear if rendered on a non-PBR viewer.

PBR Resources for Testing

  • There have been some requests for content to test PBR Materials against. Content has been provided (by LL and some of the creators testing PBR Materials already) on Aditi, and some of this could potentially be ported.
  • One suggestion was to make a sandbox available for PBR testing, allowing creators to build / import their own content and test it under different EEP settings (e.g. their own / those in the Library), using the Apply Only to Myself option.
  • Custom EEP settings are one particular area of testing that EEP creators might want to look at (both in terms of the PBR viewer and also with any PBR Materials test content). This is because there have been some changes made to the environment rendering in the PBR viewer which might impact some custom EEP settings, which may require them to be adjusted / updated and / or BUG reports raised against significant issues.
  • For those wishing to gain familiarity with PBR Materials in general, their is the SL Wiki entry for it, and it has been suggested some general test content could be provided through that page.

Future glTF Work

  • Geenz Linden is actively working on real-time mirror as a future follow-on project from the PBR Materials work, as well as working on Screen Space Reflections

Next Meeting

  • Thursday, May 18th, 2023.

† The header images included in these summaries are not intended to represent anything discussed at the meetings; they are simply here to avoid a repeated image of a gathering of people every week. They are taken from my list of region visits, with a link to the post for those interested.