2023 week #36: SL CCUG meeting summary: glTF PBR

Viper Heaven, July 2023 – blog post

The following notes were taken from my audio recording and chat log transcript of the Content Creation User Group (CCUG) meeting held on Thursday,  September 7th, 2023.

  • The CCUG meeting is for discussion of work related to content creation in Second Life, including current and upcoming LL projects, and encompasses requests or comments from the community, together with viewer development work.
  • As a rule, these meetings are:
    • Held in-world and chaired by Vir Linden.
    • Conducted in a mix of voice and text.
    • Held at 13:00 SLT on their respective days.
    • Are subject to the schedule set within the SL Public Calendar, which includes the location for the meetings.
    • Open to all with an interest in content creation.
  • The notes herein are drawn from a mix of my own chat log and audio recording of the meeting, and are not intended to be a full transcript.

glTF Materials and Reflection Probes

Project Summary

  • To provide support for PBR materials using the core glTF 2.0 specification Section 3.9 and using mikkTSpace tangents, including the ability to have PBR Materials assets which can be applied to surfaces and also traded / sold.
  • The overall goal for glTF as a whole is to provide as much support for the glTF 2.0 specification as possible.
  • Up to four texture maps are supported for PBR Materials: the base colour (which includes the alpha); normal; metallic / roughness; and emissive, each with independent scaling.
  • In the near-term, glTF materials assets are materials scenes that don’t have any nodes / geometry, they only have the materials array, and there is only one material in that array.
  • As a part of this work, PBR Materials will see the introduction of reflection probes which can be used to generate reflections (via cubemaps) on in-world surfaces. These will be a mix of automatically-place and manually place probes (with the ability to move either).
  • The viewer is available via the Alternate Viewers page.

Further Resources

General Status

  • An update for the PBR viewer – version 7.0.0.581684 was issued following the meeting, on Friday, September 8th.
  • Work is continuing on the communications bloat issue. This will utilise a new message type – GenericStreamingMessage.
  • Bug fixing continues.

Ambient Lighting / Sky Brightness

  • Also noted in previous summaries, there are some issues around ambient lighting where the PBR viewer is concerned. In particular, it tends to render a lot of EEP settings darker than users are used to (in part because the ambient environment lighting in SL has tended to always be over-saturated and bright), and the PBR viewer effectively reverses this to render environments more realistically.
  • This has been an area of work for some time, with none of the results particularly satisfactory, so there is going to be a further round of changes. with the aim of making existing EEP sky settings render more closely to how the author may have intended, rather than remaining overly dark.
  • In addition, the tone mapper in the PBR renderer is going to be adjusted to help reduce undue darkening of older EEP settings.
  • These changes will not impact EEP settings which are created specifically using the PBR settings and capabilities.
  • This work will mean there will likely be a couple more viewer passes as things are tested and adjusted.

Mirrors

  • Mirrors are a part of the glTF / PBR materials project, but something of a separate tranche of work.
  • The idea is provide the means to have via high resolution reflections (i.e. mirrors) within a scene.
  • Initially only one active mirror surface per scene will be active for any viewer.
  • The process will use the PBR reflection probes mechanism, combined with a automated “Hero Probe” mechanism which with generate high resolution (512×512) “reflections” for the mirror.
  • The system will operate on the basis of avatar / camera proximity to a mirror surface triggering the closest reflection probe to become a “Hero Probe” for that avatar / camera. This means that if there are multiple mirrors placed within an environment, only the one closest to a given avatar / camera will be active and display the “reflections” generated by the reflection probe.
  • Depending on testing and performance, the number of mirrors might be expanded to two – one for mirror surfaces and one for Linden Water to generate high resolution water reflections where appropriate.

Status

  • The promised initial build of a PBR viewer supporting mirrors should be made available in week #37. Caveats for this build include:
    • It is not functionally complete and will not render as fully as LL would like.
    • It is not intended for primary use will likely look like it is breaking things.
    • There are already some known crash issues for Mac OS X which will not be fixed when the viewer is initially made available, but will be fixed in due course.
  • The viewer will be made available via the CCUG Discord Channel, and an LSL script will be provided to allow for easy toggling through mirror options.

In Brief

  • Senra:
    • Requests continue to be made to surface the new web-bases avatar creation / customisation tool through the viewer – even it is is just hooking it to the internal browser and providing an easy and obvious means of accessing it. The major reason this is being requested is to to help new users who may have become “lost” or confused in trying to customise their avatar view the viewer – which is clearly a very different experience to the web capability – to get back to something they understand and can readily re-use.
    • It was pointed out that Senra has yet to be featured in the Choose Your Avatar option in the viewer.
    • It was also pointed out that unlike the older “starter avatars” Sentra does not inherently have any form of outfit structure, so if items are worn directly from the Library, anything worn will be copied to the item type folder (shoes to shoes, hair to hair, etc.), rather than being placed within any form of “outfit folder”, making it harder for new users to understand what is happening with their avatar + the items they are using from the Library may end up getting copied multiple times into various folders in their inventory – potentially leasing to more confusion / frustration down the line.
    • This lead to a lengthy discourse through the greater part of the meeting, further demonstrating the need for a New User Experience User Group (or similar) where those actually responsible for projects like Senra could actually engage with users, address questions, etc., rather than cherry-picking if / what they want to respond to via the forums.
  • The above spun-out into a broader discussion on the development of a Sansar-esque dressing room element in the local viewer of outfit / appearance changes (leaving an “untouched” version of the avatar within a region until the change / update is complete) and the complications of doing so (attachments require simulator rezzing, for example), and an general discussion on outfit changes, etc., without firm conclusions drawn.
  • The question was asked about SL supporting vertex animation textures (VAT). This is seen by the Lab as something that might be investigated further down the glTF implementation road, alongside the likes of blend shapes.

Next Meeting

† The header images included in these summaries are not intended to represent anything discussed at the meetings; they are simply here to avoid a repeated image of a gathering of people every week. They are taken from my list of region visits, with a link to the post for those interested.

2023 week #35: SL CCUG meeting summary: glTF PBR

Reality Escape – Books, Coffee & Chairs – Oh My! – blog post

The following notes were taken from my audio recording and chat log transcript of the Content Creation User Group (CCUG) meeting held on Thursday, August 31st, 2023.

  • The CCUG meeting is for discussion of work related to content creation in Second Life, including current and upcoming LL projects, and encompasses requests or comments from the community, together with viewer development work.
  • As a rule, these meetings are:
    • Held in-world and chaired by Vir Linden.
    • Conducted in a mix of voice and text.
    • Held at 13:00 SLT on their respective days.
    • Are subject to the schedule set within the SL Public Calendar, which includes the location for the meetings.
    • Open to all with an interest in content creation.
  • The notes herein are drawn from a mix of my own chat log and audio recording of the meeting, and are not intended to be a full transcript.

glTF Materials and Reflection Probes

Project Summary

  • To provide support for PBR materials using the core glTF 2.0 specification Section 3.9 and using mikkTSpace tangents, including the ability to have PBR Materials assets which can be applied to surfaces and also traded / sold.
  • The overall goal for glTF as a whole is to provide as much support for the glTF 2.0 specification as possible.
  • Up to four texture maps are supported for PBR Materials: the base colour (which includes the alpha); normal; metallic / roughness; and emissive, each with independent scaling.
  • In the near-term, glTF materials assets are materials scenes that don’t have any nodes / geometry, they only have the materials array, and there is only one material in that array.
  • As a part of this work, PBR Materials will see the introduction of reflection probes which can be used to generate reflections (via cubemaps) on in-world surfaces. These will be a mix of automatically-place and manually place probes (with the ability to move either).
  • The viewer is available via the Alternate Viewers page.

Further Resources

General Status

  • Cosmic Linden has been working on some permissions updates to make permissions meaningful when being addressed via LSL. This means that is a materials surface in set to No Modify, a script will not be able to change its tint, for example.
    • Support for this is currently on the glTF / PBR servers on Aditi, where it is being tested (same region names as above, just on Aditi).
    • Changes to reflect these updates will also be made to the viewer so things like the build floater correctly reflect the permissions status.
  • Runitai Linden is continuing to work on the communications bloat issue. This will utilise a new message type – GenericStreamingMessage. This will both make messages passing between the simulator and viewer more compact and also less frequent in order to reduce the load.
  • To help improve people’s awareness / avoid confusion with the updated Build floater, a new pop-up is being implemented that will be displayed and give information on the change when the floater is first opened and used with a PBR material, complete with a link to the PBR wiki page.

Mirrors

  • Mirrors are a part of the glTF / PBR materials project, but something of a separate tranche of work.
  • The idea is provide the means to have via high resolution reflections (i.e. mirrors) within a scene.
  • Initially only one active mirror surface per scene will be active for any viewer.
  • The process will use the PBR reflection probes mechanism, combined with a automated “Hero Probe” mechanism which with generate high resolution (512×512) “reflections” for the mirror.
  • The system will operate on the basis of avatar / camera proximity to a mirror surface triggering the closest reflection probe to become a “Hero Probe” for that avatar / camera. This means that if there are multiple mirrors placed within an environment, only the one closest to a given avatar / camera will be active and display the “reflections” generated by the reflection probe.
  • Depending on testing and performance, the number of mirrors might be expanded to two – one for mirror surfaces and one for Linden Water to generate high resolution water reflections where appropriate.

Summary

  • Geenz Linden hopes to start working on a version of the PBR viewer which supports mirrors very shortly.
  • The final data model for mirrors will not be available on the server end in time for the initial version of a Mirrors viewer, but will be coming later, as it is dependent on dialling-in the parameters required for the mirrors functionality based on testing.
  • Overall, the approach now taken means that mirrors will now not just be limited to planar (flat) surfaces.

Lighting / Ambient Lighting

  • Concerns continue to be raised over colour saturation / ambient colour / light within the PBR viewer within non-PBR regions. People appear to be reporting particular issues with over-saturation and with black surfaces (particularly clothing) looking “flat” or minus any clear definition.
  • It was pointed out that some of the issue may well be down to a combination of running the PBR viewer within regions that do not have the proper PBR environment adjustments, thus leading  – to some degree at least – to tone mapping being overly biased for darker colours + the adjustments made within the PBR viewer to compensate for the lack of reflection probes in non-PBR regions still requiring further tweaking + the PBR viewer generally not rendering the excess ambient light common to existing ambient lighting in non-PBR regions.
  • A lot of this is known to be an issue, and something Runitai Linden has been looking to address, as per my pervious CCUG summaries such as this one.

In Brief

  • Rider Linden noted the following updates will be available in the near future via simulator RC releases:
    • “Dog Days” update, due to go to one or more simulator RC channels during week #36 (commencing Monday, September 4th):
      • The unbinding of the Experience KVP database read / write functions from land (users will still require an Experience to access the KVP database).
      • A scripted ability to set CLICK_ACTION_IGNORE, allowing an object to be clicked-through to reach an object behind it – a flag supporting this is included in the current release viewer.
    • The still in development “Fall Colours” update, which will include:
      • llIsFriend – essentially “Is the avatar touching this object a friend of the object’s owner?”, and then act accordingly.
      • llGetInventoyDesc(ription) – a function to return a list of the contents within an rezzed object.
  • User Animats has been experimenting with a new open-source convex hull algorithm for making physics models, which he describes as probably “not suitable” for the SL mesh uploader, but which “might be useful” when working in Blender. There is a forum thread on this line of investigation got those interested.
  • A suggestion was made that as LL gather stats on the hardware users are employing to access SL, that some measure of this data is made public-facing so that creators might have a better idea of the “typical” hardware environment they need to consider, rather than assuming everyone is either running high-end or low-end systems.
    • This was put in terms of something like a list of the 10 or so most commonly used CPU / GPUs (either individually or in combination); most common RAM amounts used (8GB, 16GB, etc.), with the information made available via a web page or similar.
    • It was pointed out that some of the information might be difficult to put together as it might not be possible to accurately extrapolate or consolidate in a meaningful way.
    • However, Vir Linden thought the idea is worth poking at to see what, if anything might be done towards achieving it.
  • An animated discussion on the permissions system and No Mod objects including:
    • Why people use No Mod (such as the mistaken belief that it “prevents copybotting” + the misunderstanding users might have that while a scripted item may way have No Mod scripts, the item itself can still be modified, etc).
    • A idea for the “reset to defaults” override button for Modify object so that if a user royally messes up an object, they can click the button and restore the original look / texturing, etc., of the object (potentially very hand for Mod / No Copy objects).
    • The addition of a new attribute “Demo” which can be used to both lock an object into No Mod and add some form of “demo” indicator to it for when the user is wearing / examining it.
    • The problem with any changes with the permissions system is that it is a) already extremely complicated in its implementation; b) would require considerable care; c) is liable to be a lengthy, far-reaching project. As such, there may not be the appetite within LL to take on such work.

Next Meeting

† The header images included in these summaries are not intended to represent anything discussed at the meetings; they are simply here to avoid a repeated image of a gathering of people every week. They are taken from my list of region visits, with a link to the post for those interested.

2023 week #33: SL CCUG meeting summary: glTF PBR; Senra

Chang’an, May 2023 – click any image for full size – blog post

The following notes were taken from my audio recording and chat log transcript of the Content Creation User Group (CCUG) meeting held on Thursday,  August 17th, 2023. 

  • The CCUG meeting is for discussion of work related to content creation in Second Life, including current and upcoming LL projects, and encompasses requests or comments from the community, together with viewer development work.
  • As a rule, these meetings are:
    • Held in-world and chaired by Vir Linden.
    • Conducted in a mix of voice and text.
    • Held at 13:00 SLT on their respective days.
    • Are subject to the schedule set within the SL Public Calendar, which includes locations for both meetings (also included at the end of these reports).
    • Open to all with an interest in content creation.

Additional note: this meeting suffered several drop-outs (for whatever reason) plus my own Internet connection also went out; as such this is not a complete reflection of the meeting and all topics.

Viewer Status

General Viewer Notes

  • There are a couple of issues with the Inventory Extensions RC viewer which need to be addressed before this progresses to being ready for promotion to release status.
  • The internal discussions on font changes in the Emoji viewer (see my last CCUG / TPV meeting summary) will likely be split into a separate project, allowing the Emoji viewer to progress forward as it is.

glTF Materials and Reflection Probes

Project Summary

  • To provide support for PBR materials using the core glTF 2.0 specification Section 3.9 and using mikkTSpace tangents, including the ability to have PBR Materials assets which can be applied to surfaces and also traded / sold.
  • The overall goal is to provide as much support for the glTF 2.0 specification as possible.
  • Up to four texture maps are supported for PBR Materials: the base colour (which includes the alpha); normal; metallic / roughness; and emissive, each with independent scaling.
  • In the near-term, glTF materials assets are materials scenes that don’t have any nodes / geometry, they only have the materials array, and there is only one material in that array.
  • As a part of this work, PBR Materials will see the introduction of reflection probes which can be used to generate reflections (via cubemaps) on in-world surfaces. These will be a mix of automatically-place and manually place probes (with the ability to move either).
  • The viewer is available via the Alternate Viewers page.

Further Resources

General Status

  • The communications bloat driven by multiple script-driven glTF materials updates generating multiple connections between the viewer and the simulator (thus impacting performance) continues to be addressed. The outcome of this work is liable to result in protocol changes as and when the work is complete.
  • A number of permission-related fixes have been implemented.

Lighting / Ambient Lighting

  • The issue over the rendering changes the glTF project will bring to Second Life. It has been well-established that the PBR system removes the forward render pipe (aka “non-Advanced Lighting Model (ALM)”) from the viewer’s renderer.
  • Equally, over the last several meetings (and as noted in my CCUG summaries covering PBR) there has been discussion on the fact that PBR utilises HDR + tone mapping with rendering  / lighting. This is a significant change to Second Life, particularly when it comes to the amount of rendered ambient light (until now SL has rendered a lot of additional ambient light), resulting in two related issues:
    • Because it is intended to mimic real-world ambient lighting, environments rendered on the PBR viewer with HDR + tone mapping enabled can look a lot darker than when viewed with a non-PBR viewer, and baked lighting really does not work (and can end up looking very bad) – scripted / direct lighting is required – something for which many store owners and users in general might not be prepared.
    • While disabling the ambient HDR rendering is possible within the PBR viewer (potentially eliminating the above issues), it conversely results in any PBR content within the scene looking “bad” or “broken”, risking content creators trying to find workarounds to the glTF specification in order to make their content “look good” under either lighting condition – something they absolutely should not do.
  • Discussions are still on-going at the Lab on how best to handle this conflict between PBR and non-PBR rendering as the latter is deployed and gradually gains broader use. As a part of this, it has been recognised that one of the most direct means to alert users is via communication.
  • To this end, a form of “best practices” and guidelines for PBR are being developed with the intention to make them available to users in advance of PBR being fully deployed / released. Expect to (hopefully) hear more about this via future CCUG meetings.

Mirrors

  • Geenz Linden is currently refactoring the Hero probes which will be automatically selected for generating higher resolution reflections based on an avatar’s proximity to a planar mirror surface (and initially limited to 1 (or possibly 2 per scene  – the second being for Linden Water reflections, but this has yet to be confirmed).
  •  This work will see the Hero probes treated as their own class of reflection probes with their own filtering, etc., to avoid conflicts (debugging, etc.) with the “general” reflection matte manger. This will also better support the higher resolution of the Hero probes and possibly allow for additional Hero probes to be supported within the viewer in the future.

Building Tools

  • PBR will see a significant change to the Edit / Build floater as the project becomes more widely deployed, and this has started internal discussions at LL about the state of the in-world build tools, and what might be done to improve them for general use. Ideas put forward at the meeting included:
    • More flexible means of cutting holes in prims (e.g. offset from the Z axis of the prim), such as through the introduction of a Boolean support.
    • Making text entry within the floater consistent (e.g. clicking on some fields, the existing content is highlighted for over-writing, in others it isn’t).
    • Inclusion of a prim alignments capability (as found in some TPVs) as a default tool + making the alignment more flexible that just to the sides / edges of the bounding boxes of the objects.
    • A broader range of primitive shapes (e.g. simple step units) and improved tools for torturing prims to produce custom shapes.
    • Better exposure of some of the build options on the official viewer (e.g. making the Local Textures option a radio button option in the Texture tab, rather than hidden in a drop-down) to make them more visible.
    • Also making it clearer that an object includes Local Textures, such as via a pop-up, to remind the creator to apply an actual texture to the affected face(s).
    • Better tools  / support for making clothing.
    • Better UDIM support for UV maps.
  • These idea will be fed back into discussions at LL.

Senra Avatars

[Note: this section is abbreviated as I lost my Internet connection for some 14 minutes of the meeting & this was followed by the meeting being disrupted a further 2 times.]

  • Further discussion over the continued confusion / concern over the Senra avatar system (outside of the ongoing disquiet over the licensing agreement), including:
    • Frustration / confusion over the amount of conflicting information being offered by Linden Lab – e.g. the dev kit application form states applicants “must” own a store but Patch Linden has stated in a forum post that owning a store “is not” a requirement.
    • Negativity on the use of an application requirement at all:
      • Some see it as presenting an unnecessary barrier for some (e.g. users who just want to create Senra-related items for their personal, rather than commercial, use).
      • Concerns over privacy / security with the devkit application form being outside of SL (where it is subject to potential abuse) rather than – as with the Mesh Upload Status form  – being included within the Secondlife.com dashboard,  where it would be both secure and firmly linked to an account.
    • Further questioning as to why AvaStar has been determined to be a core requirement for the devkit, rather than simply relying on Blender.
  • More general frustration was voiced at the idea that the “Senra team” only appear to be willing to engage through the forum threads on the topic, and then only in what is perceived as being a narrow focus of engagement, with no-one appearing willing to attend the CCUG meeting – which given Linden Lab want creators to produce content for Senra would seem to be a pretty good place for them to actively address feedback in real time, at least in lieu of any Senra-focused meeting(s).

In Brief

  • While they are not in anyway directly connected or related, Cosmic Linden noted that her work on enabling PDR materials as terrain textures in the viewer is being used as a testbed for possible approaches to enabling PBR with avatar Bakes on Mesh – although the latter is not currently an active project.

Next Meeting

† The header images included in these summaries are not intended to represent anything discussed at the meetings; they are simply here to avoid a repeated image of a gathering of people every week. They are taken from my list of region visits, with a link to the post for those interested.

2023 week #31: SL CCUG + TPVD meetings summary

Strandhavet Viking Museum, May 2023 – blog post

The following notes were taken from my audio recording and chat log transcript of the Content Creation User Group (CCUG) meeting held on Thursday, August 3rd, and the Third Party Viewer Developer (TPVD) meeting held on Friday, August 4th, 2023. 

Meetings Overview

  • The CCUG meeting is for discussion of work related to content creation in Second Life, including current work, upcoming work, and requests or comments from the community, together with viewer development work.
  • The TPV Developer meeting provides an opportunity for discussion about the development of, and features for, the Second Life viewer, and for Linden Lab viewer developers and third-party viewer (TPV) / open-source code contributors developers to discuss general viewer development.
  • As a rule, both meetings are:
    • Held in-world and chaired by Vir Linden.
    • Conducted in a mix of voice and text.
    • Held at 13:00 SLT on their respective days.
    • Are subject to the schedule set within the SL Public Calendar, which includes locations for both meetings (also included at the end of these reports).
    • Open to all with an interest in content creation / viewer development.
  • As these meetings occasionally fall “back-to-back” on certain weeks, and often cover some of the same ground, their summaries are sometime combined into a single report (as is the case here). They are drawn from a mix of my own audio recordings of the meeting + chat log (CCUG), and from the video of the TPVD meeting produced by Pantera Północy (which is embedded at the end of the summaries for reference) + chat log. Not that they are summaries, and not intended to be transcripts of everything said during either meeting.

Viewer News

No changes through the week, leaving the current official viewer in the pipeline as:

Note that the alternate viewer page also lists “Win32+MacOS<10.13 – 6.6.12.579987” as an RC viewer. However, the Win 32 + pre-Mac OS 10.13 was promoted to release status on July 5th, and viewer version 6.6.12.579987 points to the Maintenance S viewer, promoted to release status on May 16th.

General Viewer Notes:

  • The Inventory Extensions viewer has a couple of bugs which are preventing it progressing but are being worked on. There are also some simulator-side issues (inventory thumbnail images being dropped) which are also being addressed. However, this remains the next potential viewer for promotion to de facto release, alongside of the Maintenance U RC viewer.
  • The Maintenance U RC includes an extension to actions available when clicking on in-world objects. CLICK_ACTION_INVISIBLE effectively makes an object “invisible” to mouse clicks, allowing it to be clecked through to whatever might be lying behind it.  This functionality will be supported within the next simulator deployment, due in week #32.
  • The Emoji project viewer may see some font changes prior to progressing further (which may additionally require UI work in general) & is still adding further UI additions.

glTF Materials and Reflection Probes

Project Summary

  • To provide support for PBR materials using the core glTF 2.0 specification Section 3.9 and using mikkTSpace tangents, including the ability to have PBR Materials assets which can be applied to surfaces and also traded / sold.
  • The overall goal is to provide as much support for the glTF 2.0 specification as possible.
  • Up to four texture maps are supported for PBR Materials: the base colour (which includes the alpha); normal; metallic / roughness; and emissive, each with independent scaling.
  • In the near-term, glTF materials assets are materials scenes that don’t have any nodes / geometry, they only have the materials array, and there is only one material in that array.
  • As a part of this work, PBR Materials will see the introduction of reflection probes which can be used to generate reflections (via cubemaps) on in-world surfaces. These will be a mix of automatically-place and manually place probes (with the ability to move either).
  • The viewer is available via the Alternate Viewers page.

Further Resources

Status

  • LL is seeking feedback on how best to handle sky rendering. In short, ambient lighting is handled differently within “non-PBR” viewers and “PBR viewers” (notably, the latter uses HDR + tone mapping where the former does not).
  • As the majority of ambient environments have been designed using the “non-PBR” viewer rendering system, they undergo an auto-adjustment process within the PBR viewer so that that match the glTF specification requirements. Unfortunately, this can leave some skies / ambient lighting looking far too dark – and potentially lead to complaints from users on the PBR viewer (at least until more “PBR compliant” EEP assets make themselves available).
  • To compensate for this, LL included the option to disable the HDR / tone mapping processes in the viewer by setting Probe Ambience to 0 with Graphics preferences. However, doing this makes content specifically designed for PBR environments look muted and much poorer than they should. This brings with it the concern that to try to make their content look good in both “PBR” and “non-PBR” environments, creators will start to go “off-piste” (so to speak) from the glTF specification when making new content, thus defaulting the entire objective in trying to move SL to match recognise content creation standards.
  • There have been two main schools of thought within LL as to how to best handle both situations, these being:
    • Continue to iterate on the auto-adjustment system so it can handle a broader range of sky settings that are in popular use without them going overly dark within the PBR viewer.
    • Initially make HDR / tone mapping opt-in, rather than opt-out (so probe ambience is set to 0 by default, but can be set above zero by users as required) until such time as all viewers are running with PBR, then switch to making it opt-out (so HDR / tone mapping must be manually disabled).
  • General feedback at the meeting was for LL to continue to try to iterate and improve the the automatic adjustment to HDR / tone mapping for skies, so as to avoid the need for content creators to have to start producing “PBR” and “non PBR” versions of their content.
  • Outside of this, it has also been reported that multiple script-driven glTF materials updates (such has those that might be seen with the changing pattens on a disco floor, for example) actually cause multiple network connections, impacting network bandwidth to the viewer, which is hardly ideal.  This is currently being addressed, but until fixed on the simulator side, it will see a pause in glTF simulator updates being released.
  • The work on “hero” reflection probes for planar mirrors is continuing to progress.

Senra Discussions – CCUG and TPVD

Via the Content Creation Meeting:

  • A lengthy discussion on the Senra SDK and the requirement for Avastar with Blender – seen as a paywall block for creators who may not have previously entered the clothing market, but who want to in order to support Senra. Unfortunately, no-one directly involved in the Senra body development was at the meeting to handle questions.
    • Avastar is generally required with Blender as  the latter uses “none-standard” axes orientation compared to other tools, resulting in issues such as armature rotations being incorrect, plus its Collada export doesn’t (I gather, subject to correction here) support volume bones.
    • However, it was noted that other mesh bodies available within SL provide SDKs where these uses are fixed for Blender without the need to reference Avastar – so the questions were raised as to why LL haven’t done the same (or at least looked at those solutions).
    • The discussion broadened into issues with the avatar blend file itself which have long required fixing, with the promise that all comments on the SDK, Blender, and the avatar Blend file will be passed back to the relevant parties at the Lab.
      Those at the meeting from LL noted their hope that  – down the road – the switch from Collada to glTF-compatible formats will help to eliminate many issues related to avatar content creation, and if nothing else, will look to address specific issues . this, and that if nothing else, they will mark the need to fix the armature rotation issue with that work (“glTF Phase 2”) if it is not addressed beforehand.

Via the TPVD Meeting:

  • It was noted that there currently isn’t a formal venue for discussion Senra outside of the current forum threads or the Discord channel (for those able to access it.
  • The suggestion is currently to have a special purpose meeting – possibly under the CCUG banner – where those who developed Senra could respond to questions / concerns. This suggestion is being passed to Patch Linden who is better placed to arrange a meeting, given the Senra project largely falls within his remit.
  • There is a lot of concern / confusion over the SDK licensing (again, please refer to the forum thread on this for details).
  •  It was indicated that the Senra content will soon have inventory thumbnails included, ready for when the Inventory Extension viewer is promoted to release status.
  • Concerns about new users getting confused by wearing Senra items directly from the Library a) do not appear to be highlighted to indicate they have been added to the avatar (this is actually because the process of “wearing” the item has actually generated a copy within the user’s inventory, which *is* highlighted as added / worn); b) individual items added to an avatar in this manner go to the matching object class type system folder, *not* to a dedicated Senra folder (e.g. mesh clothing is copied to the Objects folder; skins go to the Body folder, etc.).

In Brief

Via the TPVD meeting

  • General discussions on:
    • Scalable fonts (as implemented by Genesis viewer).
    • How TPVs block older versions (for releases, the viewer requests a list of blocked versions from the TPV server in question (say, Firestorm, for the sake of argument), and if it finds itself on the list, it terminates trying to log-in to SL).
    • The move to de-dupe some asset types (textures, notecards, scripts  & (possibly) gestures by giving multiple CDN versions the same UUID number, including clarification on the difference between the original asset, the UUIDs for multiple versions and also inventory IDs (which handle permissions, etc.).
    • An extensive discussion on chat bubbles and toasts in the official viewer.
  • Please refer to the TPVD meeting video below for further details on the above discussions.

Next Meetings

† The header images included in these summaries are not intended to represent anything discussed at the meetings; they are simply here to avoid a repeated image of a gathering of people every week. They are taken from my list of region visits, with a link to the post for those interested.

2023 week #29: SL CCUG meeting summary: Senra, glTF, etc.

LemonCliff, May 2023 – blog post

The following notes were taken from my audio recording and chat log transcript of the Content Creation User Group (CCUG) meeting held on Thursday, July 20th, 2023. 

  • The CCUG meeting is for discussion of work related to content creation in Second Life, including current work, upcoming work, and requests or comments from the community, together with viewer development work.
  • As a rule, these meetings are:
    • Held in-world and chaired by Vir Linden.
    • Conducted in a mix of voice and text.
    • Held at 13:00 SLT on their respective days.
    • Are subject to the schedule set within the SL Public Calendar, which includes locations for both meetings (also included at the end of these reports).
    • Open to all with an interest in content creation.

Viewer Updates

  • The Maintenance U(pbeat) RC viewer, version 6.6.14.581101, was released on July 21st. Key changes in this viewer comprise:
    • Improvement parcel audio as the viewer leverages VLC for audio streams.
  • The Inventory Extensions viewer was promoted to RC status with version 6.6.14.581058, on July 20th.
    • A new option Show Ban Lines On Collision (toggled via World→Show) which will only show banline on a direct collision (foot or vehicle) rather than constantly visible when within camera range.
  • The Alternative Viewers page appears to have suffered a hiccup, listing version 6.6.12.579987 as the “Win32+MacOS<10.13” RC viewer.  However,
    • The Win 32  + Pre-MAC OS 10.3 viewer was promoted to release status on July 5th.
    • 6.6.12.579987  was the version umber assigned to the Maintenance S RC viewer (primarily translation updates), originally issued on May 11th, and promoted to de facto release status on May 16th.

The release and Project viewers currently in the pipeline remain unchanged:

    • Release viewer: 6.6.13.580918, formerly the Maintenance T RC viewer, promoted on July 14.
    • Project viewers:

Senra NUX Avatars

  • There was a stir in the week when the Senra brand of mesh avatars designed by LL (and primarily intended for new users as a part of the New User eXperience  – NUX) were made available through the system Library and then withdrawn.
  • This was apparently not an error on LL’s part, but rather the result of an issue with the avatars being noted, prompting their removal from the Library.
  • The removal did not prevent some users grabbing copies of the avatars + accessories (presumably by copying items from the Library to their inventory), which weren’t removed as a part of the “recall”.
  • The appearance of the bodies + accessories also sparked a fair degree of forum discussion, approximately starting towards the bottom of page 17 of this thread.
  • In reference to the thread, LL encourage those who did manage to retain the Senra bodies and are observing issues / have concerns to continue to record feedback there, as “all eyes” involved in the project are watching that thread.

glTF Materials and Reflection Probes

Project Summary

  • To provide support for PBR materials using the core glTF 2.0 specification Section 3.9 and using mikkTSpace tangents, including the ability to have PBR Materials assets which can be applied to surfaces and also traded / sold.
  • There is a general introduction / overview / guide to authoring PBR Materials available via the Second Life Wiki.
  • For a list of tools and libraries that support GLTF, see https://github.khronos.org/glTF-Project-Explorer/
  • Substance Painter is also used as a guiding principal for how PBR materials should look in Second Life.
  • Up to four texture maps are supported for PBR Materials: the base colour (which includes the alpha); normal; metallic / roughness; and emissive, each with independent scaling.
  • Given the additional texture load, work has been put into improving texture handling within the PBR viewer.
  • In the near-term, glTF materials assets are materials scenes that don’t have any nodes / geometry, they only have the materials array, and there is only one material in that array.
  • As a part of this work, PBR Materials will see the introduction of reflection probes which can be used to generate reflections (via cubemaps) on in-world surfaces. These will be a mix of automatically-place and manually place probes (with the ability to move either).
  • The overall goal is to provide as much support for the glTF 2.0 specification as possible.
  • As a result of the updates, SL’s ambient lighting will change (e.g. indoor spaces will appear darker, regardless of whether or not Shadow are enabled), and so there will be a period of adjustment for users (e.g. opting to install lighting in indoor spaces, choosing between the HDR lighting model of glTF or opting to set a sky ambient level).
  • The viewer is available via the Alternate Viewers page.
  • The simulator code is now more widely available on the Main Grid, including some sandbox environments, but still in RC. Demonstration regions might be found at: Rumpus RoomRumpus Room 2, Rumpus Room 3, Rumpus Room 4, Rumpus Room 5.
  • Please also see previous CCUG meeting summaries for further background on this project.

Status

  • Many in the team have been out-of-office recently, and so work had slowed for a while.
  • The focus remains on bug fixing within both the viewer and the simulator code.

Double-Sides Materials Concerns

A request was made for LL to remove double-sided materials from the PBR work, due to the following concerns:

  • Inexperienced creators misunderstanding the capability (e.g. a content creator who makes a pendant with 500,000 triangles and applying materials to all of them, overriding backface culling), and:
  • Clothing creators who add additional tris along the edges of clothing (e.g. cuffs, lapels, collars, etc.), to given the illusion of an “inner” material instead utilising double-sided materials instead, leading to:
  • The potential of both of these leading to noticeable viewer performance impacts (e.g. due to doubling the amount of rasterising the viewer must perform).

In response Runitai Linden noted:

  •  Double-sided materials is a part of the glTF specification, and so will remain  within the PBR project, and so forms a part of the overall requirements for obtaining Khronos 3D commerce certification, which LL would like to achieve for Second Life, and for that reason will not be removed.
  • In terms of performance LL believe:
    • Double-sided materials generally do not get rasterised twice (e.g. if you are looking at the front face of a leaf with double-sided materials, the back face is not rasterised) – although there are some exceptions to this.
    • Double-sided materials are a “fill” hit, not a per triangle hit, so the performance hit decreases exponentially as the camera moves away from the object – so for actual double-sided objects, it is a performance win.
  • To help safeguard against accidental misuse, the option to apply double-sided materials must be explicitly enabled when uploading, even if the materials themselves have been created as double-sided (if the option is not explicitly set, then they will be uploaded as single-sided).
  •  The issue does admittedly have edge-cases, and there are issues around any implementations for double-sided materials (e.g. how do you penalise for incorrect use? Increased LI? But then – a) what about worn items (which are immune to LI), and b) how does the viewer differentiate between “correct” use of double-sided materials and an “incorrect” use, in order to avoid penalise good practices in error?
  • However, LL are not going to disable / artificially limit the use of double-sided materials due to the potential for misuse, either accidental or deliberate.

PBR Mirrors

  • This is a follow-on project to the PBR Materials, intended to provide a controlled method to enable planar mirrors in SL (i.e. flat surface mirrors which can reflect what is immediately around them, including avatars).
  • As per my previous CCUG update, the approach being taken is to use a “hero probe”.
    • This uses a materials flag added to a surface which allows it to be considered as a mirror face, based on the proximity of a camera to it.
    • When a camera is within the expected range, the flag will instruct the viewer to create a “hero probe”, rendering high resolution (512×512) reflections on the mirror surface until such time as the camera moves away.
    • It is an approach which allows for multiple mirrors within a scene, whilst minimising the performance impact to only one mirror per viewer.
  • The concept is now working in tests, and depending on performance, it is possible the viewer might be allowed to support up to two hero probes at a time: one for any nearby mirror surface, and the other for generating reflections on any nearby Linden Water.
  • It is hoped that a project viewer will be available for public testing of the idea will be available Soon™.

ARC  – Avatar Render Cost

  • Intended to be a means of calculating the overall cost of rendering individual avatars by the viewer, ARC has long been acknowledged as inaccurate.
  • Currently, the project to adjust both ARC calculations and the actual cost of rendering in-world objects to make them more reasonable – Project ARCTan – remains inactive.
  • The problem with such metrics like ARC is that they depend on a range of analyses which, when combined, do not necessarily result in an accurate reflection of real-world rendering very well.
  • However, those curious about the rendering cost can use:
    • World→Improve Graphic Speed→Your Avatar Complexity to seen the render impact (in ms, currently for the CPU, but with the PBR viewer, for the GPU) of their own attachments can have on own and other viewers.
    • World→Improve Graphic Speed→Avatars Nearby to see the rendering impact of other avatars within view.
    • Note that both will fluctuate do to the general “noise” of rendering, however, the generated figures are far more accurate in real terms than those for ARC.
    • Details of these capabilities – first deployed in Firestorm, and contributed to Linden Lab for inclusion in all viewers, can be found in this blog, here.
  • Questions were asked over the ability to see these figures displayed over avatars heads vs. having to go to a “specialised” menu, with some at the meeting pointing to the overhead display being preferable, because it it “there”. However, this overlooks the facts:
    • It could be received as “cluttering” the in-world view and reducing immersiveness.
    • If displayed as hover text, it could be easily disabled either by an dedicated UI setting or simply by exposing the debug to disable avatar-related hover text
    • Most particularly, any such display (even if added to name tags) would in fact adversely impact performance due the CPU / GPU cycles taken up by performing the calculations and then displaying them – with Runtai noting it can takes “several times longer” CPU time to calculate and display avatar render cost on the than it does to render the avatar.
  • The above led to a broader discussion on how to encourage better awareness of avatar impact on viewer performance (ARC shaming not being a positive approach to things), such as general education among users and having some form of “try before you buy” capability (if this were possible to implement) which would offer the ability to see the impact of wearing a specific attachment ahead of wearing it), or some form of inspection capability at upload which might encourage creators to go back and better optimise their avatar attachments.
    • One noted issue here is ensuring both sides of the equation have the tools to make more informed decisions: creators in terms of making their content more performant / efficient, and consumers to enable them to be able to better identify performant / efficient content. The latter is particularly important in its ability to drive market forces through users being able to naturally gravitate towards more efficient content.

Tags for Wearables

This was an idea mooted by the Lab in the meeting – not a project currently being worked upon.

  • A tag system which allows items with a certain tag to automatically replace another of the same tag type with a single click and without also replacing other items using the sane attachment point. For example, an item tagged as “hair” replaces the currently worn hair with a single click, but without also knocking off a hat also worn on the skull.
  • This was expanded upon by the idea of tags being used with demo items – the tag being used to perform tasks such as:
    • Only allowing the demo item to be worn within a certain location (e.g. the “dressing” area of the store).
    • Somehow records the item being worn prior to using the demo, o that it is automatically replaced when the demo item is removed.
  • The problem with the latter idea is that everyone uses demos differently, so assigning a single place at which a demo can be tried is a non-starter (do we really wany people trying demos at already busy events? What about items purchased via the MP or affiliate vendors, what location should be assigned to them? How is the creator to differentiate? Multiple versions of the same item for different points-of-sale? What about people who don’t have a home location by use sandboxes, but the demo tagged for use only within the avatar’s home location? Can this realistically even be done?).
  • An alternative suggestion for tags put forwards at the meeting was to have them as a part of the upload process, so creators could be reminded / encouraged to specific the desired attachment point via a tag list, so that users are not left with items defaulting to their avatar’s right hand.
  • There are a range of issues over any tag system, including:
    • a) How well the option would be used unless enforced; b) Even if enforced, how many content creators, would actually define the preferred attach point over just selecting the first one on the list?
    • The idea leans towards WEAR, rather than ADD – so will not necessarily overcome the confusion of new users who wish to ADD an item to their avatar, only to find it knocks something else off of their avatar.
    • How many tags should be in the system? “Hair”, “shirt”, “pants”, “gloves”, “shoes” are all straightforward – but what about shawls or shoulder wraps? should they be classified as a shirt or a collar, or have their own tag or individual tags? How are rings, earrings, pendants, etc., be classified / tagged?

Next Meeting

† The header images included in these summaries are not intended to represent anything discussed at the meetings; they are simply here to avoid a repeated image of a gathering of people every week. They are taken from my list of region visits, with a link to the post for those interested.

2023 SL Puppetry project week #28 summary

Puppetry demonstration via Linden Lab – see below.  Demos video with the LL comment “We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.”

The following notes have been taken from chat logs and audio recording of the Thursday, July 13th, 2023 Puppetry Project meetings. Notes in these summaries are not intended to be a full transcript of every meeting, but to highlight project progress / major topics of discussion.

Project Overview

General Project Description as Originally Conceived

LL’s renewed interest in puppetry was primarily instigated by Philip joining LL as official advisor, and so it really was about streaming mocap. That is what Philip was interested in and why we started looking at it again. However since Puppetry’s announcement what I’ve been hearing from many SL Residents is: what they really want from “puppetry” is more physicality of the avatar in-world: picking up objects, holding hands, higher fidelity collisions. 
As a result, that is what I’ve been contemplating: how to improve the control and physicality of the avatar. Can that be the new improved direction of the Puppetry project? How to do it?

– Leviathan Linden

  • The project is rooted in the in idea of “avatar expressiveness”, referenced in a February 2022 Lab Gab session with Philip Rosedale and Brad Oberwager and officially introduced as Puppetry in August 2022 to provide a means by which avatars can mimic physical world actions by their owners (e.g. head, hand, arm movements) through tools such as a webcam and using technologies like inverse kinematics (IK) and the  LLSD Event API Plug-in (LEAP) system.
  • Since that time the project has expanded in size, attempting to encompass improving SL’s (somewhat primitive) IK system; investigating and developing ideas for direct avatar-to-avatar, avatar-to-object interactions (“picking up” an apple; high-fives. etc.); providing enhanced LSL integration for animation control; broader hardware support; adoption of better animation standards, etc.
  • This has led to a change in approach for the project – see below for more.

Bugs, Feature Requests and Code Submissions

  • For those experimenting with Puppetry, Jiras (bug reports / fixes or feature requests) should be filed with “[Puppetry]” at the start of the Jira title.
  • There is also a public facing Kanban board with public issues.
  • Those wishing to submit code (plug-ins or other) or who wish to offer a specific feature that might be used with Puppetry should:

Resources

Change In Approach

  • The Puppetry User Group meetings have, until now, been held on Aditi (the Beta grid) at the Castelet Puppetry Theatre, commencing at 13:00 SLT, and generally on alternate Thursdays to the Content Creation meetings, and as per the Second Life Public Calendar.
  • As of the July 13th, 2023 these meetings have now been suspended until further notice.
  • This does not mean the project is being abandoned; it was noted during the meeting that as several of those involved in the project attend other User Group (SUG) meetings – notably, but not exclusively, the Tuesday Simulator User Group meeting -, discussions on Puppetry can continue within hose meetings.
  • Explaining the decision, Simon Linden Noted:
There’s definitely a lot of interested tech and possible features with [Puppetry]. [However] the original idea of doing real-time mocap on webcams was like opening Pandora’s box in terms of features and ideas, and also was a lot harder than we expected … in the end I think it’s better to work on some fundamental tech that can be used in a lot of other ways – like IK, streaming, figuring out how animation data can work with scripts, solving some challenges like just doing a decent hand-shake.

– Simon Linden, July 13th, 2023

  • Elements of Puppetry which have thus far been confirmed as continuing as WIP projects comprised (but are not necessarily limited to):
    • The real time animation streaming component of Puppetry (forming something of a hybrid between the LEAP <-> viewer work already undertaken, and Leviathan Linden’s work in streaming animation playback from one viewer, through the simulator and to other viewers without any direct interaction with the animations by the simulator).
    • IK Improvements and updates (see below).
    • Improved animation import support (see below).
  • There are also broader discussions going on in the Lab regarding possible further overhaul of the animation system.
  • The above decision re: Puppetry meetings being the case, this will be the last of these my dedicated Puppetry Puppetry summaries for the time being, but I will continue to report on Puppetry / related work as and when it is discussed at other meetings such as SUG meetings and Content Creation User Group meetings.

Meeting Notes

Animation Import

  • One of the Puppetry expansions, improving / broadening animation import into Second Life was spun-off into its own project and June.
  • Notably with this work, LL is using Assimp (github: https://github.com/assimp/assimp), which supports some 30-40 animation formats (including the likes of .FBX and glTF), converting them to its own format for ease of import to multiple platforms.
  • The work is now in its own branch of the official viewer (DRTVWR-584, not currently ready for public consumption in any way).
  • This viewer uses the Assimp engine to read animation files, and the viewer extracts data from there for preview and then uploading as a SL animation. Animation imports it supports include:
    • BVH files, as per the current animation import within the viewer.
    • FBX format files.
    • Animations saved with the Mixamo skeleton, as supported by other tools.
  • The Mixamo element of the viewer is currently incomplete, but there is a focus on getting it wrapped up so the viewer can enter the project viewer pipeline at some point for public testing. However, when complete, it is hoped that importing an animation with a Mixamo skeleton from the likes of  Blender or a tool like Rokoko Studio should work fairly seamlessly.
  • To help with imports, Aura Linden has included an option on import to scale motion, which might be further automated slightly, for improved ease-of-use among less experienced content creators.
    • If this process is automated, it will include an capability for manual override of course for those who are more experienced with animation creation and import.

Inverse Kinetics  (IK) Updates

  • Leviathan Linden’s IK work has pretty much become a mini-project in its own right.
  • Most recently, he has been focused on implementing Forward And Backward Reaching Inverse Kinematics (FABRIK) – which is the fundamental algorithm for suggesting new joint positions in a range of applications, including 3D modelling.
  • This work has been in part a matter of trial-and-error, and most recently, Leviathan has been fixing issues of where constraints are enforced in FABRIK which impact the SL avatar, although he still has some more constraints to fix.
  • Fixing these issues has required additional visualisation / debugging tools, which he’s having to code for himself.

Additional Notes

  • A further request was made for updating the bento reference skeletons on the wiki, which are reported as being “broken”, per BUG-10981 “Test content for public Project Bento Testing wiki page” and this Content Creation Discord channel discussion. This will be chased internally at LL to see if action is being taken.