
The following notes were taken from my audio recording and chat log transcript of the Content Creation User Group (CCUG) meeting held on Thursday, July 20th, 2023.
- The CCUG meeting is for discussion of work related to content creation in Second Life, including current work, upcoming work, and requests or comments from the community, together with viewer development work.
- As a rule, these meetings are:
- Held in-world and chaired by Vir Linden.
- Conducted in a mix of voice and text.
- Held at 13:00 SLT on their respective days.
- Are subject to the schedule set within the SL Public Calendar, which includes locations for both meetings (also included at the end of these reports).
- Open to all with an interest in content creation.
Viewer Updates
- The Maintenance U(pbeat) RC viewer, version 6.6.14.581101, was released on July 21st. Key changes in this viewer comprise:
- Improvement parcel audio as the viewer leverages VLC for audio streams.
- The Inventory Extensions viewer was promoted to RC status with version 6.6.14.581058, on July 20th.
- A new option Show Ban Lines On Collision (toggled via World→Show) which will only show banline on a direct collision (foot or vehicle) rather than constantly visible when within camera range.
- The Alternative Viewers page appears to have suffered a hiccup, listing version 6.6.12.579987 as the “Win32+MacOS<10.13” RC viewer. However,
- The Win 32 + Pre-MAC OS 10.3 viewer was promoted to release status on July 5th.
- 6.6.12.579987 was the version umber assigned to the Maintenance S RC viewer (primarily translation updates), originally issued on May 11th, and promoted to de facto release status on May 16th.
The release and Project viewers currently in the pipeline remain unchanged:
-
- Release viewer: 6.6.13.580918, formerly the Maintenance T RC viewer, promoted on July 14.
- Project viewers:
- Emoji project viewer, version 6.6.13.580279, May 30.
- Puppetry project viewer, version 6.6.12.579958, May 11.
Senra NUX Avatars
- There was a stir in the week when the Senra brand of mesh avatars designed by LL (and primarily intended for new users as a part of the New User eXperience – NUX) were made available through the system Library and then withdrawn.
- This was apparently not an error on LL’s part, but rather the result of an issue with the avatars being noted, prompting their removal from the Library.
- The removal did not prevent some users grabbing copies of the avatars + accessories (presumably by copying items from the Library to their inventory), which weren’t removed as a part of the “recall”.
- The appearance of the bodies + accessories also sparked a fair degree of forum discussion, approximately starting towards the bottom of page 17 of this thread.
- In reference to the thread, LL encourage those who did manage to retain the Senra bodies and are observing issues / have concerns to continue to record feedback there, as “all eyes” involved in the project are watching that thread.
glTF Materials and Reflection Probes
Project Summary
- To provide support for PBR materials using the core glTF 2.0 specification Section 3.9 and using mikkTSpace tangents, including the ability to have PBR Materials assets which can be applied to surfaces and also traded / sold.
- There is a general introduction / overview / guide to authoring PBR Materials available via the Second Life Wiki.
- For a list of tools and libraries that support GLTF, see https://github.khronos.org/glTF-Project-Explorer/
- Substance Painter is also used as a guiding principal for how PBR materials should look in Second Life.
- Up to four texture maps are supported for PBR Materials: the base colour (which includes the alpha); normal; metallic / roughness; and emissive, each with independent scaling.
- Given the additional texture load, work has been put into improving texture handling within the PBR viewer.
- In the near-term, glTF materials assets are materials scenes that don’t have any nodes / geometry, they only have the materials array, and there is only one material in that array.
- As a part of this work, PBR Materials will see the introduction of reflection probes which can be used to generate reflections (via cubemaps) on in-world surfaces. These will be a mix of automatically-place and manually place probes (with the ability to move either).
- The overall goal is to provide as much support for the glTF 2.0 specification as possible.
- As a result of the updates, SL’s ambient lighting will change (e.g. indoor spaces will appear darker, regardless of whether or not Shadow are enabled), and so there will be a period of adjustment for users (e.g. opting to install lighting in indoor spaces, choosing between the HDR lighting model of glTF or opting to set a sky ambient level).
- The viewer is available via the Alternate Viewers page.
- The simulator code is now more widely available on the Main Grid, including some sandbox environments, but still in RC. Demonstration regions might be found at: Rumpus Room, Rumpus Room 2, Rumpus Room 3, Rumpus Room 4, Rumpus Room 5.
- Please also see previous CCUG meeting summaries for further background on this project.
Status
- Many in the team have been out-of-office recently, and so work had slowed for a while.
- The focus remains on bug fixing within both the viewer and the simulator code.
Double-Sides Materials Concerns
A request was made for LL to remove double-sided materials from the PBR work, due to the following concerns:
- Inexperienced creators misunderstanding the capability (e.g. a content creator who makes a pendant with 500,000 triangles and applying materials to all of them, overriding backface culling), and:
- Clothing creators who add additional tris along the edges of clothing (e.g. cuffs, lapels, collars, etc.), to given the illusion of an “inner” material instead utilising double-sided materials instead, leading to:
- The potential of both of these leading to noticeable viewer performance impacts (e.g. due to doubling the amount of rasterising the viewer must perform).
In response Runitai Linden noted:
- Double-sided materials is a part of the glTF specification, and so will remain within the PBR project, and so forms a part of the overall requirements for obtaining Khronos 3D commerce certification, which LL would like to achieve for Second Life, and for that reason will not be removed.
- In terms of performance LL believe:
- Double-sided materials generally do not get rasterised twice (e.g. if you are looking at the front face of a leaf with double-sided materials, the back face is not rasterised) – although there are some exceptions to this.
- Double-sided materials are a “fill” hit, not a per triangle hit, so the performance hit decreases exponentially as the camera moves away from the object – so for actual double-sided objects, it is a performance win.
- To help safeguard against accidental misuse, the option to apply double-sided materials must be explicitly enabled when uploading, even if the materials themselves have been created as double-sided (if the option is not explicitly set, then they will be uploaded as single-sided).
- The issue does admittedly have edge-cases, and there are issues around any implementations for double-sided materials (e.g. how do you penalise for incorrect use? Increased LI? But then – a) what about worn items (which are immune to LI), and b) how does the viewer differentiate between “correct” use of double-sided materials and an “incorrect” use, in order to avoid penalise good practices in error?
- However, LL are not going to disable / artificially limit the use of double-sided materials due to the potential for misuse, either accidental or deliberate.
PBR Mirrors
- This is a follow-on project to the PBR Materials, intended to provide a controlled method to enable planar mirrors in SL (i.e. flat surface mirrors which can reflect what is immediately around them, including avatars).
- As per my previous CCUG update, the approach being taken is to use a “hero probe”.
- This uses a materials flag added to a surface which allows it to be considered as a mirror face, based on the proximity of a camera to it.
- When a camera is within the expected range, the flag will instruct the viewer to create a “hero probe”, rendering high resolution (512×512) reflections on the mirror surface until such time as the camera moves away.
- It is an approach which allows for multiple mirrors within a scene, whilst minimising the performance impact to only one mirror per viewer.
- The concept is now working in tests, and depending on performance, it is possible the viewer might be allowed to support up to two hero probes at a time: one for any nearby mirror surface, and the other for generating reflections on any nearby Linden Water.
- It is hoped that a project viewer will be available for public testing of the idea will be available Soon™.
ARC – Avatar Render Cost
- Intended to be a means of calculating the overall cost of rendering individual avatars by the viewer, ARC has long been acknowledged as inaccurate.
- Currently, the project to adjust both ARC calculations and the actual cost of rendering in-world objects to make them more reasonable – Project ARCTan – remains inactive.
- The problem with such metrics like ARC is that they depend on a range of analyses which, when combined, do not necessarily result in an accurate reflection of real-world rendering very well.
- However, those curious about the rendering cost can use:
- World→Improve Graphic Speed→Your Avatar Complexity to seen the render impact (in ms, currently for the CPU, but with the PBR viewer, for the GPU) of their own attachments can have on own and other viewers.
- World→Improve Graphic Speed→Avatars Nearby to see the rendering impact of other avatars within view.
- Note that both will fluctuate do to the general “noise” of rendering, however, the generated figures are far more accurate in real terms than those for ARC.
- Details of these capabilities – first deployed in Firestorm, and contributed to Linden Lab for inclusion in all viewers, can be found in this blog, here.
- Questions were asked over the ability to see these figures displayed over avatars heads vs. having to go to a “specialised” menu, with some at the meeting pointing to the overhead display being preferable, because it it “there”. However, this overlooks the facts:
- It could be received as “cluttering” the in-world view and reducing immersiveness.
- If displayed as hover text, it could be easily disabled either by an dedicated UI setting or simply by exposing the debug to disable avatar-related hover text
- Most particularly, any such display (even if added to name tags) would in fact adversely impact performance due the CPU / GPU cycles taken up by performing the calculations and then displaying them – with Runtai noting it can takes “several times longer” CPU time to calculate and display avatar render cost on the than it does to render the avatar.
- The above led to a broader discussion on how to encourage better awareness of avatar impact on viewer performance (ARC shaming not being a positive approach to things), such as general education among users and having some form of “try before you buy” capability (if this were possible to implement) which would offer the ability to see the impact of wearing a specific attachment ahead of wearing it), or some form of inspection capability at upload which might encourage creators to go back and better optimise their avatar attachments.
- One noted issue here is ensuring both sides of the equation have the tools to make more informed decisions: creators in terms of making their content more performant / efficient, and consumers to enable them to be able to better identify performant / efficient content. The latter is particularly important in its ability to drive market forces through users being able to naturally gravitate towards more efficient content.
Tags for Wearables
This was an idea mooted by the Lab in the meeting – not a project currently being worked upon.
- A tag system which allows items with a certain tag to automatically replace another of the same tag type with a single click and without also replacing other items using the sane attachment point. For example, an item tagged as “hair” replaces the currently worn hair with a single click, but without also knocking off a hat also worn on the skull.
- This was expanded upon by the idea of tags being used with demo items – the tag being used to perform tasks such as:
- Only allowing the demo item to be worn within a certain location (e.g. the “dressing” area of the store).
- Somehow records the item being worn prior to using the demo, o that it is automatically replaced when the demo item is removed.
- The problem with the latter idea is that everyone uses demos differently, so assigning a single place at which a demo can be tried is a non-starter (do we really wany people trying demos at already busy events? What about items purchased via the MP or affiliate vendors, what location should be assigned to them? How is the creator to differentiate? Multiple versions of the same item for different points-of-sale? What about people who don’t have a home location by use sandboxes, but the demo tagged for use only within the avatar’s home location? Can this realistically even be done?).
- An alternative suggestion for tags put forwards at the meeting was to have them as a part of the upload process, so creators could be reminded / encouraged to specific the desired attachment point via a tag list, so that users are not left with items defaulting to their avatar’s right hand.
- There are a range of issues over any tag system, including:
- a) How well the option would be used unless enforced; b) Even if enforced, how many content creators, would actually define the preferred attach point over just selecting the first one on the list?
- The idea leans towards WEAR, rather than ADD – so will not necessarily overcome the confusion of new users who wish to ADD an item to their avatar, only to find it knocks something else off of their avatar.
- How many tags should be in the system? “Hair”, “shirt”, “pants”, “gloves”, “shoes” are all straightforward – but what about shawls or shoulder wraps? should they be classified as a shirt or a collar, or have their own tag or individual tags? How are rings, earrings, pendants, etc., be classified / tagged?
Next Meeting
- 13:00 SLT, Thursday, August 3rd, 2023, at the Hippotropolis Campsite.
† The header images included in these summaries are not intended to represent anything discussed at the meetings; they are simply here to avoid a repeated image of a gathering of people every week. They are taken from my list of region visits, with a link to the post for those interested.