2019 Content Creation User Group week #41 summary

Cherishville, August 2019 – blog post

The following notes are taken from my audio recording of the Content Creation User Group (CCUG) meeting, held on Thursday, October 10th 2019 at 13:00 SLT. These meetings are chaired by Vir Linden, and agenda notes, meeting SLurl, etc, are available on the Content Creation User Group wiki page.

Graphics Team

There are two new Lindens now on the rendering team – Euclid Linden, who has been with the Lab for around a month at the time of writing, and Ptolemy Linden, who has been a Linden for the last couple of weeks, again at the time of writing. Both will be working on various rendering projects which will include the Love Me Render viewer updates and also projects like the Environment Enhancement Project (EEP) – which is considered a priority in order to move that project towards release.

Euclid Linden goes full-on shark-man, while Ptolemy goes a little more conservative with a starter avatar


No further updates thus far in the week. The hope is that the Vinsanto Maintenance RC viewer (version at the time of writing) looks to be in “good shape” for promotion, but currently requires a little more time in its release cohort.

This leaves the official viewer pipelines at the time of the meeting as follows:

  • Current Release version, formerly the Umeshu Maintenance RC viewer, dated, September 5 – No Change.
  • Release channel cohorts:
  • Project viewers:
    • Legacy Profiles viewer, version, September 17. Covers the re-integration of Viewer Profiles.
    • Project Muscadine (Animesh follow-on) project viewer, version, September 11.
    • 360 Snapshot project viewer, version, July 16.
  • Linux Spur viewer, version, dated November 17, 2017 and promoted to release status 29 November 2017 – offered pending a Linux version of the Alex Ivy viewer code.
  • Obsolete platform viewer, version, May 8, 2015 – provided for users on Windows XP and OS X versions below 10.7.


Project Summary

An attempt to re-evaluate object and avatar rendering costs to make them more reflective of the actual impact of rendering both. The overall aim is to try to correct some inherent negative incentives for creating optimised content (e.g. with regards to generating LOD models with mesh), and to update the calculations to reflect current resource constraints, rather than basing them on outdated constraints (e.g. graphics systems, network capabilities, etc).

Current Status

  • Work is progressing on building a predictive model based on the data LL has been gathering on mesh complexity, frame times, etc.
  • This model will be tested across a wider range of client hardware types and different ranges of settings.
  • The data thus far confirms that geometric complexity plays a large part in performance reduction, but also that there are a lot of other variables in play: rigged meshes are very different in behaviour impact to static meshes; some graphics properties can make a “big difference” in frame time, etc.
  • Details on the impact of textures has yet to be folded into the project.

Project Muscadine

Project Summary

Currently: offering the means to change an Animesh size parameters via LSL.

Current Status

Still largely on hold while ARCTan is being focused on.

Other Items in Brief

  • Mesh Uploader: a couple of points were brought up concerning the mesh uploader:
    • At the time mesh was introduced, materials were no supported; therefore, in the uploader there is code to discard tangent space (which can be used by normal maps). This means normals must be calculated in real time, causing both performance problems and inconsistencies between how normals appear in Second Life and how they appear in the 3D software used to create them. It’s been suggested this issue should be the subject of a Jira.
    • Allowing for the work on ARCTan, some see the uploader unfairly punishing on grounds of size and LI.
      • It what pointed out that a very large mesh that can be complex to render get hit with a high LI and high upload cost, but a very small object  – which may still have tens of thousands of triangles – is not penalised to the same degree, even though it might be as costly to render.
      • The alternative suggested was to have costs based not on LOD boundaries & changes rather than a simple size / LI basis. The idea here being that the cost is more reflective of what is seen and rendered by the viewer, which is seen as “levelling” the playing field (if a small object has a really high LOD tri count, then it would incur higher costs, in theory making creators more conservative in how they construct their models.
      • It was pointed out that in some respects complexity / LODs are already being gamed (e.g. by having one high LOD model then setting the medium and low LOD levels to use the same low poly version of the model for both and avoid costs for a proper mid-level LOD model), and such an approach as suggested might further encourage similar gaming.
      • Vir’s view is that the issue is not really that tied to the uploader per se, but is more in the realm of overall cost calculations (although LOD models obviously impact upload costs). As such, ARCTan is really the first step in trying to deal with these kinds of issues, and may help alleviate some of the perceived imbalance seen with upload costs.
  • Materials and Bakes on Mesh: a request was again put forward for LL to provide materials support for Bakes on Mesh. This is not an easy capability to supply, because:
    • System layers for clothing do not have a means to support any materials properties.
    • The Bake Service has no mechanism for identifying and handling materials properties to ensure they are correctly composited.
    • Thus, in order to support materials, both the system wearables and the Bake Service would require a large-scale overhaul which, given all that is going on right now (e.g. trying to transition services to being provisioned via AWS services), the Lab is unwilling to take on.
  • A request was made to allow 2K textures to be displayed by Second Life under “controlled conditions”, the idea being that a single 2K texture could eliminate the need for multiple smaller textures. The two main problems here are:
    • There is already a propensity for people to use high-res textures across all surfaces, whether required or not on the grounds “higher must be visually better”, so allowing even higher resolution textures to be displayed could exacerbate this.
    • Given there is no real gate keeping on how textures are used in-world once uploaded, how would any “controlled conditions” on the use of certain textures actually be implemented (both technically and from a user understanding perspective)?

3 thoughts on “2019 Content Creation User Group week #41 summary

  1. Do you remember the freebie SL16B outfits. They used 1024-textures for everything, even the texture for the small chains on one pair of boots.

    I am struggling to find documentation of the texture display system. There are disorganised fragments all over the place, most over a decade old. This is well before Mesh and LOD. This describes a part of the system, but is vague about some of the details: http://wiki.secondlife.com/wiki/Image_System There’s some SL-specific jargon that doesn’t seem to get used elsewhere.

    There are references elsewhere to the mip maps being generated by the Viewer, which would mean that a 1024-texture would be downloaded and cached, but the 1024 texture might never be used to render an image.

    There is other material about choosing useful texture sizes, but those SL16B outfits leads me to wonder if anyone has ever bothered to read it.

    I suppose it’s to be expected that programmers will talk about technical solutions, but with SL dependent on user-created content Linden Lab really need to up their game on documentation. I rather pity these new Lindens, who will have to figure out how things work, all the while being shouted at by pixel-greedy ignorant users.

    Liked by 1 person

  2. While I’d love to have 2k textures, I do think something has to be done to discourage overuse of 1k textures first. I have seen some truly ridiculous examples, including a small decor item that used a grand total of 16 1k maps, including normal and spec maps.

    Any penalties for using excessively large textures should possibly be tied to the size of the object, though — there’s a difference between using a 1k texture for the pavement of a plaza and using it on a pair of earrings.

    While I”m sure education might help, it’ll probably take an actual penalty to convince some that large textures aren’t necessarily worth it.

    While on the subject of penalties, let’s not forget that rezzed objects are penalized but attached ones are not… pet peeve 🙂


    1. “Any penalties for using excessively large textures should possibly be tied to the size of the object, though — there’s a difference between using a 1k texture for the pavement of a plaza and using it on a pair of earrings.”

      How do you engineer the simulator to manage this (viewer-side management would be unreliable, as it could potentially be circumvented)? How is it to respond to incorrect usage? Pump LI? Automatically force a downsizing of the texture (itself not a bad idea)? How do you set the size constraints (some will be obvious – as with ear-rings to pavements; but other could well be less so)? What checks need to be added? How will this affect viewer-side workflow? What additional layers of understanding need to be communicated to users? That said, I agree that penalties tend to drive lessons home a lot more than expecting people to read best practices (which is not to offer any excuse for a lack of cohesive documentation).

      “While on the subject of penalties, let’s not forget that rezzed objects are penalized but attached ones are not… pet peeve”

      ARCTan should be looking at that, as the Lab is well aware of the issue.


Have any thoughts?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.