Updated, February 16th: the suggestion to make suitable mipmaps available to users when uploading textures (see the end of this summary) has been translated into a feature request by Beq Janus, allowing users to select multiple image resolutions when uploading a texture – see BUG-226352.
The majority of following notes are taken from the Content Creation User Group (CCUG) meeting, held on Thursday, February 14th, 2019 at 13:00 SLT. These meetings are chaired by Vir Linden, and agenda notes, meeting SLurl, etc, are usually available on the Content Creation User Group wiki page.
SL Viewer Updates
- The Bugsplat RC viewer updated to Wednesday, February 13th, to version 220.127.116.114348.
Bakes On Mesh
Extending the current avatar baking service to allow wearable textures (skins, tattoos, clothing) to be applied directly to mesh bodies as well as system avatars. This involves viewer and server-side changes, including updating the baking service to support 1024×1024 textures, but does not include normal or specular map support, as these are not part of the existing Bake Service, nor are they recognised as system wearables. Adding materials support may be considered in the future.
- Bakes on Mesh knowledge base article.
- Bakes on Mesh forum thread.
- Bakes on Mesh JIRA filter (courtesy of Whirly Fizzle).
- Anchor Linden has pinned down a couple of further bugs.
- The hope is that BoM will progress to viewer release candidate (RC) status in the near future, although this is down to internal testing at the Lab.
Environment Enhancement Project
A set of environmental enhancements allowing the environment (sky, sun, moon, clouds, water settings) to be set region or parcel level, with support for up to 7 days per cycle and sky environments set by altitude. It uses a new set of inventory assets (Sky, Water, Day), and includes the ability to use custom Sun, Moon and cloud textures. The assets can be stored in inventory and traded through the Marketplace / exchanged with others, and can additionally be used in experiences.
Due to performance issues, the initial implementation of EEP will not include certain atmospherics such as crepuscular rays (“God rays”).
- Project definition document.
- Project summary (this blog).
- Full EEP Documentation.
- Project Viewer – via Alternate Viewers wiki page.
- EEP Feedback forum thread.
- EEP sneak peeks forum thread.
- EEP Jira filter.
- On the simulator side, EEP is now on the BlueSteel and LeTigre channels (representing roughly 20% of the main grid).
- There is “one last” major issue with the current viewer code to be resolved. This is related to experiences, and Rider Linden believes he has a fix for the problem.
- It is therefore hoped the EEP viewer will move to RC status very soon.
- The next update to the viewer should include fixes for BUG-226249.
- There have been some changes to the data returns by llGetenvironment, and further changes will be made in the coming week. These changes will be reflected in the wiki documentation.
- There is also a bug that can cause avatar tags to be corrupted in certain EEP environments, as demonstrated by Whirly Fizzle (below).
- This pass of EEP will not, as previously reported, include Godrays at this point in time. It is hoped these will be added in the future – although when in the future is not clear.
- Vir has some prototype hooks in the code that allow body parts in the inventory of an Animesh to be used to customise the skeleton of the Animesh. This doesn’t (as yet) include any communications from the viewer to the simulator, which s the next thing Vir will be examining.
- This next part of the work may possibly hampered by the fact there is an “embarrassingly large” number of ways to transmitting object information between the viewer and the simulator. None of them are particularly comprehensive, making it harder to determine how best to add messaging specific to Animesh.
- As a result Vir is also looking at a means of rationalising things to make it easier to add object information communications to the system beyond Animesh.
- There have been requests to provide a means to obtain things like the tri count for Animesh. Vir believes it might be easier to add the means to obtain streaming cost, etc., (as this information is already supplied for in-world objects), and / or to give a more straightforward means of indicating whether something can be linked into an Aminesh object or not.
- Tri counts are being required so creators can check whether or not linking objects together for an Animesh will take them over the tri count limit for Animesh.
- A problem here is that the simulator doesn’t have the full tri count information for any mesh object, only an estimate to assist in making LI calculations; so using LSL to extract the information could require more in the way of back-end work to expose the required information obtained from the viewer.
- There is also a risk that as time goes on, the rules regarding what is allowed for Animesh might change – including the tri count. Any such change could therefore invalidate the accuracy of scripts with a hard-coded tri count limit.
- A request as also been made to enable / disable Animesh via LSL. This is currently not on the cards, unless a compelling use-case can be defined.
- A suggestion has been that using LSL to disable Animesh could help reduce LI when Animesh objects don’t need to be animated. However, due to the fact Animesh is rendered differently to static objects, this would likely not work in the way people imagine.
- Vir still hopes to re-work Reset Skeleton a system command so that anyone triggering a reset of their avatar skeleton when they see it deformed can send that update to all other viewers in the scene perform a reset for that skeleton, rather than users having to manually select and reset the affected avatar.
- Another request is to have a viewer-side option to reset Animesh skeletons. This is viewed as a good idea, as there is already a debug setting (DebugAnimatedObject – set to True) for this, which could be better exposed, although it doesn’t work with Animesh objects attached to avatars.
Thanks to a somewhat confused blog post at New World Notes, there has been a lot of talk over the last couple of days about a “debug setting that miraculously improves texture quality”. In actual fact, there is no such thing – as Beq Janus explains in her own blog (see: Compression depression – Tales of the Unexpected) and in the forums (see bi-curious? You should be – or why your assumed wisdom may not be correct).
As Beq correctly points out, there is no “magic” debug setting to improve the resolution or quality of all textures, and Second Life cannot displayed textures with a resolution greater than 1024×1024.
What Beq did confirm is (as per their descriptions) the referenced debugs can be used to upload textures much larger than 1024×1024 Second Life. However, such textures will be resampled to 1024×1024. The particularly interesting thing here is that in resizing such images, the viewer uses bilinear resampling which – contrary to perceived wisdom pointing to bicubic resampling – can actually result in far better quality in the finished 1024×1024 texture.
Beq’s forum post and (particularly) her blog post explains what appears to be happening – and how direct resizing very high-resolution images using bilinear resampling to 1024×1204 prior to upload will also result in better quality textures with viewed in-world.
Does this mean that the upload texture size limit should be increased? Well, no. The crucial part of this is that the different is only seen with 1024×1024 textures – which have issues in terms of the amount of memory they eat, and in the fact they are already drastically over-used. As such, increasing the allowed upload resolution prior to resampling might further elevate the use of 1024×1024 textures.
In the meantime, the forum post has triggered some interesting discussion around bilinear and bicubic resampling, and where each might be preferable to use, and how bilinear could benefit the production of seamless normal maps.
A further suggestion resulting from these discussions (or the resurgence of a suggestion) is to make all the mipmaps for an uploaded textures available, rather than the just the level selected at upload (so, for a 1024×1024, the 512×512, 256×26 and 128×128 mipmaps are also made available to the user, allowing them to experiment and see how the different resolutions work on surfaces. A feature request Jira has been requested for this idea.