The following notes are primarily taken from the Content Creation User Group meeting, held on Thursday, February 22nd, 2018 at 13:00 SLT. For the purposes of Animesh testing, the meetings have relocated to the Animesh4 region on Aditi, the beta grid – look for the seating area towards the middle of the region. The meeting is chaired by Vir Linden, and agenda notes, etc, are usually available on the Content Creation User Group wiki page.
- The project render viewer was updated to RC status on Wednesday February 21st become the Love Me Render viewer, version 220.127.116.112751. The two primary improvements to this viewer are:
- An improvement to mesh LOD calculation (account for CTRL+0)
- Agents that render as jelly dolls should have their attachments render at 0 LoD to prevent loading higher LoD complexity in memory thus deterring crashes. The debug setting RenderAutoMuteByteLimit has to be greater than the default of 0 for this feature to work.
- The Nalewka Maintenance RC viewer also updated on Wednesday, February 21st, to version 18.104.22.1682752.
- The 360-degree snapshot viewer updated on Thursday, February 22nd to version 22.214.171.1242774 – see my hands-on overview for more.
Bakes On Mesh
Extending the current avatar baking service to allow wearable textures (skins, tattoos, clothing) to be applied directly to mesh bodies as well as system avatars. This involves server-side changes, including updating the baking service to support 1024×1024 textures, and may in time lead to a reduction in the complexity of mesh avatar bodies and heads. The project is in two phases:
- The current work to update the baking service to support 1024×1024 textures on avatar meshes.
- An intended follow-on project to actually support baking textures onto avatar mesh surfaces (and potentially other mesh objects as well). This has yet to fully defined in terms of implementation and when it might be slotted into SL development time frames.
This work does not include normal or specular map support, as these are not part of the existing baking service.
Anchor Linden is currently working on viewer-side support, and hopes to have a project viewer, together with test regions on Aditi, available for testing “soon”.
There is a growing number of questions / concerns around this work which have yet to be fully answered:
- How will the system avatar be masked? Via a dedicated alpha channel?
- What creators need to do in order to leverage the capability. Essentially, the process requires specifying which mesh face gets what bake channel – but is this actually defined?
- Will there be scripted support (seen as vital to allow continued use of normal or spec maps, for example)?
- How might HUD systems that apply materials work in conjunction with bakes on mesh (e.g. to continue to simulate cloth textures, as seen in current clothing appliers)?
- Has there been any attempt to reach out to and encourage the makers of popular mesh bodies / heads to engage in the project, body to understand what is being attempted and how best to ensure it is something they would be willing to leverage?
- Alexa and Vir have both indicated that this is on the Lab’s plans for the project, once a basic project viewer is available so that people can actually start investigating what might be required to leverage the capability properly.
- Is there a project specification people can refer to / contribute to?
- Elizabeth Jarvinen (polysail) has offered to write a Feature Request Specification. Those wishing to contribute to it should contact her in-world.
- Vir is focused on bug fixes at the moment.
- There is a potential new project viewer waiting in the wings, but this is dependent upon some of the bug fixes, such as LOD stability issues and camming problems.
- There is still no definitive time-frame for the release of Animesh – it’s now largely dependent on the bug fixing work.
This is the code-name for the project to re-evaluate object and avatar rendering costs. It is still in its very preliminary stages, and might, in time, lead to a change in how Land Impact is calculated and assigned. No decisions have been taken (as the data has yet to be collected and analysed and decisions made), so this not something that will be happening soon – and even when it does, it may not cause any appreciable changes for the majority of users, simply because the Lab is looking to minimise any impact which may come out of the work as much as they can.
Just what is being planned and how any negative impact on LI might be mitigated were discussed at both the week #7 CCUG meeting and that week’s TPVD meeting, and the following audio featuring extracts from both meetings will hopefully further contextualise how the work is being approached, and hopefully ally fears.
Environment Enhancement Project (EEP)
A set of environmental enhancements, including:
- The ability to define the environment (sky, sun, moon, clouds, water settings) at the parcel level.
- New environment asset types (Sky, Water, Days – the latter comprising multiple Sky and Water) that can be stored in inventory and traded through the Marketplace / exchanged with others.
- Experience-based environment functions
- An extended day cycle (e.g a 24/7 cycle) and extended environmental parameters.
This work involves simulator and viewer changes, and includes some infrastructure updates.
- Rider Linden is now “well on the way” to having working inventory Windlight assets.
- Once he has finished this element of work, he hopes to put out a project viewer for use of test regions on Aditi which have the necessary support for managing the new Windlight assets – this could happen in March.
- The will be no direct LSL support for the new assets in the first EEP release.
- llGetSunDirection() will initially be broken as a result of the alterations to the day / night cycle:
- The sun’s position will no longer be based on the default Linden Sun (which has a four-hour “day”), but defined by the environment itself, allowing for more physical world daylight times
- The function will be updated to work with the new EEP capabilities before the project gets to release status.
- Once llGetSunDirection() has been updated, sundials using it should accurately reflect the position of the Sun over a region, rather than being indicative of its general position in the sky. However, sundials using the scripted function to count the number of seconds since “midnight” in a 4-hour day cycle in order to simulate the Sun’s position in the sky will be broken.
- Essentially, EEP should allow Windlight settings to work as we see them now, but a) as objects which can be applied from inventory; b) on the basis of agent application through an experience. It should also work closely with the planned atmospherics updates.
A separate piece of work in progress – albeit it related to EEP – is a set of updates to SL’s atmospheric shaders. This work is being carried out by Graham Linden, and among other things, will allow for things like Godrays, improved visual fogging, etc. There’s currently no time frame on when this work might publicly surface.
The next CCUG meeting will be held on Thursday, March 8th, 2018.