
The following notes are taken from the Content Creation User Group meeting, held on Thursday, May 25th, 2017 at 1:00pm SLT at the the Hippotropolis Camp Fire Circle. The meeting is chaired by Vir Linden, and agenda notes, etc, are available on the Content Creation User Group wiki page.
Audio extracts are provided within the text, covering the core points of the meeting. Please note, however, that comments are not necessarily presented in the chronological order in which they were discussed in the meeting, but are ordered by subject matter.
A video recorded at the meeting by Medhue Simoni is embedded at the end of this update, my thanks to him making it available. However, do not that this cuts out mid-way through the meeting. Timestamps in the text below refer to this recording.
Applying Baked Textures to Mesh Avatars
[1:54] This was announced as a new project – see my separate update for details.
The meeting saw additional questions asked about the baking service, which are summarised below.
Will the Baking Service Support Animated Objects?
- Not initially. Baked textures are only relevant to your Current Outfit Folder (COF), affecting your appearance only. Animated objects will not have any notion of a COF (as they do not have an associated inventory structure as avatars do), so whose textures would an animated object show?
- Also, even if you could assign your own COF-defined appearance to an animated object, it would only be valid until you change your own appearance, which would discard the bake used by the object, probably leaving it blank.
- One solution might be allowing arbitrary textures to be sent to the baking service (see below). Another would be to allow animated objects to have their own notion of a COF contained within the object itself which the baking service could somehow reference
- WERE this kind of work to be adopted, this would be Vir’s preferred approach. However, it is not currently a part of either the animated objects project or baking textures on meshes.
Baking Arbitrary Textures
Would it be possible to have a LSL function to request baking arbitrary textures?
- Not as a part of applying baked textures to mesh, although it might be considered in the future.
- However, the baking service could offer considerable flexibility of use were it to be extended, simply because of the way it defines the body area (head, upper body, lower body).
- A problem is that, as noted above, baked textures are held only so long as your current avatar appearance defined via your COF is relevant, after which they are discarded. For the system to be useful with arbitrary textures, the resultant composite textures would need more rigorous storage, perhaps as a new asset class or retained in some form of “temporary” texture store – either of which would have to be defined and allowed for.
- Thus, the problem is the amount of work involved in extending the baking service and (potentially) the asset handling required to support it.
HTTP Asset Viewer
[4:22] The HTTP Asset viewer was updated to version 5.0.6.326593 on Friday, May 26th. This update primarily bring the viewer to parity with the recently promoted release viewer, and so primarily comprise the revised region / parcel access controls, and the updates to Trash emptying behaviour.
Supplemental Animations
[6:53] As well as working on animated meshes, Vir is now also working on the LSL side of supplemental animations alongside of LSL changes need for animated objects. The work is designed to overcome issues of animations states keyed by the server-side llSetAnimationOverride() conflicting with one another.
Animated Objects
Current Project Status
Vir has got basic prototyping working in a “hacked up” single version of the viewer. He’s now working on the shared experience – how is an animated object seen by multiple viewers.
There is still no details on what limits beyond land impact which may be applied to animated objects (e.g. number of animated objects – not avatars – permitted per region type, etc), as there is not at this point any solid data on potential performance impact to help indicate the kind of limits which might be required..
Number of Allowed Animation Motions
[8:52] Currently, SL supports a total of 64 animation motions playing at one time per agent (hence walks, arm swings, wing flaps, tail swishes, etc., all of which can happen at the same time). It’s not been tested to see how much of an actual load running multiple animations places on a system. The limit might have to be changed as a result of animated objects – or it might not; it’ll come down to testing.
Other Items of Discussion
Avatar Scaling
[12:24-video end] There is a lengthy discussion on avatar scaling.
- Essentially, the size slider works within a certain range; go beyond this, and distortions of body parts (e.g. facial features) can start to occur, as some sliders stop working properly.
- Obviously, it is possible to scale avatars using animations, but again, doing so also doesn’t play nicely with the sliders.
- This problem is particularly impactful with Tiny and Petite avatars (although it also affects really large avatars). One workaround is to upload a mesh without joint positions of the affected bones, but this causes breakages in the mesh.Thus, having a slider which could handle the avatar’s scale over a broader range might be beneficial. However:
- Changing the definition of the current scale slider to work over a broader range isn’t an option, due to the risk of existing content breakage.
- Adding a new “global scale” slider to the system might be possible. However, while its is relatively simple at the viewer end of things, SL is already close to its limit of 255 sliders, and any additional global slider will require significant changes to the back-end.
- A further problem is motion is not affected by scale, but is keyed to the current avatar size range. So, additional work would be required to the locomotion system to ensure the distance covered by an avatar’s stride is consistent with its size, adding further complexity to any changes.
- Also, the ability to scale avatars would also require using rotations only, as any use of translations could result in locomotion issues noted above (e.g. so a really small avatar would appear to zip along at 100s of miles an hour), and rotation-only animations are somewhat limiting.
BUG-20027: Allow joint-offset-relative translations in animations
Created during the Bento project, this feature request was originally closed as something the Lab could not implement. It has now been re-opened as people wanted to add further feedback to it. So, if you have an interest – please go and comment on the JIRA.
Cost of Animating via Bones vs. Using Flexis
The Lab views animating via flexis as being very inefficient, but have no numbers for a direct comparison to the cost of animating bones.
Improving IK Support
General requests have been made for SL to better support Inverse Kinematics (IK) to add greater flexibility of joint / extremity positioning. Vir has requested that if someone could start a feature request JIRA, open for comments, on what might be sought, it would be helpful.
Next Meeting
The next CCUG meeting will be Thursday, June 8th, 2017.