SL project updates 16/2: Content Creation User Group

The Content Creation User Group meeting, at the Hippotropolis Camp Fire Circle (stock)

The following notes are taken from the Content Creation User Group meeting, held on  Thursday April 20th, 2017 at 1:00pm SLT at the the Hippotropolis Camp Fire Circle. The meeting is chaired by Vir Linden, and agenda notes, etc, are available on the Content Creation User Group wiki page.

The meeting was live streamed / recorded by Medhue Simoni, and that video is embedded at the end of this update, my thanks to him for providing it. Timestamps in the text below will take readers directly to the relevant point in the video (in a separate tab) where topics are discussed.

Supplemental Animations

This is an idea to overcome issues of animations states keyed by the server-side  llSetAnimationOverride() conflicting with one another. This problem has been particularly noticeable since the arrival of Bento, and a typical example is that an animation to flap Bento wings, if played to have natural wing movement while walking, results in a conflict with the walk animation, causing the avatar to slide along the ground.

Again, this is only an issue with the server-side animation triggers; older scripted animation overriders, such as ZHAO, which use the scripted llplayanimation function are unaffected.

[03:41] This is still under discussion at the Lab, and if anyone has specific views / ideas, they are invited to attend the CCUG meetings and discuss them.

Scripted Avatar Skeleton Reset

[05:45] A question was raised (via chat) on whether it would be possible to reset the avatar skeleton via script (rather than replying on the local Reset Avatar options in the viewer), as bone offsets defined within some meshes are not resetting when the mesh is removed (see BUG-11310).

Vir indicated that having a means to do this via script may well be useful, but is unsure of where it might slot into the work priorities were it to be adopted, particularly as it would require a new server-side message for passing the update notification to all surrounding viewers.

Skeleton Reset on Region Entry

[13:52] The above spawned a question on having the viewer automatically perform a Reset Skeleton function on entering a region. to try to also combat problems of avatars appearing deformed.

There are some cases where this might be useful. If you run an animation just once to trigger a deliberate avatar deformation, (e.g. from human to quadruped), that animation is not persistent, so if you enter another region, there is a chance other people will see you as deformed. However, trying to fix such situation through an automated skeleton reset is seen as problematic.  For example, an automatic skeleton reset by the viewer on entering a region could conflict with the receipt of animation updates intended to deform an avatar in your view so it looks correct, resulting in the avatar being unintentionally deformed (again, as in the case of an avatar which is supposed to be a quadruped appearing as a deformed biped, as the Viewer has ignored the necessary appearance information as a result of the automated reset.

Projects Under Consideration

The last several Content Creation User Group meetings have included discussions on two potential projects for the Lab. Neither has been as yet adopted, but are under possible consideration by the Lab, and Vir offered a recap of both.

Applying Baked Textures to Mesh Avatars

[20:57] This is essentially taking the ability of the default avatar baking service to manage compositing and baking system layers (skins, shirts, pants, jackets, tattoos, etc), onto a default avatar, and extending it to be used on mesh avatars, potentially reducing the complexity of the latter (which have to be built up in “onion skin” layers in order to be able to handle things like tattoos, skins, make-up options, etc.), and thus making them more render-efficient.

As mentioned in previous Content Creation updates in this blog, there are some issues with doing this: the baking service would likely need to be tweaked to handle 1024×1024 textures (it currently only support up to 512×512); it does not handle materials (although this is not seen as a major issue, as it is felt among those at the CCUG meetings that the primary use of the baking service would be for compositing skins + make-up /  tattoos into a single texture for mesh application), etc.

Animated Objects / NPCs

[25:17] Animated support of object in-world could be used from a wide variety of things, from non-player characters (NPCs), which do not require a link to an active viewer / client, through to animating things like tree branches swaying in a breeze, etc.

This work has various levels of complexity associated with it, and were it to be adopted, might actually result in a series of the related projects. NPCs in particular would be an extensive project, as it would need to encapsulate a means to define what the NPCs are wearing / have attached, how to make NPCs customisable through something like the avatar shape sliders, etc.

As such, were the Lab to decide to pursue animated objects, decisions would need to be made on overall scope for the project, what might be initially tackled, what might be seen as a follow-on project, etc.

How To Encourage the Lab to Adopt Projects

[28:03] Supplemental animations, bakes on meshes and animated objects have been the three major topics of discussion within the Lab in terms of possible major projects. Until they actually get on to the Lab’s roadmap (*if* they get that far), it is not easy for specific time frames, etc., to be discussed at meetings.

However, the best way to actively encourage the Lab to continue looking into such potential projects is to attend meetings like the CCUG and discuss them. Despite using them in the past, polls aren’t seen as particularly useful, whereas direct discussion through meetings like the CCUG tend to bring a consistent set of interests / ideas / suggestion to the service.

Windlight Assets / Scriptable Windlight

[42:16] A long time ago – back when Nyx Linden ran the original Content Creation User Group – a suggestion was made for Windlight settings to be made into an asset which, as with other inventory assets, could be passed between users, helping to give users a consistent look to an environment. No work has been carried out on this idea.

Another suggestion is to make Windlight settings more readily scriptable, so that environment settings could change based on an avatar’s location (e.g. a dark windlight applied when inside a tunnel, switching to daylight on emerging).

RLV / RLVa actually allow this, and viewers such as Firestorm allow for parcel level / altitude specific Windlight settings. However, all of these are limited, in that they are entirely viewer-side. If you’re not running a viewer with the specific support for them, you won’t necessarily experience them.

Oz Linden has previously indicated (via TPV Developer meetings – see here and here), but thus far no project has been adopted.

.BVH Animation Export Issues

[53:37] Cathy Foil outlines an issue in exporting .BVH files from Maya and Blender, which may impact content creators, relating to animation rotations / translations and the number of channels available / used for weighting.  There are workarounds available – Cathy has a video on one of Maya (below).

The view taken is that if there are methods of fixing any issues within the export process, that is preferable to trying to update Second Life so it can try to second guess what the .BVH file is saying on export.

Note in the video title frame that Cathy Foil is suffering a head deformation in this still from the video. A speech gesture for a non-human avatar is still active, despite the avatar being remove, causing her human head to deform every thing the speech gesture is triggered by her use of Voice. A coincidental effect, given that a discussion of avatar deformations occurred in the meeting.

Advertisements

One thought on “SL project updates 16/2: Content Creation User Group

  1. If applying user-baked textures to an avatar mesh attachment were to become a thing, is there any reason why doing the same for any other arbitrary object should be excluded? I’m not sure I can think of a compelling usage case here, though, other than perhaps it might be less computationally expensive in terms of determining if the target object was an actual avatar mesh body part rather than an arbitrary object either attached to an avatar or just rezzed on the ground.

    Like

Have any thoughts?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s