The following notes are taken from the Content Creation User Group meeting, held on Thursday, August 31st, 2017 at 13:00 SLT at the Content Creation User Group wiki page.. The meeting is chaired by Vir Linden, and agenda notes, etc, are usually available on the
Medhue Simoni live steamed the meeting to You Tube, and his video is embedded at the end of this article. These notes present the meeting in terms of topics discussed, rather than a chronological breakdown of the meeting, so the provided time stamps may appear to be out of sequence in places. All time stamps are provided as links which will open the video in a separate browser tab, allowing the discussion to be heard in full.
Animated Mesh (Animesh)
The goal of this project is to provide a means of animating rigged mesh objects using the avatar skeleton, in whole or in part, to provide things like independently moveable pets / creatures, and animated scenery features via scripted animation.
- Animated objects can be any rigged / skinned mesh which is correctly flagged as an animated object (so it has a skeleton associated with it), and contains the necessary animations and controlling scripts in its own inventory (Contents tab of the Build floater) required for it to animate itself.
- The capability will most likely include a new flag added to an existing rigged object type in order for the object to be given its own skeleton.
- At this point in time, this is not about adding fully functional, avatar-like non-player characters (NPCs) to Second Life.
- Animated objects will not (initially):
- Have an avatar shape associated with them
- Make use of an avatar-like inventory (although individual parts can contain their own inventory such as animations and scripts)
- Make use of the server-side locomotion graph for walking, etc., and so will not use an AO
- Use the avatar baking service
- The project may be extended in the future.
- It will involve both back-end and viewer-side changes, likely to encompass new LSL commands to trigger and stop animations (held in the object’s contents).
LSL Animation Functions
[1:18-3:04] The LSL functions needed for animating mesh objects are being tested. These comprise:
The documentation for these commands is still in progress, so the information on the wiki pages is subject to update. Also, keep in mind these commands will not work until a public animesh project viewer is available.
[10:12-11:16] These commands could, in theory, be used with a scripted HUD to provide some degree of control over animated objects by their owner. An example HUD may or may not be provided in the test content the Lab may supply (see below).
[3:23-4:43] Test content for animesh has been a request at past meetings, the idea being to have models available in the Library people can use to familiarise themselves with created animated mesh objects and using the associated scripting controls. The Moles are now looking at adding some content for the Library in preparation for animesh becoming more widely available.
[41:26-45:12] A more general discussion on promoting animesh (when available), test content, and making people aware of animesh.
Alexa Linden has been converting existing mesh content into animesh objects. She’s been using the mesh werewolf starter avatar originally released in 2014, and which is already available in the Library, for this work, and produced a short video of the result, presented as an animated GIF, below.
Again, note that these are not actual avatars connected to the simulator via individual viewers; they are purely in-world objects being animated by scripts they contain driving a set of animations also contained within them.
[8:00-9:04] More work is also required on the general controls / limits on animated mesh objects: how many are going to be allowed in a scene, the numbered of animated attachments an avatar can wear, calculating their rendering impact, etc. These will not be final when a public viewer appears, but will be used as a baseline which can then be tweaked one way or another once more intensive testing gets started.
[22:05-24:29] In her initial tests with dancing werewolves, Alexa managed to get over 300 dancing together, each using 6 dance animations and a control script. She didn’t notice much in the way of a frame rate impact whilst also moving her avatar around. However, she did notice some update issues with the Interest List (which controls how things are rendered in the viewer as you move your avatar / camera) when zooming in / out of the scene.
The test was by no means definitive. For example, it was using multiple copies of the same basic mesh model and animations, and this may have boosted performance somewhat than might have been the case with 300 different mesh models running in a scene, each with its own unique animations. She also carried out her tests on a region that doesn’t have a lot of other content on it.
[25:24-26:36] Comparing Alexa’s tests with avatar capacity / performance (e.g. 40 avatar in a region) isn’t easy as avatars can be a lot more individually complex. There are also various aspects of managing avatars which don’t necessarily apply to animated objects. For example, animesh items should really only have a limited number of updates associated with them, whereas avatars tend to have a lot more going on (interactions, messaging, etc.,), all of which increases the volume of traffic the simulator and viewers must handle.
[6:47-7:58] Still no date for when a project viewer will appear, other than the Lab will release it “as soon as we can”.
Right now the biggest piece of work within the viewer is defining how the skeleton get positioned relative to the object with which it is associated. This currently various depending on the model being used, and currently can result in things “jumping” as they start to animate.
This problem also has an impact when trying to use an animated object as an attachment (e.g. when holding and animated pet), with the result the object can be “floating” around the avatar, rather than obviously attached to it, and does not rotate / move appropriately as the attachment point moves relative to the avatar.
[11:55-12:30] Vir doesn’t believe this is a huge problem to solve, it just requires work on the transform matrix, and this shouldn’t add a significant delay in any project viewer appearing.
[21:05-21:45] However, should fixing it prove to be more complicated than anticipated, it may have to be taken into account in terms of lead times, particularly as having the ability to wear / carry animated pets is seen as one of the more popular user-cases for animesh.
Finally, internal testing of the viewer by the Lab has resulted in some suggestions being made which may also be incorporated into the viewer prior to a public project version appearing, in order to ensure that what does appear offers a solid foundation on which to build, and gives creators an opportunity to give feedback.
[15:03-15:56] As animated objects will be manipulating avatar skeleton bones whether they are attached to an avatar or operating as in-world objects, it will require more tracking of such movements than is currently the case to ensure they are correctly represented by viewers able to see them.
[16:00-18:12] Animated objects will likely be limited in terms of both their physical size and their poly / tri count. Limits for both have yet to be determined; however, high poly count objects will likely in part be limited by the impact they have on performance. The physical size of animated objects, unless special limits are set, will likely be defined by the same rules as currently exist for avatar skeleton.
[24:31-25:22] The Interest List / animation issues Alexa encountered (e.g objects some distance from the camera position moving much faster than the should be, and then slowing down to a “normal” speed when zoomed at) are not unique to animated objects. These are issues which the Lab is liable to look at in more detail, but are not seen as something which will necessarily delay progress on animesh, simply because the issue has been around for some time (it can be seen when zoomed out from a group of dancing avatars, for example).