SL project updates week 25/2: Content Creation UG w/audio

The Content Creation User Group meeting, at the Hippotropolis Camp Fire Circle (stock)

The majority of the following notes are taken from the Content Creation User Group meeting, held on  Thursday, June 22nd , 2017 at 1:00pm SLT at the the Hippotropolis Camp Fire Circle. The meeting is chaired by Vir Linden, and agenda notes, etc, are usually available on the Content Creation User Group wiki page.

Audio extracts are provided within the text, covering the core project LL has in hand. Please note, however, that comments are not necessarily presented in the chronological order in which they were discussed in the meeting, but are ordered by subject matter.

Server Deployments Week 25 – Recap

As always, please refer to the server deployment thread for the latest updates.

  • On Tuesday, June 20th, the Main (SLS) channel was updated with a new server maintenance package (#17.06.12.327066), containing fixes to help with the caps (capabilities) router (see here for details).
  • On Wednesday, June 21st, the RC channels were updated as follows:
    • BlueSteel and LeTigre should receive the same server maintenance package (#17.06.19.327206) containing internal fixes
    • Magnum should receive a server maintenance package (#17.06.19.327192) intended to fix BUG-100830 (“HTTP_CUSTOM_HEADER no longer works on RC 17.06.13.327111”) and BUG-100831 (“Lelutka Simone bento head spits a script error when attached on 17.06.13.327111 regions (Magnum & Cake)”).

Animated Objects

Vir has been trying to get animated objects using the avatar skeleton to scale in a reasonable way and that linksets are correctly referencing the same skeleton, and things are handled corrected when they are attached or detached. He’d also be interested in hearing from makers of the “current generation” of pets on how they work – how do they maintain ground contact, how they follow along, how the physics is getting managed, so that he can look into trying to make animated mesh objects operate in a compatible manner.

So, if you are a pet maker and can help Vir in this, please either attend the Content Creation User Group meetings, or contact him directly.

Attaching Animated Objects to Avatars and Avatars to Animated Objects

One of the popular aspects of pets today is the ability to attach them to an avatar (so you can carry them, have them sitting on your shoulder, etc), and this is seen as a potentially important aspect of animated mesh. However attempting to do so does present issues, as it would mean linking two avatar skeletons in some manner, something that is not currently possible. While there are some potential ways this could be done, it could add considerable overhead to the existing project, and also brings potential challenges with it – such as ensuring an attached skeleton is correctly oriented, determining the potential performance hit, etc..

Similarly, BUG-100864 suggests a means of going the other way – linking an avatar to an animated object – such as being able to walk up to a free-roaming horse on a region and being able to mount it and ride it, for example. However, this also raises some of the same concerns.

While not ruling either out, Vir is focused on bringing forward a relatively basic means of animating mesh objects using the avatar skeleton, one which can offer a series of potential uses whiles conceivably allowing existing mesh creations (such as pets) to easily be converted to use it. As such, he sees it as a foundation project, which can then be expanded to incorporate other capabilities in the future, rather than trying to pack everything into a single project which could run the risk of very long development times or becoming overly complicated with all it is trying to achieve right from the start.

Baked Textures on Mesh

Work is still focused on the baking service infrastructure updates required to support baking textures on mesh avatars. These are quite extensive, involving changes to the underpinning tools, the servers (including updating Linux), and so on.

Rigging To Attachment Points

There has been some confusion of late as to whether rigging to attachment points is allowed or not. From the Lab’s perspective, it is not allowed for uploaded since the introduction of Bento, but should still work for legacy items. However, what appears to be a server-side glitch in the last couple of weeks seems to have exacerbated the confusion.

Vir’s recommended rule-of-thumb for TPVs to test against the Lab’s official viewer and ensure behaviours match, otherwise confusion could occur down the road once the current glitches have been corrected. To help with matter, he’s going to refresh his mind on what limitations are enforced server-side, and hopefully bring a list of them to the next meeting to help TPVs ensure they are following the requirements in order to avoid future problems.

Other Items

Mesh Body Dev Kits / Clothing Making / “Standardised” Mesh Avatar

This topic took up the core part of the meeting, and as such, the following is an attempt to precis the core points into a readable summary

At the moment, all mesh bodies in Second Life are unique to their creator, utilising their own core shapes and skin weightings, which have a considerable amount of IP bound up in them. Because there is no available “standardised” mesh model available in Second Life, it means that the body creators need to provide developer kits to mesh clothing and attachment makers, which include this core information –  skin weights (in Blend or Maya or DAE or OBJ files) for rigging clothing and the shapes, which potentially makes it very easy for someone to create their own avatar bodies.

To try to reduce this risk, mesh body makers tend to have license agreements clothing makers are required to agree to, and by sometimes limiting who may or may not be deemed eligible to obtain such a kit.   This has  caused some friction / frustration in the cloth making community.

One suggestion put forward to help reduce fears on the part of mesh avatar creators and allow clothing makers more readily support avatar body brands, was that avatar makers should perhaps consider offering only the body shape to clothing makers – and then offer a fee-based rigging service to clothing makers. This would remove the need for avatar makers to give out their skin weight files, offer them a revenue stream and allow clothing makers more equitably create clothing for the more popular mesh bodies.

While there are no projects on the roadmap aimed at the SL avatar system, two other ideas were put forward which Vir agreed, could be worth consideration down the road:

  • One is a suggestion that LL look to emulate the ability in Maya and Blender to copy skin weights from an avatar model to an item of mesh clothing by running an algorithm to match the weighting from the avatar to the nearest vertices in the clothing. This would allow the clothing to fit almost any mesh body “automatically”, removing the need for clothing makers to specific weight their clothing to each of the mesh bodies they wish to support.
  • The development of a news “SL mesh avatar” designed to operate alongside the existing system avatar (so no content breakage for those preferring to continue using the current system avatar). If this avatar had a sufficient density of vertices, it offers two potential uses:
    • Mesh body makers could use its weightings with their custom shapes to produce individually unique mesh bodies, but which all have a “standardised” set of skin weights, reducing the amount of work involved in creating them (or they could continue to use their own custom skin weights if they wished
    • It could offer clothing makers a single source of skin weights for clothing, simplifying clothing making, which – if combined with the vertices matching algorithm mentioned above – would help ensure the clothing “fits” custom weighted mesh bodies.

The vertices matching algorithm idea might be the more difficult of these two ideas to implement – were either to be considered. However, the development of a mesh avatar that could exist alongside the system avatar could have a lot of merit and help “standardise” the more technical aspects of mesh avatars without impacting their individual shape / look.

Further, as mesh objects can support multiple UV sets, it would be possible for such an avatar to use the legacy UV map use to define the texture spaces on the three parts of the system avatar (thus allowing it to use existing skins, etc), or it could support more “advanced” UV maps (so skin creators could finally design skins with two arms, rather than having the one arm “mirrored” on the avatar, as is currently the case.

Why isn’t Scaling Bones by Animations Allowed?

Scaling bones using animations has never been supported in SL, although Vir isn’t clear on why (and pseudo bone scaling via animations has been possible through attachment point scaling or animating the point positions). However, one of the things that makes designing avatars harder is multiple ways to manipulation and aspect of a bone, because of the potential for conflicts. An example of this is bone translations, which can be affected by both animations and the shape sliders, and so can cause issues.

However, during the Bento project, the advantages of allowing translations through animations was such that the Lab opted to permit it, even allowing for the potential for issues. As scaling bones through animations could bring about a similar level of possible complexity to avatar design (as bones can obviously be scaled via the sliders, this could be the reason scaling bones via animations hasn’t been supported. Currently, this is unlikely to change, if for no other reason it would require a change to the animation format, which currently has no means to interpret bone scaling.

Advertisements

SL project updates 25/1: server, viewer

SL14B Stage Left; Inara Pey, June 2017, on Flickr SL14B Community Celebrationblog post

Server Deployments Week 25

As always, please refer to the server deployment thread for the latest updates.

  • On Tuesday, June 20th, the Main (SLS) channel was updated with a new server maintenance package (#17.06.12.327066), containing fixes to help with the caps (capabilities) router, particularly with reference to trying to teleport to regions which have a heavy avatar load (see here for details),. These were essentially the same fixes as deployed to the Main channel on June 6th (server maintenance package #17.05.26.326655), together with additional internal fixes.
  • On Wednesday, June 21st, the RC channels should be updated as follows:
    • BlueSteel and LeTigre should receive the same server maintenance package (#17.06.19.327206) containing internal fixes
    • Magnum should receive a server maintenance package (#17.06.19.327192) intended to fix BUG-100830 (“HTTP_CUSTOM_HEADER no longer works on RC 17.06.13.327111”) and BUG-100831 (“Lelutka Simone bento head spits a script error when attached on 17.06.13.327111 regions (Magnum & Cake)”).

SL Viewer

The Asset HTTP RC viewer, version 5.0.6.326593 dated May 23rd, was promoted to de facto release status on Tuesday, June 20th.  This viewer includes avatar rendering updates – see my RC overview for more.

The snapshot viewer updated to version 5.1.0.506488 on Monday, June 19th. This version should include all the necessary metadata in 360-degree shoot to play them as 360 images on suitable websites. However, in testing, it does not appear to work with Flickr.

Otherwise, the current viewer pipelines line-up as:

  • Release channel cohorts:
  • Project viewers:
    • Project Alex Ivy 64-bit viewer version 5.1.0.505089 dated May 11th
  • Obsolete platform viewer version 3.7.28.300847 dated May 8th, 2015 – provided for users on Windows XP and OS X versions below 10.7.

SL project updates 24/3: TPV Developer meeting

Le Sixième Sens, Les Reves Perdus; Inara Pey, June 2017, on Flickr Les Reves Perdusblog post

The majority of the notes in this update are taken from the TPV Developer meeting held on Friday, June 16th, 2017. The video of that meeting is embedded at the end of this update, my thanks as always to North for recording and providing it. Timestamps in the text below will open the video in a separate window at the relevant point for those wishing to listen to the discussions. Note that the timestamps may not be in chronological order, reflecting the fact that some topics were discussed more than once during the course of the meeting.

Server Deployments, Week 24 – Recap

As always, please refer to the server deployment thread for the latest information.

There was no deployment to the main (SLS) channel on Tuesday, June 13th. Nor, as the channel was updated in week #23, was there a restart.

On Wednesday, June 14th, two of the server RC channels were be updated as follows:

  • LeTigre received a new server maintenance package (#17.06.12.327066), comprising additional internal logging and features and improvements to region start
  • BlueSteel received a new server maintenance package (17#17.06.13.327122) containing internal fixes

[15:20] The Magnum RC was initially updated with a newer version of the new operating system update (#17.06.12.327060), which included a fix for BUG-100737 “Shoutcast receivers unable to relay on RC Magnum” (see part 1 of this report for more on this issue). However, this deployment had to be subsequently rolled-back as the corrective intent of the BUG-100737 didn’t work as expected. This update will likely be re-deployed to Magnum ion week #25 (commencing Monday, June 19th).

SL Viewer Pipelines

Asset HTTP Viewer

[1:39] The Asset HTTP viewer should be promoted to release status at the start of week #25 (week commencing Monday, June 19th). The promotion has been delayed while the viewer goes through a complete regression test (something the Lab does every X number of viewer releases).  This viewer sees delivery of all remaining asset types (wearables, gestures, animations, sounds, etc) over HTTP via the CDN.

[11:39] This viewer should hopefully see faster first-time playback of sounds and animations, as these will be obtained via the CDN, which should be faster than being obtained through the simulator. It also means obtaining assets should also be a lot more reliable when you’re in a busy region, because – again – the assets are not coming via the simulator, but through a CDN node.

The Lab will – several months hence from now – remove the server-side UDP messaging support for these asset types. This will in turn mean that any viewers not updated to the HTTP support at the time the messaging is removed from the simulator will no longer be able to receive these asset types.

Maintenance RC Viewer

[5:25] The Maintenance RC viewer updated on Thursday, June 15th to version 5.0.6.327125. This includes an update to prevent the viewer crashing if it receives a UDP message from the simulator that it doesn’t recognise, by having the viewer ignore all unrecognised messages.

Voice RC Viewer

[5:04] The Voice RC viewer has been updated, but the update has a high crash rate and so the update is unlikely to see the light of day.

Alex Ivy 64-bit Project Viewer

[2:23] The next version of the 64-bit project viewer is completing testing. This includes the new Windows SL Launcher and updater, together with a 64-bit version of the Havok sub-libraries. As noted in my last TPV Developer meeting update, the launcher is essentially a 32-bit executable that checks a Windows system to see if it is 32-bit or 64-bit, and then endeavours to download the correct version (32- or 64-bit) of the viewer if an update is available, install it and then launch it. SL Launcher is only required for Windows as the Mac version of the viewer will only be provided in 64-bit once the Alex Ivy viewer reaches release status.

A follow-up build for RC release has apparently been built, and this should appear soon after the project update, and work has commenced on updating the wiki build instructions for building the viewer to match the 64-bit build process.

[38:26] The wiki instructions are being updates to reflect the requirements of the 64-bit build, so care should be taken when following them for other builds.

360-degree Snapshot Project Viewer

[6:21] The 360-snapshot viewer is now up-to-date and includes code to generate a 360 equirectangular images and their metadata, which can then be uploaded to suitable websites supporting 360-images. The update will appear once it has cleared the Lab’s QA testing.

There is still further work to be done on this viewer – the UI is going to be updated to allow integrated uploads of 360-images to SL Place Pages (and this may be done for Flickr, etc), and SL Place Pages will be updated to accept 360-degree images from the viewer.

TP Throttle

[13:28] The Lab is still looking at throttling the speed at which teleport requests can be re-tried when trying to access a busy region. An initial change is currently on the LeTigre RC, and further changes are liable to be made. As previously noted, these updates shouldn’t impact manual teleports, but may affect teleport HUDs which are scripted to repeatedly re-try teleports in rapid succession until one is  successful (requiring the scripts running them to be modified so they don’t exceed the throttle).

This change is being made because a high incidence of failed teleport requests hitting a busy region places an additional load on the region’s simulator, adversely affecting performance for those already in the region.

Other Items

Uploading Meshes Rigged to Attachment Points

[17:48] This subject came up at the Content Creation User Group meeting as a part of the discussion on animating weapons to follow hands. There was some confusion on whether mesh objects rigged to attachment points could be uploaded, after it was reported that the LL viewer supported it, and Firestorm didn’t (see FIRE-21000 – which now has a fix).  While there is a server-side validation error which can cause some issues when uploading meshes (fix in progress) which might cause upload problems, it is believed that the current behaviour here should be that new objects rigged to attachment points should be blocked from upload, but existing items rigged to attachment points previously uploaded to SL will still work.

Supplemental Animations and Animation Priorities

[24:17] The question was asked if there was any historic reason for not being able to change the priority of an animation post upload (see SVC-8094). It is thought this might be because the priority is set within the animation asset, which cannot be edited. However, it is hoped the forthcoming server-side supplemental animation updates will help eliminate some of the conflicts created by priority clashes.

Providing a Means to Compile Experience Scripts in the User’s Inventory

[35:21] Some people working collaboratively on experiences are finding it problematic when having to update scripts used by the experience, but which are contained in another user’s objects for that experience, as it requires a lot of swapping and changing, rather than simply editing the script in question (see BUG-8180).

While the Lab understands these difficulties, it was a conscious decision to have experience management work as it does, and while at some point in the future they might revisit things, doing so isn’t on the short-term roadmap.

Resetting Scripts in No-Mod Objects

[36:47] This is a request the Lab is unlikely to implement, because it would violate the expectations of the script authors.

SL project updates week 24/2: Content Creation UG w/audio

The Content Creation User Group meeting, at the Hippotropolis Camp Fire Circle (stock)

The following notes are taken from the Content Creation User Group meeting, held on  Thursday, June 15th, 2017 at 1:00pm SLT at the the Hippotropolis Camp Fire Circle. The meeting is chaired by Vir Linden, and agenda notes, etc, are usually available on the Content Creation User Group wiki page.

Audio extracts are provided within the text, covering the core points of the meeting. Please note, however, that comments are not necessarily presented in the chronological order in which they were discussed in the meeting, but are ordered by subject matter.

Animated Objects

Vir is continuing to work on this project, which has been given the informal name of “animesh” – which, as was pointed out in the meeting by some, sounds a lot like “any mesh”, although it seems to have some support among attendees, who have been doing their best to propagate the term ahead of the Lab settling on a project name.

Viewer Status

There is no ETA for a project viewer, as the current test viewer still has a habit of crashing other viewers in the same region by sending them unrecognised messages. This need to be fixed before a viewer supporting animated meshes goes into circulation, even as a project viewer.

Scaling Animated Objects

There has been some discussion around editing animated objects in order to adjust their scale, with the associated skeleton being automatically adjusted to match the desired size of the object. In testing the idea, Vir has found it a lot harder to do than expected due to how things are coded in the viewer. Essentially, there is no overall way to scale the skeleton; every individual bone in the skeleton has to be scaled.

However, these does appear to be one viable means of achieving the scaling up / down of an animated object, and Vir is going to take a look to see if it can be made to work in a semi-predictable way.

Suggestions on how to handle this have included adding a root prim to animated objects or using a script to apply scale or using the object’s bounding box (the physics bounding box isn’t seen as suitable, as some animated objects may not have physics associated with them). While the latter might be a little more fiddly to use, it is the option Vir seems to prefer, although as he notes, he still needs to do more testing. If the approach doesn’t work, use of LSL commands might be looked at as an alternative.

Baked Textures on Meshes

Anchor Linden is working on the project. At the moment the focus is on baking service infrastructure updates to support the increased baking requirements (including support of 1024×1024 textures, which is seen as the “easy” part).  There is no ETA for this work at present, but the rough work flow is:

  • Update the baking service
  • Carry out performance testing – increasing the number of avatar bakes for a large number of avatars is going to increase the cost of the baking process, so the Lab needs to be sure any requirements for additional baking servers are understood
  • Issue an updated viewer which supports rendering the new bakes, and has a compatible “local baking” (used to define your initial look for transfer to the baking service) which is fully consistent with the baking service.

Once these are in place, then work can commence on how to flag mesh faces as being surfaces on which the baked textures are to be applied. This will include  a mechanism  for hiding the existing (default) avatar body without using an alpha layer.

Updating the baking service to support bakes on meshes will not involve adding materials support to the baking service, although that may be considered as a future project. The focus here is purely on extending the baking service to support using the baked textures already available on mesh avatar bodies.

Alpha Masking Mesh Bodies

The question was raised on whether use of the baking service would allow clothing creators to use alphas as a means to hide body elements to stop them showing through mesh clothing worn by an avatar (as tends to be done with the system avatar and mesh clothing today, rather than or alongside side of the current mechanism where a mesh body (Maitreya, Slink, TMP, etc.), is split into numerous parts with multiple faces which can have individual alphas applied too hide them.

Vir believes the baking service should be able to provide suitable body masking, given it can already for the system avatar, where alpha baked into an appearance can be used to hide all or parts of an existing system avatar when seen by others.

Cathy Foil also suggested a means to “turn off” the default body parts on the system avatar (head, upper body, lower body) or the use of a second alpha channel. The first option is useful, but constrained – you can’t turn off hands or feet, for example as they are defined within the upper / lower body part. A second alpha channel offers greater flexibility, but adds to the complexity of implementation.

Overall, masking through the baking service – given there have been tests by body creators in the past to see how alphas within bakes work on mesh bodies – is seen as the more direct answer. It will obviously require people to go through a learning curve, vis understanding applying bakes to meshes and any UI changes, etc. The project viewer – once available – is seen as a means of starting on this learning process, as well as a means of determining what has been missed / may additional be required to make the capability useful.

Mixing Bento Hand Animations and Non-Bento Hand Morphs

BUG-100819, “Default hands spread wide during bento hand animations, making it impossible for Bento and non-Bento owners to play together” came up for discussion at the meeting.

In brief: the default system avatar uses a set of morphs to allow the hand to form a series of basic shapes: a relaxed hand pose, a fist pose, a spread fingers (default) pose, etc. Which can be triggered by an animation utilising an identifier. Bento animations, however, directly manipulate the 30-odd bones the hand to produce hand and finger poses. As the system avatar cannot used these bones, the Bento animations are effectively ignored when run on a system avatar.

However, the underpinning system hand morphs can still be used by the system avatar providing the required morph is identified within the animation itself. When this is done, the animation will play for Bento avatars, or be ignored by system avatars in favour of the defined morph. But if no morph value is specified within the animation, the system avatar hand will adopt the default splayed fingers morph – which appears to be what is happening in the JIRA, possibly combined with an animation priority clash.

Medhue Simoni recently produced a live stream walk-through of mixing Bento animations and default hand morphs, and provided the link to that session at the meeting, which I’ve embedded below.

It has been suggested that the splayed fingers issue could be avoided by changing the system so that if a null value is specified in an animation (as opposed to leaving the field blank), the system avatar will adopt the relax hand morph. While Vir has agreed to look into this, adding such a null value will not automatically resolve the problem for animations which doe not have any morph value defined – the system avatar will continue to use the splayed fingers morph.

Another suggestion is to have the exporter in the tool used to create the animation (e.g. Avatar) display a reminder that hand animations should have a morph value defined. This would make more sense, as it would be within the application where the animator can easily add a value if they had forgotten to do so.

General Discussions

  • Re-purposing Bento bones for pets – yes this can be done, providing the re-purposed bones are not being used for anything else (e.g. if a pet attached to your avatar skeleton uses facial bones and you have a Bento head using the same bones, wearing both at the same time will result in conflicts.
  • Animated object will overcome this, by allowing completely independent pets, but is it’s not clear at this point if these could be attached to an avatar, as that would me combining two independent skeletons.
  • A request was made to increase the largest allows size for prim creation (64m x 64m). This is unlikely to happen.

Bento Bones and Weapons

Bento bones can be used with weapons, again providing they do not class with other mesh using the same bones. In this, the wing bones would seem to be a good choice, given groin, tail and rear leg bones can have a wide variety of uses, and may be more prone to clashes.

One problem with weapons is getting them to align with the hands. As Medhue pointed out in the meeting, he has discovered that getting rigged weapons to stay aligned to the hands when the avatar’s shape is changed is next to impossible. Instead, he recommends not rigging the weapon, then using the hand attachment point and animating that instead. This both allows the weapon to be animated and ensures the weapon remains closely matched to the hand no matter how the avatar is resized.