Category Archives: SL Project Updates

SL project updates 20/3: TPV Developer meeting

Nitroglobus Roof Gallery: Black and White Women – blog post

The majority of the notes in this update are taken from the TPV Developer meeting held on Friday, May 19th, 2017. The video of that meeting is embedded at the end of this update, my thanks as always to North for recording and providing it. Timestamps in the text below will open the video in a separate window at the relevant point for those wishing to listen to the discussions.

Server Deployments Re-cap

  • There was no Main (SLS) channel deployment or restart on Tuesday, May 16th.
  • On Wednesday, May 17th, the three RC channels were updated as follows:

SL Viewer

[1:00] The Voice RC viewer has an elevated crash rate, and the Lab currently haven’t determined why.

The Maintenance RC viewer updated to version on Thursday, May 18th. This viewer currently has a lower crash rate that the other RC viewers (although it has not been out that long), so might be a candidate for promotion. I have an overview of this viewer for those interested.

64-bit Viewer

[2:23] The last major functional addition for the 64-bit Alex Ivy viewer is currently with the Lab’s QA. If all goes well, a further project viewer update should arrive in week #21 (commencing Monday, 22nd May).

This introduces a new executable to the viewer – SL Launcher – which runs an update check at start-up. If there is a new version of the viewer available, the Launcher manages the download and installation – including ensuring Windows users get the right version for their operating system (32-, or 64-bit).  If there is no new version to install, or once the viewer installation has completed, the Launcher will launch the viewer as a child process, and will shut down when the viewer exits at the end of a session.

The plan is to move the crash data capture package to the Launcher in the future, which will give full end-to-end monitoring of the viewer in the event of a crash.

360 Snapshot Viewer

[6:07] The work on the 360 snapshot viewer is once again progressing. A new library has been added, which provides the appropriate meta data so that websites supporting 360-degree viewing can correctly such images taken by the viewer on upload, eliminating the need to process them separately via the web service currently supplied by the Lab.

This work is currently being tested, and should find its way into a project viewer update some time in the next two weeks or so, with a release candidate hopefully not too far behind that.

Region Crossing Hand-off / Caps Router Issues

[7:43] Fantasy Faire experienced very high levels of region crossing hand off problems with avatars trying to move between the various regions. A similar issue has surfaced at the just-opened Home and Garden Expo.

While it issue isn’t new, the Lab found a cause is the Caps Router running out of connections due to the number of avatars it is attempting to serve. New monitoring has been put in place which will determine how many connections the Caps Router is using, and when it is approaching its limits. The data gathered will be used to help better determine how many connections are needed, allowing the Lab to adjust the number supported.

This work is going to be carried out incrementally, starting with an initial RC deployment in week #21 containing conservative adjustments in the hope of avoiding creating additional bottlenecks in changing things too radically at one time. However, the hope is that the changes will in time result in two improvements:

  • It could result in an increase in the number of avatars a region can comfortably support
  • As this is an issue at the SERVER level (not the simulator), the changes should help reduce people on regions with few avatars on them experiencing issues as a result of the region being hosted on the same server as one (or more) regions with a lot of avatars on them.

As a result of understanding the problem, the Lab was aware the issue was impacting the Home and Garden Expo even before it had been reported.

Unsuccessful Teleports Impacting Region Performance

[14:20] During investigations into the region issues at Fantasy Faire, the Lab noted that a simulator running a busy region has to carry out a lot of work to determine whether or not someone can teleport into it, which can degrade overall simulator performance.

To combat this, the Lab is going to change the teleport re-try throttle following a failed TP. As viewer-initiated teleports are already somewhat throttled, the change should not affect them. However, it will likely mean that the very rapid retry TP HUDs (aka “TP hammers”) will break or degrade in their performance unless adjusted.

The hope is that by reducing the load placed on a simulator as it tries to deal with too rapid a succession of TP requests which cannot be granted as the region is full, overall performance will be improved and those already in the region will enjoy a better experience.

This change should be appearing in a server RC update soon.

Additional comments on teleport failures:

  • A queuing system will not be added, as this is deemed to be too difficult to implement and manage.
  • There is no relationship between the size of an avatar’s inventory and the frequency with which that avatar may experience teleport failures. However, the amount of items attached to an avatar, the scripts they are running, etc.
  • The Lab can monitor teleport failures in real-time.

Automatic Additional Logging after Region Crashes

[29:27] It was asked if additional logging could be automatically enabled on a region crash. This is something that cannot be done, and Oz’s belief is that doing so would result in an additional load on the simulator during recovery, and so not be a good idea.

Avatar and Object Rendering Cost Investigations

[31:00] The Lab is continuing work in reviewing the rendering cost calculations for in-world objects and avatars, work I first reported on in September 2016. However, the numbers aren’t at a point where any adjustments can be made to the calculations.

Fun Fact

Oz Linden marked his seventh anniversary at the Lab this week – so a belated happy rezday to him! Some of us can likely remember his 2010 appearance at the SLCC, when Esbee Linden introduced him to the audience in Boston 🙂 .

Oz at one of the viewer / open-source panels at SLCC 2010, with Esbee Linden just visible to the right






SL project updates week 20/2: Content Creation User Group w/audio

The Content Creation User Group meeting, at the Hippotropolis Camp Fire Circle (stock)

The following notes are taken from the Content Creation User Group meeting, held on  Thursday, May 18th, 2017 at 1:00pm SLT at the the Hippotropolis Camp Fire Circle. The meeting is chaired by Vir Linden, and agenda notes, etc, are available on the Content Creation User Group wiki page.

Audio extracts are provided within the text, covering the core points of the meeting. Please note, however, that comments are not necessarily presented in the chronological order in which they were discussed in the meeting. Instead, I have tried to place a number of related comments by Vir on specific topics into single audio extracts and with their associated notes, in the hope of making those topics easier to follow, and without changing the context of the comments themselves.  If you would prefer to listen to the discussion and comments in the order the meeting unfolded, I have embedded a video recorded at the meeting by Medhue Simoni. My thanks to him making it available.

Supplemental Animations

While this is now an adopted project, the focus has been on animated objects, and so there is no significant progress on this work at present.

Applying Baked Textures to Mesh Avatars

No movement on this.

Animated Objects

Vir has spent most of the week since the last meeting working on animated objects and developing prototypes and looking at proof-of-concept to see how objects might be animated using the avatar skeleton. He describes the results thus far as encouraging whilst also pointing out it is still early days with the work, so it is still far too early to determine what the final architecture will be.

The viewer already has a notion of an avatar without a human operator, which is notably seen when uploading an avatar mesh or animation. This notional avatar isn’t rendered graphically, but is oriented using transforms so that an object can use it as a source of joint motions. This is not necessarily how things will work with any finished product, but it is enough to demonstrate what might be possible.

Currently, Vir is working with single object rigged meshes, and would be happy to receive similar models, preferably with associated animation, if people have anything they believe would be useful for helping with these tests.

It is hoped that “being animated” will be an additional property which does not require a new mesh upload option, so that any rigged mesh for which you have Edit permissions for can be set to use the property  so that it can be driven by its own animations.  Currently:

  • This will likely mean the object will no longer be attachable to an avatar
  • It has yet to be determined if this property will be a new prim type or an additional field added to an existing object, etc
  • It will not require any changes to the current mesh uploader; the property to convert a mesh to an animated object can be set post upload.

A suggestion was made that the animated mesh should use its own skeleton when independently rezzed in-world, but a sub-set of a controlling avatar’s skeleton if it is attached. This would allow things like animated horses to be rezzed in-world and then sat on for riding or pets to be “picked up” and carried,  as is currently the case with some scripted animals already.

The testing carried out thus far hasn’t looked at animated attachments, although Vir appreciates the potential in having them. However, there are concerns over potential additional performance impacts, the risk of bone conflicts (what happens if your avatar is already using one or more bones some something and these same bones are used by an animated attachment).

While not ruling the potential out, Vir’s tests so far haven’t encompassed animated attachments to determine what issue might arise.  There are also other factors involved in avatar control which need to be looked at with animated objects: hover height, offsets, position, etc., all of which might affect how an animated object might be seen / behave.

Scripting / LSL Commands

The current work has not so far looked at LSL commands or command sets for the new capability. However the intent remains that scripts for controlling an animated object will be held within the inventory for that object, and able to call animations for the object also contained within the object’s inventory, so things are not straying too far from what can already be doing vis scripted control of in-world objects.

Performance Impact

Similarly, it is hard at this point to know what the likely performance hit might be. Bento has shown that adding more bones to the avatar skeleton doesn’t create a notable performance hit, so providing a skeleton for in-world objects shouldn’t cause any greater impact than a basic avatar. However, associating a rigged mesh object with than skeleton, then animating the joints, etc., will have an impact, particularly if a lot of animated objects are used in any given place.

This is something that will be looked at in greater detail once there is a project viewer available for testing alongside any server-side updates, although the Lab doesn’t intend to make it easy for a region to be spammed with multiple versions of an animated object, and this may in part be linked to the Land Impact associated with such objects.

Attachment Points on Animated Objects and Linksets with Animated Objects

While attachment points are also joints within the skeleton being used by an animated object, and so can be animated, they would not actually support having other objects attached to them, as the animated object doesn’t have links to other objects in the way an avatar does.

An animated objects could be a linkset of rigged meshes which are identified as a single object, with all of the rigged meshes referencing the same skeleton. Things might be more difficult if static mesh objects form a part of the object, as it is not clear how the positioning of these would be controlled, and more testing is required along these lines.

Body Shapes and Animation Scaling

Requests were made to allow animated objects to have body shapes (which would allow slider support, etc.), and  / or animation scaling.

Because of the changes that would be involved in both, coupled with the potential for conflicts in the case of animation scaling, Vir does not see either as being part of this work – as previously noted, assigning a body shape to an animated object would impact a number of other back-end systems (such as the baking service), adding significant overheads to the project.

As such, the Lab would rather keep the work focused, building on something that could be rolled-out relatively quickly, and then iterated upon. However, one option that might be considered is having some kind of root node scale, based on the scale of the animated object that would size the skeleton to the scale of the object, rather than vice versa, possibly by altering how the mPelvis bone is managed for such objects.

[56:37-1:02:30] The final part of the meeting delved into the relative efficiency of mesh and sculpts, and matrix maths on CPUs / GPUs, and the complexities of rendering animated objects, together with a reminder that object rendering costs are currently being re-examined.

Other Items

In-World Mesh Editing?

[41:00-55:55] Maxwell Graf raises the idea of having a simple in-world mesh editor / enhancements to the editing tools which would allow creators to adjust individual face, edge or point in an object, presenting a reason for mesh creators to spend more time in-world and which might allow non-mesh builders more flexibility in what they can do as well.

The current toolset  – mesh uploader and editing tools – would not support such a move. There are also a number of potential gotchas on a technical level which would need to be understood and dealt with, and in order for the Lab to consider such a project, any proposal would have to consider the smallest subset of capabilities available in dedicated mesh creation / editing tools like Blender and Maya that would be useful to have in-world, so that it might be possible to define the overall scope of the work required in terms of resources, etc., and what the overall return might be on the effort taken.

Based on the conversation, Max is going to try to put together a feature request / proposal, even if only for the purposes of future discussion.


SL project updates week 20/1: server, viewer

Crystal Garden Estates, Quararibea Cordata Island; Inara Pey, May 2017, on Flickr Crystal Garden Estatesblog post

Server Deployments

As always, please refer to the server deployment thread for the latest updates and news.

  • There was no Main (SLS) channel deployment or restart on Tuesday, May 16th.
  • On Wednesday, May 17th, the three RC channels should be updated as follows:

Simulator Operating System Update

The build of the simulator code using an updated version of Linux was initially deployed to LeTigre in week #19. However, it was rolled back on Thursday, May 11th and replaced with the server maintenance package originally deployed to the Magnum RC channel that week. The reason for this can be found in BUG-100667, “Krafties HUD does not work on LeTigre regions only”. The BlueSteel deployment should hopefully correct this issue.

SL Viewer

The Alex Ivy 64-bit viewer was updated to version on May 11th. If you’re on a 64-bit version of Windows, make sure you click on the correct download link to avoid receiving the 32-bit version.

On Friday, May 12th, the Maintenance RC viewer, version was released. This viewer includes improvements to Trash purging behaviour designed to assist with avoiding inventory losses and the new UI controls for the new parcel access overrides – both of which have been previously noted in these updates, with the latter being deployed to LeTigre this week.

The new Trash purging warning, giving a count of the items about to be permanently deleted from the trash folder – one of the new behaviours in the Maintenance RC viewer designed to help combat accidental inventory loss through Trash deletions

In addition, the Maintenance viewer has additional fixes and UI improvements, including a contributed feature which allows users to search and replace asset links in their inventory. This should greatly simplify updating links related to a product when it has itself been updated. The default media playback volume has also been reduced, in keeping with recent requests from some Community Gateways.

Outside of these two viewers, there have thus far been no other changes to the viewers in the pipeline, which remain as:

  • Current Release version:, dated April 3rd, promoted April 19th – overview
  • RC viewers:
    • Voice RC viewer, version, re-released on Friday, May 5th
    • Project AssetHttp project viewer, version dated May 4th
  • Project viewers:
    • 360-degree snapshot viewer,version, dated November 23rd, 2016
  • Obsolete platform viewer version dated May 8, 2015 – provided for users on Windows XP and OS X versions below 10.7.

Terrain Issues

The golfing community has noticed an apparent behaviour change affecting wither the terrain or scripted golfing systems. The change manifests in a number of ways, for example: indicators which should only be triggered when a ball registers as being in Linden Water triggering when the ball is on land; golf balls apparently penetrating the terrain and being marked as deep under it; balls hitting prim objects and bouncing wildly, etc.

The problem has been noted a multiple golf course and appears to affect all popular golf systems – those by Fa Nyak or Cowley, for example. The issues have been around for about 5-6 weeks, and reports are that they are getting worse. They don’t appear to be linked to issues of “lag” either in the viewer or at the simulator end (e.g. due to the volume of avatars in a region), as the problems can pop-up with just two people playing a round; they are also somewhat inconsistent and difficult to deliberately reproduce. A JIRA has been requested on the problem to help the Lab investigate.

SL project updates 19/2: NEW projects – supplemental animations and animated objects

The following notes are taken from the Content Creation User Group meeting, held on  Thursday, May 11th, 2017 at 1:00pm SLT at the the Hippotropolis Camp Fire Circle. The meeting is chaired by Vir Linden, and agenda notes, etc, are available on the Content Creation User Group wiki page.

Audio extracts are provided within the text – although please note, these are not necessarily presented in the chronological order in which they were discussed in the meeting. Rather, I have tried to place a number of related comments by Vir on specific topics together – project scope, constraints, etc., where in the meeting they may have been discussed / reiterated at different times. Medhue Simoni recorded the meeting, and his video is embedded at the end of this report for those wishing to following the discussion chronologically. My thanks to him for the recording.

The meeting held two major announcements: supplemental animations and animated objects, both of which are being loosely referred to under the umbrella of “animation extensions”.

Supplemental Animations

This is an idea to overcome issues of animations states keyed by the server-side  llSetAnimationOverride() conflicting with one another. This problem has been particularly noticeable since the arrival of Bento, and a typical example is that an animation to flap Bento wings, if played to have natural wing movement while walking, results in a conflict with the walk animation, causing the avatar to slide along the ground.

  • Supplemental animations will allow additional animations to run alongside the basic llSetAnimationOverride() locomotion graph, requiring updates to the server-side animation code, rather than any viewer updates.
  • The changes will allow for more than one supplemental animation to run at the same time – so you could have wings flapping while walking and a tail swinging – providing the animations are restricted to using discrete sets of bones and do not clash (e.g. the wing flapping doesn’t call on a bone used in tail wagging or walking). If there is an overlap, the normal animation priorities would then determine which animation is played.
  • While the syntax still has to be worked out, it will likely be a call to add a set of supplemental animations associated with a specific state (e.g. walking) on attaching a relevant object (such as wings), and a call to remove the animation set when the item is subsequently detached.

Animated Rigged Objects

The Lab is starting work on adding the ability to animate rigged objects to Second Life – something which has been the focus of ongoing discussions within the Content Creation User Group for the past several months.

General Overview, Initial Limitations – *NOT* NPCs

  • At this point in time, this is not about adding fully functional, avatar-like non-player characters (NPCs) to Second Life.
  • It  is about providing a means to use the Bento skeleton with rigged mesh to provide things like independently moveable pets / creatures, and animated scenery features via scripted animation
  • The project may be extended in the future.
  • It will involve both back-end and viewer-side changes, likely to encompass new LSL commands to trigger and stop animations (held in an object’s inventory)
  • It’s currently not clear if this will involve a new type of mesh object, or whether it will need a new flag added to an existing rigged object type in order for the object to be given its own skeleton. But either way, it will be in the same family of rigged mesh objects which is current available.

  • While these objects may use the avatar skeleton, they are not avatars.
  • They will not:
    • Have any concept of body shape or avatar physics associated with them.
    • Use a Current Outfit Folder for wearables.
    • Utilise any system wearables (body shape, system layers, physics objects, etc.).
    • Be influenced by the shape sliders, or have any gender setting (as this is determined by the shape / shape sliders).
  • They will only be related to avatars in that they have associated animations driving them.

  • Given this is not an attempt to implement fully avatar-like NPCs in Second Life, use of the term “NPC” with them is not encouraged.
  • At the moment, the preferred term is simply “animated objects”.

Performance Concerns

  • There is liable to be two areas of impact for this capability:  in-world land impact, directly affecting the simulator, and a rendering impact on the viewer.
  • Right now, the Lab has no idea how great either might be, but they do mean that what can be supported could be limited (hence a reason for not jumping directly to providing “full” NPC capabilities).  However, it will be something that will be monitored as the project moves forward.

General Q&A

This news prompted a range of questions, which Vir sought to address:

  • Would this mean custom avatar skeletons?  – No, it would use the existing (Bento) skeleton, and attaching it to an animated rigged object. However, joint positions and offsets will be supported, allowing the skeleton to be modified to meet different uses.

  • Will this allow the use of Animation Overriders on objects?  – No. objects would at this stage not have  their own locomotion graph like an avatar does, and therefore would not have any notion of walking or flying, etc. All animations would have to be scripted.

  • Does this mean limits associated with the current avatar skeleton – such as the limit of placing a bone no further than 5 metres from the avatar’s centre via an animation – will still apply? Yes, any limits baked into animation will remain. The idea is for existing meshes and existing animations would be able to leverage this capability. In terms of the 5 metre offset limitation.
  • Could animated objects be attached to an avatar?  – This is not necessarily what is being looked at, which is not to rule it out; rather, the emphasis at the moment is getting things animated independently of avatars. There is also a concern over the potential additional impact of animated attachments to an avatar may have.

  • What happens if a script tried to drive the rigged mesh, rather than the avatar skeleton? – Normally, the scripts driving an avatar are in the attachments to that avatar, so “crossing the beams” is not something the Lab would recommend.
  • Is the Lab using this to help fix Pathfinding? – Not really. Pathfinding has its own set of issues and these are unlikely to be tackled as part of this project.
  • Can the skeleton for an animated object be assigned via script from an inventory object? – This might cause permissions issues.
  • How will a script know which object to animate? – The basic thinking is that the script would be inside the object it is animating (as is currently the case for placing scripts in an object), and so has permissions to animate that object. Using a single script to animate multiple independent objects would be more complicated and require some kind of object ID.
  • Could several rigged objects (rigged the same) be linked and have the same animation played? – Yes; the difference would be the object would be animated with respect to its internal skeleton rather than an actual avatar skeleton.
  • Would it be possible to sit on animated objects? – Possibly; although there might be issues, things might look odd. The Lab hasn’t investigated far enough to determine potential gotchas for this, but the hope is animated objects could work for vehicles.

  • Could animation scaling be used to adjust the size of an animated object? – It might make more sense to add some kind of “global scale” which would allow a skeleton to accommodate itself to the size of its object (rather than the object’s size being defined by the skeleton).

  • Will this allow animated objects to have wearables and attachments? – Not at this stage (although mesh clothing could in theory be a part of a the linkset making-up an animated object).  This is a very focused project at this point: playing animations in-world on rigged objects.

Other Points

  • A suggested name for the animated objects project is “Project Alive” – this might actually be adopted!
  • The are no plans for a blog post announcing the project. However, a mechanism will be provided for people to keep involved and comment on the work, possibly via a forum thread, as was the case with Bento. This might at some point utilise polls to focus down on people’s preferences.
  • The in-world forum for discussing this work will be the Content Creation User Group.
  • Between the 44:24 and 51:10 there is a discussion of adding a prim cube) as the root of the skeleton, allowing it to inherit physics and the abilities associated with a prim, morphing physics, plus using IK (inverse kinematics) with rigged object skeletons etc. Pros and cons of these ideas are discussed – largely in chat.  In short: the Lab are still considering how physics might be handled, although they are unlikely to opt for animated or morphing physics, while IK would also need to be looked at.
  • At present, there are no clear time frames as to how long these projects – supplemental animations and animated objects – will take, or when they will be implemented, simply because they are in their early phases. However, given the supplemental animations are restricted to server-side changes and do not require associated viewer updates, they might arrive sooner than animated objects.

Applying Baked Textures to Mesh Avatars

This remains under consideration, with Vir noting animated rigged objects could add a level of complexity to it, were it to be formally adopted as a project.