2018 Sansar Product Meeting week #31: physics

Scurry Waters: product meeting location

The following notes are taken from the Sansar Product Meeting held on Thursday, August 2nd. These Product Meetings are open to anyone to attend, are a mix of voice (primarily) and text chat. Dates and times are currently floating, so check the Sansar Atlas events sections each week.

The primary topic of the meeting was Sansar physics, although inevitably other subjects were also covered.

My apologies for the music in the audio extracts. This is from the experience where the meeting was held, and I didn’t disable the experience audio input.

Express Yourself Release Updates

The July Express Yourself Release (see my overview here) had two short-order updates following its deployment.  Both were to provide fixes for emerging issues. The first went out on July 19th, and the second on July 30th.

Client-side Physics

The Express Yourself release included an alteration to network behaviour that means physics interactions occur locally within the client first, allowing the user an immediate response. The idea is to allow the kind of immediate feedback to the user that will be essential to dynamic activities such as drive or flying a vehicle as well as allowing for more immediate response when picking an object up, walking, firing a gun, etc.

However, as the updates still need to pass through the server and then back out to everyone else, this can result in objects appearing to instantaneously move when control is passed to another avatar. More particularly, it was discovered the change could adversely affect any movement governed by scripts, which require additional time for server-side processing, and this resulted in some content breakage, which in turn caused the updates  – notably that of July 30th – to be issued in order to fix things.

It has also resulted in some behaviourial changes with scripted interactions; for example: when firing a scripted gun, as the action still requires server-side script processing, while initial movement response is client-side, it is possible to fire a gun while moving and have the projectile appear to spawn separately to the gun and avatar (e.g. behind or slightly to one side). This is to be looked at if the July 30th update hasn’t fixed it.

This work is going to be refined over time to make interactions both responsive and smoother, and is seen as an initial step towards more complex object interactions, such as being able to pick in-world objects up and hold them in the avatar’s hands.

Avatar Location Issue

One side effect of this is that avatars in an experience, when seen by others, can appear to be in a different place to where they have placed themselves. At the meeting for example, some avatars appeared to be in the local group in their own view (and, I think, to some others), but were appearing to still be at the spawn point for the experience in other people’s views. This seemed to be particularly noticeable with avatars standing still, with movement required to force the server to update everyone’s client on the location of an avatar. A further confusion from this issue is that as voice is based on an avatar’s position relative to your own, if they appear to be much further away, they cannot be heard, even if in their own view they are standing right next to you.

Avatar Locomotion Enhancements

Improvements to avatar locomotion are said to be in development at the Lab. This work includes:

  • The ability to use animation overriders.
  • Additional animation states (e.g. jump)
  • Avatar physics driving – allowing avatars to be affected by physics for things like ballistic movement or falling.

It has been suggested this work should include an ability for the avatar IK to be enabled or disabled alongside creator animations, depending on the animation type being used.

The client scripting idea requires careful consideration: will creators want their scripts run client-side? Could it be a toggle option so scripts can be expressly flagged to run of the server only? What would be the communications mechanism between script on the client and scripts on the server to ensure they remain synchronised? Should client scripts be limited to only certain capabilities, with the server still doing the heavy lifting? and so on. So – look for the ability to attach avatars to vehicles (and vehicles to avatars and objects to one another) in the future.

Vehicles

As noted above, the work on making physics more client-side active is aimed towards enabling better vehicles (using the term generically, and not as a representation just of road / wheeled type vehicles) and their controls in Sansar.  This will likely initially take the form of an ability to attach avatars to vehicle objects (a-la Second Life), allowing both to be “driven” via scripted control. This would allow for very simple vehicle types.  From there the Lab’s thinking is moving in two directions:

  • A scripted approach (client-side?) that would allow for a more flexible approach to defining vehicles and their capabilities;
  • A “vehicle component” within the platform that could be applied to different vehicle models to enable movement, etc. This would be potentially the easier of the two approaches, but would limit the degree of customisation that could be employed to ensure it fits certain vehicle types,

Scene Load Times

There has been  – from the start with Sansar – much discussion on scene load times. While a lot has been done on the Lab’s part to improve things there are some experiences that do still take a lot of time to load, and for some, depending on the circumstance may never load. There are really two issues for scene loading:

  • Bandwidth – the biggest.
  • Memory footprint – some experiences can top-out with a physical memory footprint of 14.5 Gb. For a PC with “just” 16 Gb of memory, that represents a struggle. Virtual memory (disk space) can obviously compensate, but can lead to a performance degradation.

In hard, practical terms, there is little the Lab can directly do to resolve these issues – a person’s bandwidth is whatever their ISP provides, and physical memory is whatever is in the box. However, as noted there has been a fair amount of work to offer improved optimisation of scenes, improve load times through the way data is handled – notably textures, potentially one of the biggest causes of download problems, and sound file handling (another big issue) – and more work is coming, with Lab CEO Ebbe Altberg recently noting a number of options being considered, by way of the Sansar Discord channel:

  • Progressive texture loading.
  • CDN distribution (for more localised / faster availability of scene objects materials and textures, rather than having to call them “long distance” through the cloud).
  • Background scene loading.
  • Addition of better LOD capabilities for model loading /rendering (if it is far away, only load / render the low-detail model).

Further indicators are, I understand, also planned for the Scene Editor, designed to keep experience creators better informed about the load times of objects and elements. Appropriate elements of this information will also be made available in store listing for items, allowing scene builders to again make more informed choices about the items they may be considering buying for inclusion in their experiences. There are also some practical work creators can do to ease things across the board: use smaller textures, decimate their mesh models correctly,  employ reuse of sounds and textures, etc.

In Brief

  • Aggressive render culling: Sansar can employ some aggressive render culling resulting in objects appearing clipped or vanishing from a scene unexpectedly. This is most obvious with animated objects using bone animations. This is to be looked at.
  • The last few minutes of the meeting were focused on ideas such as having a mini-map capability to find people within an experience; an ability to “go to” teleport to a friend; the ability to offer a teleport someone in an experience to your location, etc.
Advertisements

Carolyn Phoenix at Club LA and Gallery

Club LA and Gallery: Carolyn Phoenix

“There’s a crack in everything. That’s how the light gets in” are the words printed on the invitation to see an exhibition of photographic art by Carolyn Phoenix that recently opened at the Club LA and Gallery, curated by Fuyuko ‘冬子’ Amano (Wintergeist). Whether this is the title of the exhibition or a byline for it, I’m unsure. But I can say that the pieces on offer are hauntingly beautiful in their composition and presentation.

The mezzanine level of the gallery, where the exhibition is being hosted, has been converted into a dark, enclosed space in keeping with the title / byline. On display within it are 20 images by Carolyn, sharing the space with torso mannequins equipped with angel wings that add to the dream-like feel of the environment.

Club LA and Gallery: Carolyn Phoenix

The images themselves are mostly dark in tone and subject – so much so that specific details can be hard to make out beyond the shard or pools of washes of light each image contains. These bursts and flickers and beams of light reflect the title  / byline: they have seemingly entered the worlds of these pictures through cracks or holes or as a result of sunlight breaking through clouds or a lone bulb hanging from a ceiling or a reflection from somewhere, to revel things that might otherwise remain unseen.

What these casts of light reveal various from image to image.  Some are mindful of dreams or secret thoughts, often dark in tone – the kind of imaginings we’d rather not shed public light upon, but that nevertheless draw us to them. Others are lighter in nature, simply exulting in the play of light and shadow or the beauty of an artist’s expression of their work; there’s even a hint of playfulness about one.

Club LA and Gallery: Carolyn Phoenix

Some of the images seem to call into focus ideas of identity and of judgement. Teller (seen on the left of the banner image for this review) for example, with its reclined figure looking at a list of eyes from eyeless sockets, tends to suggest the idea of how we present ourselves to the world. The eyes, after all, are the windows of the soul; so how better to project who we might want to appear to be than by selecting our eyes, and only revealing what we want to be seen of ourselves? At the same time there is another potential interpretation: if the eyes are the windows into the soul and thus to who we really are, then how better to remove the potential for the light of understanding to penetrate our inner self than by expunging our eyes altogether, lest we be judged for what lies within.

Judgement is a theme brought into focus by a piece called Verdict (on the left of the image directly above these two paragraphs). But Again the meaning seems to be twofold. On the one hand, the tall figures surrounding the smaller one suggest a fear of judgement; of being looked down upon by others. But closer examination of the smaller subject, catsuited and hooded, perhaps suggests something else: a desire to be judged, to be found wanting and perhaps “punished”. Thus the light haloing the scene perhaps reveals kink-edged secret she at the centre of the image would rather remain hidden to all but a few – or even takes a guilty pleasure in having it so revealed…

Club LA and Gallery: Carolyn Phoenix

Nuanced throughout, a captivating display of photographic art well worth visiting. And while doingso, why not avail yourself of the exhibitions by tralala Loordes and Sighvatr (worthaboutapig), both of which can be seen or accessed on the ground floor of the gallery.

SLurl Details