Sansar Product Meetings #9: interactivity

The Wednesday February 28th, 2018 Product Meeting

The following notes are taken from the Sansar Product Meeting held on Wednesday, February 28th. The subject for the meeting was Interactivity in Sansar, although more than this was covered. These notes are an attempt to bring together the key points of discussion from the meeting.

These weekly Product Meetings are open to anyone to attend, are a mix of voice (primarily) and text chat. Dates and times are currently floating, so check the Meet-up Announcements and the Sansar Atlas events sections each week. Official notes, when published can be found here.

Bust a Move Release and Update

  • The Bust a Move (February) release was made subsequent to the meeting, which includes some of the updates indicated in the meeting – these have been noted below.
  • An update to the Bust a Move release was made on Friday, March 2nd, 2018, specifically aimed at fixing the issue of complex Marvelous Designer clothing items exploding briefly when entering cloth simulation mode.

Roadmap Notes – Interactivity

Aleks summarised the following as the key points for interactivity in Sansar the Lab has deployed , or is looking to, deploy and build on, over the coming months.

  • Keyframed motion type: introduced with the Bust a Move release on March 1st, which allows objects to be moved by object animations and certain script API functions.
    • This means that, while they are capable of movement, they are unaffected by gravity, cannot collide with static or keyframed objects, and can collide with dynamic objects and avatars, but are not moved by them.
    • Keyframed objects are useful for creating things like doors, that need to perform a specific motion reliably without adding unnecessary strain to the physics simulation.
    • It is acknowledged by the Lab that “keyframed” in this context is perhaps not the best descriptor for the functionality, and it may be changing to “scripted objects”.
  • A New Dynamic objects flag: used to determine if a dynamic object can be picked up or not.
  • Gravity / Centre of Mass:
    • Initial gravity scaling (0 through 5x “normal”) introduced with the Bust A Move release.This is straight up-and-down gravity.
    • Does not affect animations / locomotion, etc., at this time (a walk in 0 gee will be the same as in 1 gee, for example).
    • Ability to set centre of mass of objects and adjust the mass of objects via script will be coming.
    • Cylindrical and / or spherical gravity are not being worked on at this time.
    • Opens the case for avatar jumping / flying.
  • Friction and bounce controls to be made available for both terrain and objects surfaces. So ice will be slippery, a lead cannon ball many be set so it doesn’t bounce, but a rubber ball does, etc.
  • Scripted object interactions: this will allow creators to define what items  support interactivity, and how users can interact with them.
    • Will likely take the form of some kind of indicator which is visible when a mouse pointer (Desktop Mode) or hand (VR mode) is over the object, perhaps with a little tool-tip outlining the available interaction.
  • Manipulation abilities: the Lab is investigating how best to allow creators to define how items are held (e.g. a rifle and a hammer have different means of being held, one to the other) and for users to manipulate the objects they are holding (left hand? right hand? swapping between hands? etc).
  • Resource containers: objects that can be used to contain all of the sounds or animations or textures or rezzable which might be used in a scene, which can then be called via script as needed.
  • New interaction API: to provide abilities such as touching a button to change the colour or intensity or brightness of a light, or pull a lever, etc.
    • Again such capabilities will be introduced iteratively over time.
  • Scripted animation of textures: the ability to animation multiple textures on a surface, and potentially hook-in to a flipbook of textures.

Persistence Across Sessions

  • The Lab is looking at how to handle data persistence across sessions in Sansar (e.g. progress to date in a quest, or points scored in a game, etc.).
  • It’s not clear if this will be something handled internally with experiences, or by providing experience creators with some form of API, allowing them to create external mechanicals to store such data.
  • Seen as a capability which will be added once more in the way of game creation capabilities have been added to Sansar.


  • Not on the immediate  roadmap, and viewed as a “nice to have”, but not a priority at this point.
  • Are being considered with regards to emotes, e.g. type or say “smile” and a smiley pops up over your avatar or similar.
    • (Quite how this wouldn’t be as “immersion breaking” as having an unobtrusive voice indicator illuminate over an avatar’s head when voice is being used, as some at the lab claim, totally escapes me.)

Scripted Animation

  • Comprehensive scripted controls (start, stop, select a frame, change speed, etc.) for animations (and eventually leveraging the ability for .FBX files to contain multiple animations?) are being developed.
  • Initial work will be to provide scripted access to one animation on an object, rather than being able to control multiple animations.
  • The ability to swap between different sets of animations is acknowledged, but seen as “further out”.
  • This should allow for a range of NPC-style creations which can respond to avatar proximity.

Permissions System / Supply Chain

  • Still being worked on by the Lab.
  • Seen as a “sticking point” for many capabilities  / options within Sansar.
  • Currently prioritised is a script update system.
    • This won’t extend to objects, which will eventually have their own update mechanism.
    • It is not an automated update process (i.e. a new script is released and all previous versions are automatically updated).
  • Overall, the permissions system / supply chain is split into a number of areas, none of which is set for deployment in the near-term.

Performance Improvements

  • Voice is being revised so that when in an experience, only the voice data for those you can hear speaking is sent to the client, rather than voice data for all users in the experience, regardless of whether or not you can hear them.
  • Avatar LODs are also being looked at to help reduce the amount of avatar data being transferred to the client (do you really need to have every single tri in an avatar’s hair rendered when they are 300 metres away?).

One thought on “Sansar Product Meetings #9: interactivity

Comments are closed.