The following notes are taken from the 4:00pm PST Sansar Product Meeting held on Friday, December 8th. Product Meetings are usually held every Friday at 9:30am PST and 4:00pm PST, and are open to all. There is currently no set agenda, and the meetings are a mix of voice and text. The official meeting notes are published in the week following each pair of meetings, while venues change each week, and are listed in the Meet-up Announcements. and the Sansar Atlas events section.
The afternoon Product Meeting featured a drop-in by Ebbe Altberg, the Lab’s CEO, and notes on his comments, with audio extracts are included in this update.
This is still on-course for a deployment – mostly likely during week #51, commencing Monday, December 18th, 2017. For a summary of some of the items included in the release, please refer to my December 1st Product Meeting notes. The following covers only those items in addition to that breakdown, and which were noted in the December 8th meetings:
- In addition to clothing, the release will allow the upload and sale of hair attachments, and hair can be removed from an avatar to make it bald. It’s not clear at present if the hair is / can be rigged or not.
- Avatar attachments will no longer be limited to a 1m x 1m x 1m size, but will be limited by the avatar bounding box – precise dimensions will be in the knowledge base for attachments when the release is deployed.
- There will be a snow material type for those wanting to make winter scenes.
- Materials have been made more distinct and most spatial.
- Voice fall-off has been revised so it starts fading from 2 metres away from a person speaking, rather than a metre.
- Some of the UI panels / floaters will be resizable and relocatable within the client, these include:
- People and chat panels in the run-time mode
- The inventory, scene object and properties panels can be resized and moved around in Edit mode.
- The Event calendar will be available on the Client Atlas, although it will look different to the Web Atlas format.
As previously noted, this release will add the ability for experience creators to edit and change the materials on in-world (not accessories / clothing) on items they have obtained via the Sansar Store. However, any such changes will only be applicable when the object in question is within the scene. As soon as it is returned to inventory, any changes made will be lost.
This change comes ahead of any permissions / licensing capability in Sansar, and has caused some upset. However, Cara indicated that when a permissions system is introduced, any items held in inventory will effectively be grandfathered – so it will still not be possible to save changes to their materials back to inventory.
A Store Update had been planned for between the Friends release (October 2017) and the Friends release (December 2017), but this has apparently been pushed back.
Ebbe Altberg dropped into the afternoon Product Meeting, part of his plan to spend more time at Sansar meetings and meet-ups – and took time to answer questions and offered thoughts on the platform.
An Important Note from Ebbe
You have to be careful when you listen to me, because I mix what’s actually going to happen with want we wish will happen all the time. So I can’t promise time frames for some of these things.
So when you listen to me, think about it as general ideas of where we want to go. Whether it happens or not, that’s kind-of a different story.
On Sansar’s Engine
“We made some really tough choices up front,” Ebbe said on the choice of building Sansar’s engine, rather than opting to utilise something like Unreal or Unity. “We would have gotten something much faster to market that would have been usable if we had just gone with an existing engine … But because of the problem we’re solving for, which is user-generated content in massive quantities, going with another engine really becomes problematic over time.”
Essentially, this choice came down to the issues of backwards compatibility within the platform; using a third-party engine in full or in part potentially opens the Lab to content breakage as a result of changes being made to an engine or elements of an engine that are outside their control. This is a lesson they’ve taken to heart with Blocksworld, which is based on Unity, and has had problems over the last five years as a result.
On The Risk of a Large-Scale “Reset” for Sansar
This goal with regards to continued backward compatibility with content available and used within Sansar means the Lab is hoping that they’ll never have to do a large-scale “reset” with Sansar which might result in widespread content breakage. However, this can never be guaranteed; there may be times – a significant bug, a major technical issue, an implementation (say) of new software the Lab wants to leverage for Sansar – which might result in content breakage. Should anything like this happen, the hope is there will be advanced communication with creators so they understand the issue, together with time allowed for them to swap over to any “new” way of doing things (where applicable) in order to try to minimise the overall impact.
The Supply Chain / Licensing
The supply chain / licensing (/permissions) system is one of the more complex aspects of Sansar the Lab is still working through.
For those unfamiliar with the idea, in essence, if someone creates an experience intended for re-sale which utilises content made by other creators, and then packages the experience for sale, the supply chain system will ensure the other creators will automatically get paid to some degree as well for their creations with each sale of the experience. On a smaller scale, it would mean a building designer could furnish their buildings from a range of furnishing and décor suppliers, rather than having to make everything, and again, the supply chain means those designers receive an amount from each sale of the building.
Obviously, this involves considerable added complexity in terms of permissions, licensing, tracking, payment, etc., – so until things are ready, there is not detailed talk on time frames for introduction – but it is the goal Linden Lab is hoping to achieve.
Avatar Animations for Non-VR Users
When using Sansar in Desktop Mode, the avatars are – in a word – wooden. Changing this is a “high priority” for the Lab. However, how this is to be achieved is still the subject of debate within the Lab. Some want extremely smooth, human-like avatar movement, with blended transitions between animations to give a more fluid movement (such as getting up from a chair being a fluid transition from seated to standing, or a turn to face to the left being a sequence, fluid body move). Other prefer a “snappier” transition – as is the case of turning left or right in Desktop Mode at present, or the avatar “jump” from seated to standing seen in Second Life.
Allowing user-created animations and animation systems (e.g. SL-like animation override systems) is currently much further down the road than trying to provide a more basic animations within the Sansar locomotion graph.
A Broad Look At The Future
The focus thus far has been building-out the platform, getting the software and infrastructure needed to support it all brought together, with more recent work centred on creator tools and needs and initial avatar development and accessories support – all of which will be continuing. However, Ebbe expects some of the focus in 2018 to start shifting towards more general use of Sansar – including user engagement and user retention, hopefully growing the user base for those who find Sansar usable at this stage of its development.
This does mean that the focus will entirely shift away from creator tools and capabilities. Rather it will see more of a blending of things: some work will be focused on the user aspects of the platform – socialising, interactions, etc., other will remain focused on creator tools, and on things like adding more interactive capabilities which can be used within experiences to broaden their appeal.
And his wish for Sansar’s Creator Beats one year anniversary (July 31st, 2018)? That there are more people using Sansar, that the default experience for someone coming to the platform is that there is life within it, there are people, there’s vibrancy within experiences with events and activities to be enjoyed. It is acknowledged that currently, visits to experiences can be lonely, and the Lab will be looking at ways and means to reduce this alongside increasing new user interest / engagement.
Binary Avatar Choice
There has been a lot of discussion on avatar gender types (sparked in part by Ryan’s blog post). In short, offering a broad range than of avatars than the current binary option is something the Lab is considering. How this might be achieved isn’t clear. For example more body shape sliders are likely to be introduced in the future; would these be enough to allow more gender neutral avatars to be created by users, rather than the Lab providing more neutral avatar forms themselves?
Avatar Customisation: Complexity vs. Simplicity
That avatar shape sliders are to be introduced has raised concerns that they might conflict with any clothing rigged to the avatar models due to be released with the upcoming fashion update. These models will effectively be “fixed” in shape and size, so if clothing is rigged to them, what will happen when – for example – breasts can be resized; the fear being creators may have to re-generate their clothing items.
The Lab’s hope is that by the time slide modification is introduced, they will have in place an “auto size” capability which will ensure clothing will naturally fit a modified avatar shape – a kind of “one size fits all” approach. However, it is acknowledged that even if implemented, there still might be some incompatibilities which might have to be dealt with.
Essentially, the Lab is trying to learn from Second Life – where avatar customisation and clothing can be terribly complex for both creators and consumers (different clothing objects for different mesh bodies, different “standard” sizes for fitmesh, how alpha masking to hide body parts is handled, etc.), and strike a median between complexity and simplicity, rather than straying too far in one direction compared to the other.
This approach has necessarily started from the “simpler” end of the curve – two avatar types of fixed size respectively and (largely) fixed shape (outside of the face), but the idea is to hopefully build on this over time and add capabilities without slipping into things being too complex and thus off-putting to consumer-style users. In discussing this during the meeting, Ebbe indicated that how the balance might be achieved moving forward, and what it might entail in terms of capabilities vs. limitations on what creators might be able to do, is still the subject of much discussion at the Lab.
This balance between complexity / and simplicity is also being implemented on a broader front: such as providing the infrastructure and back-end systems and services required to run many different types of experience for many different reasons (commercial, fun, gaming, social, education), thus removing the complexities of doing so from creators, while at the same time providing them with the tools to build experiences which are complex enough to engage a target audience, but which avoid overwhelming that same audience with complexities steep learning curves, etc., when trying to engage in Sansar.
Currently in Sansar, changing any aspect of an avatar’s appearance – be it physical changes to the avatar’s face, changing clothing or adding / removing an attachment – as it requires effectively dropping out of an experience and into the Avatar App, making the adjustments and then being dropped back into the experience at the spawn point (not the Avatar’s last location).
This means for example that even the simple act of walking into a building and taking off your hat and coat breaks immersion – and even conversation – up as you drop out of the experience, enter a separate app, remove the hat, drop back into the experience then walk back to where you were. For certain types of hoped-for activity in Sansar – such as role-play and gaming – this lack of any “local” or “temporal” inventory, associated directly with the avatar, which allows selected objects to be “carried” around and produced when needed within the scene, is completely immersion-breaking.
This has been raised with the Lab and with Ebbe – and is being taken back to the office for further consideration.
Sansar On Other Platforms
While Sansar is currently Windows-centric (due to Windows being the current platform of choice for high-end consumer VR), it has always been stated that ultimately, the Lab wants to offer Sansar to the Mac and mobile (initially iOS?) platforms – even if the latter is restricted to consumer-only (i.e. non-edit). However, the Lab is already keeping themselves honest by ensuing the Sansar client code will compile for iOS and Mac, so they’re ready to make the jump to those platforms when it makes sense to do so.
- Object metadata / inspection: currently, objects in Sansar cannot be inspected to determine things like the creator’s name and information. This will be changing in the future to be somewhat more SL-like, with the metadata available for causal inspection.
- This might even reach a point in the future where obtaining metadata on an object (even an item of clothing) could allow the interested party to purchase the item for themselves, without necessarily going to the Store. How this works would also be dependent on how creators license / set the permissions on their items – and possibly whether or not an experience creator want people to use their experience as something of a store front for goods.
- The ability to buy directly from within experiences could offer significant opportunities for revenue sharing.
- Store Policy / Paypal: the Store policy changes announced in October were in part pushed by to January 2018, in order to give time for alternative payment options (e.g. PayPal) to be filed with Linden Lab. Work is still progressing to add PayPal as an option, and as a result, implementation of the updated store policies is still under discussion at the Lab.
- Personal space bubble: currently within Sansar, move too close to an avatar in VR mode and it will vanish due to the “personal space bubble” – thus making it impossible to hug someone in VR. The Lab is looking into possibly revising this.
- This might even be a two-part option: people can get close to those on their friends list (and vice-versa), while strangers continue to vanish.
- Microphone push-to-talk: a big gripe with me is microphones left open – although in VR mode, it’s not always easy to mute one’s own mic. An easier way to toggling microphones on / off (by key binding?) is under investigation.
- Experience downloads: requests have been made for assorted performance tools, including a way to monitor how an experience is downloading. This could be as simple as a progress bar, or include information on potential download time, bandwidth usage, etc.
- The Lab is planning to introduce download optimisations, possibly in an early 2018 release, to help smooth things further and help reduce the initial download / load time.
- Experience capacity / instancing: right now, individual instances are capped at around 30 concurrent avatars. There are plans to increase this to perhaps around 100 concurrent avatars per instance. This requires further optimisation work on the Lab’s part, which is being balanced against other work.
- With instancing, the Lab is looking at the means to ensure groups of people can access the same instance of an experience. This could be done via some kind of queuing system / ability to pick an instance of an experience.
- Broadcasting across instances is also being investigated – thus for example, a performer or group could have an audience of 500-600 (or more) within multiple instances, and can interact with that audience, even if the individual instances of the audience cannot see / interact with one another.