The following notes are taken from the Web User Group meeting held on Friday, December 19th, 2017. These meetings are generally held on alternate Fridays, and chaired by Alexa and Grumpity Linden at Alexa’s barn. The focus is the Lab’s web properties, which include the Second Life website (including the blogs, Destination Guide, Maps, Search, the Knowledge base, etc.), Place Pages, Landing Pages (and join flow for sign-ups), the Marketplace, and so on and the Lab’s own website at lindenlab.com.
Not all of these topics will be discussed at every meeting, however, the intention within the group is to gain feedback on the web properties, pain points, etc., and as such is very much led by comments and input from those attending. Along with this are two points of note:
Specific bugs within any web property – be it Marketplace, forums, Place Pages or anything else), or any specific feature request for a web property should be made via the Second Life JIRA.
Alex Linden provides routine updates on the Lab’s SL-facing web properties as and when appropriate, which can be found in the Second Life Web thread.
Note that the SL forums are not covered by the Web User Group, as the management of functionality of the forums falls under the remit of the Support Team.
Lindens in the Web Team
A number of Lindens attend the Web User Group meetings in addition to Grumpity and Alexa (who are part of the Second Life Product team). While they may not be present at every meeting, Lindens staff directly involved in supporting the SL web services include:
Spidey Linden: QA Lead for SL Web and Marketplace.
Shrike Linden: a QA tester on the Second Life web team.
Nazz Linden: a web developer who has thus far primarily worked on secondlife.com and the Place Pages.
Natty Linden: a web developer with a focus on the Marketplace.
Sherbert Linden: a web developer working on various SL web properties.
Support Portal Migration
Some people have reported that their support ticket histories are no longer intact. This may be a result of the ongoing migration of data from the old support system to the new system (see here and here for more). If there are specific tickets raised prior to the start of 2017 people need to view, a new support ticket, including details of the ticket which needs to be viewed, should be raised, and the support team should be able to access the old ticket and provide any information on it.
Currently a project viewer (version 18.104.22.1686743 at the time of writing), this is still in the process of being updated to offer higher resolution 360-degree images taken in Second Life, and for the uploading of 360 images to Place Pages (as well as the other viewer snapshot upload options).
Feature requests are suggestions forwarded to the Lab on ideas and improvements which might be added / made to Second Life. They are raised via the Second Life JIRA:
Once logged-in to your Dashboard, click ob Create Issue (top right of the window).
A pop-up Create Issue form is displayed.
Click on the right of the Issue Type box on the form to display a drop-down, and select New Feature Request.
When filing a feature request, give as much information as clearly and concisely as possible: what the feature request is, what it is for, why it should be considered beneficial, what it might help improve, how it might work, etc., – as these things apply.
If you are requesting a UI change to the viewer, and can include images of proposed changes or new floaters / panels the feature would require, be sure to attach them.
In 2017, 383 feature requests were filed via JIRA. Of these, 167 (roughly 43%) were accepted by Linden Lab for transfer into their internal JIRA system. It’s not clear how many of the accepted items were eventually actioned, but the figures nevertheless show that feature requests are triaged and some are taken for current or future consideration and possible implementation at a later date.
When launched on Valentine’s Day 2012, One Billion Rising (OBR) was the biggest mass action in human history; a call to action based on the staggering statistic that 1 in 3 women on the planet will be beaten or raped during her lifetime. With the world population at 7 billion, this adds up to more than one billion women and girls who are at risk. OBR aims to bring people together, raise greater awareness of the plight of those at risk the world over, and bring about a fundamental change in how vulnerable and defenceless women and girls are treated.
Since its inception, One Billion Rising has grown and the local campaigns deepened, it has also brought in economic violence and the violence of poverty, racial violence, gender violence, violence caused by corruption, occupation and aggression, violence caused by environmental disasters, climate change and environmental plunder, violence impacting women in the context of state sponsored wars, militarization, and the worsening internal and international displacement of millions of people, and violence created by corporate greed, among so many others.
On Wednesday, February 14th, 2018. One Billion Rising continues to sustain the theme of “Solidarity Against the Exploitation of Women”, and activities in Second Life event will be focused on a four-region stage where 200 people can come together to dance, surrounded by an area of art installations, an arena for poetry and dramatic productions, and informational exhibits. Activities at 00:00 SLT on the morning of February 14th, and will continue through a full 24 hours across the OBR regions.
To support the event, the organisers are currently seeking volunteer stage managers, security helpers, greeter, general volunteers to help gather information etc. Bloggers interested in covering the event both in the run-up and on the day itself are also being sought, as are sponsors to help cover the cost of the regions.
If you are interested in helping with any of these aspects of the event, please follow the links below:
The following notes are taken from the Sansar Product Meetings held at 4:00pm PST on the afternoon of Friday, January 19th, 2018. These Product Meetings are open to anyone to attend, are a mix of voice (primarily) and text chat, and there is currently no set agenda. The official meeting notes are published in the week following each pair of meetings, while venues change each week, and are listed in the Meet-up Announcements. and the Sansar Atlas events section.
Ebbe Altberg and Paul (aka Pierre), Alex and Nyx Linden from the Sansar product team joined meeting host Jennifer for the event. Audio extracts from the meeting are included below for reference to key points in the discussions. Note that some subjects were discussed at different points in the meeting, and so some of the audio extracts here represent a concatenation of the different points at which a particular topic may have been discussed.
Web Atlas – Concurrency Indicators
The Web Atlas has been updated with a new search category – Popularity – and concurrency indicator. When selected from the sort drop-down menu (see below), the Popularity option orders the listed experiences in a tab by current real-time use, so those with avatars actually visiting them will be listed first.
In addition, those experiences with avatars in them have a concurrency indicator in the top left corner of their thumbnail image. This is a near real-time indicator that the experience is in use at the time it is seen in the Atlas.
The popularity search option and the concurrency indicator will be added to the client Atlas, possibly as an update in week #4. Both present a first step in presenting users with more information on popular experiences and in helping them locate spaces which have a “social” presence in Sansar.
Sansar Store Tags
It is now possible for creators to tag items when creating Sansar Store listings, and the Store Guidelines have been updated to reflect this.
The January 2018 release, referred to internally at the Lab as “Release 17” (the Fashion release having been Release 16), is primarily code / performance focused. In particular this update includes:
Performance improvements – for example, the amount of data sent to the client for avatar and dynamic object animations has been reduced by some 60%, which will hopefully make things more fluid for users in busy experiences.
An experience loading progress bar has been coded, although the scene loading page has yet to be revised to show it, and it is hoped this will be in Release 17, or deployed shortly thereafter.
User Sign-up / On-Boarding Process
The Sansar product team believe the current sign-up / on-boarding process for Sansar (see here for the basics) is too complex. It is hoped that a more streamlined sign-up process will form the nucleus of the February 2018 release, and that these updates, together with the Atlas popularity ratings / indicators, will make it easier for incoming users to sign-up and start finding experiences where they can meet and interact with other Sansar users.
Under discussion at the Lab is whether or not to create a dedicated “on-boarding” experience towards which incoming new users could be directed following sign-up, rather than just leaving them to find their way around the Atlas. This would not be part of the February release, and could be more of an exercise in testing which route – via sign-up and then Atlas, or sign-up and “learning / tutorial” experience – is preferred by in-coming users / helps improve retention levels among new users.
One issue with providing any “centralised” on-boarding experience is how will it sit with user-created experiences? Part of the idea with Sansar is not to have a central / main “gateway” into the platform (as is the case with Second Life), but to allow experience creators to develop their own gateways directly to their own experiences (e.g. through a dedicated web presence, a corporate website, or via Facebook or Twitter, etc.). So, how do any on-boarding experiences supplied by the Lab fit with these routes of access?
Should a user signing-up to Sansar through a specific experience gateway be “diverted” to a Lab-created learning experience and then dropped into the experience they were signing-up to join? If so, how exactly should that work? Should they simply be dropped into the experience they were expecting, and be left to work it out for themselves / complete any tutorial options provided by the experience creator?
There’s also the question of how deep does any on-boarding experience have to go – can things be made easier to understand through the client itself – keeping the UI straightforward, offering on-screen indicators for controller buttons options when required, etc?
Mentors / Greeters
A suggest was made to have a “learning welcome” space where volunteer “Sansar ambassadors” (akin to Second Life mentors) can spend time helping new arrivals gain familiarity with using the Sansar client – the atlas, settings, walking, running, chatting in text, IMing, etc.
In response, Ebbe noted that – contrary to anecdotal views in Second Life – having mentors (either at their own welcome environments or those at the various Community Gateways in operation around Second Life) does not actually lead to any greater levels of retention among new users than the self-teach environments that have been presented to incoming users over the years. However, the is a willingness to experience with methods – with the use of AI-driven NPCs or the provision of some kind of “learning HUD” also being mentioned as possible options to help new users.
Pierre reiterated his comments from the previous Product Meeting, that additional tools to help creators / users mount and promote their own experiences will be appearing in the very near future. This is again seen as a component in helping to drive user interest in Sansar.
Avatar and Fashion
Currently, the Sansar avatar is not – outside of the head / face – customisable other than with clothing and accessories. As recorded in my 2018 week #2 notes, there are plans to enhance the degree of customisation available within the avatar, starting with the head, and then with work on the body. This led to concerns on how additional avatar customisation capabilities might impact clothing design. Animator and creator Medhue Simoni in particular laid out his concerns in a video on the matter, which apparently became the subject of discussion at the first Sansar Fashion Product Meeting (which I was unable to attend), with it being indicated that the Lab’s Fashion Team had watched the video, taken note of the concerns raised etc.
As the avatar is enhanced, there may well be a need for clothing designers to go back and re-rig clothing created outside of Marvelous Designer (MD), although it should be possible to re-simulate MD clothing over a changed avatar body shape once this capability has been enabled with Sansar.
In the short-term for Fashion, there will be an update to fix the UV issues people are experiencing with MD, wherein the export to Sansar is using a different UV space to the export to other formats. However, the avatar customisations capabilities will be added gradually over a longer period of time.
To help compensate for the avatar updates, requests have been made for a deformer mechanism to be added to Sansar to allow rigged mesh clothing to more easily adjust to the avatar shape (and changes to it – think Fitmesh is Second Life as a broad idea), while potentially avoiding the need for clothing to be supplied in a range of sizes. This may not be so easy to introduce.
However, and whether it will be possible to implement or not is still unknown, the Lab is trying to determine if, where different clothing sizes are required, the platform itself can auto-generate different default sizes rather than the designer having to upload them all (e.g. if a designer uploaded an item in a “standard medium” size, the “standard large” and “standard small” sizes would be auto-generated from it).
One of the things the Lab wants to do is keep the fashion creation flow relatively straightforward, and avoid placing too many requirements on designers. They are therefore keen to avoid things like morph-based solutions and blend shapes (thus negating designers having to implement a whole series of body morphs into their designs or having to run through some conversion process to handle blend shapes, etc.).