Visual Outfits Browser and VLC Media project viewers

secondlifeThe Lab has recently released two new project viewers, the VLC Media Plugin viewer,and the Visual Outfits Browser viewer.

As they are both project viewers, they are not in the viewer release channel, and must be manually downloaded and installed via the Lab’s Alternate Viewers wiki page. Also, as they are project viewers, they are subject to change (including change based on feedback), and may be buggy.

The following notes are intended to provide a brief overview of both. Should you decide to download and test either, please do file JIRAs against any reproduceable issues / bugs with them, please do file a JIRA, giving as much information, including the info from Help > About Second Life and any log files which you feel may be relevant.

Visual Outfits Browser

The Visual Outfits Browser (VOB) viewer,  version 4.0.6.316123, appeared on Monday, June 6th. Simply put, it allows you to use the Appearance floater to capture / upload / select images of your outfits and save them against the outfits in a new Outfit Gallery tab within the floater.

Creating outfit thumbnails
The new Outfit Gallery tab in the Visual Outfit Browser allows you to create photos of any outfits saved to My Outfits as thumbnails. You can then use the Appearance floater to scan your outfits to decide what to wear, and use the context menu to wear the one you want

The new Outfits Gallery tab (right-click your avatar > select My Appearance > Outfits Gallery) should display all of your created outfits as a series of folder icons, each one displaying the name of the outfit beneath it. You can replace these icons with an image of the outfit in one of three ways:

  • You can wear the outfit, then right-click on its associated folder icon and select Take a Snapshot (shown above left). This will open the snapshot floater with save to inventory selected by default, allowing you to photograph yourself wearing the outfit and upload the image to SL, where it automatically replaces the folder icon for the outfit
  • You can use Upload Photo to upload an image of the outfit your previously saved to your hard drive, and have it replace the folder icon
  • You can use Select Photo to select an image previously saved to your inventory, and use that to replace the folder icon for the outfit.

When using the capability there are a number of points to keep in mind:

  • Both the Take a Snapshot and the Upload Photo options will incur the L$10 upload fee, with the images themselves saved in your Textures folder
  • In all three cases, link to the original images are placed in the outfit folder
  • This approach only works for outfits you’ve created using the Appearance floater / the Outfits tab. It doesn’t work for any other folders where you might have outfits – such is the Clothing folder.

VVOB-2Feedback

How useful people find this is open to debate; I actually don’t use the Outfits capability in the viewer as I find it clumsy and inefficient for my needs. However, it would seem that pointing people towards the appearance floater in order to preview outfits, when most of us tend to work from within our inventories, would seem to be somewhat counter-intuitive.

As such, it’s hard to fathom why the Lab didn’t elect to include something akin to Catznip’s texture preview capability within the VOB functionality. This allows a user to open their Inventory and simply hover their mouse over a texture / image to generate a preview of it (as seen on the right).

Offering a similar capability within the VOB viewer would, I’d suggest, offer a far more elegant and flexible means of using the new capability than is currently the case*. Users would have the choice of previewing outfits either via the Outfits Gallery tab in the Appearance floater or from within Inventory.

There are also a number of wardrobe systems available through the Marketplace. While these may require RLV functionality and come at a price, they may still be seen as offering a more flexible approach to managing and previewing outfits. As such, it will be interesting  to see how the VOB capabilities are received by those with very large outfit wardrobes.

VLC Media Plugin Viewer

As Apple recently announced they are no longer supporting QuickTime for Windows and will not be offering security updates for it, going forwards, the Lab is looking to remove all reliance on the QuickTime media plugin, which is used to play back media type likes MP3, MPEG-4 and MOV, from its viewer, and replace it with LibVLC (https://wiki.videolan.org/LibVLC/).

This project viewer – version 4.0.6.316087 at the time of writing – replaces QuickTime with LibVLC support for the Windows version of the viewer only. The OS X viewer is currently unchanged, as Apple are continuing to support QuickTime on that OS. However, the Lab note that they will eventually also move the OS X version of the viewer to use LibVLC as their 64-bit versions of the viewer start to appear, as the QuickTime APIs are Carbon and not available as 64bit.

*I’ve been informed, and hadn’t appreciated, that this approach can be graphics memory intensive – see FIRE-933.

Grandfathered buy-down contributing to Lindex fluctuations?

The Lindex has been in a state of flux of late, something that has been the subject of discussion and speculation on a number of fronts. Reader Ample Clarity first pointed things out to me earlier last week via IM (I’ve been rather focused on other things of late, so haven’t been watching the broader news as much as I should), and I’ve been dipping in-and-out of conversations and reports on things since then.

The fluctuations started towards the end of 2015, and were perhaps first discussed on the pages of SL Universe. The discussion resumed in April, when further swings were noted,  causing additional concern among those looking to cash-out L$ balances, while sparking some of the more widespread discussion.

Lindex fluctuations (with thanks to Eku Zhong for the screen capture)
Lindex fluctuations (with thanks to Eku Zhong for the screen capture)

Various theories (and not a few conspiracies) have been put forward to explain what has been happening – although determining precisely what the cause is, is pretty much anyone’s guess. But purely in terms of the more recent fluctuations, New Worlds Notes (NWN) is promoting a theory which might just be plausible: that one (or more) large land estates have been liquidating L$ stocks in order to realise additional US dollar funds to take advantage of the Lab’s grandfathered buy-down offer.

The theory actually comes from Plurker T-Kesserex, who is quoted by NWN as saying:

I think it’s people cashing out to get capital for the $600 dollar sim price reduction … If you own 10 sims you need $6000, so that’s not easy without some cashing out.

At the start of the buy-down offer, Tyche Shepherd, of Grid Survey fame, estimated that around 85% of Homestead regions were already grandfathered, but only around 11% of full-priced regions of all types, leaving enormous potential in the market. During the first month, this figure increased to almost 21%, with the number of grandfathered full-priced regions rising from around 1,039 to 1960, demonstrating a thirst for conversion. Thus, the idea that one or more large estates might be liquidating L$ stocks to cover the cost of further conversions isn’t an unreasonable speculation.

But even if it is a fair assessment of the situation, it doesn’t offer any hint as to what  – market forces or otherwise – has been pushing at the Lindex since late 2015. Nor does it offer any comfort to those concerned about cashing out at a reasonable – or at least stable – rate. All that can be said for certain is that, if you have the need for L$ in your account, buying them hasn’t been this attractive in a good while.

Avatar Complexity and Graphics Presets in Second Life

Avatar Complexity provides users with the adbility to
Avatar Complexity is a means to help people who may suffer from performance issues in crowd areas

On Wednesday, May 18th, Linden Lab promoted the long-awaited Quick Graphics viewer to de facto release status. This viewer includes two important new features:

  • The updated Avatar Complexity settings
  • The ability to create, save and load different groups of graphics settings quickly and easily.

Avatar Complexity

As avatars can often be the single biggest impact on the viewer in terms of rendering, particularly in crowded places, so  Avatar Complexity adds a new slider to the viewer which can be used to set a level above which avatars requiring a lot of processing will appear as a solid colour – the casual term to refer to them being “Jelly Dolls” – greatly reducing the load placed on a system compared to having to render them in detail, so improving performance.

The idea is that you can adjust the setting according to circumstance, so that when in a crowded area with lots of avatars, you can dial down the Avatar Complexity setting, found in Preferences > Graphics (and in the Advanced Settings floater), with the result that more of the avatars around you are rendered as solid colours, reducing the load on your graphics card and system, thus improving performance. Then, in quieter areas, the setting can be dialled back up, allowing more avatars to fully render in your view.

Note: this only applies to other avatars in your world view: your own avatar will always fully render in your view.

The Avatar Maximum Complexity slider sets a threshold on avatar rendering by your viewer. Any avatars in your view exceeding this value will be rendered as a
The Avatar Maximum Complexity slider sets a threshold on avatar rendering by your viewer. Any avatars in your view exceeding this value will be rendered as a “Jelly Doll”, sans attachments

If you have a good system with a high-end graphics car, you can set the value on the slider quite high and thus ensure all avatars render fully for you wherever you are.

Note: You can sett the Avatar Maximum Complexity to “No Limit”. However, this is not entirely recommended. some irritants in Second Life still use worn graphics crashers to overload GPUs and crash the viewer. If you set Avatar Maximum complexity to “No Limit”, then such tools, should you ever encounter an irritant using one, will still be effective; so it’s better to set a reasonable high value, leaving your viewer with a cut-off point which should defeat their efforts in crashing you.

There are a few other points to note with Avatar Complexity:

  • You can opt to always render or to not render avatars around you, regardless of your Avatar Maximum Complexity setting by right-clicking on them and selecting your desired action from the context menu
    You can opt to always render or to not render avatars around you, regardless of your Avatar Maximum Complexity setting by right-clicking on them and selecting your desired action from the context menu

    To help you understand how complex you own avatar is, every time you change your appearance, each time you change the appearance of your avatar, a small notice with your new complexity value will appear in the upper right of your display for a few seconds

  • The complexity value of your avatar is transmitted to each simulator as you travel around Second Life. In return, you’ll get a brief notice in the upper right of your screen telling you approximately how many of those around you are (or are not) rendering you because of your complexity
  • If you have a friend or friend you wish to see fully rendered no matter how low you dial Avatar Maximum Complexity (while out at a club, for example, where it may be beneficial to set a lower complexity threshold), you can right-click on those individuals and select “Render Fully” from the context menu
  • Similarly, and if you prefer, you can selectivity render avatars in your view as grey imposters, by right-clicking on them and selecting “Do Not Render” from the context menu.

Note: Both “Render Fully” and “Do Not Render” will only apply during your current log-in session; the options are not persistent between re-logs.

To help people understand Avatar Complexity, the Lab has produced the following:

  • A blog post to accompany the promotion of the Quick Graphics viewer to release status
  • An Avatar Complexity Knowledge Base article
  • A video tuTORial, which I’ve embedded below.

https://www.youtube.com/watch?v=PxWrqd0o3dc

Continue reading “Avatar Complexity and Graphics Presets in Second Life”

Of outages and feedback

secondlifeI normally keep a close eye on outgoing communications from the Lab, but this week I’ve had other things distracting me, and so haven’t been keeping an eye on the official blog for posts and updates. My thanks therefore to reader BazdeSantis for pointing me to April Linden’s Tools and Technology update, The Story Behind Last Week’s Unexpected Downtime.

April has very much become the voice of the Lab’s Operations team, and has provided us with some excellent insights to Why Things Sometimes Went Wrong – a valuable exercise as it increases both our understanding of the complexities inherent in Second Life, and also what is likely to be going on behind the scenes when things do go drastically sideways.

April’s post refers to the issues experienced on Friday May 6th, when a primary node of a central database failed, with April noting:

The database node that crashed holds some of the most core data to Second Life, and a whole lot of things stop working when it’s inaccessible, as a lot of Residents saw.

When the primary node in this database is off-line we turn off a bunch of services, so that we can bring the grid back up in a controlled manner by turning them back on one at a time.

There’s an interesting point to note here. This is the same – or very similar – issue to that which occurred in January 2016, which again goes to show that given the constant usage it sees, Second Life is a volatile service  – and that the Operations team are capable of turning major issues around in a remarkably short time; around 90 minutes in January, and less than an hour this last time.

Both events were also coupled with unexpected parallel issues as well: in January,  the database issue was followed by issues with one of the Lab’s service providers – which did take a while to sort out. This time it was the Grid Status service. As I’ve recently reported, the Grid Status web pages have recently moved to a new provider. A couple of changes resulting from this have been with the RSS Feed, and integrating the Grid Status reporting pages with the rest of the Lab’s blog / forum Lithium service. However, as April also notes:

It can be really hard to tune a system for something like a status blog, because the traffic will go from its normal amount to many, many times that very suddenly. We see we now have some additional tuning we need to do with the status blog now that it’s in its new home.

She also points out that people with Twitter can also track the situation with Second Life by following the Grid Status Twitter account.

April’s posts are always welcome and well worth reading, and this one is no exception. We obviously don’t like things when the go wrong, but it’s impossible for SL to be all plain sailing. So, as I’ve said before (and above), hearing just what goes on behind the scenes to fix things when the do go wrong helps remind and inform us just how hard the Lab actually doe work to keep the complexities of a 13-year-old platform running for us to enjoy.

 

Lab Chat #3 in 10-ish minutes

Lab Chat #3: Troy, Oz and Ebbe
Lab Chat #3: Troy, Oz and Ebbe

Friday, May 6th saw the third in the Lab Chat series take place in-world, featuring guests Oz Linden, the Director of Second Life Engineering, Troy Linden, a Senior Producer of Second Life and of course, Linden Lab CEO, Ebbe Altberg, in his alter-ego of Ebbe Linden.

You can find the full transcript, with audio extracts, as previously published in these pages by following this link.

However, I’ve been asked by a number of people if I could summarise things, rather than them having to read the entire transcript or just having a list of up–front links. I’ve therefore produced this summary, complete with links to the full answers within the transcript. If this approach proves popular with readers, I’ll adopt it as the lead-in to future transcripts.

Work in progress: Aki Shichiroji demonstrates a wearable wyvern utilising Bento bones for animation.
Work in progress: Aki Shichiroji demonstrates a wearable wyvern utilising Bento bones for animation.

Project Bento

  • How will creators make poses and animations for the new bones (wings, fingers, facial expressions, etc)? Creators will be able to use existing plug-ins (MayaStar, Avastar) to create animation content for Project Bento as is currently the case. Full answer.
  • Will there be any in-world tools for Bento pose and animation creation? At this point, Second Life doesn’t have any in-world animation creation tools, and Bento does not attempt to add them. Instead it leverages existing out-world tools. Full answer.
  • Will Bento have the ability to animate (or pose) separately?  Yes. Second life does already support isolating animations to certain parts, and Bento is no different.  Full answer.
  • Will any of the work on the Bento facial bones be incorporated into the default/system avatar for expressions, etc? The default system avatar has not at this point been re-rigged to use the new Bento bones. However, custom mesh heads, when rigged to the bones, will be able to make use of them. Full answer.
  • Will there be, or are there any plans to introduce animated mesh into Second Life (e.g. animated pets, etc)? No comment on whether or not animated meshes will be supported in the future. However, Bento bones can be used to provide a level of animation of creatures, objects, attached to an avatar (e.g. bats flying around your head). Full answer.
  • Will any attempts be made to have the new bones be scriptable for the use in user-created animation rigs like Anypose?  There are no plans to add scripting capabilities that are specific to Bento at this time. Full answer.
  • Can some Bento UG meetings be held at an “Asia friendly” time? It will be looked into. Full answer.

Second Life

The new Experience Keys based Social Islands
The new Experience Keys based Social Islands – see below
  • Can we have tools inside inventory to help manage it?  The Lab is focused on improving inventory operation robustness, and will have a new viewer offering this soon. Better inventory management interfaces and tools are a terrific idea, and something TPVs could even contribute. Full answer.
  • Will we see similar edutainment-type experiences as the new social islands, but aimed at more advanced users? Yes, very probably in time. Full answer.
  • Why doesn’t Second Life have gift cards which can be purchased in stores like other games? Probably more interesting to think of ways to sort-of refer a friend, maybe, with an associated gift card to get them into the world. But something to examine. Full answer.
  • Any plans to provide more robust photography tools similar to Firestorm’s Phototools? Will existing tools be updated? Lab prefers not to comment on things until close to release; photography floater updates an excellent opportunity for TPV / open-source contributions. Full answer.
  • Can sound files be increased in length beyond the 10 second limit? Yes, and animation file sizes can be increased. By how much isn’t clear, and the work will be dependent on moving the assets to CDN delivery first. Full answer.
  • Will we be able to texture more than 8 faces when editing mesh in-world?  The change made in Sept 2015 refers to allowing more than 8 textureable faces as a part of the upload process, not to in-world editing. No further changes planned at present. Full answer.
  • Will any similar incentive to the private island buy-down offer be presented to Mainland owners? Not at present. Time is required to analyse the other impact of the buy-down offer and determine its overall benefit (or otherwise). So nothing planned for Mainland at the moment or immediate future. Full answer.
  • Will anything be done to address vehicle region crossing issues, particularly with large vehicles, which have become worse over the past year? Lab not aware of any changes that should have made things worse, but will look into matters. However, large vehicles have always been problematic on region crossings, so no promises. Full answer.
  • Will RLV functionality be added to the official viewer? Longer-term, Lab will add more capabilities to Experience Keys which will be similar to, but not compatible with, RLV. Full answer.
  • Will Experience Keys be opened to Basic members to create Experiences? Experience Keys will remain Premium-only do to potential griefing abuse. Premium helps ensure accountability.  Full answer.
  • Will Experience Search (and other search) be improved? The  current focus is the Marketplace search beta, using Elasticsearch. This will likely become the default MP search engine soon. The Lab may then use Elasticsearch on other search capabilities. Full answer.
  • Will the Marketplace Listing Enhancement issues & JIRAs be addressed? The Lab believes they have a fix for a major cause, which is in the process of being implemented and may clear up most issues. Full answer.
  • Can the number of Estate Managers be increased? Will be looked at. Full answer.
  • What’s the best way to report group spammers? Single or Multiple reports? Via the Abuse Report, Quality of report, not quantity is important. Many reports aren’t actionable as they are incomplete. Full answer.
  • Does LL give employees time to use SL? Yes & all staff are encouraged to spend time in SL when first starting. Oz Linden also looks to recruit from SL users where possible. Full answer.
  • Any thoughts on Vulkan graphics support for SL? For SL, no. Sansar, yes.
  • Can we have an update on Linden Realms and the grid hunt games available through the portal parks? New Linden content is coming, but no details given.

Continue reading “Lab Chat #3 in 10-ish minutes”

Reminder: Lab Chat #3, May 6th with Ebbe, Oz, Troy and Bento

Lab Chat LogoLab Chat is the name of the public Q&A series aimed at providing Second Life users with the opportunity to have their questions put to Lab management and personnel.

The first two sessions in the series took place in November 2015 and January 2016 respectively, with guest Ebbe Altberg, CEO of Linden Lab. Each event covered both Second Life and Project Sansar and saw Ebbe respond to questions selected from those put forward to a forum thread ahead of each event.

The third in the series will take place on Friday, May 6th, starting at 10:30 SLT at the Linden Endowment for the Arts Theatre. The guests for this session will be:

Ebbe Linden (Ebbe Altberg, the Lab’s CEO), who requires no introduction here. He’ll obviously be answering any questions on Project Sansar which are raised during the show.

Oz Linden, the Director of Second Life Engineering at Linden Lab, and is perhaps most noted for his involvement with viewer development, including contributions from the open-source community and TPVs. He oversees almost all aspects of the technical development of Second Life, both viewer and server, and works closely with his engineers and developers to ensure Second Life continues to be enhanced.

Troy Linden, a Senior Producer of Second Life at the Lab, and has been involved in bringing numerous high-profile projects within SL to fruition, and is currently engaged in Project Bento, the project to greatly extend the second Life avatar skeleton, which Oz’s team is currently working on together with members of the SL content creation community.

Because both Oz and Troy will be present at the show, the majority of the questions this time around will be focused on Second Life and Project Bento, so this is a great opportunity to find out what is being planned for Second Life, and what Project Bento is all about and what it might mean for you.

Among many other things, Bento offers the potential for animated facial expressions and animated fingers (shown in this video by Abramelin Wolfe) on mesh avatar models

The show will be recorded in audio, which will be made available some time after the show has wrapped. I hope to attend and produce a full transcript, and those wishing to catch-up on the first two Lab Chat sessions through this blog can do so by following the links below:

For those who prefer, videos of the first two sessions can be found on YouTube:

LEA Theatre SLurls