Reflections on a prim: a potential way to create mirrors in SL

Update: just after pushing this out (slightly prematurely, thank you, Mona, for pointing out the error), Gwenners poked me on Twitter and reminded me of the 2006 experiments with reflections and supplied some links to shots from those heady days: and

The ability to have honest-to-goodness mirror surfaces in Second Life which could reflect the world – and avatars – around them has often been asked for over the years, but has tended to be avoided by the Lab as it’s been seen as potentially resource-intensive and not the easiest thing to achieve. As a result people have in the past played around with various means to try to create in-world mirrors.

Zonja Capalini posted a article on using linden water as an avatar mirror in 2011
Zonja Capalini posted an article on using linden water as an avatar mirror as far back as 2009

Zonja Capalini, for example, was perhaps one of the first to blog about using Linden water as a mirror (or at least the first I came across, thanks to Chestnut Rau and Whiskey Monday), and she certainly came up with some interesting results, as shown on the right, and which I tried-out for myself back in 2012.

However, achieving results in this way is also time-consuming and not always practical; you either have to purpose-build a set, or try shoving a jack under a region and hope you can persuade it to tip over on its side…

But there is hope on the horizon that perhaps we may yet see mirrors in SL (and OpenSim).

While it is still very early days,  Zi Ree of the Firestorm team has been poking at things to see what might be achieved, and has had some interesting results using some additional viewer code and a suitable texture.

This has allowed Zi to define a basic way of generating real-time reflections, including those of avatars, on the surface of a prim. The work is still in its early days, and Zi points to the fact that she’s not a rendering pipe expert, so there may be under-the-hood issues which may not have come to light as yet. However, she as produced a number of videos demonstrating the work to date (see the same below), and has raised a JIRA (STORM-2055) which documents the work so far, and self-compilers can use the patch provided in the JIRA if they want to try things for themselves.

Currently, the code only works when the viewer is running in non-deferred rendering (i.e. with the Advanced Lighting Model turned off). This does tend to make the in-world view a little flat, particularly if you’re used to seeing lighting and shadows.

However, having tried a version of the SL viewer with a code applied to it, I can say that it is very easy to create a mirror – all you need is a prim and a texture, make a few tweaks to some debug settings, and a possible relog. The results are quite impressive, as I hope the picture below demonstrates (click to enlarge, if required).

I see you looking at me ...
I see you looking at me …

Performance-wise, my PC and GPU didn’t seem to take too much of a hit – no doubt helped by the fact the mirror effect only works in non-deferred mode at present. Quite what things would be like if this were to be tried with ALM active and shadows and lighting enabled could be a very different story.

As the effect is purely viewer-side, it does run up against the Lab’s “shared experience” policy; not only do you need a viewer with the code to create mirror surfaces, you need a viewer with the code to see the results. People using viewers without the code will just see a transparent prim face (or if the mirror texture is applied to the entire prim, nothing at all while it is 100% transparent).

This means that in order for mirrors of this nature to become the norm in Second Life, then the idea, as offered through this approach, is going to have to be adopted by the Lab. Obviously, to be absolutely ideal, it would also be better if it worked with Advance Lighting Model active as well. Zi additionally notes that some server-side updates are also required in order for a simulator to be able to save things like the reflectiveness of a given mirror surface, etc.

It's all done with mirrors ...
It’s all done with mirrors, y’know … (click to enlarge, if required)

Whether this work could herald the arrival of fully reflective surfaces in-world remains to be seen. It’s not clear how much interest in the idea has been shown by the Lab, but hopefully with the JIRA filed, they’ll take a look at things. There’s little doubt that if such a capability could be made to happen, and without a massive performance or system hit, then it could prove popular with users and add further depth to the platform.

SL project updates week 32/1: server, viewer

The Snow Lion, Oceanside dAlliez; Inara Pey, May 2014, on FlickrThe Snow Lion, Oceanside dAlliez (click for full size – blog post here)

Server Deployments – Week 32

As always, please refer to the server deployment thread for the latest updates and status on deployments.

  • There was no Main (SLS) channel deployment on Tuesday August 5th
  • On Wednesday August 6th, all three RC channels should receive the same maintenance update, which addresses some miscellaneous bugs, and fixes the JSON issue “Valid JSON numbers like 0e0 no longer valid after 14.06.26.291532” (BUG-6657) and includes changes from the current Main channel release.

SL Server

Release Viewer

The de facto release viewer was updated on Monday August 4th to version 3.7.13.292225, formerly the Group Ban RC viewer. For an overview of the group ban functionality, please refer to my article here (note this is based on the project viewer, although the functionality has not changed significantly).

All other viewer in the release and project channels remain as per my Current Viewer Releases page.

Oculus Rift Viewer and Oculus DK2

Versions of the Oculus Rift DK2 kit are beginning to appear, and those who have received the updated headset are noting that the current Oculus Rift project viewer (version 3.7.12.292141 – see the Alternate Viewers wiki page) is not compatible with the newer equipment, particularly with regards to the updated tracking system and the new display modes (see: RIFT-130).

Given the extent of changes between the DK1 and DK2 headsets, incompatibilities shouldn’t be that surprising, and problems are limited to SL; other applications built for the DK1 are also apparently having their share of niggles with the newer hardware. At the time of writing, it wasn’t clear if the Lab had or had not received their Oculus DK2, so it might still be a while before an updated version of the viewer with DK2 support appears.

User Group Meetings, Week 33

There will be no Open-source Developer meeting or Simulator User Group meeting on Monday August 11th or Tuesday August 12th respectively. This is because the Second Life technical team will be involved in a strategy  / team building get-together in the physical world, and so will all be offline for in-world meetings as a result of travelling, etc.

Other Items

Off-line E-mail Issues

There are reports that offline IMs to e-mail are being blocked by certain ISP, most notably United Internet in Germany, which operates a number of web-based e-mail services, including GMX, 1 and 1, web.de mail.com and A1. The problems started in early July (see this forum thread and BUG-6591 and BUG-6717), and was initially seen as an issue within SL.

However, SL user MartinRJ Fayray, himself based in Germany, contacted 1 and 1 and received confirmation that certain IPs from SL have been blocked on account of the amount of “spam” they are generating.

As the blocking appears to be simulator IP-based, the situation has given rise to some confusion, as some offline e-mails do get through to users, and others don’t (depending on where they were initiated within SL), leading to people viewing the problem as Lab-based issue.

Things currently appear to be up in the air as to what might happen, as unblocking IP ranges is something United Internet needs to do, although United Internet appear to be waiting for the Lab to contact them.

This is not the first time issues with IM to e-mail have been encountered. In 2013, many Gmail users found that their off-line e-mails were no longer being received, although that was due to a change in Gmail’s filtering policies which could be rectified by a settings change at the user’s end.

Getting more animated at High Fidelity

HF-logoOne of the things people have critiqued High Fidelity about is the look of their avatars. Yes, they can use 3D cameras to capture a user’s facial expression and translated them into facial movements on an avatar but, well, the avatars just look a little odd.

Or at least, that’s an oft-heard or read comment. I’m not entirely in disagreement; SL avatars may not be technically up-to-snuff in many ways, but they can look good, and over they years, they have spoiled us somewhat.

However, High Fidelity is still only in an alpha phase; and things are bound to improve over time with the look and feel of their environments and their avatars. As a demonstration of their attempts to improve things, the HiFi team have recently released a couple of videos and a blog post from their animator, Ozan Serim, formerly of Pixar Studios.

In the post – which marks his first time writing  for the blog, Ozan explains how he’s trying to bring more advanced animation to the platform’s avatars to, as he puts it, “make live avatars look really amazing – as close to what we see in animated films today.” This isn’t as easy at it sounds, as he goes on to note:

This is a big challenge – we have to do everything in a fraction of a second without the benefits of an animator (like me!) being able to ‘post-process’ the results of what is motion captured.  So I’ve been working on the ‘rigging’: how a live 3D camera and a motion capture package like Faceshift is able to ‘puppeteer’ an avatar.  With less accurate data, we have to be clever about things like how we move the mouth to more simplistically capture the phonemes that make up speech.

To demonstrate the result, Ozan includes a video of Emily Donald, one of the other HiFi staff members, singing

 As well as this video, using the “default” format of HiFi avatar, Ozan and members of the HiFi team have been working on improving the overall look of their avatar, and some early results of their efforts can be seen in another music video released at the start of August, and which is linked-to in the blog post.

This is again experiment in rigging facial expressions to more fully match those of a human being, with special attention being paid to the “A”s and “M”s as the avatar (Ozan) lip-synchs to Freddie Mercury singing Queen’s Bohemian Rhapsody. This is another video where it’s worth watching the avatar’s mouth movements – and also eye and eyebrow movements, which also reflect a strong level of emotion.

Again, there’s a fair way to go here, but these early results are fascinating, and not just for the technical aspects of what is being done here: capturing, processing and rigging subtle facial expressions in real-time. As a commentator on the Bohemian Rhapsody notes, “cool but creepy” – a reflection of the fact that HiFi have taken a further step into the Uncanny Valley. It’s going to be interesting to see how well they fare in crossing it.

Related Links

With thanks to Indigo Martel for the pointer.