The ability to take 360-degree panoramic shots is to be integrated into the viewer, with access via the snapshot floater (Image location: It All Starts With A Smile – blog post – static image produced with the Illiastra Panoramic Camera HUD) – click the image to see it in 360-degree format
Just as I was working on an article about the Illiastra Panoramic Camera and producing static / interactive 360-degree images of Second Life, I attended the Third Party Viewer Development meeting on Friday, October 7th. During that meeting, Troy Linden announced that the Lab are working on incorporating the capability to generate 360-degree snapshots directly into the viewer.
The new capability is to be called 360 Snapshot, and will be integrated into the snapshot floater (alongside of additional snapshot improvements contributed by TPV developer NiranV Dean – although these sit outside of the 360-degree feature).
In essence, the snapshot floater will act as a 360-degree camera rig, allowing you to position your avatar almost anywhere in-world and capture a full 360-degree image, stitched together by back-end processing by the Lab. The image will then be shareable via the SL Share feature, and should be available for download to your local drive.
The work is far enough advanced such that a test viewer (not a project viewer) will be appearing sometime quite soon, with the Lab being keen to get it capability out into the hands of users to try. However, the important thing to note is that it will be a test version – it will not be a final, polished solution right out of the gate. The idea is to give users an indication of things like picture quality, approach taken, etc., and allow the Lab to examine exactly how much additional functionality they need to consider / include in the capability.
Initially, the stitching element will be absent; users will have to take care of that themselves after saving the image set to their local drive. There are also some potentially significant issues the Lab want to look at in detail through the use of the test viewer.
In particular there is the question of how the capability will interact with the simulator Interest List: will items effectively behind your avatar’s field of view update correctly in order to be properly imaged by the system? If not, the Lab will need to look in to how things might be adjusted. The idea here is that by carrying out such tests publicly, the Lab can work with interested users and photographers to identify potential limitations and problem areas in the approach, and so hopefully address them.
In commenting on the project, Oz acknowledged that there are HUD systems available which have been inspirational, and much of the driver behind this capability is the desire to give users a simple “point and shoot” interface.
There is no indication yet on limitations which might be placed on the system, such as image resolution, etc. Hence again why the capability will be appearing in a test viewer when it emerges, rather than a project viewer. The Lab also isn’t committing to any kind of time scales for this work, other than the test viewer is liable to appear reasonably soon; or how long the project will take to reach a release status once a test viewer does appear. The focus is on a step-by-step development of the capability.
Note: the audio clips here are extracts of salient points from the discussion on the 360 Snapshot capability. To hear the full discussion of the capability, please listen to the video of the Third Party Viewer Meeting video, starting at the 08:49 point.