Update, April 1st: Vir Linden’s comments on this viewer, offered at the Content Creation User Group meeting, are appended to the end of this article in an audio file.
Some of my recent SL project updates have mentioned that the Lab is working to remove the remaining task of asset fetching away from UDP running through the simulator and to HTTP (avatar baking information, mesh and texture data have been delivered to users via HTTP for the last several years).
This work involves changes to both the simulator and the viewer, both of which have been subject to testing in Aditi, the beta grid for the last few weeks.
However, on Thursday, March 30th, the Lab effectively marked the start of testing on Agni, the main grid, with the release of the AssetHttp project viewer, version 5.0.4.324828.
This viewer enables the remaining asset classes used in Second Life – landmarks, wearables (system layer clothing and body parts), sounds and animations – will now be delivered to users the same way as textures, mesh and avatar baking information: via HTTP over a Content Delivery Network (CDN) rather than through the simulator. This should generally make loading of such content both faster and more reliable.
Hang On! What’s this CDN Thing?
If you’ve followed the HTTP / CDN project, you can skip this part 🙂 .
To keep things extremely brief and simple: a Content Delivery Network is a globally distributed network of servers which can be used to store SL asset information. This means that when you need an asset – say a sound or animation – rather than having to go via UDP to the simulator, then to LL’s asset service, back to the simulator and finally back to you (again via UDP), the asset is fetched over HTTP from whichever CDN node which is closest to you. This should make things faster and smoother, particularly if you are a non-US based user.

There are some caveats around this – one being, for example, if you’re calling for asset information not stored on the local CDN node, then it still has to be fetched from the Lab’s services for delivery to you, where it can be cached by your viewer.
As noted above, the Lab started using CDN providers when they introduced the avatar baking service (called server-side baking) in 2013, and extended the use to the delivery of mesh and texture assets as part of a massive overhaul of Second Life’s communications and asset handling protocols spearheaded by Monty Linden (see my HTTP updates). Moving the remaining asset types to HTTP / CDN delivery effectively completes that work.
OK, So, What’s Next?
Right now, this is only a project viewer, and the Lab are looking to have people try it out and test fetching and loading of landmarks, wearables (system layer clothing and body parts), sounds and animations, so they can examine performance, locate potential issues etc.
However, the code will be progressing through project status to release candidate and ultimately to release status over the next few weeks / months (depending on whether any significant issues show up). Once this happens, TPVs will be given a period of time to integrate the code as a well, after which, all support for UDP asset fetching will be removed from both the viewer code, and from the simulators.
A rough time frame for this latter work is around late summer 2017. When it happens, it will mean that anyone using a viewer that does not have the updated HTTP code for asset handling isn’t going to be able to obtain any new or updated asset data from the Second Life service.