Monty Linden discusses CDN and HTTP

Monty Linden talking CDN and HTTP
Monty Linden talking CDN and HTTP

In show #46 of The Drax Files Radio Hour, which I’ve reviewed here, Draxtor pays a visit to the Lab’s head office in Battery Street, San Francisco. While there, he interviews a number of Linden staffers – including Monty Linden.

Monty is the man behind the Herculean efforts in expanding and improving the Lab’s use of HTTP in support of delivering SL to users, and which most recently resulted in the arrival of the HTTP Pipeline viewer (the code for which is currently being updated).

He’s also been bringing us much of the news about the content delivery network (CDN) project, through his blog posts; as such, he’s perhaps the perfect person to provide further insight into the ins and outs of the Lab’s use of both the CDN and HTTP in non-technical terms.

While most of us have a broad understanding of the CDN (which is now in use across the entire grid), Monty provides some great insights and explanations that I thought it worthwhile pulling his conversation with Drax out of the podcast and devoting a blog post on it.


Monty Linden talks CDN and HTTP with Draxtor Despres on the Drax Files Radio Hour

Monty starts out by providing a nice, non-technical summary of the CDN (which, as I’ve previously noted, is a third–party service operated by Highwinds). In paraphrase, this is to get essential data about the content in any region as close as possible to SL users by replicating it as many different locations around the world as is possible; then by assorted network trickery, ensure that data can be delivered to users’ viewers from the location that is closest to them, rather than having to come all the way from the Lab’s servers. All of which should result in much better SL performance.

“Performance” in this case isn’t just a case of how fast data can be downloaded to the viewer when it is needed. As Monty explains, in the past, simulation data, asset management data, and a lot of other essential information ran through the simulator host servers. All of that adds up to a lot of information the simulator host had to deliver to  every user connected to a region.

The CDN means that a lot of that data is now pivoted away from the simulator host, as it is now supplied by the CDN’s servers. The frees-up capacity on the simulator host for handling other tasks (an example being that of region crossings), leading to additional performance improvements across the grid.

LL's CDN provider (Highwinds) has a dedicated network and 25 data centres around the world which should help to generate improvements in the speed and reliablity of asset downloads to your viewer, starting with mesh and textures
Highwinds, a CDN provider Linden Lab initially selected for this project, has 25 data centres around the world and a dedicated network from and through which essential asset data on avatar bakes, textures and meshes (at present) can be delivered to SL users

An important point to grasp with the CDN is that it is used for what the Lab refers to as “hot” data. That is, the data required to render the world around you and other users. “Cold” data, such as the contents of your inventory, isn’t handled by the CDN. There’s no need, given it is inside your inventory and not visible to you or anyone else (although objects you rez and leave visible on your parcel or region for anyone to see will have “hot” data (e.g. texture data) associated with it, which will gradually be replicated to the CDN as people see it).

The way the system works is that when you log-in or teleport to a region, the viewer makes an initial request for information on the region from the simulator itself. This is referred to as the scene description information, which allows the viewer to know what’s in the region and start basic rendering.

This information also allows the viewer to request the actual detailed data on the textures and meshes in the region, and it is this data which is now obtained directly from the CDN. If the information isn’t already stored by the CDN server, it makes a request for the information from the Lab’s asset servers, and it becomes “hot” data stored by the CDN. Thus, what is actually stored on the CDN servers is defined entirely by users as they travel around the grid.

The CDN is used to deliver
The CDN is used to deliver “hot” texture and mesh data – the data relating in in-world objects – to the viewer on request

The HTTP work itself is entirely separate to the CDN work (the latter was introduced by the Lab’s systems engineering group while Monty, as noted in my HTTP updates, has been working on HTTP for almost two-and-a-half years now). However, they are complimentary; the HTTP work was initially aimed at making both communications between the viewer and the simulator hosts a lot more reliable, and in trying to pivot some of the data delivery between simulator and viewer away from the more rate-limited UDP protocol.

As Monty admits in the second half of the interview, there have been some teething problems, particularly in when using the CDN alongside his own HTTP updates in the viewer. This is being worked on, and some recent updates to the viewer code have just made it into a release candidate viewer. In discussing these, Monty is confident they will yield positive benefits, noting that in tests with users in the UK,, the results were so good, “were I to take those users and put them in out data centre in Phoenix and let them plug into the rack where their simulator host was running, the number would not be better.”

So fingers crossed on this as the code sees wider use!

In terms of future improvements / updates, as Monty notes, the CDN is a major milestone, something many in the Lab have wanted to implement for a long while,  so the aim for the moment is making sure that everyone is getting the fullest possible benefit from it. In the future, as Oz linden has indicated in various User Group meetings, it is likely that further asset-related data will be moved across to the CDN where it makes sense for the Lab to do this.

This is a great conversation, and if use of the CDN has been confusing you at all, I thoroughly recommend it; Monty does a superb job of explaining things in clear, non-technical terms.

The Drax Files Radio Hour: land of the Lindens

radio-hourTime has meant I’ve not had time to mull over the last few Radio Hour podcasts, which is a shame as there have been some gems. If you’ve not already done so, do try to catch show #41 for a brilliant interview with Justin Esparza the man behind one of the great legends of SL – Salazar Jack. Then there’s show #44 with Jaimy Hancroft, one of the great talents behind Dwarfins and the creator of the magnificent Hope’s Horizon at the 2014 Fantasy Faire.

However in the latest podcast, show #46, Drax ventures out on his own to visit the Lindens on their home turf, dropping in on the Battery Street offices for an informative visit, offering a lot to listen to and absorb.

The Lab's Battery Street staff (image: Ebbe Altberg, via Twitter)
The Lab’s Battery Street staff (image: Ebbe Altberg, via Twitter)

The first big interview, kicking off at the 18:08 mark into the show, is with Monty Linden, who provides a clear-cut explanation for the Content Delivery Network (CDN) and also talks about his HTTP project work. Such is the level of information in this conversation, rather than condensing into a couple of paragraphs here, I’ve included it in a separate article, as it really does help frame both the CDN work and the HTTP work in non-technical terms.

That said, Drax also leads Monty into a discussion about net neutrality starting at the 24:50 minute mark in the interview (and continues through until the 30:13 mark), which is also something worth listening to in detail (and which I’ve deliberately excluded from the article on Monty’s CDN / HTTP discussion).

Down in the basement – looking down on the Lab’s engineering team at Battery Street (image via The Drax Files Radio Hour)

Elsewhere in the show, Drax gets to try out the DK2 with Second Life (36:27), with Ebbe revealing that a popular destination when demonstrating the Oculus and SL to journalists is Mont Saint Michel, which for those who have not visited it, is a glorious Second Life reproduction of the “real thing“. Ebbe also makes mention of one of the problems that preclude SL from being an “ideal” companion for the Oculus – the render engine isn’t up to consistently manage the 90 frames-per-second already utilised by the Oculus Crescent Bay prototype in order to eliminate issues of image judder when the wearer turns their head.

In discussing the Oculus Rift, Ebbe indicates that the Lab is working to make the abstraction layer for input devices as open as possible on their next generation platform, so that new devices can be added as easily as possible. He also reveals the new platform already works with game pad devices and the Leap Motion.

The discussion of the Oculus and Leap Motion is particularly interesting as it opens the door on the myriad of challenges encountered in user interface design. For example, with gesture devices, not only do you need to define the gestures required to move an avatar and interact with in-world objects, etc., you need to consider what’s required in order for the user to interact with the UI itself – to press buttons, make menu selections, and so on. These complexities of user interface design get even deeper when you consider that not only do they have to work across multiple client platforms, they have to work across multiple combination of client platform, input and other devices (screens, headsets, etc.).

Mont Saint Michel; Inara Pey, June 2013, on FlickrMont Saint Michel – a location the Lab uses to demonstrate the Oculus Rift and Second Life to journalists

Mention here is also made of High Fidelity. While the two are entirely separate companies, there is an intimation from Ebbe that High Fidelity may be one of the “technology partners” the Lab is talking to with regards to facial recognition capabilities in the next gen platform. Given that the Lab did provide some seed money towards High Fidelity’s first round of funding, this would make some sense.

As Drax tours the Lab’s office with Ebbe (35:13), some interesting snippets of what is going on are provided – such as the work that’s already going on with the “next generation Marketplace”. This is further touched-upon in a conversation (43:59) with Brooke Linden from the SL Commerce Team. She not only discusses aspects of the the Marketplace such as trying to address performance issues, improve search and so on, she also confirms that the Commerce Team is working closely with those working on the next generation platform to ensure that lessons learned in operating the SL Marketplace are carried forward in support of that project.

A potentially interesting snippet about the SL Marketplace from the conversation is that it handles a larger volume of sales than most on-line e-commerce sites. As Brooke points out, given that it does deal with micro-transactions, it is somewhat easier for the Marketplace to generate volume sales; however, this still makes it a challenge when trying to manage things.

Kona, Shaman and (looking like he's fresh from the set of Star Trek sans insignia!) Caleb Linden
Left-to-right: Shaman and Kona Linden from the QA  team and (looking like he’s fresh from the set of Star Trek sans insignia!) Caleb Linden. Shaman (one of the friendliest and welcoming members of the Linden team I’ve met in-world) and Kona discuss with Drax the idea of making Lab’s internal merchandise, such as the Rubik’s cube Shaman is holding, available to users, as well as matters of community (both within the Lab and in SL). Caleb co-leads the Server Beta User Group meeting on Thursdays (image via The Drax Files Radio Hour)

One interview that didn’t make it to the podcast features Jeff “Bagman Linden” Peterson, the Lab’s VP of engineering, who is heading-up the next generation platform work (Don “Danger Linden” Labs having the lead on Second Life). Apparently, a little too much was revealed about the new platform considering the growing commercial interest in virtual world spaces, so the Lab has requested that  Unfortunately, dues to the fact the Lab is keeping a tight lid on the new platform for the time being, the interview has been shelved for (hopefully) a later date.

All told, a really interesting podcast, one that shouldn’t be missed.

What’s in the Box? be a part of Santalarity and find out!

SantalaritySantalarity, the BURN2 seasonal event, will this year kick-off at 10:00 SLT on Saturday, December 13th ans run for 24 hours.

Santalarity is BURN2’s antithesis of the typical  rushed holiday season focused on profits and buying, on unrealistic expectations and joy found in aisle 3.

So reads the event’s press release, which continues:

Instead we ask, “What’s in the box?” What do we want to give? We invite you to come to the playa, to observe artists’ interpretations of these questions; to skate, dance, drum and be present when we Burn the Box. Before the Box is burned,  messages left by visitors to the playa will be read, which will express what they would like to give. To give without expectation of return. To give for the pure joy of giving.

The event itself will start with an initial drumming by the Lamplighters. The Box will burn twice during the event, at 12:00 noon SLT and again at 20:00 SLT.

Artists and builders wishing to participate in Santalarity by building to the theme of What’s in the Box.  Land plots are free, and applications should be made via the What’s in the Box application form.

DJs and live performers who would like to participate in the event are invited to submit their details through the Santalarity Performer sign-up form. Questions about performance scheduling should be addressed to the BURN2 Performance Lead, Larree Quixote in-world.

About BURN2

BURN2 is an extension of the Burning Man festival and community into the world of Second Life. It is an officially sanctioned Burning Man regional event, and the only virtual world event out of more than 100 real world Regional groups and the only regional event allowed to burn the man.

The BURN2 Team operates events year around, culminating in an annual major festival of community, art and fire in the fall – a virtual echo of Burning Man itself.

Related Links

Of Martian walkabouts, pictures from a comet, and getting ready to fly

CuriosityIn my last report on the Mars Science Laboratory, I mentioned that Curiosity has been on a geology “walkabout” up the slopes of the “Pahrump Hills” at the base of “Mount Sharp” (more correctly, Aeolis Mons). The zigzagging route up through the area took the rover from “Confidence Hills” and the location of the last drilling operation up to a point dubbed “Whale Rock”, the drive being used to gather information on potential points of interest for further detailed examination.

The exposed rocks in this transitional layering between the floor of Gale Crater, in which Curiosity arrived back in August 2012, and the higher slopes of “Mount Sharp” is expected to hold evidence about dramatic changes in the environmental evolution of Mars. Thus, the “walkabout”  – a common practice in field geology on Earth – was seen as the best means of carrying out a reasonable analysis of the area in order for the rover to be most efficiently targeted at specific locations of interest.

Curiosity’s walkabout, from “Confidence Hills” to “Whale Rock” in October, the rover is now working its way back to various points of interest for further studies

“We’ve seen a diversity of textures in this outcrop,” Curiosity’s deputy scientist Ashwin Vasavada (JPL) said of the drive. “Some parts finely layered and fine-grained, others more blocky with erosion-resistant ledges. Overlaid on that structure are compositional variations. Some of those variations were detected with our spectrometer. Others show themselves as apparent differences in cementation or as mineral veins. There’s a lot to study here.”

During the drive, Curiosity travelled some 110 metres, with an elevation of about 9 metres, using the Mastcam and the ChemCam (Chemistry and Camera) laser spectrometer system to inspect and test potential points of interest for more detailed examination at a later date. Since completing that drive, the rover has been working its way back through Pahrump Hills, this time examining specific targets using the robot-arm mounted Mars Hand Lens Imager (MAHLI) camera and spectrometer. Once this work has been completed, specific targets for in-depth analysis, including drilling for samples will for the core activity of a third pass through the area.

So far, two specific areas have been identified for detailed examination. The first, dubbed “Pelona” is a  fine-grained, finely layered rock close to the “Confidence Hills” drilling location. The second is a small erosion-resistant ridge dubbed “Pink Cliffs” the rover drove around on its way up the incline.

“Pink Cliffs” is roughly a metre (3ft) in length and appears to resist wind erosion more than the flatter plates around it.As such, it offers precisely the kind of mixed rock characteristics mission scientists want to investigate in order to better understand “Mount Sharp’s” composition. This image is a mosaic of 3 pictures captured on October 7th PDT, 2014 (Sol 771 for the rover) by Curiosity’s Mastcam. It has been white balanced to show the scene under normal Earth daylight lighting – click for full size.

Another target of investigation has been the edge of a series of sand and dust dunes right on the edge of “Pahrump Hills”.  In August 2014, Curiosity attempted to use these dunes as a means to more quickly access the “Pahrump Hills” area, but the effort had to be abandoned when it proved far harder for the rover to maintain traction than had been anticipated, particularly given the rover has successfully negotiated sandy dunes and ridges earlier in the mission. As a result, scientists are keep to understand more about the composition of the dunes.

On November 7th, Curiosity was ordered to venture onto the dunes very briefly in order to break the surface of one of the rippled dunes and expose the underlying layers of sand in an effort to better understand why the rover found the sand such hard going the first time around, and what might be within these wind-formed dunes that would prove to be so bothersome to driving over them. Data gathered from the drive is still being analysed.

Spanning roughly 1.2 metres from left to right, a wheel track breaks the surface of a dust sand dune ripple on the edge of “Pahrump Hiils”. The MSL science team hope the exposed material within the ripple will help them understand why Curiosity found these dunes hard-going when trying to cross them in August 2014.

The work in the “Pahrump Hills” area has given rise to concerns over one of the two lasers in the ChemCam instrument. As well as the main laser, known for “zapping” targets on the surface of Mars in order to reveal their chemical and mineral composition, the system uses a second laser, a continuous wave laser, used for focusing the ChemCam’s telescope to ensure the plasma flash of vaporised rock is properly imaged when the main laser fires. Data received on Earth when using the ChemCam to examine rocks on the first pass through “Pahrump Hills” suggests this smaller laser is weakening and may no longer be able to perform adequately.

If this is the case, the laser team plan to switch to using an auto-focus capability with the telescope so it will automatically focus itself on a few “targeting” shots from the main laser ahead of any data-gathering burst of fire, allowing for proper telescope calibration.

Continue reading “Of Martian walkabouts, pictures from a comet, and getting ready to fly”