Comparative Viewer frame rates

Last week, Pserendipity Daniels left a comment on comparing Viewer performances which got me thnking. As I said in my reply, coming up with an objective means of comparing the performance of various Viewers is a little difficult, as so much as client-side dependent (hardware) while some is also down to your network connection.

However, I decided to take those Viewers I’ve actively used over the last 12 months (as opposed to reviewed and put to one side), and see what I could come up with by way of a very basic and simple means of comparing Viewer performance that might address Pep’s question without me getting bogged down in anything complex (which would probably go right over my head anyway…).

So, the two tables below represent my findings based on Viewer frame rates – which I appreciate aren’t the only measure of a Viewer’s performance (but they are the one most looked at). There are further notes below the tables on how I set-up and ran my “tests”.

Jan 6th: Tables updated to reflect the fact that Niran’s Viewer has been using the 3.2.6 code base since release 1.01. Also, Nalates Urriah has carried out further analysis on these figures.

370m altitude – click to enlarge

Average ping rate for sim: 167ms (averaged across all eight Viewers)

22m altitude – click to enlarge

Average ping rate for sim: 174ms (averaged across all eight Viewers)


  • “High” = graphics set to the SL “High” setting (Ultra in the case of Phoenix), shaders ON, all Deferred Rendering options for lighting & shadows and ambient occulsion (or equivs) OFF
  • Deferred  = deferred render ON, but ambient occulsion / shadows OFF
  • Ambient = deferred render ON, ambient occulsion ON, shadows OFF
  • Shadows = deferred render ON, ambient occulsion OFF shadows ON
  • Ambient + Shadows = deferred render on, but ambient occulsion / shadows ON
  • Numbers in brackets refer to the official Viewer release I believe each TPV is based upon.

Test Environment

To try and give as level a playing field as possible for the tests, I attempted to create a “test environment”, namely:

  • Tests were run after a completely clean reinstall of the listed Viewers (original installation and all associated files / folders uninstalled / deleted)
  • All Viewers were configured alongside my nVidia Control Panel in accordance with this tutorial from the Shopping Cart Disco blog (with thanks to Innula Zenovka for pointing it out)
  • All other major graphics and network settings within the Viewers were set to the same criteria (e.g. Draw Distance set to 300m; network bandwidth set to 1500kbps, etc.)
  • Where possible (and with the exception of Firestorm and Phoenix) the UI was set-up the same: same buttons, same locations, and not other floaters / panels were open, and any group chat sessions active on logging-in were terminated
  • The same avatar with the same attachments was used with each test (with a Draw Weight of 112,986), with the same camera defaults
  • I used the same regions for all Viewers tested, each with 4 other avatars in the regions during the tests. One region was a skymall shopping area, the other a residential sim at ground level (which actually had the same 4 other avatars present in it for all tests!)
  • The same test was used for each case: Teleport to an arrival point; allow rez time, then walk a set route for around 3 minutes, monitoring fps rates
  • Recorded frame rates are based on a roughly-calculated average, rounded up or down to the nearest whole number, as appropriate.

Hardware and network connection

The hardware used for the tests comprise my usual PC and nework connection:

  • Windows 7 32-bit with SP1; Intel Q6600 CPU 2.4Ghz; 3Gb RAM; ASUS motherboard (no idea of the model); nVidia Ge9800GT with 1Gb on-board RAM (driver: 15-10-2011); Viewers running on 320Gb SATA drive @ 7200rpm
  • Netgear DGN2200 (wireless between PC and router)
  • Internet connection averaging a ping of 43ms to the preferred test server, with a download speed of 9.55Mbps and 1.02Mbps upload (speedtest verified).


  • I don’t pretend that either the methodology or the results are particularly scientific, and underline that they are at best indicative – and even that’s strongly caveated
  • Frame rates varied somewhat from those recorded in my reviews (obtained using a basic alt avatar & on a variety of sims)
  • On my home sim, when alone, SL 3.2.6, Exodus and Milkshake all exceed 60fps in “High” mode at altitudes above 300m; on the ground all achieve rates in the high 40s
  • Niran’s Viewer has achieved higher rates in Beta then with release 1.03, which Niran notes as being a “test” release. Unfortunately, the 1.02 release will not run on my PC at all, so I’ve been unable to test it
  • SL 3.2.6, Exodus, Milkshake and Niran’s all demonstrate considerably faster sculptie rendering than the other Viewers on my PC (sculpties rarely initially rez as a sphere or disk, but simply “pop-out” fully formed a few seconds after other prims).

Obviously, there are other factors that weigh-in on Viewer choice, and it is actually possible to have a worthwhile in-world experience with what might be regarded as low frame rates (I’ve been running Firestorm with shadows enabled since before Christmas, with an average frame rate probably around 12fps (allowing for averages between locations) for example). In the case of Niran’s Viewer and Exodus, the graphics enhancements may well provide more of an incentive for use than straightforward frame rates. Certainly, the quality of rendering on Niran’s Viewer is signifcantly better when optimised than the majority of other Viewers (although it really hits my GPU hard!).

So, in conclusion, you’re free to interpret these results as you see fit; how much value they represent is questionable. As always, individual experences may vary wildly from my own (particularly those of you fortunate enough to run a higher-specfication CPU / GPU combination). However, as a finger-in-the-air reference point for my own reviews, the tables may have value, and I may maintain them…

Again, to be clear: I’m not claiming the test is designed to be either empirical or scientific – please do not take it as such.

37 thoughts on “Comparative Viewer frame rates

    1. TY :).

      I feel something more meaningful could have been done, but my head hurts as it is! 🙂


  1. Very useful – thank you! I’ve been looking at the frame rates of various viewers as you mention them on your blog, but it’s great to have them all at once in a controlled experiment….Now if I could only break my addiction to Phoenix. 🙂


    1. We will all need to move on, soon – and if it is any comfort at all: I found that it takes just 3 days to get used to the 2.-based viewers, much faster than I expected to actually.
      And I can safely state that if I can do it, others can too (lol)….
      I lived in Phoenix for a long long time, but there is nothing that would make me go back there, now. I never liked V2, but I tried V3 and I have to admit: I love it! Now that Kirstens is gone, I use V3 and FS all the time.


      1. Chantal, I’ve been trying to find out where LL states (or indeed, infers) that they’ll stop VI UI viewers connecting to the grid. I assume that’s what you mean by “We will all need to move on, soon”. I’d really like to find something concrete, ’cause then I’ll *have* to adapt to a more spangly viewer 😀


  2. Proves my point about the official SL viewer; framerates are best. I keep hopping to FS for the derender option, and the improved drawdistance – but a high FPS is one thing a good machinima needs. Thanks for this!


    1. Viewer 2.x was actually very hit-and-miss, rates wise, TBH. The mid-2.x versions were (on my PC) pretty spectacular, then around 2.6 (IIRC) the rates tumbled to well below those of TPVs with tweaks.

      For Machinima, I really recommend you take a look at Niran’s Viewer and Exodus – both NiranV Dean (Niran’s) and Geenz Spad (Exodus) have put a *lot* of work into the graphics side of things, although currently Exodus does apparently have an alpha issue the team are working on.

      V3.2.6 is pretty spectacular on my system, and as such both Milkshake and Exodus are tempting me away from my Firestorm addiction…!


  3. I wish I knew what you were doing right, or I am doing wrong. I get MUCH lower frame rates than you do with every viewer I have tried, and I have a much more powerful machine and a faster internet connection! I am talking like 15-22 fps in a high skybox, 7-10 fps in a sim at ground level with no other avatars, 5-7 fps in a crowded sim, 3-5 fps in a crowded dance club. Draw distance is around 200m, but shadows and DOF are not enabled.
    Intel i920 processor, 12 GB RAM, 7200 rpm hard drives, Win Vista 64, Nvidia 560Ti, 35 Mbps FIOS connection.


    1. I have no idea what is going on for you – other than I have noticed (for reasons that escape me entirely) that the Ge9800 seems to perform rather better with SL than some of the more recent nVidia cards. Mine was completely unaffected by the OpenGL issue, for example. This hasn’t always been the case – I has some astounding awful frame rates about a year ago, and Viewer 2.x was never particularly expectional for me.

      Performance can also be impacted by setting the Viewer’s bandwidth too high; ideally this shouldn’t be greater than 1500kbps, and I found that going higher did affect performance.

      Also, could you ISP be engaging in traffic shaping, which is impacting you?

      As it stands, I’m comfortably running both Firestorm and Exodus with deferred and shadows ON, and have yet to have fps hit single digits in either (I’ve just been running Exodus with gamma correction and tone mapping active as well).


      1. Since you include Phoenix, it would perhaps be instructive — and possibly salutary for some Phoenix die-hards — to see the figures for Singularity and Cool VL, too.

        But, that apart, thank you so much for doing all that hard work.


        1. I don’t use V1 at all, and I made a point of de-installing all V1 (and V2) Viewers on my system back in December, when the number of installed Viewers & clients I had for SL hit 18. Phoenix only survived by dint of the fact that installer was still in my download folder after I checked-out the mesh rendering version.


  4. Wow, excellent work! Of course a lot of people will complain that their actual results are different, because, well, FPS depends on a lot of things — but definitely on the graphics card support, above everything. Nevertheless, a badly/incompletely supported graphics card (in the sense that it won’t be using its maximum potential performance) should give similar results across all TPVs, since they’re all based on LL’s own code, really… so your tests are definitely a good starting point.

    I’d be curious if you did the same tests with some viewers based on the 1.X code (SL’s own 1.23 from 2009 and, say, Imprudence). This might help to dispel the myth that they’re faster than the 3.X codebase (hah!).

    And I was certainly surprised — almost shocked! — to see that LL’s own viewer actually beats all other TPVs. Wow! WTG, LL! They seem not to be sleeping on their chairs after all. I’ve yet to test 3.2.6 — I’ve got 3.2.4 installed and one “beta” which is labeled 3.2.5 — but this article of yours definitely persuaded me to do a few tests with the latest development version from the ‘Lab. It’s quite noticeable that even 3.2.4 outputs a much higher FPS rate that I’m used to. But this is one of the eternal cyclical “problems” which LL refuses to admit the existence of: sometimes, some things stop working for a few releases, or work worse than usual, and then start to work again, without any explanation whatsoever. I’ve given up to try to figure out inventory sorting by name, for example; on one machine I use, it only works on the odd LL viewer release. TPVs never have any problem, and neither have other machines. Who knows what’s wrong…


    1. I was tempted to try all the Viewers I’ve reviewed over the last year, but I think that would be opening a whole can of worms – and the least thing I want is for the tests to be taken as being empirical in any form, as they are clearly subjective. It was also too much like hard work!

      As to the official Viewer – I was floored, tbh. 3.2.2 & 3.2.4 had only produced “average” results for me (although again, I don’t use the official Viewer, so any data I had was through quick-fire looks, rather than in-depth use). When I tested Milkshake, I was stunned by the results I was getting during my more extensive time playing with it for review purposes – and given Cinder hadn’t indicated any graphics jiggling had been carried out, I downloaded 3.2.6 out of curiosity to try-out.

      Where things working / not working / working is concerned – I’ve always assumed that’s down to so many factors and (as Oz puts it) band-aids being used to fix things that there’s no predicting quite how a nip, tweak or tuck is going to impact something elsewhere.

      Still, glad you found it reasonably-useful :).


        1. 3.2.6 is in the Dev viewer fork.

          I’ve been using it for a while, and it appears to resolve some texture load stalls. (But not all of them)



          1. Actually, I’d noticed this myself and was correcting things when Pussycat posted. In fairness and honesty, I *was* actually using 3.2.5, *not* 3.2.6 for the tests as originally stated. My fault for having too many Viewers / installers on the PC.


        2. Going to have to check the version of my Beta copy tonight. Last time I tried beta, I was getting 5-7fps, while in the same location getting 20-50 fps on official. Hopefully my beta is slightly out of date, if you’re results above were from beta and not official. I had just assumed the delay I saw in beta was debugging code running.


  5. I just cant understand how you are sure that you have AA and AH working?
    Cause i know that none of the mesh viewers i tried, allow the overriding of My Nvidea 580 Gtx!
    None, not even exedus that i installed yesterday and enjoyed a lot in fact!
    The only viewers where you can really notice that AA and Ah is working via hardware are the non mesh ones (phoenix, firestorm pre mesh, singularity pre mesh, Imprudance)
    And its is easy to see, if you disable aa and aq in game, and override it via hardware, just look at at LL rocker femalle outift on the library, wear it and watch the metal details.
    The higer and Aq the higer details ansd less flirking you will see.
    So my guess is just 1, you are not using any AA and AQ on any mesh viewers, if you are doing as on that blog!


  6. Thank you Inara! You reward will be in heaven.

    Pep (wishes you hadn’t fixed the results so that the official viewer came out top, dammit!)


  7. Thanks for this, real numbers at last!

    /me grumbles that much higher performance hardware produces FPS results 10% of what you get. I’ll have to dig deeper into why all of my machines do so poorly compared to these numbers.


    1. It’s odd on the hardware front; I was originally running on a Ge8800, and SL was largely a mess for me if I tried turning anything special on, even before V2 came along. When the card went gaga on me, I had hoped to get a beefier replacement, but in the end could only actually get a Ge9800GT fitted into the available space – and when first done, performance wasn’t significantly better. Over time (and with some backsliding on occasions) frame rates have been steadily improving for me across most viewers that are being developed. I’ve not done anything clever with the Viewer, and only implemented the hardware settings noted through the link to Shopping Cart Disco recently, and I’m not sure they’ve made much of a difference.

      Who knows… this time next week, maybe I’ll be back with things running at 22-25fps!


  8. You have to state a margin of error in tests likes these, 3 – 5 fps is probably about right. The SL ‘High’ setting is not the same everywhere, and once you pop the shadows on it really is luck of the draw. plenty of TPV’s like to fiddle with the defaults.

    Would be very interesting to see how the choice of OS changes the results. Personally, I have Nvidia8700 mobile in my laptop, Catznip is a clear 10 – 20 fps faster on Linux over Vista, it really screams.


  9. This is good work, Inara. Thank you for the charts. I am interested in any differences there may be in viewer performance with Second Life and Open Simulator as well.

    If you are interested, we could design further experimentation.


    1. Douglas,

      Thanks for the feedback – I rather think that as Trinity says, any comparisons between OpenSim and SL Viewer-wise will be flawed from the outset; the differences between the two are enormous. As it is, what I’ve done here is at best a thumbnail sketch that is obviously very subjective (few others are liable to have the same hardware set-up, it’s unlikely the regions used in the test will be in the same overall state if the tests were repeated, etc.). As it stands this was more an effort driven out of curiosity than to set any form of standard – and was published simply because for *my* environment I did notice a trending within the figures that appeared to point of some consistency with the results (such as an overall 50% drop in performance I experienced on *all* Viewers “tested”, when running non-deferred & deferred).

      I’m not sure what else can be done to improve things. Some elsewhere have pointed out the flaws in my approach (which again, I don’t claim to be empirical), but on the other hand have outlined an approach that could give unrealistically high FPS rates in terms of while they may well be achievable, they in no way reflect the daily experience people will actually have using a Viewer in-world (rather than subjecting it a static test).


    1. As stated in the article, I restricted the “test” to those Viewers I routinely use + Phoenix (which I’m starting to regret, as I really stopped using that Viewer back in February of 2011).

      The only way to add further viewers to the results list would be to re-run the tests across *all* those listed above *and* any additional Viewers requested, in order to minimise as many unpredictable varables as possible from the results (i.e. different number of avatars on the destination sims, different loads on the grid as a whole, etc., etc.). Frankly, at this point in time, that’s not something I’m willing to commit to doing.


    1. Thank you.

      I may well return to the tests at some point in the future, depending upon how radically things change or if it appears that Viewers are starting to show wild differences from my personal baseline as TPVs or LL tinker and improve things like the rendering engines – or if I finally manage to upgrade my system to something more up-to-date :).

      Either way, the tests will remain entirely subjective and caveated with YMMV, and will, I’m afraid, be restricted to those Viewers I personally prefer to use or at least routinely use by way of reviews, etc.


Comments are closed.