Comparative Viewer frame rates

Last week, Pserendipity Daniels left a comment on comparing Viewer performances which got me thnking. As I said in my reply, coming up with an objective means of comparing the performance of various Viewers is a little difficult, as so much as client-side dependent (hardware) while some is also down to your network connection.

However, I decided to take those Viewers I’ve actively used over the last 12 months (as opposed to reviewed and put to one side), and see what I could come up with by way of a very basic and simple means of comparing Viewer performance that might address Pep’s question without me getting bogged down in anything complex (which would probably go right over my head anyway…).

So, the two tables below represent my findings based on Viewer frame rates – which I appreciate aren’t the only measure of a Viewer’s performance (but they are the one most looked at). There are further notes below the tables on how I set-up and ran my “tests”.

Jan 6th: Tables updated to reflect the fact that Niran’s Viewer has been using the 3.2.6 code base since release 1.01. Also, Nalates Urriah has carried out further analysis on these figures.

370m altitude – click to enlarge

Average ping rate for sim: 167ms (averaged across all eight Viewers)

22m altitude – click to enlarge

Average ping rate for sim: 174ms (averaged across all eight Viewers)


  • “High” = graphics set to the SL “High” setting (Ultra in the case of Phoenix), shaders ON, all Deferred Rendering options for lighting & shadows and ambient occulsion (or equivs) OFF
  • Deferred  = deferred render ON, but ambient occulsion / shadows OFF
  • Ambient = deferred render ON, ambient occulsion ON, shadows OFF
  • Shadows = deferred render ON, ambient occulsion OFF shadows ON
  • Ambient + Shadows = deferred render on, but ambient occulsion / shadows ON
  • Numbers in brackets refer to the official Viewer release I believe each TPV is based upon.

Test Environment

To try and give as level a playing field as possible for the tests, I attempted to create a “test environment”, namely:

  • Tests were run after a completely clean reinstall of the listed Viewers (original installation and all associated files / folders uninstalled / deleted)
  • All Viewers were configured alongside my nVidia Control Panel in accordance with this tutorial from the Shopping Cart Disco blog (with thanks to Innula Zenovka for pointing it out)
  • All other major graphics and network settings within the Viewers were set to the same criteria (e.g. Draw Distance set to 300m; network bandwidth set to 1500kbps, etc.)
  • Where possible (and with the exception of Firestorm and Phoenix) the UI was set-up the same: same buttons, same locations, and not other floaters / panels were open, and any group chat sessions active on logging-in were terminated
  • The same avatar with the same attachments was used with each test (with a Draw Weight of 112,986), with the same camera defaults
  • I used the same regions for all Viewers tested, each with 4 other avatars in the regions during the tests. One region was a skymall shopping area, the other a residential sim at ground level (which actually had the same 4 other avatars present in it for all tests!)
  • The same test was used for each case: Teleport to an arrival point; allow rez time, then walk a set route for around 3 minutes, monitoring fps rates
  • Recorded frame rates are based on a roughly-calculated average, rounded up or down to the nearest whole number, as appropriate.

Hardware and network connection

The hardware used for the tests comprise my usual PC and nework connection:

  • Windows 7 32-bit with SP1; Intel Q6600 CPU 2.4Ghz; 3Gb RAM; ASUS motherboard (no idea of the model); nVidia Ge9800GT with 1Gb on-board RAM (driver: 15-10-2011); Viewers running on 320Gb SATA drive @ 7200rpm
  • Netgear DGN2200 (wireless between PC and router)
  • Internet connection averaging a ping of 43ms to the preferred test server, with a download speed of 9.55Mbps and 1.02Mbps upload (speedtest verified).


  • I don’t pretend that either the methodology or the results are particularly scientific, and underline that they are at best indicative – and even that’s strongly caveated
  • Frame rates varied somewhat from those recorded in my reviews (obtained using a basic alt avatar & on a variety of sims)
  • On my home sim, when alone, SL 3.2.6, Exodus and Milkshake all exceed 60fps in “High” mode at altitudes above 300m; on the ground all achieve rates in the high 40s
  • Niran’s Viewer has achieved higher rates in Beta then with release 1.03, which Niran notes as being a “test” release. Unfortunately, the 1.02 release will not run on my PC at all, so I’ve been unable to test it
  • SL 3.2.6, Exodus, Milkshake and Niran’s all demonstrate considerably faster sculptie rendering than the other Viewers on my PC (sculpties rarely initially rez as a sphere or disk, but simply “pop-out” fully formed a few seconds after other prims).

Obviously, there are other factors that weigh-in on Viewer choice, and it is actually possible to have a worthwhile in-world experience with what might be regarded as low frame rates (I’ve been running Firestorm with shadows enabled since before Christmas, with an average frame rate probably around 12fps (allowing for averages between locations) for example). In the case of Niran’s Viewer and Exodus, the graphics enhancements may well provide more of an incentive for use than straightforward frame rates. Certainly, the quality of rendering on Niran’s Viewer is signifcantly better when optimised than the majority of other Viewers (although it really hits my GPU hard!).

So, in conclusion, you’re free to interpret these results as you see fit; how much value they represent is questionable. As always, individual experences may vary wildly from my own (particularly those of you fortunate enough to run a higher-specfication CPU / GPU combination). However, as a finger-in-the-air reference point for my own reviews, the tables may have value, and I may maintain them…

Again, to be clear: I’m not claiming the test is designed to be either empirical or scientific – please do not take it as such.