
As mentioned in the TPV/Developer meeting of the 24th August, Oz Linden has been taking steps to try an improve how issues are addressed by the company’s support teams when dealing providing support to users who are using a TPV as their viewer of choice.
That TPVs are collectively more popular than the official SL viewer is not that surprising. However, a lot of people still turn to Linden Lab for help when they encounter issues. As a result of this, LL have come in for criticism as to how they handle users who report that they are using TPVs, and it is this that has prompted Oz to try to improve how matters can be handled and addressed.
Identifying the Problem
The first part of dealing with any problem is correctly diagnosing whether it is in fact viewer-related or server-related. This isn’t as easy as it sounds because there are many parts of SL where the problem could reside either within the viewer or on the server-side of things (inventory issues being a good example) – hence why LL often get the call when things go wrong.
Because of this complexity, and in order to help improve the initial viewer issue / server issue diagnosis, Oz is working with LL’s support teams to put together a better set of heuristics for use in support staff training and guidance in identifying where a particular problem may reside. To help with this work, he has asked the TPVs supply lists of issues they have encountered which they know are not viewer issues, and how to recognise them. These lists can then be added to the information supplied to LL support staff to both speed the initial diagnosis of a problem and reduce the chances of a problem being mis-diagnosed from the outset.
It’s a Viewer Problem – But Can it be Reproduced on the LL Viewer?
When it comes to trying to resolve what appears to be a viewer issue, LL support staff will ask a) whether the user is using the official LL viewer; and b) if they have tried to reproduce the issue using the official LL viewer. These questions are often taken to mean LL’s support staff “do not want to help” with the problem if it appears to be TPV related.
However, this is not the case; the question is a perfectly valid part of trying diagnose a problem because:
- If the problem can be reproduced using the official viewer, there is a chance support staff may be able to provide SL-viewer based assistance to resolve the issue
- If the problem cannot be reproduced on the official viewer, then it at least helps point to the problem potentially being related to the TPV itself.
Obviously, if the problem does appear to be viewer-related but only manifests in a TPV, LL’s support personnel are unlikely to be able to give detailed help (simply because it is unfair to expect LL’s support personnel to be intimately versed in how to resolve issues occurring with all of the TPVs used to access SL). As such, they are going to pass the matter back to the user. When this happens, it can lead to frustrations and a feeling that LL “aren’t interested” in solving the problem.
To avoid this in the future, Oz is working with TPVs to ensure LL’s support staff are better placed to provide onward guidance rather than leaving users feeling they “don’t want to help”. This is being done by each TPV listed in the TPV Directory being asked to:
- Add the details of any in-world support group(s) they operate to their Directory listing if they haven’t already done so
- Use a new field in the Directory to give details of any additional locations where help on a specific TPV might be obtained (e.g. a website, a support forum, etc.)
Thus, should an issue appear to be related to a specific viewer which LL staff cannot help resolve, they will at least be able to point the user concerned in one or more directions where they can receive more focused assistance in order to resolve the problem.
Asking People to Complete the Survey
During the discussion, Oz reiterated that every support issue dealt with by LL staff should trigger a follow-up e-mail to the user concerned. While this might not happen until up to four days after the event itself, the e-mail does include a customer satisfaction survey. This is important for two reasons:
- All survey responses are reviewed by a Linden Lab staffer; they are not farmed out to a third-party survey company or ignored or handled by an automated process
- They are seen as a primary mechanism for determining how well support is identifying and dealing with issues to the satisfaction of LL’s users.
As such, Oz emphasised the importance for feedback to be given, particularly where there is strong evidence to show that support have failed to provide the correct assistance. While completing the survey may not help in resolving the issue itself, it may help pin-point errors within the support process, particularly if a number of surveys are received highlighting the same fault.
The current process by which support issues – particularly those with TPV problems reported to LL – are handled doesn’t always run smoothly, and there are times when issues do get mis-directed. However, Oz’s response to concerns raised during recent TPV developer meetings demonstrates that steps are being taken to address them. It has been suggested that LL post a blog entry on the initiatives explained here (particularly on the need for TPV users to understand why LL do ask about reproducing issues encountered using the official viewer). In lieu of that happening, I hope this piece will serve as an informational.






