Restrained Love 2.9: scripted camera controls

On June 16th, Marine Kelley recently updated her Restrained Love viewer to version 2.9. It introduces a new series of camera control options, offering a range of potential opportunities for those wishing to create puzzles, mazes, immersive quests, etc., as well as being applicable to the general use of RLV!

Marine provides the details on the updates, but here in brief is a summary of the key additions, together with an  image I’ve borrowed from her blog:

  • @camdistmin and @camdistmax force the camera to stay within a range (0= Mouselook any value above 0 actively prevents Mouselook being engaged)
  • @camdrawmin and @camdrawmax simulate fog / blindfolds by obscuring the world around the avatar (not around the camera, as with the windlight settings)
  • @camdrawalphamin and @camdrawalphamax indicate the closest and farthest opacities of fog defined by @camdrawmin and @camdrawmax
  • @camdrawcolor sets the color of the fog defined by the above (black is the default)
  • @camunlock prevents the camera from being panned, orbited, etc. away from the avatar – so can prevent someone from peer through walls, etc.
  • @camavdist specifies the maximum distance beyond which avatars look like shadows (think ssing people in a mist or heavy fog)
  • @camtextures renders the world grey, other than avatars and Linden water. Marine notes that bump mapping and shininess remain untouched, as even someone blindfolded or in heavy fog can still feel their way around
  • @shownametags hides the radar, name tags, and prevents doing things to an avatar through the context – useful for games involving trying to find someone without them being betrayed by their name tag.

There are three additional camera presets added as well (left, right, top), to allow some additional camera options when @camunlock is active. There is also a new debug setting, RestrainedLoveCamDistNbGradients, to go with the camera options, as well.

RLV 2.9 adds some interested scripted controls for the camera which could have a range of uses, such as locking the camera to the avatar and controlling how far the user can see, a
RLV 2.9 adds some interesting scripted controls for the camera which could have a range of uses, such as locking the camera to the avatar and controlling how far the user can see (image: Marine Kelley)

Again, please refer to the RLV 2.9 release notes for full details of these, and the other updates with this release.

The new camera options, as noted, could have a range of potential uses, and demonstrate (once again) that RLV isn’t just about “teh bondages”  – it’s an extremely flexible extension to her viewer (note that they are only applicable to her RLV viewer at this time). Those wishing to find out more about it and who may not have taken a look at it previously, can find more information both on Marine’s blog and on the RLV API wiki page.

Related Links



11 thoughts on “Restrained Love 2.9: scripted camera controls

  1. That’s nice and all but… *mumbles something about TPV Policy and releasing features the Lab might be working on before they’re done* I feel a disturbance in the Force.


    1. Wish I could edit my comment… I am referring to this JIRA where they suggested scripting the camera… Wouldn’t Marine quite sweep the rug from under the Lab’s feet (or whatever the expression is)? I like what has been done here, it’s appreciated since it’s faster than the Lab’s usual timeframe, but… That just doesn’t feel right, and with really bad timing.


      1. Heh… our comments passed one another in the ether…

        Again, not sure that this would conflict directly with anything the Lab might do vis feature request BUG-6325 & llSetCameraParams(). That’s about controlling camera movement with greater freedom to allow for broader gaming, etc., use; this is about a local series of constraints on camera movement and “vision” restrictions.


          1. No worries. I have a habit of making blog posts late at night (like now!) and then having a “DOH!” moment afterwards :).


    2. Not sure why. Marine has been working on this for a while, she is the curator of the RLV API, and some of the work was discussed in the open-source mailing list.


      1. Oh, I didn’t know about the mailing list. I think I forgot to subscribe to that one. Yeah, I know who Marine is, though… My bad. Still, I think that the time between that JIRA and this API update is rather fishy. Might be the other way around.


  2. This looks very interesting and I’m hoping we can explore it in immersive theatre applications – thanks for sharing the info.


    1. I had a feeling you and Harvey might be interested in the camera scripting. Potential there fore atmospherics :).


Comments are closed.