Now open at the R&D Art Gallery complex is Five Years of Fractals, celebrating five years of Gem Preiz’s remarkable fractal art in Second Life. Split over two floors of the exhibition space, the displayed art is divided between retrospectives of Gem’s past installations at the Linden Endowment for the Arts, and his exhibitions and installations displayed elsewhere in Second Life.
Normally displayed in a very large format, Gem’s work is always a masterpiece of fractal design and storytelling on a grand scale. As such, what is seen within Five Years is but the tip of the iceberg – a soupçon if you will – that should remind those familiar with Gem’s work with the power and majesty of his art and hopefully serve to whet the appetite of those new to his work such that they will want to see more.
I certainly fall into the former of these two groups. I’ve long been an admirer of Gem’s art and his virtuosity in both setting a mood and telling a story for almost as long as he has been exhibiting in Second Life, and a number of my personal favourites out of his installations are presented here, both directly and indirectly. The ground level section of the exhibition space presents a retrospective of, for want of a better category name, Gem’s “non-LEA” work. Some of this is presented through individual images, other is animated frames which page through scenes from those exhibitions. On this level we can again experience Polychronies, Rhapsody in Blue Fractals, Myths, Temples, Metropolis – complete with silhouettes of the figures which formed a part of it painted on the walls behind the images – and more.
As well as the art itself, there are books of his work visitors can peruse and also links to videos of some of this exhibitions – which I unhesitatingly recommend watching, bringing together as they do not only the art as it could be seen in situ whilst on display, but which also marry the images to the music Gem has offered with each installation, thus, through the videos as well as this exhibition, we can re-immerse ourselves in his art or gain greater familiarity with it and understand the inter-weaving of images and music.
Reached via teleport discs, the lower level of the exhibition space focuses on Gem’s LEA exhibitions, as noted. Among the pieces displayed, we can once again experience the visions of his Cathedral Dreamer, journey through his trilogy of stories, Vestiges and Wrecks, which formed his Heritage pairing, and No Frontiers, the unofficial sequel to Heritage, while images from the likes of The Anthropic Principle and No Frontiers cover sections of the walls behind some of the images. As with the upper level of the gallery, objects offer links to videos of some of the installations, while spaced around the gallery area are props and elements from others – such as the air car and the shuttle which Gem his used in his installations, allowing visitors to fly through them.
Fractal art is not uncommon in Second Life, but there is something very unique in Gem’s work. Perhaps it is the way in which it reflects both his interests – cosmology, nature, geology – and blends them with his background education in science and mathematics to present stunning visions of nature and future (or even ancient) scenes which are evocative, and both beautifully geometric and wonderfully fluid. Perhaps it is because, in composing his pieces, he presents not just individual pieces of art, but entire stories we can explore and witness.
Whatever the reason, I very much welcome this opportunity to revisit – at least in part – many of his past extraordinary installations – and in doing so, to look forward to his next.
The following notes are taken from the Content Creation User Group meeting, held on Thursday, September 14th, 2017 at 13:00 SLT at the the Hippotropolis Camp Fire Circle. The meeting is chaired by Vir Linden, and agenda notes, etc, are usually available on the Content Creation User Group wiki page.
Medhue Simoni live steamed the meeting to You Tube, and his video is embedded at the end of this article. However, due to power issues at his end, the first few minutes are missing from the recording. Time stamps to that recording are included below, and clicking on any of them will launch the video in a separate browser tab at the assigned point. However as these notes present the meeting in terms of topics discussed, rather than a chronological breakdown of the meeting, time stamps may appear to be out of sequence in relation to the recording.
Animesh (Animated Mesh)
“I like the name ‘animated objects’ because I think it’s unambiguous, but it takes a long time to type!” – Vir Linden joking about the name “Animesh”.
The goal of this project is to provide a means of animating rigged mesh objects using the avatar skeleton, in whole or in part, to provide things like independently moveable pets / creatures, and animated scenery features via scripted animation.
Animated objects can be any rigged / skinned mesh which is correctly flagged as an animated object (so it has a skeleton associated with it), and contains the necessary animations and controlling scripts in its own inventory (Contents tab of the Build floater) required for it to animate itself.
The capability will most likely include a new flag added to an existing rigged object type in order for the object to be given its own skeleton.
At this point in time, this is not about adding fully functional, avatar-like non-player characters (NPCs) to Second Life.
Animated objects will not (initially):
Have an avatar shape associated with them
Make use of an avatar-like inventory (although individual parts can contain their own inventory such as animations and scripts)
Make use of the server-side locomotion graph for walking, etc., and so will not use an AO
Use the avatar baking service
The project may be extended in the future.
It will involve both back-end and viewer-side changes, likely to encompass new LSL commands to trigger and stop animations (held in the object’s contents).
[pre-video start and 46:06-46:40] Work on the viewer is focused on cleaning up how the viewer handles animation when dealing with Animesh objects, as there are some elements which simply aren’t used. The transform matrix has been adjusted, so that Animesh attachments now look as if they are attached to the avatar, rather than potentially floating somewhere reasonably close to it (so an Animesh object attached to a hand will now appear attached to the hand and move with the hand, for example). Further work is required, but things are now behaving a lot better; there’s still no ETA on the appearance of the project viewer, however.
Some basic constraints on attaching Animesh objects to an avatar or in-world, and on the overall allowable complexity of Animesh objects have yet to be defined and incorporated into the viewer. These are necessary to prevent Animesh objects negatively impacting performance to an undue degree. The initial constraints set within the project viewer will be subject to refinement in testing.
[32:21-33:06] In terms of avatar attachments, there is already a server-side limit on how many attachments can currently be worn by an avatar (38), so the Lab could look at the type of attachments being used, and limit Animesh in terms of an allowed number within this global attachment limit (e.g. 2 out of the 38 global limit for attachments may be Animesh).
Alexa provided a couple of GIFs demonstrating Animesh. The first showed her army of dancing bears – around 100 – all happily dancing on a region without causing an appreciable load.
[13:39-16:54] However, whether populated regions (in terms of avatars and objects) could handle such a load is open to question. Also, these bears are all the same optimised mesh, and are probably not placing the kind of load on the simulator and viewer as would be in the case of multiple and different kinds of mesh with varying levels of complexity. To help determine reasonable limits, the Lab will be establishing some test regions once the projects viewer is available, to allow for more comprehensive testing with assorted Animesh models, and which will used to further refine the Animesh constraints noted above.
[18:11-18:40] As a part of her own testing, Alex also intends to use the mesh starter avatars produced by the Lab, mixing them together in a scene using different animations, etc., to see how things behave.
Animesh and Pathfinding
[10:35-11:14] A couple of previous meetings have raised the idea of Animesh working with Pathfinding (the means by which scripted objects – people, animals, monsters, etc,– be set to move in a region / parcel and react to avatars, etc). Dan Linden is looking into how Animesh and Pathfinding might work together, and he and Alexa shared a GIF image of some of the work, with some of Alexa’s dancing bears skating around their own pathfinding routes, which provide a quick demonstration that the two can likely be used together.
Alexa has also been experimenting with Animesh and ice-skating, taking the view that having creatures and animals ice-skating in winter scenes (which can actually be common in “wintered” regions, with skating penguins and the like).
Animesh Attachments: Fluidity and Clothing
How smoothly attached Animesh objects work is liable to be dependent upon the animations themselves, and whether or not they move the object’s pelvis bone or nor. As with all things, some experimentation and fine tuning is likely to be required be creators in order to optimise the motions of their Animesh attachments.
Some people have been looking at Animesh as a means to get clothing to move more naturally with an avatar. However, as Vir pointed out in the meeting, utilising additional skeletons in clothes may not be the most efficient way to achieve this, when it should be possible – with a little care – to use existing some of the bones in the avatar skeleton to achieve the same results (e.g. skinning the cloth of a gown or skirt or cape to the wing bones, etc).
This would allow the clothing to move far more seamlessly and in sync with body movements than could be achieved with Animesh attachments, which would not have any direct means of syncing with an avatar’s movement.
[1:39-3:03] Vir is has been working on aligning the root joint of a skeleton associated with a Animesh to the position of the object in-world. Sometimes this gives good results, sometimes it doesn’t, resulting in objects jumping around when animations is played or moving into the air or sinking into the ground, etc, as the server thinks they are somewhere else than the visual position shown in the viewer. Because of the mixed results, he’s considering alternative approaches, such as using the object’s bounding box, and will be exploring options before pushing out any project viewer. One of the balances he’s trying to achieve is presenting a nice visual result without over-complicating what has to be done in order to achieve that result.
Changing the Orientation of the Skeleton via LSL / LSL Commands
[34:12-34:45] Will there be a scripted function to change the default orientation of a skeleton to match an Animesh object? Conceivably; however, Vir is hoping to develop a reasonable default behaviour when attaching a skeleton which will allow simple editing of the object to achieve the desired result, if required. Should this be shown not to work sufficiently well enough, additional LSL support will likely be looked at.