Today, the telephone – in the form of pocket-sized smartphones – is an essential part of most people’s every day life. Looking at one, it is hard to imagine how far the technology behind this means of long-distance communication has come since its birth in the 1870s.
Of course, we all know something of the history of the telephone, with names like Alexander Graham Bell and Elisha Grey (if not poor Antonio Meucci) being familiar to many of us, if only as a result of our school days. But what is it’s real story? How did the early telephones work? what have been the various eras of the ‘phone?
Denzel Coy brilliantly and charmingly answers these questions in Second Life through his Telephone Museum. Within it, visitors can explore the telephone’s entire history, from its beginnings with the unfortunate Meucci, the unlucky Grey and the fortunate Bell, through the first box systems, to candlesticks and on to the rotary era – all the way up to the modern cellphone.
This is a fabulous environment for anyone interested in history or technology as well as the telephone. On displays are around 50 exquisitely crafted telephones from the last 140 years, made by a number of Second Life creators – Raya Jonson, Jin Zhu, Zaida Gearbox, Neotoy Story and Plato Novo, to name but a handful, as well as Denzel himself. Alongside of them are information boards complete with audio playback capabilities, allowing visitors to read or hear the information they contain, together with reproductions of adverts for telephones from the different eras, and more.
The displays are laid out around two levels, with the lower progressing from information on Meucci, Grey and Bell, through to the arrival of rotary dial telephones in the 1920s. These displays are all offered around a model of the very first telephone device from 1876. From here, visitors can progress to the mezzanine level, and the history of the telephone from the 1950s through to the present day, with a brief detour into the world of the military field telephone.
As well as the audio capabilities, the museum includes a number of interactive elements – including the display case of the aforementioned 1876 device being alarmed against theft! There is also a gacha station, where visitors can obtain a number of items, including some rare models of the ‘phones on display and the Telephone Museum Ultimate Guide. There is also a trivia competition on the main floor, where people can test the knowledge they’ve gained during their visit.
This is a superb exhibit to visit, perfectly presented in an environment designed by Denzel. Informative and educational, it is also entertaining and offers another look at just how exquisite mesh models can be in Second Life.
The following notes are taken from the Content Creation User Group meeting, held on Thursday February 23rd, 2017 at 1:00pm SLT at the the Hippotropolis Campfire Circle. The meeting is chaired by Vir Linden, and agenda notes, etc, are available on the Content Creation User Group wiki page.
HTTP asset fetching
Applying Baked Textures to Mesh Avatars
As previously noted, the Lab is working on moving landmarks, gestures, animations, sounds and wearables (system layer clothing) from UDP delivery via the simulator to HTTP delivery via the CDN(s). This work is now progressing to the stage where initial testing is liable to be starting soon. It’s not clear if this is internal testing with the Lab, or whether it will involve wider (Aditi testing) as well. As things progress, expect the viewer-side changes to appear in a project viewer and then progress through the normal route of testing / update to RC and onwards towards release.
Potential Project: Animated Objects
As noted in my last Content Creation UG meeting notes, the Lab is taking a speculative look at using the current avatar skeleton to animate in-world objects to provide a means for users to more easily create animated objects (e.g. non-player characters – NPCS -, plants and trees responding to a breeze, providing mesh animals which do not rely on performance hitting alpha swapping, etc) – see feature request BUG-11368. for some of the ideas put forward which helped prompt the Lab’s interest.
It is important to note that this is still a speculative look at the potential; there is no confirmed project coming off the back of it, the Lab is currently seeking feedback on how people might use the capability, were it to be implemented. No in depth consideration has been given to how such a capability would be support on the back end, or what changes would be required to the viewer.
One of the many issues that would need to be worked through is just the simple matter of how an object might be animated to achieve something like walking, running or flying. These require the simulator to make certain assumptions when handling an avatar which are not a part of object handling. There’s also the question of how the skeleton would be applied to an object.
Having animated objects does give rise to concerns over potential resource / performance impacts. for example, someone having a dozen animated pets running around them as animated objects could potentially have the same resource / performance overheads as thirteen actual avatars in a region.
One possible offset to this (although obviously, the two aren’t equitable) is that mesh animals / objects which currently use a lot of alpha flipping to achieve different “states” of “animation” (such a the squirrel which can jump from the ground and swing on a nut holder and jump back down again, or the peek-a-boo baby bears, etc., all of which are popular in gardens and public regions) could be made a lot more efficient were they to be animated, as the resource / performance hitting alpha swapping could be abandoned.
It was suggested that rather than having the full skeleton available for animated objects, it might be possible to use a sub-set of bones, or even the pre-Bento skeleton. Agreeing that this might be done, Vir pointed out that using the full skeleton would perhaps offer the most flexible approach, and also allow the re-use of existing content, particularly given that things like custom skeletons (also mooted) would be too big a project to undertake.
Applying Baked Textures to Mesh Avatars
Interest is increasing in this potential project, which would allow baked textures – skins and wearble clothing layers – to be applied directly to mesh avatars via the baking service. This also has yet to be officially adopted by the Lab as a project, but there is considerable interest internally in the idea.
As I’ve previously reported, there is considerable interest in this idea, as it could greatly reduce the complexity of mesh avatar bodies by removing the need for them to be “onion skinned” with multiple layers. However, as I noted in that report, a sticking point is that currently, the baking service is limited to a maximum texture resolution of 512×512, whereas mesh bodies and parts (heads, feet, hands) can use 1024×1024.
These is concern that if the baking service isn’t updated to also support 1024×1024 textures, it would not be used as skins and wearable using it would appear to be of lower resolution quality than can be achieved when using applier systems on mesh bodies. Vir expressed doubt as to whether the detail within 1024×1024 textures is really being seen unless people are zoomed right into other avatars, which for most of the time we’re going about our SL times and doing things, isn’t the case.
This lead to a lengthy mixed text / voice discussion on texture resolution and extending the baking service to support mesh avatars (were it to go ahead), which essentially came down to two elements:
The technical aspects of whether or not we actually get to see the greater detail in 1024×1024 textures most of the time we’re in world and in re-working the baking service to supporting 1024×1204 across all wearable layers from skin up through to jacket.
The sociological aspect of whether or not people would actually use the baking service route with mesh avatars front , if the texture resolution were left at 512×512, because of the perceived loss of detail involved.
Various compromises were put forward to try to work around the additional impact of updating the baking service to support 1024×1024 textures. One of these was that body creators might provide two versions of their products if they wish: one utilising appliers and 1024×1024 textures as is the case now, and the other supporting the baking service and system layers at 512×512, then leave it to users to decide what they want to use / buy. Another was a suggestion that baking service support could be initially rolled out at 512×512 and then updated to 1024×1024 support if there was a demand.
None of the alternative suggestions were ideal (in the two above, for example, creators are left having to support two product ranges, which could discourage them; while the idea of leaving the baking service at 512×512 falls into the sociological aspect of non-use mentioned previously). Currently, Vir appears to be perhaps leaning more towards updating the baking service to 1024×1024 were the project to be adopted but, the overheads in doing so still need to be investigated and understood.
.ANIM Exporter for Maya
Cathy Foil indicated that Aura Linden has almost finished working on the .ANIM exporter she’s being developing for Maya. The hope is that the work will be completed in the next week or so. She also indicated that, in keeping with Medhue Simoni’s advice from a few weeks ago (see .BVH Animations and Animation Playback), she was able to overcome some of the issues being experienced with fine-tuning .BVH animation playback, although there are still problems.
The .ANIM exporter will be available for anyone using Maya, and is not something dependent upon Mayastar.
Avastar 2.0 in RC
The upcoming fully Bento compliant version of Avastar is now available as a release candidate.
Tapple Gao has been looking at IK (Inverse Kinematics) constraints within Second Life. These aren’t widely used within existing animations – although up to about eight constraints can be defined – largely because the documentation doesn’t appear to be too clear. Tapple hopes to improve this through investigation and then updating the SL wiki.
The next content Creation meeting will be in two weeks, on Thursday, March 9th, at 13:00 SLT.