Category Archives: Other Worlds

Support Rock Your Rack in Second Life

Image via Rock Your Rack

Image via Rock Your Rack

Rock Your Rack, the annual charity event to raise funds for the US National Breast Cancer Foundation, and which organised and presented by Models Giving Back, will this year take place between Saturday October 1st and Sunday October 16th inclusive – and the organisers are currently seeking designers, entertainers and bloggers wishing to both take part in and support the event.

Rock Your Rack is a combined fashion and entertainment event, offering the best in both to visitors, with designer booths, fashions shows, live performances and DJ sessions. In addition, there will be a range of supporting activities, including:

  • A silent auction featuring one-of-a-kind items from Rock Your Rack designers who will give 100% of the sales to the cause
  • A Rock Your Rack Hunt featuring collectables offered by participating designers at L$10 each – with all proceeds going directly into the fund-raising
  • And Art show and auction organised by Windlight Magazine on behalf of Models Giving Back.

Designer applications at the event have been open since mid-June, and will remain so until Sunday, July 31st or until the remaining slots are taken. If you are fashion designer and would be interested in participating in Rock Your Rack, please visit the Rock Your Rack Designers information page.

If you are DJ or Live performer, and would like to donate your time to the event, registrations are now open, and will run through until Tuesday, August 30th (unless all slots are filled before then).

  • DJ slots are available on the Saturdays and Sundays of the event only, and are available for either dance parties or fashion show. Those interested should check the Rock Your Rack Entertainments page, and then complete the DJ registration form
  • Live entertainment slots are available for the Saturdays and Sundays of the event, and weekday events. Those interested should check the Rock Your Rack Entertainments page, and then complete the entertainer registration form

Note that all DJ and entertainers applying must have their own music stream, as none can be provided.

Bloggers interested in covering the event have from now until Sunday, July 31st to apply. Do note that given the event involves both fashion and entertainment and requires both are covered, Rock Your Rack has a detailed set of blogging requirements. Those interested in applying to be an event blogger are asked to read the Rock Your Rack blogger’s information page prior to submitting an application.

Artists who would like to participate in the Rock Your Rack Art Show and Auction should refer to the entry requirement for participating on the Art Show application form.

For more information on Rock Your Rack, including a complete time line covering the run-up to the event, as well as information on Models Giving Back, please visit the Rock Your Rack website.

High Fidelity moves to “open beta”

HF-logoTuesday, April 27th saw High Fidelity move to an “open beta” phase, with a simple Twitter announcement.Having spent just over a year in “open alpha” (see my update here), the company and the platform has been making steady progress over the course of the last 12 months, offering increasing levels of sophistication and capabilities  – some of which might actually surprise those who have not actually set foot inside High Fidelity but are nevertheless will to offer comments concerning it.

I’ve actually neglected updating on HiFi for a while, my last report having been at the start of March. However, even since then, things have moved on at quite a pace. The website has been completely overhauled and given a modern “tile” look (something which actually seems to be a little be derivative rather than leading edge – even Buckingham Palace has adopted the approach).

High Fidelity open beta requirements

High Fidelity open beta requirements

The company has also hired Caitlyn Meeks, former Editor in Chief of the Unity Asset Store, as their Director of Content, and she has been keeping people appraised of progress on the platform at a huge pace, with numerous blog posts, including technical overviews of new capabilities, as well as covering more social aspects of the platform, including pushing aside the myth that High Fidelity is rooted in “cartoony” avatars, but has a fairly free-form approach to avatars and to content.

High Fidelity may not be as sophisticated in terms of overall looks and content – or user numbers – as something like Second Life or OpenSim, but it is grabbing a lot of media attention (and investment) thanks to it have a very strong focus on the emerging ranging to VR hardware systems, and the beta announcement is timed to coincide with the anticipated increasing availability of the first generation HMDs from Oculus VR and HTC. Indeed, while the platform is described as “better with” such hardware and can be used without HMDs and their associated controllers, High Fidelity  describe it as being “better with” such hardware.

High Fidelity avatars

High Fidelity avatars

I still tend to be of the opinion that, over time, VR won’t be perhaps as disruptive in our lives as the likes of Mixed / Augmented Reality as these gradually mature; as such I remain sceptical that platforms such as High Fidelity and Project Sansar will become as mainstream and their creators believe, rather than simply vying for as much space as they can claim in similar (if larger) niches to that occupied by Second Life.

And even is VR does grow in a manner similar t that predicted by analysts, it still doesn’t necessarily mean that everyone will be leaping into immersive VR environments to conduct major aspects of their social interactions. As such, it will be interesting to see what kind of traction high Fidelity gains over the course of the next 12 months, now that it might be considered moving more towards maturity – allowing for things like media coverage, etc., of course.

Which is not to share the capabilities aren’t getting increasingly impressive, as the video below notes – and just look at the way Caitlyn’s avatar interact with the “camera” of our viewpoint!

 

 

.

Amazon Lumberyard

Image source: Amazon

Image source: Amazon

Lumberyard is the name of Amazon’s new game engine, released on Tuesday, February 9th. Based on Crytek’s CryEngine, which Amazon licensed in 2015, Lumberyard will apparently be developed in its own direction, independently of CryEngine and is being provided as a free-to-download tool (with optional asset packs) which can be used to develop games for PCs and consoles on a “no seat fees, subscription fees, or requirements to share revenue” basis.

Instead, Amazon will monetise Lumberyard through the use of AWS cloud computing. If you use the game engine for your own game and opt to run it on your own server, then that’s it: no fees. But if you want to distribute through a third-party provider, you can only use Amazon’s services, via either GameLift, a managed service for deploying, operating, and scaling server-based on-line games using AWS at a cost of $1.50 per 1,000 daily active users.Or, if you prefer you can use AWS directly, at normal AWS service rates.

Lumberyard (image: Amazon)

Lumberyard includes a customisable drag-and-drop UI (image: Amazon)

As well as AWS integration and the development of new low-latency networking code to support it, and native C++ access to its service, Lumberyard has deep, built-in support for Twitch (purchased by Amazon in 2014 for $970 million), including “Twitch play”-style chat commands and a function called JoinIn, which allows viewers to leap directly into on-line games alongside Twitch broadcasters as they stream. The aim here, according to Mike Frazzini, vice president of Amazon Games, when talking to Gamasutra, is “creating experiences that embrace the notion of a player, broadcaster, and viewer all joining together.”

Described as a triple-A games development engine, Lumberyard has already seen many of the CryEngine systems upgraded or replaced, including the implementation of an entirely new asset pipeline and processor and low-latency networking code – hence why Lumberyard will diverge from CryEngine’s core development.  And Amazon is promising more to come, including a new component system and particle editor and  CloudCanvas, which will allow developers to set up server-based in-game events in AWS using visual scripting.

"Alien Abode" a game scene rendered in Lumberjack (:image: Amazon)

“Alien Abode” a game scene rendered in Lumberyard (:image: Amazon)

All of which adds-up to a very powerful games development environment – although Amazon are clear that right now, it is only in beta. This means that things are liable to undergo tweaking, etc., and that some capabilities – such as Oculus Rift support – haven’t been enabled for the current version of the engine.However, VR support is there, with Amazon noting:

We have been actively working on VR within Lumberyard for some time now, and it looks great. We are currently upgrading our Oculus VR support to Rift SDK 1.0, which was released by Oculus in late December. We wanted to finish upgrading to Rift SDK 1.0 before releasing the first public version of VR support within Lumberyard, which will be included in a future release soon.

Further, Amazon has already signed official tools deals with Microsoft and Sony, which means game developers licensed to develop games for the Xbox One and PlayStation 4 can immediately start using Lumberyard to develop games for those platforms.

There are – for some – a few initial downsides to Lumberyard where independent game developers are concerned. At launch, the engine only supports models created in Maya and 3D Max, although this may change – Blender support is promised for the future, for example.  There is also no support for Mac or Linux, although Amazon have indicated that these will be come, along with iOS and Android support.

Use of the engine includes the right to redistribute it and pieces of the development environment within games, and allows game developers to any companion products developed for a game using Lumberyard with allow end users to modify and create derivative works of that game.

The CryEngine SDK is one of the Asset Packs available for download for use with Lumberyard (image: Amazon)

The CryEngine SDK is one of the Asset Packs available for download for use with Lumberyard (image: Amazon)

As noted above, the company has already started supplying asset packs developers can include in their games, Three packs are available at launch, including the CryEngine GameSDK,  which contains everything required for a first-person shooter game, including complex animated characters, vehicles and game AI, and which includes a sample level.

Amazon clearly have major plans for Lumberyard, and some in the gaming media are already wondering what it might do to the current development environment, which is largely dominated by the likes of  Unity, Unreal Engine, or even CryEngine itself, but which all require either a license fee or a royalty fee.

Is Lumberyard competition for the Lab’s Project Sansar? The engine certainly has the ability to create immersive environments, and Lumberyard will support VR HMDs as it moves forward, as noted.

However, everything about Lumberyard points to it being pitched as a professional games development environment with a dedicated distribution service through Amazon’s cloud services available for use with it. Hence, again, why Twitch is deeply integrated into Lumberyard – Amazon appear to be a lot more interested in building an entire gaming ecosystem. Amazon’s marketing is also geared towards gaming, as their promotional video (below) shows.

Which is not to say that it couldn’t be attractive to markets outside of gaming. As such, it will be interesting to see over time just who does take an interest in it – and how Amazon might support them.

With thanks to John for the pointer to Amazon.

Sources

High Fidelity: “VR commerce”, 200 avatars and scanning faces

HF-logoHigh Fidelity have a put out a couple of interesting blog posts on their more recent work, both of which make for interesting reading.

In Update from our Interns, the provide a report by Edgar Pironti and Alessandro, Signa two of High Fidelity’s interns who have been working as interns  on a number of projects within the company, including developing a single software controller for mapping inputs from a range of hand controllers, with the initial work involving the Razer Hydra, HTC Vive’s controllers and the Space Navigator. They also discuss working on recording and playback functionality, which is also expanded upon in the second blog post which caught my eye, the January newsletter, issued by Chris Collins.

This work has involved developing the ability to pre-record the data stream of an avatar – audio, facial expressions and body movement in a format which can later be played back on a server under the control of JavaScript. As Chris notes, this makes it very easy to quickly populate a VR experience with compelling pre-recorded avatar content, allowing life-like characters to a place or for use within a machinima film.

As their third reported project. Edgar and Alessandro discuss how they’ve been looking into creating a “VR commerce” environment. This combines elements of physical world shopping – sharing it with friends, actually grabbing and trying items, having discussions with sales staff, etc., with the convenience of e-shopping, such as quickly changing colours, seeing customer reviews and feedback, and so on. As well as explaining how they went about the task, Edgar and Alessandro have put together a video demonstration:

In the High Fidelity newsletter, Chris Collins covers a number of topics, including work on optimising avatar concurrency on a single High Fidelity server. While this work most likely used Edgar’s and Alessandro’s approach to pre-recording avatar  data streams mentioned above, the initial results are impressive: 200 avatars on a server, of which the 40 nearest the observer’s viewpoint are rendered at 75Hz when using the Oculus Rift, and with a very high level of detail, with full facial animations.

200 avatars on a High Fidelity server as the comes starts into optimising avatar concurrency (image: High Fidelity)

200 avatars on a High Fidelity server as the comes starts into optimising avatar concurrency (image: High Fidelity)

One of the things that High Fidelity has often be critiqued for by SL users is the cartoon like avatars which were first shown as the company gradually cracked open its doors. These are still in use, but there has also been a lot of work put into making the avatars more life-like should users so wish. However, there is a trade-off here, which has been discussed in the past: the so-called uncanny valley effect: the closer the facial features look and move almost, but not exactly, like natural beings, it causes a response of revulsion among some observers.

This has tended those investigating things like avatar usage to express caution in pushing too close to achieving any kind of genuine realism in their avatars, and Philip Rosedale has discussed High Fidelity’s own gentle pushing at the edge of the valley. Now it seems the company is stepping towards the edge of the valley once more, using 3D scans of people to create avatar faces, as Chris notes in the newsletter:

We scanned our whole team using two different 3D scanners (one for hair and one for the face), and then used some simple techniques to make the results not be (too) uncanny … Although there is still a lot of work to do, we achieved these results with less than 2 hours of work beyond the initial scans, and the effect of having our team meetings using our ‘real’ avatars is quite compelling.

High Fidelity's staff group photo using scans of their own hair and faces (image: High Fidelity)

High Fidelity’s staff group photo using scans of their own hair and faces (image: High Fidelity)

Not everyone is liable to want to use an avatar bearing a representation of their own face, and the idea of using such a technique does raise issues around identity, privacy, etc., which should be discussed, by High Fidelity’s work in this area is intriguing. Although that said, and looking at the staff group photo, I would perhaps agree there is still more work needed; I’m not so much concerned about the technique pushing towards the edge of the uncanny valley so much as I am about the avatars in the photo looking just that little bit odd.

Also in the update is a discussion on audio reverb, a look at how the use of particles can be transformed when you’re able to see your “hands” manipulating them, and a look at a purpose-built game in High Fidelity.  These are all discussed on the video accompanying the January video, which I’ll close with.