Category Archives: Other Worlds

OSCC 2016: call for proposals and volunteers

The 2016 OpenSimulator Community Conference (OSCC) will take place on Saturday 10th and Sunday 11th December 2016.

An annual conference that focuses on the developer and user community creating the OpenSimulator software.  Organised as a joint production by Core Developers of OpenSimulator and AvaCon, Inc., and sponsored by the University of California, Irvine, Institute for Virtual Environments and Computer Games, the conference features a day of presentations, panels, keynote sessions, and social events across diverse sectors of the OpenSimulator user base.

Call for Proposals

The Conference for 2016 will feature a series of dynamic short presentations and panels that spotlight the best of the OpenSimulator platform and community, and a Call for Proposals has been issued to individuals or groups who are shaping the Metaverse.

The focus for the 2016 event is the visions for the future and the evolution of the platform, with 20-minute sessions available for speakers, while community-sponsored tours, content give-aways and Hypergrid explorations take attendees to far away places. The organisers encourage presentations that span current innovations and activities, performance artistry, educational simulations, innovative business cases or  have a publication or track record of real world use.

Those wishing to participate directly in the conference as speakers can do so via the following tracks:

  • Creative
  • Education
  • Technical
  • Experiential
  • Other

All proposals should be submitted using the conference Proposal Submission form, and any questions direct to the conference organisers.

The key dates for proposals are:

  • October 9th, 2016 11:59 PST: Deadline for proposals
  • October 23rd, 2016:  acceptance details e-mails will be dispatched by the conference organisers to accepted speakers
  • October 30th, 2016: accepted speakers must register for the conference in order to be included in the conference schedule and the programme
  • November 19th, 2016: presenter Orientation & Training sessions to prepare speakers for the conference and to set-up Presenter Booths. Any custom content, props, and audio-visuals must be submitted to be included in the conference programme.
  • December 10th – 11th, 2016: 2016 OpenSimulator Community Conference
  • December 11th, 2016 – OSCC Hypergrid tour and other community events.

Community Social Events

A key part of OSCC is the social events held in the run-up to, and around the dates of, the conference itself. Those interested in running / hosting a social event should complete the Community Events Sign-up page.  There will also be limited available space on the OSCC conference grid for those who would like to host an OSCC meet-up or an after conference event on Sunday, December 11th. Please contact the conference organisers. with any questions.


The conference needs volunteers to help in a range of activities:

  • Greeters / audience assistances
  • Moderators
  • Builders
  • Scripters
  • Social Media / Communications
  • Streaming and Technical Support

Those interested in volunteering can do so via the Volunteer Sign-up form,  Depending upon interests, volunteers can select more than one role if they wish.

About the Conference

The OpenSimulator Community Conference is an annual conference that focuses on the developer and user community creating the OpenSimulator software. The conference is a joint production by Core Developers of OpenSimulator and AvaCon, Inc., a 501(c)(3) non-profit organization dedicated to promoting the growth, enhancement, and development of the metaverse, virtual worlds, augmented reality, and 3D immersive and virtual spaces.  The conference features a day of presentations, panels, keynote sessions, and social events across diverse sectors of the OpenSimulator user base.

High Fidelity moves to “open beta”

HF-logoTuesday, April 27th saw High Fidelity move to an “open beta” phase, with a simple Twitter announcement.Having spent just over a year in “open alpha” (see my update here), the company and the platform has been making steady progress over the course of the last 12 months, offering increasing levels of sophistication and capabilities  – some of which might actually surprise those who have not actually set foot inside High Fidelity but are nevertheless will to offer comments concerning it.

I’ve actually neglected updating on HiFi for a while, my last report having been at the start of March. However, even since then, things have moved on at quite a pace. The website has been completely overhauled and given a modern “tile” look (something which actually seems to be a little be derivative rather than leading edge – even Buckingham Palace has adopted the approach).

High Fidelity open beta requirements

High Fidelity open beta requirements

The company has also hired Caitlyn Meeks, former Editor in Chief of the Unity Asset Store, as their Director of Content, and she has been keeping people appraised of progress on the platform at a huge pace, with numerous blog posts, including technical overviews of new capabilities, as well as covering more social aspects of the platform, including pushing aside the myth that High Fidelity is rooted in “cartoony” avatars, but has a fairly free-form approach to avatars and to content.

High Fidelity may not be as sophisticated in terms of overall looks and content – or user numbers – as something like Second Life or OpenSim, but it is grabbing a lot of media attention (and investment) thanks to it have a very strong focus on the emerging ranging to VR hardware systems, and the beta announcement is timed to coincide with the anticipated increasing availability of the first generation HMDs from Oculus VR and HTC. Indeed, while the platform is described as “better with” such hardware and can be used without HMDs and their associated controllers, High Fidelity  describe it as being “better with” such hardware.

High Fidelity avatars

High Fidelity avatars

I still tend to be of the opinion that, over time, VR won’t be perhaps as disruptive in our lives as the likes of Mixed / Augmented Reality as these gradually mature; as such I remain sceptical that platforms such as High Fidelity and Project Sansar will become as mainstream and their creators believe, rather than simply vying for as much space as they can claim in similar (if larger) niches to that occupied by Second Life.

And even is VR does grow in a manner similar t that predicted by analysts, it still doesn’t necessarily mean that everyone will be leaping into immersive VR environments to conduct major aspects of their social interactions. As such, it will be interesting to see what kind of traction high Fidelity gains over the course of the next 12 months, now that it might be considered moving more towards maturity – allowing for things like media coverage, etc., of course.

Which is not to share the capabilities aren’t getting increasingly impressive, as the video below notes – and just look at the way Caitlyn’s avatar interact with the “camera” of our viewpoint!




Amazon Lumberyard

Image source: Amazon

Image source: Amazon

Lumberyard is the name of Amazon’s new game engine, released on Tuesday, February 9th. Based on Crytek’s CryEngine, which Amazon licensed in 2015, Lumberyard will apparently be developed in its own direction, independently of CryEngine and is being provided as a free-to-download tool (with optional asset packs) which can be used to develop games for PCs and consoles on a “no seat fees, subscription fees, or requirements to share revenue” basis.

Instead, Amazon will monetise Lumberyard through the use of AWS cloud computing. If you use the game engine for your own game and opt to run it on your own server, then that’s it: no fees. But if you want to distribute through a third-party provider, you can only use Amazon’s services, via either GameLift, a managed service for deploying, operating, and scaling server-based on-line games using AWS at a cost of $1.50 per 1,000 daily active users.Or, if you prefer you can use AWS directly, at normal AWS service rates.

Lumberyard (image: Amazon)

Lumberyard includes a customisable drag-and-drop UI (image: Amazon)

As well as AWS integration and the development of new low-latency networking code to support it, and native C++ access to its service, Lumberyard has deep, built-in support for Twitch (purchased by Amazon in 2014 for $970 million), including “Twitch play”-style chat commands and a function called JoinIn, which allows viewers to leap directly into on-line games alongside Twitch broadcasters as they stream. The aim here, according to Mike Frazzini, vice president of Amazon Games, when talking to Gamasutra, is “creating experiences that embrace the notion of a player, broadcaster, and viewer all joining together.”

Described as a triple-A games development engine, Lumberyard has already seen many of the CryEngine systems upgraded or replaced, including the implementation of an entirely new asset pipeline and processor and low-latency networking code – hence why Lumberyard will diverge from CryEngine’s core development.  And Amazon is promising more to come, including a new component system and particle editor and  CloudCanvas, which will allow developers to set up server-based in-game events in AWS using visual scripting.

"Alien Abode" a game scene rendered in Lumberjack (:image: Amazon)

“Alien Abode” a game scene rendered in Lumberyard (:image: Amazon)

All of which adds-up to a very powerful games development environment – although Amazon are clear that right now, it is only in beta. This means that things are liable to undergo tweaking, etc., and that some capabilities – such as Oculus Rift support – haven’t been enabled for the current version of the engine.However, VR support is there, with Amazon noting:

We have been actively working on VR within Lumberyard for some time now, and it looks great. We are currently upgrading our Oculus VR support to Rift SDK 1.0, which was released by Oculus in late December. We wanted to finish upgrading to Rift SDK 1.0 before releasing the first public version of VR support within Lumberyard, which will be included in a future release soon.

Further, Amazon has already signed official tools deals with Microsoft and Sony, which means game developers licensed to develop games for the Xbox One and PlayStation 4 can immediately start using Lumberyard to develop games for those platforms.

There are – for some – a few initial downsides to Lumberyard where independent game developers are concerned. At launch, the engine only supports models created in Maya and 3D Max, although this may change – Blender support is promised for the future, for example.  There is also no support for Mac or Linux, although Amazon have indicated that these will be come, along with iOS and Android support.

Use of the engine includes the right to redistribute it and pieces of the development environment within games, and allows game developers to any companion products developed for a game using Lumberyard with allow end users to modify and create derivative works of that game.

The CryEngine SDK is one of the Asset Packs available for download for use with Lumberyard (image: Amazon)

The CryEngine SDK is one of the Asset Packs available for download for use with Lumberyard (image: Amazon)

As noted above, the company has already started supplying asset packs developers can include in their games, Three packs are available at launch, including the CryEngine GameSDK,  which contains everything required for a first-person shooter game, including complex animated characters, vehicles and game AI, and which includes a sample level.

Amazon clearly have major plans for Lumberyard, and some in the gaming media are already wondering what it might do to the current development environment, which is largely dominated by the likes of  Unity, Unreal Engine, or even CryEngine itself, but which all require either a license fee or a royalty fee.

Is Lumberyard competition for the Lab’s Project Sansar? The engine certainly has the ability to create immersive environments, and Lumberyard will support VR HMDs as it moves forward, as noted.

However, everything about Lumberyard points to it being pitched as a professional games development environment with a dedicated distribution service through Amazon’s cloud services available for use with it. Hence, again, why Twitch is deeply integrated into Lumberyard – Amazon appear to be a lot more interested in building an entire gaming ecosystem. Amazon’s marketing is also geared towards gaming, as their promotional video (below) shows.

Which is not to say that it couldn’t be attractive to markets outside of gaming. As such, it will be interesting to see over time just who does take an interest in it – and how Amazon might support them.

With thanks to John for the pointer to Amazon.


High Fidelity: “VR commerce”, 200 avatars and scanning faces

HF-logoHigh Fidelity have a put out a couple of interesting blog posts on their more recent work, both of which make for interesting reading.

In Update from our Interns, the provide a report by Edgar Pironti and Alessandro, Signa two of High Fidelity’s interns who have been working as interns  on a number of projects within the company, including developing a single software controller for mapping inputs from a range of hand controllers, with the initial work involving the Razer Hydra, HTC Vive’s controllers and the Space Navigator. They also discuss working on recording and playback functionality, which is also expanded upon in the second blog post which caught my eye, the January newsletter, issued by Chris Collins.

This work has involved developing the ability to pre-record the data stream of an avatar – audio, facial expressions and body movement in a format which can later be played back on a server under the control of JavaScript. As Chris notes, this makes it very easy to quickly populate a VR experience with compelling pre-recorded avatar content, allowing life-like characters to a place or for use within a machinima film.

As their third reported project. Edgar and Alessandro discuss how they’ve been looking into creating a “VR commerce” environment. This combines elements of physical world shopping – sharing it with friends, actually grabbing and trying items, having discussions with sales staff, etc., with the convenience of e-shopping, such as quickly changing colours, seeing customer reviews and feedback, and so on. As well as explaining how they went about the task, Edgar and Alessandro have put together a video demonstration:

In the High Fidelity newsletter, Chris Collins covers a number of topics, including work on optimising avatar concurrency on a single High Fidelity server. While this work most likely used Edgar’s and Alessandro’s approach to pre-recording avatar  data streams mentioned above, the initial results are impressive: 200 avatars on a server, of which the 40 nearest the observer’s viewpoint are rendered at 75Hz when using the Oculus Rift, and with a very high level of detail, with full facial animations.

200 avatars on a High Fidelity server as the comes starts into optimising avatar concurrency (image: High Fidelity)

200 avatars on a High Fidelity server as the comes starts into optimising avatar concurrency (image: High Fidelity)

One of the things that High Fidelity has often be critiqued for by SL users is the cartoon like avatars which were first shown as the company gradually cracked open its doors. These are still in use, but there has also been a lot of work put into making the avatars more life-like should users so wish. However, there is a trade-off here, which has been discussed in the past: the so-called uncanny valley effect: the closer the facial features look and move almost, but not exactly, like natural beings, it causes a response of revulsion among some observers.

This has tended those investigating things like avatar usage to express caution in pushing too close to achieving any kind of genuine realism in their avatars, and Philip Rosedale has discussed High Fidelity’s own gentle pushing at the edge of the valley. Now it seems the company is stepping towards the edge of the valley once more, using 3D scans of people to create avatar faces, as Chris notes in the newsletter:

We scanned our whole team using two different 3D scanners (one for hair and one for the face), and then used some simple techniques to make the results not be (too) uncanny … Although there is still a lot of work to do, we achieved these results with less than 2 hours of work beyond the initial scans, and the effect of having our team meetings using our ‘real’ avatars is quite compelling.

High Fidelity's staff group photo using scans of their own hair and faces (image: High Fidelity)

High Fidelity’s staff group photo using scans of their own hair and faces (image: High Fidelity)

Not everyone is liable to want to use an avatar bearing a representation of their own face, and the idea of using such a technique does raise issues around identity, privacy, etc., which should be discussed, by High Fidelity’s work in this area is intriguing. Although that said, and looking at the staff group photo, I would perhaps agree there is still more work needed; I’m not so much concerned about the technique pushing towards the edge of the uncanny valley so much as I am about the avatars in the photo looking just that little bit odd.

Also in the update is a discussion on audio reverb, a look at how the use of particles can be transformed when you’re able to see your “hands” manipulating them, and a look at a purpose-built game in High Fidelity.  These are all discussed on the video accompanying the January video, which I’ll close with.