High Fidelity moves to “open beta”

HF-logoTuesday, April 27th saw High Fidelity move to an “open beta” phase, with a simple Twitter announcement.Having spent just over a year in “open alpha” (see my update here), the company and the platform has been making steady progress over the course of the last 12 months, offering increasing levels of sophistication and capabilities  – some of which might actually surprise those who have not actually set foot inside High Fidelity but are nevertheless will to offer comments concerning it.

I’ve actually neglected updating on HiFi for a while, my last report having been at the start of March. However, even since then, things have moved on at quite a pace. The website has been completely overhauled and given a modern “tile” look (something which actually seems to be a little be derivative rather than leading edge – even Buckingham Palace has adopted the approach).

High Fidelity open beta requirements
High Fidelity open beta requirements

The company has also hired Caitlyn Meeks, former Editor in Chief of the Unity Asset Store, as their Director of Content, and she has been keeping people appraised of progress on the platform at a huge pace, with numerous blog posts, including technical overviews of new capabilities, as well as covering more social aspects of the platform, including pushing aside the myth that High Fidelity is rooted in “cartoony” avatars, but has a fairly free-form approach to avatars and to content.

High Fidelity may not be as sophisticated in terms of overall looks and content – or user numbers – as something like Second Life or OpenSim, but it is grabbing a lot of media attention (and investment) thanks to it have a very strong focus on the emerging ranging to VR hardware systems, and the beta announcement is timed to coincide with the anticipated increasing availability of the first generation HMDs from Oculus VR and HTC. Indeed, while the platform is described as “better with” such hardware and can be used without HMDs and their associated controllers, High Fidelity  describe it as being “better with” such hardware.

High Fidelity avatars
High Fidelity avatars

I still tend to be of the opinion that, over time, VR won’t be perhaps as disruptive in our lives as the likes of Mixed / Augmented Reality as these gradually mature; as such I remain sceptical that platforms such as High Fidelity and Project Sansar will become as mainstream and their creators believe, rather than simply vying for as much space as they can claim in similar (if larger) niches to that occupied by Second Life.

And even is VR does grow in a manner similar t that predicted by analysts, it still doesn’t necessarily mean that everyone will be leaping into immersive VR environments to conduct major aspects of their social interactions. As such, it will be interesting to see what kind of traction high Fidelity gains over the course of the next 12 months, now that it might be considered moving more towards maturity – allowing for things like media coverage, etc., of course.

Which is not to share the capabilities aren’t getting increasingly impressive, as the video below notes – and just look at the way Caitlyn’s avatar interact with the “camera” of our viewpoint!

 

 

.

Amazon Lumberyard

Image source: Amazon
Image source: Amazon

Lumberyard is the name of Amazon’s new game engine, released on Tuesday, February 9th. Based on Crytek’s CryEngine, which Amazon licensed in 2015, Lumberyard will apparently be developed in its own direction, independently of CryEngine and is being provided as a free-to-download tool (with optional asset packs) which can be used to develop games for PCs and consoles on a “no seat fees, subscription fees, or requirements to share revenue” basis.

Instead, Amazon will monetise Lumberyard through the use of AWS cloud computing. If you use the game engine for your own game and opt to run it on your own server, then that’s it: no fees. But if you want to distribute through a third-party provider, you can only use Amazon’s services, via either GameLift, a managed service for deploying, operating, and scaling server-based on-line games using AWS at a cost of $1.50 per 1,000 daily active users.Or, if you prefer you can use AWS directly, at normal AWS service rates.

Lumberyard (image: Amazon)
Lumberyard includes a customisable drag-and-drop UI (image: Amazon)

As well as AWS integration and the development of new low-latency networking code to support it, and native C++ access to its service, Lumberyard has deep, built-in support for Twitch (purchased by Amazon in 2014 for $970 million), including “Twitch play”-style chat commands and a function called JoinIn, which allows viewers to leap directly into on-line games alongside Twitch broadcasters as they stream. The aim here, according to Mike Frazzini, vice president of Amazon Games, when talking to Gamasutra, is “creating experiences that embrace the notion of a player, broadcaster, and viewer all joining together.”

Described as a triple-A games development engine, Lumberyard has already seen many of the CryEngine systems upgraded or replaced, including the implementation of an entirely new asset pipeline and processor and low-latency networking code – hence why Lumberyard will diverge from CryEngine’s core development.  And Amazon is promising more to come, including a new component system and particle editor and  CloudCanvas, which will allow developers to set up server-based in-game events in AWS using visual scripting.

"Alien Abode" a game scene rendered in Lumberjack (:image: Amazon)
“Alien Abode” a game scene rendered in Lumberyard (:image: Amazon)

All of which adds-up to a very powerful games development environment – although Amazon are clear that right now, it is only in beta. This means that things are liable to undergo tweaking, etc., and that some capabilities – such as Oculus Rift support – haven’t been enabled for the current version of the engine.However, VR support is there, with Amazon noting:

We have been actively working on VR within Lumberyard for some time now, and it looks great. We are currently upgrading our Oculus VR support to Rift SDK 1.0, which was released by Oculus in late December. We wanted to finish upgrading to Rift SDK 1.0 before releasing the first public version of VR support within Lumberyard, which will be included in a future release soon.

Further, Amazon has already signed official tools deals with Microsoft and Sony, which means game developers licensed to develop games for the Xbox One and PlayStation 4 can immediately start using Lumberyard to develop games for those platforms.

There are – for some – a few initial downsides to Lumberyard where independent game developers are concerned. At launch, the engine only supports models created in Maya and 3D Max, although this may change – Blender support is promised for the future, for example.  There is also no support for Mac or Linux, although Amazon have indicated that these will be come, along with iOS and Android support.

Use of the engine includes the right to redistribute it and pieces of the development environment within games, and allows game developers to any companion products developed for a game using Lumberyard with allow end users to modify and create derivative works of that game.

The CryEngine SDK is one of the Asset Packs available for download for use with Lumberyard (image: Amazon)
The CryEngine SDK is one of the Asset Packs available for download for use with Lumberyard (image: Amazon)

As noted above, the company has already started supplying asset packs developers can include in their games, Three packs are available at launch, including the CryEngine GameSDK,  which contains everything required for a first-person shooter game, including complex animated characters, vehicles and game AI, and which includes a sample level.

Amazon clearly have major plans for Lumberyard, and some in the gaming media are already wondering what it might do to the current development environment, which is largely dominated by the likes of  Unity, Unreal Engine, or even CryEngine itself, but which all require either a license fee or a royalty fee.

Is Lumberyard competition for the Lab’s Project Sansar? The engine certainly has the ability to create immersive environments, and Lumberyard will support VR HMDs as it moves forward, as noted.

However, everything about Lumberyard points to it being pitched as a professional games development environment with a dedicated distribution service through Amazon’s cloud services available for use with it. Hence, again, why Twitch is deeply integrated into Lumberyard – Amazon appear to be a lot more interested in building an entire gaming ecosystem. Amazon’s marketing is also geared towards gaming, as their promotional video (below) shows.

Which is not to say that it couldn’t be attractive to markets outside of gaming. As such, it will be interesting to see over time just who does take an interest in it – and how Amazon might support them.

With thanks to John for the pointer to Amazon.

Sources

High Fidelity: “VR commerce”, 200 avatars and scanning faces

HF-logoHigh Fidelity have a put out a couple of interesting blog posts on their more recent work, both of which make for interesting reading.

In Update from our Interns, the provide a report by Edgar Pironti and Alessandro, Signa two of High Fidelity’s interns who have been working as interns  on a number of projects within the company, including developing a single software controller for mapping inputs from a range of hand controllers, with the initial work involving the Razer Hydra, HTC Vive’s controllers and the Space Navigator. They also discuss working on recording and playback functionality, which is also expanded upon in the second blog post which caught my eye, the January newsletter, issued by Chris Collins.

This work has involved developing the ability to pre-record the data stream of an avatar – audio, facial expressions and body movement in a format which can later be played back on a server under the control of JavaScript. As Chris notes, this makes it very easy to quickly populate a VR experience with compelling pre-recorded avatar content, allowing life-like characters to a place or for use within a machinima film.

As their third reported project. Edgar and Alessandro discuss how they’ve been looking into creating a “VR commerce” environment. This combines elements of physical world shopping – sharing it with friends, actually grabbing and trying items, having discussions with sales staff, etc., with the convenience of e-shopping, such as quickly changing colours, seeing customer reviews and feedback, and so on. As well as explaining how they went about the task, Edgar and Alessandro have put together a video demonstration:

In the High Fidelity newsletter, Chris Collins covers a number of topics, including work on optimising avatar concurrency on a single High Fidelity server. While this work most likely used Edgar’s and Alessandro’s approach to pre-recording avatar  data streams mentioned above, the initial results are impressive: 200 avatars on a server, of which the 40 nearest the observer’s viewpoint are rendered at 75Hz when using the Oculus Rift, and with a very high level of detail, with full facial animations.

200 avatars on a High Fidelity server as the comes starts into optimising avatar concurrency (image: High Fidelity)
200 avatars on a High Fidelity server as the comes starts into optimising avatar concurrency (image: High Fidelity)

One of the things that High Fidelity has often be critiqued for by SL users is the cartoon like avatars which were first shown as the company gradually cracked open its doors. These are still in use, but there has also been a lot of work put into making the avatars more life-like should users so wish. However, there is a trade-off here, which has been discussed in the past: the so-called uncanny valley effect: the closer the facial features look and move almost, but not exactly, like natural beings, it causes a response of revulsion among some observers.

This has tended those investigating things like avatar usage to express caution in pushing too close to achieving any kind of genuine realism in their avatars, and Philip Rosedale has discussed High Fidelity’s own gentle pushing at the edge of the valley. Now it seems the company is stepping towards the edge of the valley once more, using 3D scans of people to create avatar faces, as Chris notes in the newsletter:

We scanned our whole team using two different 3D scanners (one for hair and one for the face), and then used some simple techniques to make the results not be (too) uncanny … Although there is still a lot of work to do, we achieved these results with less than 2 hours of work beyond the initial scans, and the effect of having our team meetings using our ‘real’ avatars is quite compelling.

High Fidelity's staff group photo using scans of their own hair and faces (image: High Fidelity)
High Fidelity’s staff group photo using scans of their own hair and faces (image: High Fidelity)

Not everyone is liable to want to use an avatar bearing a representation of their own face, and the idea of using such a technique does raise issues around identity, privacy, etc., which should be discussed, by High Fidelity’s work in this area is intriguing. Although that said, and looking at the staff group photo, I would perhaps agree there is still more work needed; I’m not so much concerned about the technique pushing towards the edge of the uncanny valley so much as I am about the avatars in the photo looking just that little bit odd.

Also in the update is a discussion on audio reverb, a look at how the use of particles can be transformed when you’re able to see your “hands” manipulating them, and a look at a purpose-built game in High Fidelity.  These are all discussed on the video accompanying the January video, which I’ll close with.

High Fidelity: Stephen Wolfram and more on tracking

HF-logoOn Tuesday, December 29th, High Fidelity announced that Stephen Wolfram has become their latest advisor.

British-born Stephen Wolfram is best known  for his work in theoretical physics, mathematics and computer science. He  began research in applied quantum field theory and particle physics and publish scientific papers when just 15 years old. By the age of 23, he was studying at the School of Natural Sciences of the Institute for Advanced Study in Princeton, New Jersey, USA, where he  conducted research into cellular automata using computer simulations.

Stephen Wolfram via Quantified Self
Stephen Wolfram via Quantified Self

When introducing Wolfram through the High Fidelity blog, Philip Rosedale notes this work had a profound impact on him, as did – later in life – Wolfram’s 2002 book, A New Kind of Science.

More recently, Wolfram has been responsible for WolframAlpha, an answer engine launched in 2009, and which is one of the systems used by both Microsoft’s Bing “decision engine” and also Apple’s Siri. In 2014, he launched the Wolfram Language as a new general multi-paradigm programming language.

In become an advisor to High Fidelity, Dr. Wolfram joins Peter Diamandis, the entrepreneur perhaps most well-known for the X-Prize Foundation, Dr. Adam Gazzaley, founder of the Gazzaley cognitive neuroscience research lab at the University of California, Tony Parisi, the co-creator of the VRML and X3D ISO standards for networked 3D graphics, and a 3D technology innovator, Professor Ken Perlin of the NYU Media Research Lab, and Professor Jeremy Bailenson, the director of Stanford University’s Virtual Human Interaction Lab.

In their October update published at the start of November, High Fidelity followed-up on the work they’ve been putting into various elements of tracking movement, including the use of BinaryVR for tracking facial movement and expressions. In particular, the company’s software allows users to create a their own virtual 3D face from 2D facial photos, allowing them to track their facial animations in real-time, transforming them onto the CG representation of their face.

Ryan balances one object atop another while Chris uses the gun he picked-up with a hand movement, then cocked it, to try and shoot the yellow object down
Ryan balances one object atop another while Chris uses the gun he picked-up with a hand movement, then cocked it, to try to shoot the yellow object down

Integration of the BinaryVR software allows High Fidelity to track users’ mouth movements through their HMDs, allowing their avatars to mimic these movements in-world, as Chris Collins demonstrates in the update video. The company has also been extending the work in full body tracking, as seen in my October coverage of their work, and this is also demonstrated in the video alongside of more in-world object manipulation by avatars, with Chris and Ryan building a tower of blocks and Chris then picking up a gun and shooting it down.

The hand manipulation isn’t precise at this point in time, as can be seen in the video, but this isn’t the point; it’s the fact that in-world objects can be so freely manipulated that is impressive. That said, it would be interesting to see how this translates to building: how do you accurately sized a basic voxel (a sort-of primitive for those of us in SL) shape to a precise length and width, for example, without recourse to the keyboard or potentially complicated overlays?

Maybe the answer to this last point is “stay tuned!”.

2015 OpenSimulator Conference registrations open

On Thursday, October 29th, I received an e-mail announcing that registrations for the 2015 OpenSimulator Community Conference are open.

Attendance is free, but for those wishing to donate to the supporting this and future conferences, there are a number of options to do so, ranging from $10.00 USD through to $200.00 USD, all of which offer various benefits to purchasers. For the full range of ticket options and their respective benefits, and to book your place at the conference, please visit the conference ticket page.

The 2015 conference will be a one day affair, taking place on Saturday, December 5th. Nevertheless, it will present a full programme of dynamic short presentations and panels that spotlight the best of the OpenSimulator platform and community that will take place virtually on the conference grid.

The OpenSimulator Community conference 2014 (image: the OpenSimulator Community Conference)
The OpenSimulator Community conference 2014 (image: the OpenSimulator Community Conference)

In addition, the organisers are inviting the OpenSimulator Community to host community and social events, scheduled for dates leading up to the conference in the days leading up to the conference and immediately following its closing on Saturday, December 5th at 17:00 PST, and again on Sunday, December 6th.

Those interested in hosting a social event should register their interest via the Community Event Sing-up page.

If you wish to give a presentation or talk at the conference, please register your interest via the Call for Proposals page, but note that all proposals must be received no later that 11:59 PST on Saturday, October 31st.

Volunteers for the event can also sign-up via the Call for Volunteers page.

The 2013 conference arena
The 2013 conference arena

About the Conference

The OpenSimulator Community Conference is an annual conference that focuses on the developer and user community creating the OpenSimulator software. The conference is a joint production by Core Developers of OpenSimulator and AvaCon, Inc., a 501(c)(3) non-profit organization dedicated to promoting the growth, enhancement, and development of the metaverse, virtual worlds, augmented reality, and 3D immersive and virtual spaces.  The conference features a day of presentations, panels, keynote sessions, and social events across diverse sectors of the OpenSimulator user base.

High Fidelity: September update and things to come

HF-logoThe September newsletter from High Fidelity appeared at the end of that month, with Chris Collins highlighting some of the work that has been going on of late, providing an update on particle effects, procedural textures and – most interestingly – avatar kinematics and in-world object manipulation using an avatar’s hands and via suitable controllers.

Procedural textures allow for complex, algorithm based textures to be created using tools such as ShaderToy and used directly within High Fidelity. Brad Davis has created a video tutorial on procedural entities which Chris references in the newsletter, the write-up also follows a short video released on the High Fidelity  you Tube channel which briefly demonstrates procedural textures in HiFi.

However, it is the object manipulation that’s likely to get the most attention, together with avatar kinematics and attempts to imply a force when moving an object.

In terms of avatar kinematics, Chris notes:

In 2016, when the consumer versions of the HMD’s are released, you are also going to be using a hand controller. It is therefore important that we can make your avatar body simulate correct movement with the hand data that we receive back from the controllers.

The results are shown in the newsletter in the form of  some animated GIFs. In the first, Chris’ avatar is shown responding to a Hydra controller for hand movements and echoing his jaw movements. The second demonstrates object manipulation, with Chris’ avatar using its hand to pick up a block from an in-world game, echoing Chris’ motions using a hand-held controller.

Manipulating in-world objects in High Fidelity via an avatar's hands and a set of controllers (image: High Fidelity)
Manipulating in-world objects in High Fidelity via an avatar’s hands and a set of controllers (image: High Fidelity)

The animation in picking up the block may not be entire accurate at this point in time – the block seems to travel through the avatar’s thumb as the wrist is rotated – but that isn’t what matters. The level of manipulation is impressive, and it’ll be interesting to see if this might be matched with things like feedback through a haptic style device, so that users can really get a sense of manipulating objects.

The object manipulation element, together with attempts to imply a force when moving objects in-world which make up a core part of the video accompanying the newsletter (and which is embedded below). Again, this really is worth watching, as the results are both impressive, and illustrate some of the problems High Fidelity are trying to solve in order to give virtual spaces greater fidelity.

Coupling object manipulation with implied force opens up a range of opportunities for things like in-world games, physical activities, puzzles, and so on. There’s also potential for learning and teaching as well, so it’ll be interesting to see how this aspect of the work develops.

The newsletter also promises that we’ll be seeing some further VR demo videos from High Fidelity in October, so keep an eye out for those as well.