High Fidelity update users with a quarterly report

HF-logoHigh Fidelity have issues a progress report for the second quarter of 2015, which has been circulated to users via e-mail and made available as a blog post.

In the report, they highlight recently achievements / work, including:

  • The fact that they’ve been hiring-in new talent (and are still looking for more). It should be noted that the talent is restricted to employees, either. At the end of May, Professor  Jeremy Bailenson of the Virtual Human Interaction Lab at Stanford University  and Professor Ken Perlin both joined High Fidelity’s growing list of high-powered advisors
  • The instructions and video on setting-up the stack manager to run your own High Fidelity server has been updated, with the promise that next up will be an ability to optionally allow you share your server resources with other nearby users who need extra capacity
  • The ability to track and capture head movements and facial expressions with a regular webcam, as an alternative to needing a 3D camera
  • The arrival of the High Fidelity Marketplace, where you can drag and drop content into your server, and also to upload content you want to share with others. This is currently a sharing environment rather than a commerce environment, but the promise is that the commerce aspect will be coming soon
  • Commencing work on implementing distributed physics, building on top of the open source Bullet physics engine, with the aim of having low latency for interactions while maintaining the same state among participants – such as when people in different locations are playing Jenga or billiards together
  • The ability to import web content into High Fidelity – static web pages, videos, interactive web pages, etc., complete with a demonstration video and the promise of figuring out the best ways to allow the different types of shared browsing that people are going to need
  • My personal favourite: zone entities, skyboxes and dynamic lighting with spherical harmonic lighting and optional sync to real-world day/night cycles

Also in the Next Steps aspects of High Fidelity’s development is the intriguing promise of avatars with soft bodies, which are capable of interacting physically, or as Philip Rosedale puts it in the blog post, “imagine sword-fighting, for example”, while being driven by hand controllers such as those coming with the HTC / Valve Vive or for the Oculus Rift. This also links back to the work going on with the physics engine as well, which has, as Mr. Rosedale explains in the blog post, an added level of complexity within High Fidelity due to the distributed nature of the platform, and the need to maintain consistency between players as to what is happening, where things are, who is controlling what, and so on.

For those wishing to keep abreast with the key points of what is going on with High Fidelity, but who do not necessarily have the time to jump into every blog post that comes out, these updates are a useful means of tracking core events within the platform.

High Fidelity moves to “Open Alpha”

HF-logoIn what is not an April Fools joke, the rumours of an announcement having been doing the rounds for the last few days, High Fidelity  announced on April 1st, 2015 that they are throwing wide the gates on an “Open Alpha” phase for their nascent virtual worlds platform.

The announcement came in the form of a blog post from Philip Rosedale, which reads in part:

This is a very early release, and High Fidelity is still very much a work in progress.  The look and visual quality is far from complete, and big things like avatar movement animation and physics are still not in place.  There are lots of bugs to fix, and content formats will continue to change.  But enough systems are now functional to make us feel that High Fidelity is useful for some types of work, experimentation, and exploration. Having run a small and controlled early alpha to iron out the really show-stopping bugs, we’re now eager to engage a larger group and recruit open source contributions from other developers working on building the metaverse.

The post is full of a lot of useful information for those who have been waiting to slip into Hi Fi and find out what it might be about – such as how to obtain the Interface (client) to access worlds within Hi Fi, and how to download the Stack Manager, should you wish to create your own world.  Both the Stack Manager and Interface currently require one of Windows (7 with SP 1 or later), Mac OS X or Linux, although the blog post notes High Fidelity is working on a GearVR / Android version as well.

In mentioning both the Interface and the Stack Manager, it’s worth noting that there are also a number of tutorial videos available which may also be of use, including one covering downloading and installing the Stack Manager and another on running the Interface for the first time (although this doesn’t include downloading and installing it). I’ve added the URLs for the all of the tutorials at the end of this article.

Another aspect of the platform that’s mentioned is that of the Marketplace, which was also recently featured in a High Fidelity video. However, before you get excited about buying / selling goods on Hi Fi, keep in mind the platform doesn’t as yet have any for of currency / token / micro-transaction support. Thus, the marketplace is purely for freely sharing creations with other Hi Fi users – although the company again notes that getting a payment system sorted out is also on their list of priorities.

The most important thing to remember, should you opt to try High Fidelity out for yourself, and haven’t kept up with the news, is that it is very alpha. This means that it is not going to look like Second life in any way shape size or form, and the Alpha is about getting a feel for things, participating in High Fidelity’s development. As such, change is to be expected, as Philip Rosedale warnings in the blog post:

You can expect continuous and substantial changes as we complete new features; we will likely break content as we continue to design and experiment.   The transition from ‘alpha’ to ‘beta’, which we expect will happen over a year or so, will signal greater stability in the content formats.  But as an open source project with contributions from many developers and with a broad set of features working, we think the time is right to open things up completely for early use.

Obviously Hi Fi also doesn’t run the same way as SL or OpenSim, so there will be a lot of nuances you’ll need to get used to. It’s also currently very small – although the High Fidelity home page may help you get started with finding places to visit (see Up and Running on the home page).

There is also a lot of good stuff in the platform as well which may be fun for some people to play with – the physics system works, 3D audio is operational, there is support for some bleeding-edge VR technology (for those who have the necessary toys!), and so on. The blog post includes some animated GIFs of some of the physics capabilities in action. models can also be imported (.FBX format), and JavaScript is the scripting medium.

If you are interested in giving High Fidelity a try, please do make sure you read the blog post in full, as it will help to give you a better feel for what you can expect. You can also catch a series of videos from the High Fidelity team on their You Tube channel.

Related and Useful Links

Microsoft co-founder backs High Fidelity with US $11 million

HF-logoIn what is its largest round of funding to date, High Fidelity, Philip Rosedale’s virtual work start-up, received an additional US $11 million in a round lead by Vulcan Capital, the investment company founded and run by Ex-Microsoft co-founder Paul Allen.

This marks at least the second time Vulcan Capital has made an investment in VR technology recently. In October 2014, they were a part of a US $542 million round of investment in Magic Leap, the company developing a new augmented reality system, with an eye potentially on VR applications and virtual worlds as well.

The news of the funding was broken by TechCrunch on Wednesday, February 25th, after papers confirming the funding round were filed with the US Securities and Exchange Commission. Philip Rosedale later confirmed the news to Techcrunch, after the technology journal had placed an enquiry with High Fidelity on the matter.

Rosedale then went on to make a more public announcement on the High Fidelity blog, which included a fun and informative video on matters featuring himself, HiFi co-founder Ryan Karpf and the ever-popular Emily Donald. As this is unlisted on YouTube, I’m respecting High Fidelity’s wishes and linking to it, rather than sharing it via embedding, although I am sneaking in a still from it.

The US $11 million funding confirmation from High Fidelity includes a short video from Ryan, Emily and Philip outlining the investment news, what it means for the company and mentioning job opportunities at HiFi (image source: High Fidelity)
The US $11 million funding confirmation from High Fidelity includes a short video from Ryan Kampf, Emily Donald and Philip Rosedale, outlining the investment news, what it means for the company and mentioning job opportunities at HiFi (image source: High Fidelity)

The blog post leads with the statement:

We are happy to announce today that we have raised an additional $11M in funding, in a new round led by Vulcan Capital and with participation from other new and existing investors.  This is certainly great news for us, but also great news for the overall VR ecosystem as we continue to see more and more validation from the investing community that VR presents enormous opportunities.  With this investment, we will be able to substantially grow our team as we continue to develop and release our open source shared virtual reality software.

The amount raised is more than that had been achieved during both the first and second funding rounds for the company. These occurred in April 2013 (US $2.4 million) and March 2014 (US $2.5 million), and were largely lead by True Ventures and Google Ventures. In the video, Ryan Kampf gives some idea of what this latest round immediately means for High Fidelity:

The next step for us is going to be moving to a more open alpha stage to let you guys come in and create a lot of cool content that you see around you. We’ve had a lot of fun making it, and we look forward to seeing what you guys can create as well.

The video also covers the fact that High Fidelity is still hiring, with Emily Donald pointing people to the company’s job page, and her e-mail address.

Definitely worth smiling about: Philip Rosedale's High Fidelity gains a further US $11 million in funding
Definitely worth smiling about: Philip Rosedale’s High Fidelity gains a further US $11 million in funding

Following the March 2014 US $2.5 million round of funding, which came on the heels of Facebook acquiring Oculus VR, I idly speculated whether or not it might have put High Fidelity – until then point regarded as being something of a “stealth start-up” in the eyes of the technology media – more firmly on people’s radar. If it wasn’t the case then, a further US $11 million now should certainly do so. And that’s not just good for High Fidelity.

As Rosedale notes in the quote given earlier in this article, this latest investment in High Fidelity does much to further validate the VR / VW ecosystem as a whole as an investment opportunity. That’s got to be good news for any company working on a new VW platform / environment, and which may want to explore wider options and opportunities for possible funding in the future, even if it’s not necessarily a start-up like High Fidelity.

Related Links

High Fidelity tutorial videos

HF-logoEarly December saw High Fidelity slip out a series of introductory videos on their YouTube channel under a play list entitled Intro to High Fidelity: The Basics.

At the time of writing this piece, six video are included in the play list, which together run to a total of some 16 minutes, although individually they range from just over the minute mark to just shy of five minutes in length. All six are narrated / produced by Chris Collins from HiFi, and the topic areas covered are:

  • First time log-in
  • Edit entities
  • Stack Manager
  • Oculus Rift set-up
  • Hydra support
  • Leap Motion controller.

First time log-in: takes the user from the point at which they have downloaded and installed the High Fidelity client software – referred to as “the interface”, and have logged-in for the first time, arriving in the High Fidelity sandbox area.

From here, Chris takes users quickly through changing the appearance of the default robot avatar by using one of the available pre-sets (“Ron”, shown in the image above), and noting that people can also upload their own avatars. He also covers ensuring the audio is correctly set (microphone pick-up, etc.), and basic navigation between domains / locations within the High Fidelity topography. Interface customisation through the use of JavaScript elements is also touched upon (the entire interface is written in JavaScript and includes some additional elements, making it highly customisable).

Editing entities: the second video provides a very high-level overview of creating and editing content (entities) in High Fidelity, starting with making sure the toolbox is correctly displayed (if necessary). Importing pre-built elements supplied with the interface is covered, and the ability for collaborative building within a domain is mentioned as is using FBX animations, and editing object properties is looked over.

A quick overview is also given on uploading custom content (in .FBX format), noting that it needs to be available from a web service (such as Dropbox or your own web server, if you happen to run one.

The Stack Manager focuses on building your own server to host a dedicated domain where you can build and share content, invite friends to come an join you and interact with them, etc. Servers can be run on your own local machine, or on any other machine to which you have suitable access (e.g. a web server).

The video runs through everything from downloading and installing the Stack Manager through to importing initial content. An overview of various settings (security, audio) and tools (logs, nodes), is also provided.

The final three videos provide quick start guides to using the Oculus Rift, Sixense Hydra and Leap Motion (attached to the Oculus Rift headset). All assume that you already have the hardware set-up and ready to go with your computer, and so each simply steps you through the basics to get yourself going (making sure the correct scripts are running, etc.).

Using the Sixsense Hydra with the High Fidelity interface
Using the Sixense Hydra with the High Fidelity interface

As noted, these are introductory videos, so don’t expect them to go into great detail in terms of what you can do, troubleshooting or anything like that. However, as quick start guides, they are clear, concise and do exactly what it says they do on the label.

Related Links

Videos courtesy of High Fidelity Inc.

Rock-paper-scissors at HiFi, with thanks to SL’s Strachan Ofarrel!

HF-logoDan Hope over at High Fidelity has provided  a light-hearted blog post on using the Leap Motion gesture device with the High Fidelity Alpha.

The blog post includes a video showing Chris Collins and Ozam Serim in-world in High Fidelity playing a game of rock-paper-scissors. The intention is to provide something of an update on integrating Leap Motion with High Fidelity.

Both Chris and Ozan’s avatars have intentionally-oversized hands, which although they look silly / awkward, help emphasise the  dexterity available in the High Fidelity avatar. Not only can avatars mimic user’s gestures, they can mimic  individual finger movements as well (something Dan has shown previously in still images).

Dan also points out the work to integrate Leap Motion hasn’t been done internally, but has  been a contribution from CtrlAltDavid – better known in Second Life as Strachan Ofarrel (aka Dave Rowe), the man behind the CtrlAltStudio viewer. As such, Dan points to it being an example of the High Fidelity Worklist being put to good use – although I say it’s more a demonstration of  Dave’s work in getting new technology into virtual environments :).

A lot of people have been fiddling with Leap Motion – including fixing it to the front of an Oculus Rift headset (as noted in the HiFi blog post) in order to make better use of it in immersive environments.Having it fixed to an Oculus, makes it easier for the Leap Motion to capture gestures – all you need to do is hold your hands up in your approximate field-of-view, rather than having to worry about where the Leap is on your desk.

Mounting the Leap motion to the front of Oculus Rift headsets is seen as one way to more accurately translate hand movements and gestures into a virtual environment. Perhaps so - but a lot of people remain unconvinced with gesture devices as they are today
Mounting the Leap motion to the front of Oculus Rift headsets is seen as one way to more accurately translate hand movements and gestures into a virtual environment. Perhaps so – but a lot of people remain unconvinced about using gesture devices as we have them today

Away from the ubiquitous Oculus Rift, Simon Linden did some initial experiments with Leap Motion with Second Life in early 2013, and Drax also tried it out with some basic gesture integration using GameWAVE software, however the lack of accuracy with the earlier Leap Motion devices didn’t easily lend their use to the platform, which is why more recent attempts at integration didn’t really get off the ground. However, Leap Motion have been working to improve things.

That said, not everyone is convinced as to the suitability of such gesture devices when compared to more tactile input systems such as haptic gloves, which have the benefit of providing levels of feedback on things (so when you pick a cube up in-world, you can “feel” it between your fingers, for example). Leap certainly appears to suffer from some lack of accuracy  – but it is apparently getting better.

Given a choice, I’d probably go the haptic glove + gesture route, just because it does seem more practical and assured when it comes to direct interactions. Nevertheless, it’s interesting to see how experiments like this are progressing, particularly given the Lab’s own attempts to make the abstraction layer for input devices as open as possible on their next generation platform, in order to embrace devices such as the Leap Motion.

Related Links

Philip Rosedale and virtual worlds: “we still don’t get it yet”

As noted by Ciaran Laval, Philip Rosedale appeared at the Gigaom Roadmap event held in San Francisco on November 18th and 19th. He was taking part in a (roughly) 30-minute discussion with Gigaom’s staff writer, Signe Brewster, entitled Designing Virtual Worlds, in which he explores the potential of virtual worlds  when coupled with virtual reality, both in terms of High Fidelity and in general. In doing so, he touches on a number of topics and areas – including Second Life – providing some interesting insights into the technologies we see emerging today, aspects of on-line life that have been mentioned previously in reference to High Fidelity, such as the matter of identity, and what might influence or shape where VR is going.

This is very much a crystal ball type conversation such as the Engadget Expand NY panel discussion Linden Lab’s CEO Ebbe Altberg participated in at the start of November, inasmuch as it is something of an exploration of potential. However, given this is a more focused one-to-one conversation than the Engadget discussion, there is much more meat to be found in the roughly 31-minute long video.

Philip Rosedale in conversation with Gigaom's Signe Brewster
Philip Rosedale in conversation with Gigaom’s Signe Brewster

Unsurprisingly, the initial part of the conversation focuses very much on the Oculus Rift, with Rosedale (also unsurprisingly, as they’re all potentially right) agreeing with the likes of the Engadget panel, Tony Parisi, Brendan Iribe, Mark Zurkerberg et al, that the Oculus Rift / games relationship is just the tip of the iceberg, and there there is so much more to be had that lies well beyond games. Indeed, he goes so far to define the Oculus / games experience as “ephemeral” compared to what might be coming in the future. Given the very nature of games, this is not an unreasonable summation, although his prediction that there will only be “one or two” big game titles for the Rift might upset a few people.

A more interesting part of the discussion revolves around the issue of identity, when encompasses more than one might expect, dealing with both the matter of how we use our own identity as a means of social interaction – through introducing ourselves, defining ourselves, and so on, and also how others actually relate to us, particularly in non-verbal ways (thus overlapping the conversation with non-verbal communications.

Identity is something Rosedale has given opinion on ion the past, notably through his essay on Identity in the Metaverse from March 2014 –  recommended reading to anyone with an interest in the subject. The points raised are much more tightly encapsulated here in terms of how we use our name as a means of greeting, although the idea of of trust as an emerging currency in virtual environments is touched upon: just as in the physical world, we need to have the means to apply checks and balances to how much we reveal about ourselves to others on meeting them.

Can the facial expressions we use, exaggerated or otherwise, when talking with others be as much a part of out identity as our looks?
Can the facial expressions we use, exaggerated or otherwise, when talking with others be as much a part of out identity as our looks?

The overlap between identity and communication is graphically demonstrated in Rosedale’s relating of an experiment carried out at High Fidelity. This saw several members of the HiFi team talking on a subject, a 3D camera being used to capture their facial expressions and gestures, recording them against the same “default” HiFi avatar.  When a recording of the avatar was selected at random and played by to HiFi staff sans any audio, they were still very quickly able to identify who the avatar represented, purely by a subconscious recognition of the way facial expression and any visible gestures were used.

This is actually a very important aspect when it comes to the idea of trust as virtual “currency”, as well as demonstrating how much more we may rely on non-verbal communication cues than we might otherwise realise. If we are able to identify people we know – as friends, as work colleagues, business associates, etc. – through such non-verbal behavioural prompts and cues, then establishing trust with others within a virtual medium which allows such non-verbal prompts to be accurately transmitted, can only more rapidly establish that exchange of trust, allowing for much more rapid progression into other areas of interaction  and exchange.

Interaction and exchange also feature more broadly in the conversation. There is, for example the difference in the forms of interaction which take place within a video game and those we’re likely to encounter in a virtual space. Those used in games tend to be limited to what is required in the game itself – such as shooting a gun or running.

If 3D spaces can be made to operate as naturally as we function in the real world - such as when handing some something, as Mr. Rosedale is miming, might they become a more natural extension of our lives?
If 3D spaces can be made to operate as naturally as we function in the real world – such as when handing some something, as Mr. Rosedale is miming, might they become a more natural extension of our lives?

Obviously, interactions and exchanges in the physical world go well beyond this, and finding a means by which natural actions, such as the simple act of shaking hands or passing a document or file to another person can be either replaced by a recognisable virtual response, or replicated through a more natural approach than opening windows, selecting files, etc., is, Rosedale believes, potentially going to be key to a wider acceptance of VR and simulated environments in everyday life.

There’s a certain amount of truth in this, hence the high degree of R&D going on with input devices from gesture-based tools such as Leap Motion or haptic gloves or some other device. But at the same time, the mouse / trackpad / mouse aren’t going to go away overnight. There are still and essential part of our interactions with the laptops in front of us for carrying out a ranges of tasks that also aren’t going to vanish with the arrival and growth of VR. So any new tool may well have to be as easy and convenient to use as opening up a laptop and then starting to type.

Drawing an interesting, on a number of levels, comparison between the rise of the CD ROM and the impact of the Internet’s arrival, Rosedale suggests that really, we have no idea where virtual worlds might lead us simply because, as he points out, even now “we don’t get it yet”. The reality is that the potential for virtual spaces is so vast, it is easy to focus on X and Y and predict what’s going to happen, only to have Z arrive around the same time and completely alter perceptions and opportunities.

There are some things within the conversation that go unchallenged. For example, talking about wandering into a coffee shop, opening your laptop and then conducting business in a virtual space is expressed as a natural given. But really, even with the projected convenience of use, is this something people will readily accept? Will they want to be sitting at a table, waving hands around, staring intently into camera and sharing their business with the rest of the coffee shop in a manner that potentially goes beyond wibbling loudly and obnoxiously  over a mobile phone? Will people want to do business against the clatter and noise and distractions of an entire coffee shop coming over their speakers / headphones from “the other end”? Will we want to be seated next to someone on the train who is given to waving arms and hands, presenting  corner-eye distraction that goes beyond that encountered were they to simply open a laptop and type quietly? Or will we all simply shrug and do our best to ignore it, as we do with the mobile ‘phone wibblers of today?

That said, there is much that is covered with the discussion from what’s been learnt from the development of Second Life through to the influence of science-fiction on the entire VR/VW medium, with further focus on identity through the way people invest themselves in their avatar in between, until we arrive at the uncanny valley, and a potential means of crossing it: facial hair! As such, the video is a more than worthwhile listen, and I challenge anyone not to give Mr. Rosedale a sly smile of admiration as he slips-in a final mention of HiFi is such a way as to get the inquisitive twitching their whiskers and pulling-up the HiFi site in their browser to find out more.