The September newsletter from High Fidelity appeared at the end of that month, with Chris Collins highlighting some of the work that has been going on of late, providing an update on particle effects, procedural textures and – most interestingly – avatar kinematics and in-world object manipulation using an avatar’s hands and via suitable controllers.
Procedural textures allow for complex, algorithm based textures to be created using tools such as ShaderToy and used directly within High Fidelity. Brad Davis has created a video tutorial on procedural entities which Chris references in the newsletter, the write-up also follows a short video released on the High Fidelity you Tube channel which briefly demonstrates procedural textures in HiFi.
However, it is the object manipulation that’s likely to get the most attention, together with avatar kinematics and attempts to imply a force when moving an object.
In terms of avatar kinematics, Chris notes:
In 2016, when the consumer versions of the HMD’s are released, you are also going to be using a hand controller. It is therefore important that we can make your avatar body simulate correct movement with the hand data that we receive back from the controllers.
The results are shown in the newsletter in the form of some animated GIFs. In the first, Chris’ avatar is shown responding to a Hydra controller for hand movements and echoing his jaw movements. The second demonstrates object manipulation, with Chris’ avatar using its hand to pick up a block from an in-world game, echoing Chris’ motions using a hand-held controller.
Manipulating in-world objects in High Fidelity via an avatar’s hands and a set of controllers (image: High Fidelity)
The animation in picking up the block may not be entire accurate at this point in time – the block seems to travel through the avatar’s thumb as the wrist is rotated – but that isn’t what matters. The level of manipulation is impressive, and it’ll be interesting to see if this might be matched with things like feedback through a haptic style device, so that users can really get a sense of manipulating objects.
The object manipulation element, together with attempts to imply a force when moving objects in-world which make up a core part of the video accompanying the newsletter (and which is embedded below). Again, this really is worth watching, as the results are both impressive, and illustrate some of the problems High Fidelity are trying to solve in order to give virtual spaces greater fidelity.
Coupling object manipulation with implied force opens up a range of opportunities for things like in-world games, physical activities, puzzles, and so on. There’s also potential for learning and teaching as well, so it’ll be interesting to see how this aspect of the work develops.
The newsletter also promises that we’ll be seeing some further VR demo videos from High Fidelity in October, so keep an eye out for those as well.
I’m rather into space and astronomy – that much should be obvious from my Space Sunday reports, and coverage of mission like the Curiosity rover, astronomical events like the transit of Venus and so on.
So when High Fidelity posted news on the 2015 summer intern project, and the words “solar system” featured in it, my attention was grabbed. The post opens:
Hello! I’m Bridget, and I’ve been interning at High Fidelity this summer, working to build some JavaScript content in HF. As a math and computer science major, I had the opportunity to hone my programming skill set, learning from Hifi’s superb team of software engineers and design-minded innovators.
So here’s the culmination of my work this summer: a virtual orbital physics simulation that provides an immersive, interactive look at our solar system.
Bridget’s solar system model correctly simulates the movement of planetary bodies around a stellar object , utilising both Newton’s and Kepler’s laws, thus producing a dynamic teaching model for orbital mechanics and gravity – with a potential application for teaching aspect of physical cosmology
The goal of Bridget’s project is to demonstrate what can be built using JavaScript (and some C++), with a particular emphasis on building educational content in High Fidelity, and by using the solar system, she has come up with a highly innovative approach to teaching orbital mechanics – and more besides.
Essentially, she has created a model of the solar system which uses “real” gravitational physics to simulate the motion of the planets around the Sun. The planets themselves occupy orbits scaled relative to Earth, and fixed reference values are used for the orbital period, large and small body masses, and gravity. Then, a little Newtonian physics is thrown into the mix, together with a sprinkling of Kepler’s Laws of planetary motion. Thus, the scripting ensures that the planets maintain a stable orbit, while updates correct mimic each planet’s orbital trajectory around the Sun.
This generates a model that is interesting enough in itself, if somewhat simplified in nature, as Bridget notes, whilst also pointing to its potential for further use:
While the simulation exploits a somewhat simplified model, namely neglecting the elliptical nature of the planets’ orbits, it can easily be modified to account for additional factors such as the n-body problem.
In other words, there is the potential here to both refine the model in terms of orbital mechanics and planetary motion as a part of the teaching / learning process, and perhaps even dip a toe into physical cosmology.
the simulation includes a UI which allows users to perform a number of tasks, including playing a little game and being able to zoom into the planets.
Bridget also notes:
Another fun aspect of the project was implementing UI to create possibilities for exploration and experimentation within the simulation. A panel with icons lets you:
Pause the simulation and show labels above each planet revealing its name and current speed
Zoom in on each planet
Play a “Satellite Game” (think Lunar Lander, but with a satellite around the earth), where you attempt to fling a satellite into stable orbit
Adjust gravity and/or the “reference” period, and see what happens!
Bridget’s work marks the second time a summer intern has reported on working at High Fidelity during the summer hiatus. In 2014, Chris Collins chatted to the (then) 17-year-old Paloma Palmer, a high School student also honing her coding skills. She focused on coding voxels to respond directly to volume inputs over a microphone in real-time. You can see her discussion with Chris on the HiFi YouTube channel.
Staying with education, and following on from my coverage of High Fidelity’s STEM VR challenge, Ryan Kampf announced the first of the grant recipients on Friday, August 14th.
The VR challenge invited educators, be they individuals or groups, to take up the STEM VR Challenge, to submit proposals for educational content in High Fidelity which meets the criteria set-out in the Challenge website, namely that the content is:
HMD (e.g. Oculus Rift) featured
High school age appropriate
STEM focused
Social (can be experienced by >3 people together).
On offer were up to three grants of US $5,000 each for recipients to further develop their ideas.
In his announcement Ryan indicated that two recipients for grants had been selected from submissions: the TCaRs VR Challenge and Planet Drop VR.
Both use game mechanics, with TCaRs (Teaching Coding – a Racing simulation) enabling users get to interact with and customise their racing cars using JavaScript, while Planet Drop places players into an alien planet environment which they must explore through “cooperative asymmetrical gaming”. Each has highly specialised information, based on their chosen STEM field and provided to them via a game HUD, and the aim is for them to work together, sharing the information they receive as quickly and effectively as possible to allow the team to solve challenges and advance through a story arc of increasingly impressive accomplishments.
Conceptual illustration of the “Mech Pods” the players in Planet Drop will use to explore their alien environment
Congratulations to Bridget on her summer intern project (the script is available for those wishing to use it), and to the STEM VR challenge recipients.
There has been another recent spate of articles on Linden Lab, Project Sansar, Second Life and the potential for avatar-based virtual spaces with the upcoming advent of VR. Even Moviepilot, whom I took to task in 2014, has been busy looking at what’s going on, while Gamasutra rushed out what is essentially a nutshell version of Eric Johnson’s excellent Re/code article examining the question of the metaverse, which I looked at here.
However, the pick of the latest crop has to be Alice Truong’s article published in Quartz: Could the Oculus Rift help give Second Life a second life?While the title might sound Second-Life centric and suggestive of a piece looking at how it will faire under the Rift (“not very well”), it is anything but.
What is actually presented is a well-rounded piece on the future of avatar-based virtual spaces which uses Second Life as the measure of their mark and launchpad for their future. Within it, Second Life is examined from a number of angles and Sansar is explored, together with a nodding look towards High Fidelity.
Alice Truong: thought on virtual spaces and avatars in Quartz (image credit: Quartz.com)
As with most of the pieces which had appeared over the last month or so, little real news on Sansar (or SL’s development for that matter) is given out. This is hardly surprising, as the Lab does like to hold its cards close to its chest – the relative newness (and thus the difficulty in highlighting specific tablets-of-stone facts) of Sansar notwithstanding.
What makes this article a joy, is that it provides a solid framing for the subject of the Lab and virtual worlds, reaching back to 1999 and the original efforts with The Rig. This is nicely packaged and offers a solid foundation from which Ms. Truong expertly weave her piece. Some of the path she takes will be familiar, particularly where SL and Sansar is concerned. We get to hear about SL’s growth, revenue, the US $60 million collectively cashed-out of the platform by many of its users, etc.
We also get fair mention of the decline in the number of active users on the platform, but again, this is properly framed. At its peak, SL had around 1.1 million active users; eight-ish years later, that number stands at around 900,000. A decline, yes. but as Ebbe Altberg points out hardly any kind of “mass exodus”; and certainly nowhere near the dire haemorrhaging of users we tend to hear proclaimed to be happening every time the Lab makes what is perceived as an irksome decision.
For Sansar, similarly familiar ground is covered – the revenue model (and the comparison with SL’s model and its weakness), the promise of VR, the opportunity to grow a platform for “tens, if not hundreds” of millions of users, the aspect of much broader “discoverabiilty” / ease of access for Sansar in order to help generate more appeal, and so on.
Mention is made of the Lab planning to “commercially release” Sansar by the end of 2016. Given what has been said by the Lab to date concerning time frames for future work, and allowing for Ebbe’s comments of perhaps having something worthy of a “version 1.0” label by the close of 2016, I’m taking the comment to be more of a misunderstanding on Ms. Truong’s part than any revelation as to Sansar’s roadmap.
Hunter Walk (l), the Lab’s former “Director of Everything Non-Engineering” as well as a founder of the company, and now a VC in his own right, and Bernard Drax, aka Draxtor Despres (r) offer thoughts on Sansar
Another enjoyable element of this article is that Ms. Truong casts her net wide for input; thus she captures both Hunter Walk and Draxtor Despres. Their comments serve to both offer the means by which ideas can be further explored in the piece, and serve to offer a measure of counterpoint to the assumed mass appeal spaces like Sansar and High Fidelity will have.
Hunter Walk, for example, underlines the most critical problem in growing users Second Life has faced throughout its lifetime – that of accessibility and use. As he states, “ultimately, the work you had to put in was, for most people, more than the fun you got out.” Not only does this underline the essential truth about SL’s longest-running issue (it’s as true today for many as 2003/4), it lays the foundation for an exploration of some of Sansar’s fundamental differences to SL later in the article.
Hunter also passes comment on the idea of these spaces finding many millions of users, pointing out that “tens of millions” was always an unrealised dream at the Lab for Second Life; perhaps a cautionary warning about focusing on user numbers. He also seems to offer something of a warning on investment returns in such ventures as well, again referencing Second Life, although if intended as a warning, it is more relevant to High Fidelity (which has received around US $16.5 million in investment to date).
Draxtor similarly questions whether user numbers should necessarily be the focus / rationale for building these kind of virtual spaces. Like him, I’m far from convinced Sansar will have the kind of broad-ranging reach to draw in “hundreds of millions” (or, if I’m honest, even more than the low tens of millions). I’ve explained some of the reason why I think in my review of Eric Johnson’s piece linked to towards the top of this article, so I won’t repeat them here.
Could the promise of 2mixed reality” technologies which combine VR, AR and physical world activities yet serve to keep avatar-based virtual spaces a niche endeavour? (image: Magic Leap, via the New York Times)
If I’m honest, my only regret is that while Ms Truong’s tone is (rightly) sceptical in places, there is no outright challenge to the idea that people will embrace avatar-based interactions on a massive scale just because VR is on our doorstep.
Right now, there is a lot going on in the world of technology: VR, AR, the potential to fuse the two; faster communications capabilities, much better mobile connectivity, and so on. All of these could serve to dramatically marginalise any need to persistently engage in avatar-based interactions outside of very defined areas. As such, the inescapable whiff of “will we build it, they will use it” (to utterly mangle an already oft-misquoted line from a certain film) which seems to pervade the talk of high Fidelity and Sansar does perhaps deserve a degree of challenge.
Perhaps I should drop a line to Peter Gray suggesting an interview on those lines…
Th obligatory Sansar promo image 🙂 (please can we have some new ones?) – Linden Lab
Eric Johnson has a thought-provoking article over on re/code. In Welcome to the Metaverse, he ponders the lot of avatar-based virtual spaces, past and future, and how a number of companies – the Lab included – are betting that the “new era” of VR is going to be the means by which such spaces will become mainstream.
It’s an interesting piece, offering plenty of food for thought, starting with an opening statement by the Lab’s CEO, Ebbe Altberg, on defining human life:
What humans do is create spaces. Some spaces are mobile, like a bus. San Francisco is a space that was created by its users. Whether you go into a pub, a bar, a classroom, a bowling alley, an office, a library … We create spaces and we have people come together in those spaces, and then we communicate and socialize within those spaces.
This is actually the first thing about the article that leaves me with a familiar feeling of feeling at odds with the prevailing view of all things metaverse, albeit for a slightly different reason. With due respect to Mr. Altberg, people didn’t come together as a result of building spaces. They built spaces as a result of coming together. However, as an opening gambit for a study of this thing we call the “metaverse”, it’ll do as an opener.
Eric Johnson, Associate editor, Gaming at Re/code (via LinkedIn)
From here, Mr. Johnson give us the pocket introduction to “the metaverse” via the obligatory (and rightful) nod to Neal Stephenson while simultaneously dispensing quickly with a look at the “past promise” of virtual spaces that didn’t in the end measure-up to the expectations.
This leads the way to a clever little nod to the book which has become this decade’s “Snowcrash” – in the form of Ernest Cline’s Ready Player One (which is actually a very good read) – as a means to introduce the main three companies he sees as currently vying for space in “the metaverse” – the Lab, High Fidelity and AltspaceVR.
Chances are the Sansar and High Fidelity are already well-known to people reading these pages, which AltspaceVR may have passed some unnoticed. As the article points out, they’ve been developing avatar-based VR for the last couple of years, focusing on shared spaces (watching a film with a friend who is halfway across the world for example), and scheduled events, including gaming weekends, etc.
AltspaceVR also has some ideas for business applications with their environments, which they are planning to offer on a pay-to-use basis. And while their avatars main have been viewed with disdain by some, there are a couple of points to bear in mind where the company is concerned.
The first is that as a result of watching some of AltspaceVR’s virtual interactions, Mark Zuckerberg caught the social VR bug, and Facebook went after Oculus VR, with the subsequent $2 billion acquisition (which was actually quite a modest punt when compared to the $19 billion the company had earlier spent on a proven technology in WhatsApp).
The second is that the company, which has been around about as long at Philip Rosedale’s High Fidelity, has almost raised a comparable amount in funding – around $15.7 million to date (SEC filings indicate High Fidelity has raised around $16.5 million), and both are working at solving many of the same technical issues – head and motion tracking, eye tracking, etc.,
Beyond this, others interested in making a pitch into the metaverse space, as Mr. Johnson mentions are IMVU, which has around 15% of it’s 130+ staff now working on trying to integrate VR into its existing spaces (a-la the Lab’s early effects with SL and the Rift), and a small New York based start-up, focusing on VR social games with around $300,000 in seeding money. called Surreal, the 4-person company is billing itself as “the first fully immersive virtual world”, which is focused entirely on using VR HMDs (Oculus, Gear VR and Cardboard).
Johnson attempts to split his examination of the metaverse into two views: the short-term and the long-term. In doing so, he inevitably points to the elephant in the room: Facebook. In this, he quotes Palmer Luckey, who gives a fair warning as to whether or not “the metaverse” is around the corner, and which stands as a cautionary warning, in more ways than one:
I think at this point the term ‘metaverse’ is a bit undefined. For any one company to say, ‘We are building the metaverse’ is pretty hyperbolic. Building all the pieces is going to be hard, and the way you imagine things in sci-fi doesn’t always translate over to the way things will be in the real world.
Palmer Luckey: prescient words on “the metaverse”?
He has a very valid point; and with today’s rapidly evolving pace of technology, it’s one worth keeping in mind; the technical issues people see today as only being surmountable through the use of avatars may not actually be technical issues a few years hence.
Interestingly, Johnson places this in the “short-term” view – although both Oculus VR and Facebook have always talked in terms of “the metaverse” still being around a decade away. For the longer term, Johnson looks in particular at High Fidelity’s work and also the Second Life revenue generation success (and, despite the naysayers out there SL is a commercial success, both for the Lab and its users, the latter of whom benefited with collective revenues of $60 million from the platform in 2014), before taking another look at AltspaceVR.
There is a lot to be digested in the piece, and it makes for a good read. However, for me, Palmer Luckey’s warning that how things don’t always match the real world tends to stand out a lot when a lot of the approach being then with avatar-based virtual spaces tend to smack of the “if you build it, they will use it” approach.
I don’t doubt for a minute that spaces will have a lot of applications among various vertical markets. It is no coincidence that the likes of Philip Rosedale and Ebbe Altberg talk much of the same language concerning them: education, training, healthcare, business; there is potential for avatar-based VR spaces in all of them. But I’m still not convinced that longer-term, such spaces are going to claim a much large market among causal consumers than is currently the case, for a couple of reasons.
The first is that the vast majority of people really haven’t seen the need to “climb in” to an avatar for their social interactions – and getting a shiny new headset (which Johnson quotes some rather interesting demographics about) isn’t actually going to change that. The second is connected to the headsets themselves.
High Fidelity and Linden Lab see the education sector as a major focus for their efforts – and neither is wrong. But are avatar-based virtual spaces really going to go consumer mass market?
Simply put, it would seem likely that this brave new world of VR could end-up delivering so many fantastic experiences and opportunities to the casual user, that the majority still won’t see the need to invest time and effort in creating a virtual alter-ego of the kind we desire (and we, as SL / OpenSim users are a niche), because so much else is being delivered to them pre-packaged and ready-to-go. Thus, as Palmer Luckey indicates, the chances are “the metaverse” could well arrive in our lives in a manner very different to that being envisaged by High Fidelity and Linden Lab, thus leaving their approach still very much niche-oriented.
Not that there is anything wrong with that either. As both Rosedale and the Lab can demonstrate, it’s done them rather nicely over the years. And it is fair to say that “niche” this time around a liable to be somewhat larger, simply because of the vertical market opportunities they’re looking at.
Even so, and as mentioned, there is this optimistic we “build / they come” aspect to the whole idea of avatar-based vertical spaces that it would be nice to see an article probing the pros and cons a little more. Perhaps that might be something for a follow-up from Mr. Johnson? In the meantime, Welcome to the Metaverse is a thought-provoking read, and for reasons I’ve not even scratched at here (such as the question of on-line abuse), as such, it’s not one to miss.
One of the big use-cases is going to be kids maybe doing an extra, like instead of doing their homework in the normal way in the evening, they go on-line where they join a study group where they join a teacher..
So opens segment #75 of the with some thoughts from Philip Rosedale, co-founder of Second Life, and more particularly now the CEO of start-up virtual worlds company, High Fidelity.
At just over 89 minutes in length, this is a special show, exploring High Fidelity from the inside, so to speak, complete with conversations with Mr. Rosedale, Ryan Karpf (HiFi’s co-founder and ex-Linden), Chris Collins and Ozan Serim, while David Rowe (perhaps more familiarly known to SL users as Strachan Ofarrel creator of the Oculus Rift compatible CtrlAltStudio viewer), who has been working with the HiFi team, becoming a guest host for the segment.
Since its founding, High Fidelity has made remarkable strides in developing its next generation, open-source virtual world environment, both technically and financially. Since April 2013, the company has undergone three rounds of funding, attracting around US $16 million, most of which has come from True Ventures, Google Ventures and, most recently, Paul Allan’s Vulcan Capital (which also participated in the October 2014 US $542 million investment round for Magic Leap). In addition, HiFi has attracted a number of high-profile advisers, including VR veteran Tony Parisi and, most recently, professors Ken Perlin and Jeremy Bailenson.
As well as Philip Rosedale, Drax talks with Chris Collins (l), Ryan Karpf and Ozan Serim from high Fidelity
The interviews themselves are quite wide-ranging. With Dave Rowe, (known in HiFi as CtrlAltDavid) the open-source nature of the platform is explored, from the ability to download and run your owner HiFi server (aka “Stack Manager“) and client (aka “Interface“), through to the concept of the worklist, which allows contributors to bid for work on offer and get paid based on results.In Dave’s case, this has led him to working on various aspects of the platform such as integrating Leap Motion capabilities to improving eye tracking within HiFi’s avatars, so they track the movements of other avatars, just as our own eyes track other people’s facial and other movements as they interact with us.
In terms of general looks, the avatars – which have in the past been critiqued for being “cartoony” (despite it is still very early days for HiFi) – are still very much under development. In particular, Ozan Serim has been working to raise – and no pun intended here – the overall fidelity of the avatars in terms of looks and capabilities. He’s well-placed to do so, being an ex-Pixar animator.
One of the problems here is that the more real in appearance and capabilities they get, the closer the avatars come to the Uncanny Valley, which has led HiFi and Ozan to look at a number of avatar styles, from those which are very human in appearance through to those that are more “cartoonish” in looks.
A 2014 video showing Ozan’s work in improving the rigging around a more “realistic” HiFi avatar to more actually reflect mouth forms and facial movement when singing. High Fidelity now use Faceshift for real-time facial expression capture, rigging and animation, using either 3D or standard webcams
In discussing the Uncanny Valley, and particularly people’s reactions to avatars that are somewhat less-than-real (and we can include SL avatars in this, given their inability to naturally reflect facial expressions), Ozan raises the interesting question of whether people who critique the look of such avatars actually want to have a “realistic” looking avatar, or whether it is more a case of people wanting an avatar look that is appealing to their aesthetics which they can they identify with.
This is and interesting train of thought, as it is certainly true that – limitations of the avatar skeleton aside – most of us in Second Life are probably more driven to develop our avatars to a point where they have a personal aesthetic appeal, rather than in wanted them to be specifically “more realistic”.
Currently, HiFi is leaning towards a somewhat stylised avatar as seen in Team Fortress 2, which is allowing them to develop a natural-looking avatar look that doesn’t come too close to the Uncanny Valley. They use Adobe Maximo as their avatar creation tool, which Ozan views as a capable workflow package, but which may have some creative limitations. However, as an open-source environment, HiFi does offer the potential for someone to script in “in-world” character modelling tools, or at least to offer upload capabilities for avatar model generated in tools such as Blender. Avatars can also, if wanted, by uploaded as a complete package with all required / defined animations, such as walks, etc, included.
Chris Collins has very much become the voice of High Fidelity on You Tube, producing a wide range of videos demonstrating features of the platform, together with short tutorial pieces. The video above is one of his, demonstrating how to code interactive 3D content, using the Planky game as an example
While Ozan and his team work on avatar animations and rigging using real-time capture, Ryan Karpf reveals that by default, an avatar’s facial expressions are driven via the audio more than by direct capture: the mouth movement, for example, comprises 3 positions based on the audio, while a rising of voice or tone can result in the avatar’s eyebrows rising and falling. Ryan also touches on the Uncanny Valley issue of people’s increasingly discomfiture the closer avatars become to looking “photo-realistic”.
In talking to Chris Collins, an ex-Linden Lab alumni who headed the former SL Enterprise division, who now wears a number of hats at HiFi, Drax discusses how HiFi deals with the ever-changing face of the emerging VR hardware market, where headsets, input, tracking, and so on, is in something of a state of flux. Chris points out that while open-source, HiFi does have a set of strict coding standards and licensing, and offer external libraries to help support third-party SDK integration.
One of the powerful elements of High Fidelity is the ability you to have full agency over your environment, if you so wish; using the Stack Manager, you can create your own server / world / space, and control who might access it. The scripting tools similarly allow users to download and tweak elements – such as walking animations, a basic avatar appearance, etc., quickly and easily.
Few people involved in VR and augmented reality are unconvinced that these emerging technologies will have a profound effect on education and teaching. As has been seen in both Second Life and Open Simulator, even without immersive VR, virtual environments offer a huge opportunity to education.
Now High Fidelity is joining in, and is doing so in a novel but enticing way: by offering up to three US$5,000 grants to teams or individuals who want to build educational content within High Fidelity.
The new of the opportunity, which the HiFi team is calling the “STEM VR Challenge” (STEM being the acronym for Science, Technology, Engineering and Mathematics in education), was made via a blog post on the High Fidelity website from Ryan Karpf. In it, Ryan says:
High Fidelity recently had the pleasure of showing off our open source virtual reality platform to educators and technical integrators at the ISTE conference in Philadelphia.
To demonstrate one way educators can use our platform, High Fidelity worked with DynamoidApps to develop an interactive model of an animal cell that can be explored on one’s own or with an entire class. The vast alien looking environment goes beyond just showing the parts of the cell, also showing some of the processes taking place. Travelling around with your classmates and teacher allows for real time question and answers and sharing of ideas.
If you want to visit this animal cell, login and go to cellscience/start, and fly towards any cell you see to begin your journey. Hitch a ride on a motor protein and jump off at one of the huge mitochondria along the way!
The interactive model of an animal cell created by High Fidelity, working with DynamoidApps (image courtesy of High Fidelity)
The model itself, in keeping with High Fidelity’s open-source approach to their platform, is being offered free to any who wishes to modify it, with the companying hoping it will become the first of a catalogue of educational units created within High Fidelity.
To further kick-start things, High Fidelity are inviting educators, be they individuals or groups, to take up the STEM VR Challenge, to submit proposals for educational content in High Fidelity which meets the criteria set-out in the Challenge website, namely that the content is:
HMD (e.g. Oculus Rift) featured
High school age appropriate
STEM focused
Social (can be experienced by >3 people together)
Proposals meeting these criteria and abiding by the rules and are eligible to enter the Challenge, should be submitted via e-mail to eduvrgrant-at-highfidelity.com. On offer are up to three grants of US$5,000 apiece to help further develop the selected ideas. In addition, awardees will have direct access to High Fidelity’s technical support, and have their content hosted by High Fidelity. To find out more, follow the links to the High Fidelity blog and the STEM VR website.