Sinewave.space: a further contender for virtual spaces

The above video has been gaining attention since first appearing on You Tube at the end of August. It’s advance promotion for a new virtual worlds platform called Sinewave.space, built using the Unity 3D engine, and which may be opening its doors to initial users in December 2015.

The company behind Sinewave.space is Sine Wave Entertainment, a name which may be familiar to many Second Life users, given it is also the company behind the highly successful Sine Wave animations brand in-world.

Spearheading the work is Sine Wave’s CEO, Adam Frisby, a man who has considerable experience with virtual world platforms, having been one of the founders of the OpenSimulator project. In Second Life he is probably better known as Adam Zaius, the man behind such ventures as Azure Islands and the DeepThink virtual worlds development agency, which operated in both Second Life and OpenSim.

Adam Frisby is perhaps more recognisable to many in SL as Adam Zaius
Adam Frisby is perhaps more recognisable to many in SL as Adam Zaius

Nor are Sine Wave Entertainment new to the virtual worlds market. They’ve built and operated a number of virtual world spaces themselves, and they’ve produced virtual world spaces on behalf of clients, with all of their products created using the Unity 3D engine.

Perhaps the largest of their own environments is Wet.fm, a music-focused virtual environment claiming some 400,000 “live audience members”, 120 artists and some 600 music events held to date.

Chief among client-oriented spaces the company have developed is Flybar, a “multiplayer social game and on-line cinema for [the] globally distributed Spanish language soap opera Cuéntame cómo pasó“, and  which claims 1.2 million unique visitors since  2012, together with the Gojiyo virtual world / platform The latter was originally developed for India’s Godrej Industries and boasts 1.7 million registered users. It also appears to have what might be called associated games or spin-offs, such as Jiyopets.

Sine Wave are responsible for the India-based virtual world, GoJiyo, for Godrei Industries, and which boasted 1.7 million registered users

Reading the available information about sinewave.space, it’s interesting to note the similarities in approach between it and Project Sansar. For example, both platforms are intended to be white label environments in which creators can build their own branded spaces, and then promote  / market them directly to their potential audience, complete with sign-up portal, etc., without that audience necessarily being aware that the space they are entering is part of a platform providing many such spaces / experiences.

Further, both companies indicate the spaces within each platform could potentially be of unlimited size (Sine Wave indicate bandwidth, and Linden Lab the physics simulator, as being the only practical limitations to “land size”);  both platforms will offer a mix of “in-built” tools as well as support for a broad range of 3rd party tools for content creation – although Sine Wave would appear to be significantly further down the road in this. Sine Wave and Linden Lab also appear to be steering a similar course in terms of offering central user account management, virtual goods marketing, etc., which can be used across multiple environments running on their platforms.

Among their tool chain - which includes an advanced animation / gesture system - is the "humanoid resizer", intended to allow mesh clothing sized for "poular avatar skeletons" to be automatically resized to fit the primary Sinewave.space avatar skeleton
Sine Wave are offering a content creation tool chain which includes an advanced animation / gesture system and, as illustrated above, a “humanoid resizer” tool, designed to allow mesh clothing sized for “popular avatar skeletons” to be automatically resized to fit the primary Sinewave.space avatar skeleton

Which should not be taken to mean I think the two are in any way connected – I don’t. Rather, I find it interesting that two companies, each with their own approach to building and running immersive 3D spaces, have arrived at a similar conceptual approach as to how to build a platform aimed at being flexible enough in design and implementation to appeal to a wide cross-section of potential use-cases, without necessarily tying creators / clients / partners – or indeed, users – to a single branded environment.

Obviously, there are differences as well. For example, Sine Wave have indicated that among the worlds running on Sinewave.space will be a number of their own spaces – such as the aforementioned wet.fm, which is due for a re-launch under the sinewave.space banner in the near future – with the Sine Wave portfolio listing a number (all?) of such spaces which might be candidates for inclusion.

Sine Wave also produce Convvirt, a business-oriented space built on Unity 3D. Whether it is to form a part of the overall sinewave.space "federation" of virtual worlds is unclear, but it is listed as a part of the also listed under the Sinewave.space portfolio, so one assumes so
Sine Wave also produce Convvirt, a business-oriented space built on Unity 3D. Whether it is to form a part of the overall Sinewave.space “federation” of virtual worlds is unclear, but it is listed as a part of the also listed under the Sine Wave portfolio bearing the Sinewave.space brand, so one assumes so

Sine Wave also have the advantage of building on an engine – Unity 3D – with which they have many years of experience of both operating and using to build virtual spaces, rather than starting entirely from scratch. Lessons learned from past efforts can be put directly to use. They are also well-versed in the tools an capabilities contained within the engine without having to go through an internal learning curve as a part of the development process, and they have experience in combining the tools within the engine with their own tools – motion capture, animation, etc. – to present creators with an integrated tool chain.

As it is, and as noted earlier, Sine Wave are seeking content creators – region designers, clothing designers, animators and gesture designers, vehicle builders, and more – and in doing so, they’re offering those signing-up a 70/30 (in the creator’s favour) revenue split on all content sold within the platform’s worlds when they are opened to users. Those interested should follow the above link to find out more.

It’ll be interesting to see how sinewave.space develops over the coming months, both independently as with Project Sansar as a possible frame of reference (and even vice-versa), and I hope to be able to provide updates on progress through these pages.

Note; this article was largely drafted prior to show #84 of the Drax Files Radio Hour podcast, in which Drax talks to Adam Frisby about Sinewave.space. You can hear the conversation starting at the 34:30 mark, with an introduction by Drax.

High Fidelity: into the solar system and STEM grant recipients

HF-logoI’m rather into space and astronomy – that much should be obvious from my Space Sunday reports, and coverage of mission like the Curiosity rover, astronomical events like the transit of Venus and so on.

So when High Fidelity posted news on the 2015 summer intern project, and the words “solar system” featured in it, my attention was grabbed. The post opens:

Hello! I’m Bridget, and I’ve been interning at High Fidelity this summer, working to build some JavaScript content in HF. As a math and computer science major, I had the opportunity to hone my programming skill set, learning from Hifi’s superb team of software engineers and design-minded innovators.

So here’s the culmination of my work this summer: a virtual orbital physics simulation that provides an immersive, interactive look at our solar system.

Bridget's solar system model correctly simulates the movement of planetary bodies around a stellar object , utilsing both Newton's and Kepler's laws, thus producing a dynamic teaching model for orbital mechanics and gravity
Bridget’s solar system model correctly simulates the movement of planetary bodies around a stellar object , utilising both Newton’s and Kepler’s laws, thus producing a dynamic teaching model for orbital mechanics and gravity – with a potential application for teaching aspect of physical cosmology

The goal of Bridget’s project is to demonstrate what can be built using JavaScript (and some C++), with a particular emphasis on building educational content in High Fidelity, and by using the solar system, she has come up with a highly innovative approach to teaching orbital mechanics – and more besides.

Essentially, she has created a model of the solar system which uses “real” gravitational physics to simulate the motion of the planets around the Sun. The planets themselves occupy orbits scaled relative to Earth, and fixed reference values are used for the orbital period, large and small body masses, and gravity. Then, a little Newtonian physics is thrown into the mix, together with a sprinkling of Kepler’s Laws of planetary motion. Thus, the scripting ensures that the planets maintain a stable orbit, while updates correct mimic each planet’s orbital trajectory around the Sun.

This generates a model that is interesting enough in itself, if somewhat simplified in nature, as Bridget notes, whilst also pointing to its potential for further use:

While the simulation exploits a somewhat simplified model, namely neglecting the elliptical nature of the planets’ orbits, it can easily be modified to account for additional factors such as the n-body problem.

In other words, there is the potential here to both refine the model in terms of orbital mechanics and planetary motion as a part of the teaching / learning process, and perhaps even dip a toe into physical cosmology.

the simulation
the simulation includes a UI which allows users to perform a number of tasks, including playing a little game and being able to zoom into the planets.

Bridget also notes:

Another fun aspect of the project was implementing UI to create possibilities for exploration and experimentation within the simulation. A panel with icons lets you:

  • Pause the simulation and show labels above each planet revealing its name and current speed
  • Zoom in on each planet
  • Play a “Satellite Game” (think Lunar Lander, but with a satellite around the earth), where you attempt to fling a satellite into stable orbit
  • Adjust gravity and/or the “reference” period, and see what happens!

Bridget’s work marks the second time a summer intern has reported on working at High Fidelity during the summer hiatus. In 2014, Chris Collins chatted to the (then) 17-year-old Paloma Palmer, a high School student also honing her coding skills. She focused on coding voxels to respond directly to volume inputs over a microphone in real-time. You can see her discussion with Chris on the HiFi YouTube channel.

Staying with education, and following on from my coverage of High Fidelity’s STEM VR challenge, Ryan Kampf announced the first of the grant recipients on Friday, August 14th.

The VR challenge invited educators, be they individuals or groups, to take up the STEM VR Challenge, to submit proposals for educational content in High Fidelity which meets the criteria set-out in the Challenge website, namely that the content is:

  • HMD (e.g. Oculus Rift) featured
  • High school age appropriate
  • STEM focused
  • Social (can be experienced by >3 people together).

On offer were up to three grants of US $5,000 each for recipients to further develop their ideas.

In his  announcement Ryan indicated that two recipients for grants had been selected from submissions: the TCaRs VR Challenge and Planet Drop VR.

Both use game mechanics, with TCaRs (Teaching Coding – a Racing simulation) enabling users get to interact with and customise their racing cars using JavaScript, while Planet Drop places players into an alien planet environment which they must explore through “cooperative asymmetrical gaming”. Each has highly specialised information, based on their chosen STEM field and provided to them via a game HUD, and the aim is for them to work together, sharing the information they receive as quickly and effectively as possible to allow the team to solve challenges and advance through a story arc of increasingly impressive accomplishments.

Conceptual illustration of the "Mech Pods" the players in Planet Drop will use to explore their alien environment
Conceptual illustration of the “Mech Pods” the players in Planet Drop will use to explore their alien environment

Congratulations to Bridget on her summer intern project (the script is available for those wishing to use it), and to the STEM VR challenge recipients.

OpenSimulator: Justin Clark-Casey steps back

Maria Korolov on Hypergrid Business covers the news that Justin Clark-Casey is significantly scaling-back his involvement in OpenSimulator development.

Justin Clark-Casey
Justin Clark-Casey

For those deeply entrenched in Second Life, his name may well pass unnoticed. However, since 2007, Justin has been deeply involved in OpenSimulator, as both a core developers and as a founding member and first president of the Overte Foundation, a non-profit organisation that manages contribution agreements for the OpenSimulator project.

Just how big a role he has played can in part be seen through the 11,631 code commits he has personally made to the project over eight years  – that averages out to just under four commits every single day.

Justin announced his decision to step back from what has been a central role within the OpenSimulator in a blog post, where he emphasised that he’s doing so in part because he’s shifting career, although he makes it clear he is not leaving OpenSimulator entirely; it just won’t be a primary focus in his life in the foreseeable future:

OpenSimulator (and the Metaverse in general) has been an amazing journey but, as they say, we have grown apart. For whatever reason the area doesn’t fascinate me as it did. For better or for worse, that’s crucial for me to feel happy in my work.

I’m not disappearing completely but very likely for the immediate future my involvement will be at a low ebb (mainly answering mailing list questions and the occasional bug fix). My new field is quite a bit different (data warehousing for genetics and synthetic biology) but I will always have a soft spot for virtual worlds and the idea of the Metaverse.

Justin Clark-Casey's code commits to OpenSimulator amount to 11,631 over eight years, work that has involved him in laying many of the foundations for the project and in re-factoring much of the code-base in 2011/12 (source: Black Duck’s Open Hub open source project tracker, via Hypergrid Business)
Justin Clark-Casey’s code commits to OpenSimulator amount to 11,631 over eight years, work that has involved him in laying many of the foundations for the project and in re-factoring much of the code-base in 2011/12 (source: Black Duck Open Hub open source project tracker, via Hypergrid Business)

As well as his own code contributions, Clark-Casey has been noted for carrying out a significant portion of the work required integrate patches submitted by others, and has also taken on many of the organisational duties and activities which have perhaps been seen as somewhat onerous by other developers.

His popularity and import to the OpenSimulator community can be measures by the outpouring of personal thanks and testimonials which followed his own blog post and featured in Maria’s Hypergrid Business article.

According to Maria, Justin’s announcement has led to some concerns as to the future of the project. While there has never been a single de facto leader for the platform and its very diverse and global community, Clark-Casey has very much been the public face of the platform, hence some of the concerns raised.

However, as others central to the platform’s development have been quick to point out, this is not the first time a key figure has opted to set back from the platform. As it is, the team of core developers has changed over the years and remains strong. Similarly, OpenSimulator itself enjoys broad-based support and engagement from individuals, groups, education and academia and business. As such, there is little need to doubt its foreseeable future.

“Open source development has a high churn of people, for many reasons, and many times people who have been there for a long time simply decide to leave and do something else,” Crista Lopes, creator of the Hypergrid, told is quoted as saying in Hypergrid Business. “The good thing about open source projects is that, if people find them useful or interesting, the projects survive any one particular developer’s absence. That will happen with OpenSim too.”

I only had cause to talk to Justin twice over the years, and was certainly not in any way acquainted with him. However, as a very occasional OpenSimulator visitor (notably via Kitely, OSGid and InWorldz), I offer my own thanks to him for all of his contributions to the OpenSim community, and best wishes as he enters a new stage in his career.

Related Links

 

The Drax Files Radio Hour: giving it the HiFi!

radio-hourOne of the big use-cases is going to be kids maybe doing an extra, like instead of doing their homework in the normal way in the evening, they go on-line where they join a study group where they join a teacher..

So opens segment #75 of the with some thoughts from Philip Rosedale, co-founder of Second Life, and more particularly now the CEO of start-up virtual worlds company, High Fidelity.

At just over 89 minutes in length, this is a special show, exploring High Fidelity from the inside, so to speak, complete with conversations with Mr. Rosedale, Ryan Karpf (HiFi’s co-founder and ex-Linden), Chris Collins and Ozan Serim, while David Rowe (perhaps more familiarly known to SL users as Strachan Ofarrel creator of the Oculus Rift compatible CtrlAltStudio viewer), who has been working with the HiFi team, becoming a guest host for the segment.

Since its founding, High Fidelity has made remarkable strides in developing its next generation, open-source virtual world environment, both technically and financially. Since April 2013, the company has undergone three rounds of funding, attracting around US $16 million, most of which has come from True Ventures, Google Ventures and, most recently, Paul Allan’s Vulcan Capital (which also participated in the October 2014 US $542 million investment round for Magic Leap). In addition, HiFi has attracted a number of high-profile advisers, including VR veteran Tony Parisi and, most recently, professors Ken Perlin and Jeremy Bailenson.

As well as Philip Rosedale, Drax talks with Chris Collins (l), Ryan Kampf and Ozan Serim from high Fidelity
As well as Philip Rosedale, Drax talks with Chris Collins (l), Ryan Karpf and Ozan Serim from high Fidelity

The interviews themselves are quite wide-ranging. With Dave Rowe, (known in HiFi as CtrlAltDavid) the open-source nature of the platform is explored, from the ability to download and run your owner HiFi server (aka “Stack Manager“) and client (aka “Interface“), through to the concept of the worklist, which allows contributors to bid for work on offer and get paid based on results.In Dave’s case, this has led him to working on various aspects of the platform such as integrating Leap Motion capabilities to improving eye tracking within HiFi’s avatars, so they track the movements of other avatars, just as our own eyes track other people’s facial and other movements as they interact with us.

In terms of general looks, the avatars – which have in the past been critiqued for being “cartoony” (despite it is still very early days for HiFi) –  are still very much under development. In particular, Ozan Serim has been working to raise –  and no pun intended here – the overall fidelity of the avatars in terms of looks and capabilities. He’s well-placed to do so, being an ex-Pixar animator.

One of the problems here is that the more real in appearance and capabilities they get, the closer the avatars come to the Uncanny Valley, which has led HiFi and Ozan to look at a number of avatar styles, from those which are very human in appearance through to those that are more “cartoonish” in looks.

A 2014 video showing Ozan’s work in improving the rigging around a more “realistic” HiFi avatar to more actually reflect mouth forms and facial movement when singing. High Fidelity now use Faceshift for real-time facial expression capture, rigging and animation, using either 3D or standard webcams

In discussing the Uncanny Valley, and particularly people’s reactions to avatars that are somewhat less-than-real (and we can include SL avatars in this, given their inability to naturally reflect facial expressions), Ozan raises the interesting question of whether people who critique the look of such avatars actually want to have a “realistic” looking avatar, or whether it is more a case of people wanting an avatar look that is appealing to their aesthetics which they can they identify with.

This is and interesting train of thought, as it is certainly true that – limitations of the avatar skeleton aside – most of us in Second Life are probably more driven to develop our avatars to a point where they have a personal aesthetic appeal, rather than in wanted them to be specifically “more realistic”.

Currently, HiFi is leaning towards a somewhat stylised avatar as seen in Team Fortress 2, which is allowing them to develop a natural-looking avatar look that doesn’t come too close to the Uncanny Valley. They use Adobe Maximo as their avatar creation tool, which Ozan views as a capable workflow package, but which may have some creative limitations. However, as an open-source environment, HiFi does offer the potential for someone to script in “in-world” character modelling tools, or at least to offer upload capabilities for avatar model generated in tools such as Blender. Avatars can also, if wanted, by uploaded as a complete package with all required / defined animations, such as walks, etc, included.

Chris Collins has very much become the voice of High Fidelity on You Tube, producing a wide range of videos demonstrating features of the platform, together with short tutorial pieces. The video above is one of his, demonstrating how to code interactive 3D content, using the Planky game as an example

While Ozan and his team work on avatar animations and rigging using real-time capture, Ryan Karpf reveals that by default, an avatar’s facial expressions are driven via the audio more than by direct capture: the mouth movement, for example, comprises 3 positions based on the audio, while a rising of voice or tone can result in the avatar’s eyebrows rising and falling. Ryan also touches on the Uncanny Valley issue of people’s increasingly discomfiture the closer avatars become to looking “photo-realistic”.

In talking to Chris Collins, an ex-Linden Lab alumni who headed the former SL Enterprise division, who now wears a number of hats at HiFi, Drax discusses how HiFi deals with the ever-changing face of the emerging VR hardware market, where headsets, input, tracking, and so on, is in something of a state of flux. Chris points out that while open-source, HiFi does have a set of strict coding standards and licensing, and offer external libraries to help support third-party SDK integration.

One of the powerful elements of High Fidelity is the ability you to have full agency over your environment, if you so wish; using the Stack Manager, you can create your own server / world / space, and control who might access it.  The scripting tools similarly allow users to download and tweak elements – such as walking animations, a basic avatar appearance, etc., quickly and easily.

Continue reading “The Drax Files Radio Hour: giving it the HiFi!”

High Fidelity launches US$15,000 STEM VR Challenge

HF-logoFew people involved in VR and augmented reality are unconvinced that these emerging technologies will have a profound effect on education and teaching. As has been seen in both Second Life and Open Simulator, even without immersive VR, virtual environments offer a huge opportunity to education.

Now High Fidelity is joining in, and is doing so in a novel but enticing way: by offering up to three US$5,000 grants to teams or individuals who want to build educational content within High Fidelity.

The new of the opportunity, which the HiFi team is calling the “STEM VR Challenge” (STEM being the acronym for Science, Technology, Engineering and Mathematics in education), was made via a blog post on the High Fidelity website from Ryan Karpf. In it, Ryan says:

High Fidelity recently had the pleasure of showing off our open source virtual reality platform to educators and technical integrators at the ISTE conference in Philadelphia.

To demonstrate one way educators can use our platform, High Fidelity worked with DynamoidApps to develop an interactive model of an animal cell that can be explored on one’s own or with an entire class. The vast alien looking environment goes beyond just showing the parts of the cell, also showing some of the processes taking place. Travelling around with your classmates and teacher allows for real time question and answers and sharing of ideas.

If you want to visit this animal cell, login and go to cellscience/start, and fly towards any cell you see to begin your journey. Hitch a ride on a motor protein and jump off at one of the huge mitochondria along the way!

The interactive model of an animal cell created by High Fidelity, working with DynamoidApps (image courtesy of High Fidelity)

The model itself, in keeping with High Fidelity’s open-source approach to their platform, is being offered free to any who wishes to modify it, with the companying hoping it will become the first of a catalogue of educational units created within High Fidelity.

To further kick-start things, High Fidelity are inviting educators, be they individuals or groups, to take up the STEM VR Challenge, to submit proposals for educational content in High Fidelity which meets the criteria set-out in the Challenge website, namely that the content is:

  • HMD (e.g. Oculus Rift) featured
  • High school age appropriate
  • STEM focused
  • Social (can be experienced by >3 people together)

Proposals meeting these criteria and abiding by the rules and are eligible to enter the Challenge, should be submitted via e-mail to eduvrgrant-at-highfidelity.com. On offer are up to three grants of US$5,000 apiece to help further develop the selected ideas. In addition, awardees will have direct access to High Fidelity’s technical support, and have their content hosted by High Fidelity. To find out more, follow the links to the High Fidelity blog and the STEM VR website.

Related Links

With thanks to Indigo Mertel for the pointer.

High Fidelity update users with a quarterly report

HF-logoHigh Fidelity have issues a progress report for the second quarter of 2015, which has been circulated to users via e-mail and made available as a blog post.

In the report, they highlight recently achievements / work, including:

  • The fact that they’ve been hiring-in new talent (and are still looking for more). It should be noted that the talent is restricted to employees, either. At the end of May, Professor  Jeremy Bailenson of the Virtual Human Interaction Lab at Stanford University  and Professor Ken Perlin both joined High Fidelity’s growing list of high-powered advisors
  • The instructions and video on setting-up the stack manager to run your own High Fidelity server has been updated, with the promise that next up will be an ability to optionally allow you share your server resources with other nearby users who need extra capacity
  • The ability to track and capture head movements and facial expressions with a regular webcam, as an alternative to needing a 3D camera
  • The arrival of the High Fidelity Marketplace, where you can drag and drop content into your server, and also to upload content you want to share with others. This is currently a sharing environment rather than a commerce environment, but the promise is that the commerce aspect will be coming soon
  • Commencing work on implementing distributed physics, building on top of the open source Bullet physics engine, with the aim of having low latency for interactions while maintaining the same state among participants – such as when people in different locations are playing Jenga or billiards together
  • The ability to import web content into High Fidelity – static web pages, videos, interactive web pages, etc., complete with a demonstration video and the promise of figuring out the best ways to allow the different types of shared browsing that people are going to need
  • My personal favourite: zone entities, skyboxes and dynamic lighting with spherical harmonic lighting and optional sync to real-world day/night cycles

Also in the Next Steps aspects of High Fidelity’s development is the intriguing promise of avatars with soft bodies, which are capable of interacting physically, or as Philip Rosedale puts it in the blog post, “imagine sword-fighting, for example”, while being driven by hand controllers such as those coming with the HTC / Valve Vive or for the Oculus Rift. This also links back to the work going on with the physics engine as well, which has, as Mr. Rosedale explains in the blog post, an added level of complexity within High Fidelity due to the distributed nature of the platform, and the need to maintain consistency between players as to what is happening, where things are, who is controlling what, and so on.

For those wishing to keep abreast with the key points of what is going on with High Fidelity, but who do not necessarily have the time to jump into every blog post that comes out, these updates are a useful means of tracking core events within the platform.