Commerce in High Fidelity

It’s been a while since I last looked at High Fidelity, however, there have been a number of developments on Philip Rosedale’s VR platform over the last several months, specifically related to the last HF subject I blogged about: currency and IP protection.

In August 2017, Rosedale wrote two blog posts on the company’s currency and IP protection roadmap, setting out plans to use a blockchain-based crypto-currency – the High Fidelity Coin (HFC). Since then, they’ve issued a series of blog posts tracking their ideas and developments towards building a blockchain centric currency / IP management capability.

For those unfamiliar with the concept, the attraction of blockchain systems is both their “openness” and their security. In short and simply put, a blockchain can be thought of as a completely decentralised database duplicated across the Internet, with the information held on it both immediately shared and reconciled across all instances of the database after any transaction, anywhere, any time. It is almost entirely self-managed, with nodes on the network of databases acting as “administrators” of the entire system.

All of this makes a blockchain environment transparent and exceptionally difficult to hack; it has no single point of data which can be corrupted, nor is it reliant on a single point of management for its continued existence. Thus, blockchain networks are considered both highly robust and very secure.

Following these initial posts, High Fidelity issued two further blog posts charting their steps towards building a blockchain based commerce system using the HFC as its crypto-currency, and which can provide a means of authenticating and a chain of ownership for valid digital goods (assets) within High Fidelity domains. These were:

  • A First Look at High Fidelity Commerce in Action, published in October 2018, demonstrating how their proposed approach, as a decentralised, independent service, would integrate into the shopping experience for people within High Fidelity domains.

  • An examination of their approach to handling worn assets (clothing / accessories), published at the start of November 2017. This included how worn assets would be technically managed (including allowing in-world / in-store demonstration / trial versions), and how the blockchain mechanism will not only handle the purchase of goods in HFCs, but provide certification of validly purchased goods which can be reviewed by any other user when examining the purchased item itself.

Then, in December 2017, the company launched a closed beta of Avatar Island, a shopping domain offering more than 300 avatar clothing and accessories from designers around the world for High Fidelity users to try in-world and, if they wish, purchase them. first environment within High Fidelity which starts to weave all of the threads from those earlier blog posts together into a whole.

Avatar Island is impressive on a number of levels, including the real-time, interactive ability to try on different items; the ability to resize accessories to fit, to share the shopping experience with a friend, etc. The items offered for sale within it are the first digital goods (assets) certified by HF’s Digital Asset Registry (DAR), a decentralised, publicly auditable blockchain ledger.

The DAR serves a number of functions: it uniquely identifies every digital asset on the system; it enables such goods to be  purchased with the High Fidelity Coin; and it serves as a record of transactions made by High Fidelity users. At its heart is the use of Proof of Provenance (PoP), which documents an asset’s chain of ownership, its characteristics, and its entire history, from certification onward. It’s a record which cannot be altered, deleted or denied, establishing an asset’s chain of ownership — its sale and resale — that sits entirely in the hands of the asset’s owner.

Furthermore, PoP can be used to authenticate digital assets in any High Fidelity domain – and even allow domain owners regulate the objects allowed into their virtual spaces (e.g. a restriction could be placed to only allow items in keeping with the theme of a space into it, or only allow items from approved vendors, etc.). Thus, DAR / PoP is potentially a powerful way of managing asset ownership, identification, purchase and use across High Fidelity’s distributed environment.

Since the start of February 2018, High Fidelity offers a means for users to pay one another in HFCs. Credit: High Fidelity

At the start of February 2018, the company announced they were launching the ability for users to pay one another directly in HFCs (so tips can be given to performers, etc.). To kick-start this, early adopters of High Fidelity (e.g. those who had signed-up and been involved in High Fidelity over the course of the last couple of years) have been awarded a range of HFC grants, made available through a server called the “BankofHighFidelity”.

Together, the HFC, DAR, PoP and the “BankofHighFidelity” provide a solid foundation for commerce within HF domains. Currently, there is no means to cash-out HFCs for fiat money, but given that High Fidelity is well aware of the powerful attraction of being able to do so (and allowing for regulatory adherence), it’s hard to imagine this would not be a part of the company’s plans.

As it is, the company has stated it plans to operate “BankofHighFidelity” as an exchange where HFCs can be exchanged for other crypto-currencies (I assume the likes of Ethereum and Gloebit) – a quite ambitious move in itself.

High Fidelity, approaching the second anniversary of its open beta, had laid down and impressive commerce roadmap for their environment with some impressive technical capabilities for “in-world” transactions and shopping. It’s not entirely clear how this approach might work a supply chain approach to commerce, something Linden Lab is attempting to build into Sansar, or even if High Fidelity is thinking along those lines.

Given these developments within High Fidelity and the fact that linden Lab are hoping to have their own approach to commerce in Sansar more firmly established later in 2018, – and allowing for the key differences between the two environments, it’ll be interesting to compare and contrast how each tackles commerce, digital rights, asset provenance, etc., down the road.

Advertisements

High Fidelity reveal currency and IP protection roadmaps

In a pair of blog posts, Philip Rosedale of High Fidelity revealed the company’s plans to use blockchain technology as both a virtual worlds currency and for content protection.

The blockchain is described as “an incorruptible digital ledger of economic transactions that can be programmed to record not just financial transactions but virtually everything of value” (Don Tapscott, Blockchain Revolution: How the Technology Behind Bitcoin Is Changing Money, Business, and the World). It allows transactions to be simultaneously anonymous and secure by maintaining a tamper-proof public ledger of value. While it is most recognised for its role in driving Bitcoin, the technology is seen by more than 40 of the world’s top financial institutions as a potential means to provide speedier and more secure currency transactions. However, the technology has the potential to have far wider application.

To understand the basics of the blockchain, think of a database duplicated across the Internet, allowing any part of it to be updated by anyone at any time, and the updates being immediately available across all the duplicates of the database. Information held on a blockchain exists as a shared — and continually reconciled — database existing across multiple nodes. The decentralised blockchain network automatically checks with itself every ten minutes, reconciling every transaction, with each group of transactions checked referred to as a”block”. Within the network, nodes all operate as “administrators” of the entire network, and are encouraged to join it through what is (mistakenly) referred to as “mining”  – competing to “win” currency exchanges, sometimes for financial reward to the node’s operators (High Fidelity indicate that node operators will not gain directly from “mining” activities, but will instead be paid in HFCs for their computing resources used by the network).

Centralised, distributed and de-centralised networks – blockchains utilised decentralised networks

The key points to all this is that the blockchain is both openly transparent – the data is embedded in the network as a whole, not in any single point, and is by definition “public”. The lack of any centralisation also means it cannot be easily hacked – doing so would require huge amounts of computing power; nor is there a single point of data which can be corrupted or reliant on a single point of management for its continued existence – as High Fidelity point out, this means that the service can continue, even if High Fidelity does not. Thus, blockchain networks are considered both highly robust and very secure.

An estimated 700 Bitcoin-like crypto-currencies are already thought to be in operation, although the potential use of blockchains goes far, far beyond this (identity management, data management, record-keeping, stock broking, etc., etc.).

High Fidelity plans, over the coming months, to deploy their own blockchain network which will support both a new crypto-currency, the HFC (presumably “High Fidelity Currency”), which will ultimately operate independently of High Fidelity’s control. In addition, the system will provide a mechanism to protect intellectual property by embedding object certification affirming item ownership into the blockchain. This means that creators of original digital content. As High Fidelity explain:

Digital certificates issued by the High Fidelity marketplace (and likely other marketplaces choosing to use HFC) will serve a similar function as patents or trademarks — creators will register their works to get the initial certificates, and these certificates will be given out only for work that is not infringing on other or earlier works…. Once granted, these durable certificates cannot be revoked and can then be attached to purchases on the blockchain to prove the origin of goods. The absence of an accompanying digital certificate and blockchain entry will make digital forgery more obvious and impactful than in the real world — for example, server operators may choose not to host content without certificates and end-users may choose not to ‘see’ content according to it’s certificate status.

This approach could provide an extremely durable and trusted means of sharing digital content, one which is more durable than other approaches to digital rights management, for the same reasons as the blockchain offers security, transparency and robustness to operating a crypto-currency.

That the HFC blockchain is designed to operate independently of High Fidelity means that it can become self-sustaining, providing a currency environment that can be traded with other crypto-currencies and which can be exchanged for fiat currency through multiple exchanges.

The two blog posts – Roadmap Currency and Content Protection and Roadmap: Protecting Intellectual Property in Virtual Worlds – are very much companion pieces to be read in the order given. The first provides an overview of the HFC blockchain system and currency management, including how High Fidelity hope to establish a stable exchange rate mechanism without running into the issues of speculative dabbling in the system, inflated ICOs, etc., and on the use of digital wallets and personal security. It also outlines the certification mechanism for content protection, which the second article takes a deeper dive into, explaining how the relative strengthen of a blockchain approach as very quickly sketched out above could be used in protecting creator’s IP and controlling how their products / creations are used.

The decentralised approach to currency and digital rights management is something that has been pointed to numerous times during High Fidelity’s development, but this is the first time the plans have been more fully fleshed out and defined in writing. It’s an ambitious approach, one likely to stir debate and discussion – particularly given the current situation regarding the Decentraland / Ethereum and the risk of speculation around ICOs (again, something High Fidelity hope to avoid).

it’s also one which again points to High Fidelity’s founders looking far more towards more of an “open metaverse” approach to virtual environments and goods than others might be considering.

High Fidelity moves to “open beta”

HF-logoTuesday, April 27th saw High Fidelity move to an “open beta” phase, with a simple Twitter announcement.Having spent just over a year in “open alpha” (see my update here), the company and the platform has been making steady progress over the course of the last 12 months, offering increasing levels of sophistication and capabilities  – some of which might actually surprise those who have not actually set foot inside High Fidelity but are nevertheless will to offer comments concerning it.

I’ve actually neglected updating on HiFi for a while, my last report having been at the start of March. However, even since then, things have moved on at quite a pace. The website has been completely overhauled and given a modern “tile” look (something which actually seems to be a little be derivative rather than leading edge – even Buckingham Palace has adopted the approach).

High Fidelity open beta requirements
High Fidelity open beta requirements

The company has also hired Caitlyn Meeks, former Editor in Chief of the Unity Asset Store, as their Director of Content, and she has been keeping people appraised of progress on the platform at a huge pace, with numerous blog posts, including technical overviews of new capabilities, as well as covering more social aspects of the platform, including pushing aside the myth that High Fidelity is rooted in “cartoony” avatars, but has a fairly free-form approach to avatars and to content.

High Fidelity may not be as sophisticated in terms of overall looks and content – or user numbers – as something like Second Life or OpenSim, but it is grabbing a lot of media attention (and investment) thanks to it have a very strong focus on the emerging ranging to VR hardware systems, and the beta announcement is timed to coincide with the anticipated increasing availability of the first generation HMDs from Oculus VR and HTC. Indeed, while the platform is described as “better with” such hardware and can be used without HMDs and their associated controllers, High Fidelity  describe it as being “better with” such hardware.

High Fidelity avatars
High Fidelity avatars

I still tend to be of the opinion that, over time, VR won’t be perhaps as disruptive in our lives as the likes of Mixed / Augmented Reality as these gradually mature; as such I remain sceptical that platforms such as High Fidelity and Project Sansar will become as mainstream and their creators believe, rather than simply vying for as much space as they can claim in similar (if larger) niches to that occupied by Second Life.

And even is VR does grow in a manner similar t that predicted by analysts, it still doesn’t necessarily mean that everyone will be leaping into immersive VR environments to conduct major aspects of their social interactions. As such, it will be interesting to see what kind of traction high Fidelity gains over the course of the next 12 months, now that it might be considered moving more towards maturity – allowing for things like media coverage, etc., of course.

Which is not to share the capabilities aren’t getting increasingly impressive, as the video below notes – and just look at the way Caitlyn’s avatar interact with the “camera” of our viewpoint!

 

 

.

High Fidelity: “VR commerce”, 200 avatars and scanning faces

HF-logoHigh Fidelity have a put out a couple of interesting blog posts on their more recent work, both of which make for interesting reading.

In Update from our Interns, the provide a report by Edgar Pironti and Alessandro, Signa two of High Fidelity’s interns who have been working as interns  on a number of projects within the company, including developing a single software controller for mapping inputs from a range of hand controllers, with the initial work involving the Razer Hydra, HTC Vive’s controllers and the Space Navigator. They also discuss working on recording and playback functionality, which is also expanded upon in the second blog post which caught my eye, the January newsletter, issued by Chris Collins.

This work has involved developing the ability to pre-record the data stream of an avatar – audio, facial expressions and body movement in a format which can later be played back on a server under the control of JavaScript. As Chris notes, this makes it very easy to quickly populate a VR experience with compelling pre-recorded avatar content, allowing life-like characters to a place or for use within a machinima film.

As their third reported project. Edgar and Alessandro discuss how they’ve been looking into creating a “VR commerce” environment. This combines elements of physical world shopping – sharing it with friends, actually grabbing and trying items, having discussions with sales staff, etc., with the convenience of e-shopping, such as quickly changing colours, seeing customer reviews and feedback, and so on. As well as explaining how they went about the task, Edgar and Alessandro have put together a video demonstration:

In the High Fidelity newsletter, Chris Collins covers a number of topics, including work on optimising avatar concurrency on a single High Fidelity server. While this work most likely used Edgar’s and Alessandro’s approach to pre-recording avatar  data streams mentioned above, the initial results are impressive: 200 avatars on a server, of which the 40 nearest the observer’s viewpoint are rendered at 75Hz when using the Oculus Rift, and with a very high level of detail, with full facial animations.

200 avatars on a High Fidelity server as the comes starts into optimising avatar concurrency (image: High Fidelity)
200 avatars on a High Fidelity server as the comes starts into optimising avatar concurrency (image: High Fidelity)

One of the things that High Fidelity has often be critiqued for by SL users is the cartoon like avatars which were first shown as the company gradually cracked open its doors. These are still in use, but there has also been a lot of work put into making the avatars more life-like should users so wish. However, there is a trade-off here, which has been discussed in the past: the so-called uncanny valley effect: the closer the facial features look and move almost, but not exactly, like natural beings, it causes a response of revulsion among some observers.

This has tended those investigating things like avatar usage to express caution in pushing too close to achieving any kind of genuine realism in their avatars, and Philip Rosedale has discussed High Fidelity’s own gentle pushing at the edge of the valley. Now it seems the company is stepping towards the edge of the valley once more, using 3D scans of people to create avatar faces, as Chris notes in the newsletter:

We scanned our whole team using two different 3D scanners (one for hair and one for the face), and then used some simple techniques to make the results not be (too) uncanny … Although there is still a lot of work to do, we achieved these results with less than 2 hours of work beyond the initial scans, and the effect of having our team meetings using our ‘real’ avatars is quite compelling.

High Fidelity's staff group photo using scans of their own hair and faces (image: High Fidelity)
High Fidelity’s staff group photo using scans of their own hair and faces (image: High Fidelity)

Not everyone is liable to want to use an avatar bearing a representation of their own face, and the idea of using such a technique does raise issues around identity, privacy, etc., which should be discussed, by High Fidelity’s work in this area is intriguing. Although that said, and looking at the staff group photo, I would perhaps agree there is still more work needed; I’m not so much concerned about the technique pushing towards the edge of the uncanny valley so much as I am about the avatars in the photo looking just that little bit odd.

Also in the update is a discussion on audio reverb, a look at how the use of particles can be transformed when you’re able to see your “hands” manipulating them, and a look at a purpose-built game in High Fidelity.  These are all discussed on the video accompanying the January video, which I’ll close with.

High Fidelity: Stephen Wolfram and more on tracking

HF-logoOn Tuesday, December 29th, High Fidelity announced that Stephen Wolfram has become their latest advisor.

British-born Stephen Wolfram is best known  for his work in theoretical physics, mathematics and computer science. He  began research in applied quantum field theory and particle physics and publish scientific papers when just 15 years old. By the age of 23, he was studying at the School of Natural Sciences of the Institute for Advanced Study in Princeton, New Jersey, USA, where he  conducted research into cellular automata using computer simulations.

Stephen Wolfram via Quantified Self
Stephen Wolfram via Quantified Self

When introducing Wolfram through the High Fidelity blog, Philip Rosedale notes this work had a profound impact on him, as did – later in life – Wolfram’s 2002 book, A New Kind of Science.

More recently, Wolfram has been responsible for WolframAlpha, an answer engine launched in 2009, and which is one of the systems used by both Microsoft’s Bing “decision engine” and also Apple’s Siri. In 2014, he launched the Wolfram Language as a new general multi-paradigm programming language.

In become an advisor to High Fidelity, Dr. Wolfram joins Peter Diamandis, the entrepreneur perhaps most well-known for the X-Prize Foundation, Dr. Adam Gazzaley, founder of the Gazzaley cognitive neuroscience research lab at the University of California, Tony Parisi, the co-creator of the VRML and X3D ISO standards for networked 3D graphics, and a 3D technology innovator, Professor Ken Perlin of the NYU Media Research Lab, and Professor Jeremy Bailenson, the director of Stanford University’s Virtual Human Interaction Lab.

In their October update published at the start of November, High Fidelity followed-up on the work they’ve been putting into various elements of tracking movement, including the use of BinaryVR for tracking facial movement and expressions. In particular, the company’s software allows users to create a their own virtual 3D face from 2D facial photos, allowing them to track their facial animations in real-time, transforming them onto the CG representation of their face.

Ryan balances one object atop another while Chris uses the gun he picked-up with a hand movement, then cocked it, to try and shoot the yellow object down
Ryan balances one object atop another while Chris uses the gun he picked-up with a hand movement, then cocked it, to try to shoot the yellow object down

Integration of the BinaryVR software allows High Fidelity to track users’ mouth movements through their HMDs, allowing their avatars to mimic these movements in-world, as Chris Collins demonstrates in the update video. The company has also been extending the work in full body tracking, as seen in my October coverage of their work, and this is also demonstrated in the video alongside of more in-world object manipulation by avatars, with Chris and Ryan building a tower of blocks and Chris then picking up a gun and shooting it down.

The hand manipulation isn’t precise at this point in time, as can be seen in the video, but this isn’t the point; it’s the fact that in-world objects can be so freely manipulated that is impressive. That said, it would be interesting to see how this translates to building: how do you accurately sized a basic voxel (a sort-of primitive for those of us in SL) shape to a precise length and width, for example, without recourse to the keyboard or potentially complicated overlays?

Maybe the answer to this last point is “stay tuned!”.

High Fidelity: September update and things to come

HF-logoThe September newsletter from High Fidelity appeared at the end of that month, with Chris Collins highlighting some of the work that has been going on of late, providing an update on particle effects, procedural textures and – most interestingly – avatar kinematics and in-world object manipulation using an avatar’s hands and via suitable controllers.

Procedural textures allow for complex, algorithm based textures to be created using tools such as ShaderToy and used directly within High Fidelity. Brad Davis has created a video tutorial on procedural entities which Chris references in the newsletter, the write-up also follows a short video released on the High Fidelity  you Tube channel which briefly demonstrates procedural textures in HiFi.

However, it is the object manipulation that’s likely to get the most attention, together with avatar kinematics and attempts to imply a force when moving an object.

In terms of avatar kinematics, Chris notes:

In 2016, when the consumer versions of the HMD’s are released, you are also going to be using a hand controller. It is therefore important that we can make your avatar body simulate correct movement with the hand data that we receive back from the controllers.

The results are shown in the newsletter in the form of  some animated GIFs. In the first, Chris’ avatar is shown responding to a Hydra controller for hand movements and echoing his jaw movements. The second demonstrates object manipulation, with Chris’ avatar using its hand to pick up a block from an in-world game, echoing Chris’ motions using a hand-held controller.

Manipulating in-world objects in High Fidelity via an avatar's hands and a set of controllers (image: High Fidelity)
Manipulating in-world objects in High Fidelity via an avatar’s hands and a set of controllers (image: High Fidelity)

The animation in picking up the block may not be entire accurate at this point in time – the block seems to travel through the avatar’s thumb as the wrist is rotated – but that isn’t what matters. The level of manipulation is impressive, and it’ll be interesting to see if this might be matched with things like feedback through a haptic style device, so that users can really get a sense of manipulating objects.

The object manipulation element, together with attempts to imply a force when moving objects in-world which make up a core part of the video accompanying the newsletter (and which is embedded below). Again, this really is worth watching, as the results are both impressive, and illustrate some of the problems High Fidelity are trying to solve in order to give virtual spaces greater fidelity.

Coupling object manipulation with implied force opens up a range of opportunities for things like in-world games, physical activities, puzzles, and so on. There’s also potential for learning and teaching as well, so it’ll be interesting to see how this aspect of the work develops.

The newsletter also promises that we’ll be seeing some further VR demo videos from High Fidelity in October, so keep an eye out for those as well.