Reminder: Lab Gab with Brad Oberwager & Philip Rosedale Jan 31st 2022

via Linden Lab

On Monday, January 31st, there will be a special pre-recorded edition of Lab Gab featuring Linden Lab’s Executive Chairman Brad Oberwager (Oberwolf Linden) and the Lab’s co-founder, Philip Rosedale.

As noted in an official press release and within this blog (and others), High Fidelity Incorporated, the company co-founded by Rosedale in 2013, following his departure from Linden Lab, has invested in Linden Lab, bringing with it an a influx of money, patents and new and returning skills.

Following the press release, the Lab also issued and official Second Life blog post on the matter, in which they invited Second Life users to submit questions that might be asked of Brad and Philip as a part of the session, in which they will also likely discuss the future of Linden Lab and and Second Life. They may also talk about the other recent news that the Lab’s subsidiary, Tilia has partnered with Unity to provide their solutions to Unity developers who wish to include  virtual economy elements into their product offerings (see: Tilia Partners with Unity to Power Virtual Economies for Game and Metaverse Developers and Linden Lab announces Tilia partners with Unity “to power virtual economies”).

I hope to have a summary of the session available some time after it has streamed, but in the meantime, the salient details are summarised below.

Viewing Details

  • Time and Date: 09:00 SLT (17:00 UK / 18:00 CET) on Monday, January 31st, 2022
  • Watch on You Tube via this link (when the programme starts) or click the embedded viewer below.

 

Philip Rosedale talks Second Life and the metaverse

On Tuesday, January 25th, Philip Rosedale held a Twitter Spaces event (also relayed in-world in Second Life and to other virtual spaces) to discuss “the metaverse” and Second Life and to answer questions from SL users / interested parties. He was joined from time-to-time by various guests notably Avi Bar-Zeev¹ who added their own thoughts.

The conversation was wide-ranging, extended over some 100 minutes. What follows here is an attempt at a summary of the key areas of discussion in terms of comments from Both Philip Rosedale and Avi Bar-Zeev. Given the natural flow of the event, with subjects being raised and returned to, rather than being discussed sequentially, I have attempted to summarise the comments into bullet-point under topic headings. Where I have felt it worthwhile, I have included audio extracts of the actual comments made as well.

General Notes for This Summary

  • The full audio for the event is currently available via Twitter Spaces, where it will remain available for 30 days from the date of recording – Tuesday, January 25th, 2022.
  • For the most part, the bullet points refer to comments made by Philip Rosedale. Those by Avi Bar-Zeev are intentionally under a sub-heading for easier identification.
  • Where provided, audio extracts below have been edited to remove pauses, repetition, non-relevant asides, etc., in an attempt to assist with understanding the flow of comments. Where this has been done, I have taken care to try to ensure none of the original context / meaning of the comments has been lost or in any way altered.

On His Role at Linden Lab

  • Reiterated that he is not back at the Lab in any form of a managerial role or full-time “at this time” [which I found a potentially interesting qualifier, if intended that way].
  • He is “delightedly” providing advice and attending meetings with Lab staff.

On Second Life’s History

  • Recalled SL’s modest beginnings as LindenWorld, and interactions with the first residents – such as Stellar Sunshine.

  • Noted that a challenge Second life has had throughout its history is that in allowing user-generated content (UGC) that interacts with the native controls / capabilities (such as the physics engine), it becomes increasingly hard to make substantive changes to the behaviours of those capabilities, lest they result in content becoming “broken”.
  • Also noted that SL was very much ahead of its time with things like its particle and water system (the latter of which allowed for splashes, etc., when object cross the water plane) had to be removed, because they were simply too computationally complex for either home computers to process in a timely manner or the available network bandwidth and server communications could transmit to other users in a timely fashion – with some of these problems still existing to this day.
  • Indicated that these are issues not confined just to SL:  they are lesson that need to be understood in building any user-driven virtual spaces.

On Moderating Virtual Spaces

  • Sees moderation of virtual spaces / virtual worlds as something that still needs to be fully addressed.
  • Believes the approaches to moderation taken by social media platforms and across the Internet as a whole are insufficient for immersive spaces utilising avatars – simply put, a single standard of rules applied from above by a single company will not work.
  • In particular sees a top-down approach to moderation troublesome for a number of reasons, including:
    • Those utilising Meta’s suggested approach of recording interactions so that in the event of a dispute / reported abuse, the last 10-15 minutes can be attached to an abuse report, could use the gathered data to also help drive any advert / content-based revenue generation model they might also use.
    • Top-down approaches risk utilising a “one size fits all” approach to disputes in order to minimise the costs involved in managing moderation activities, thus removing the opportunity for for subtlety of approach or taking into consideration the uniqueness of any given situation / group, potentially alienating groups or activities.
  • Instead, believes that there should be a more fluid approach to moderation more in keeping with the physical world, and adjusted by circumstance / situation, and that:
    • Companies need to look at how spaces within their platforms are used and what is deemed as acceptable behaviour by the people operating  / using them.
    • Enable the communities / groups using spaces to be able to self-moderate through the provision of the means for them to do so (e.g. provide their own guidelines backed by the ability for them to remove troublemakers).
    •  Recognise the fact that the majority of people will adjust their behaviour to suit the environment they are within and self-moderate according to expectations of that environment.
  • Toward the end of the session, notes that there is a risk associated with some aspects of decentralisation of moderation / control. Within Second Life, for example, decentralisation of land ownership brought with it issues of anti-social behaviour and griefing – ad farms, intentionally being abusive towards neighbours through the use of large billboards, sounds, etc., whilst making the land too expensive for it to be reasonably purchased.

From Avi Bar-Zeev

  • Avi Bar-Zeev

    Also notes that there is an inherent danger in how a company could use the recording / surveillance approach to moderation to profile users and to assist their ad / targeting revenue model.

  • However, he thinks the larger issue is that given the review of recordings associated with abuse reports that may be coming in by the thousand in a large-scale system is going to be human-intensive, then the use of AI systems to manage this process and minimise costs is likely inevitable. But:
    • How do we know the AI isn’t by its very nature, pre-disposed to “find bad behaviour”, and to do so without consideration of a wider context (pre Philip Rosedale’s warning).
    • How can we be sure AI programming is sufficient for a system to correctly recognise some behaviour types as being abusive.
    • Is dealing with incidents in retrospect and with limited supporting data (e.g. just 10 minutes of audio) actually the best method of handling incidents.
  • As such, also believes it is better to design systems wherein people are innately aware that they are dealing with other people across the screen from them, and so they self-moderate their behaviour (as most of us naturally do most of the time when engaging with others), and that there are ramifications if we then chose to be directly abusive towards others. In short, virtual spaces should “re-humanise” our interactions with others.

On Preferred Business Models for Virtual Spaces

  • The common practice for social platforms – YouTube, Facebook, each is the behavioural surveillance model noted above –  collecting data on user interests and activities, etc., and using that to push  content / adverts / etc., to users whilst also gathering an overall profile on them.
  • Sees the development of AI / intuitive algorithms in this space particularly dangerous as they grow increasingly capable of recording moods / states of mind / health conditions (particularly where facial / body tracking is utilised).
  • Much prefers the model offered by the likes of Second Life, where the emphasis is not on advertising revenues, content delivery for brands, etc., but rather entirely fee-based.
  • Notes that as it is, Second Life generates more revenue dollars per user per year than You Tube through its model, and probably than Facebook. As such, and with roughly one million active users, SL has proven the fee-based revenue model works, and it is fully scalable.

From Avi Bar-Zeev

  • Notes there has been criticism of some platforms that deal in virtual “land” than is really just vapourware.
  • Wanted to underscore the point that SL and platform like it do not fall into this category, because while the land is virtual, it is nevertheless underpinned by actual servers and infrastructure and support services that incur costs that are being met by the fees charged.

On Accessibility for Virtual Spaces

  • Points out that when people in Second Life talk about “accessibility”, it is invariably from the perspective of learning to do things within the platform – getting to grips with the viewer, walking, talking, building, etc., and the “steep learning curve”. However, would argue that the issue starts much earlier than that.
  • The real issue with accessibility is not what to do / how to do it, but in getting people comfortable with the idea of using avatars and virtual spaces.
  • Has personal experience of this through building both Second Life and High Fidelity² and notes that by-and-large a typical reaction of anyone being asked to sit down and try any virtual world / space for social interaction will likely express interest in the experience, but discomfort at the idea of making it a part of there daily interactions in the manner promoted by the likes of Meta, etc.
  • Ergo, the first step in accessibility is moving things to a point where people are comfortable within idea of using avatars and a virtual presence. Only when this has been addressed, and people are comfortable with the idea, can the wider issues of moderation, world-building, economics, etc., be tackled.
  • Believes the way to do this is to make avatars more visually expressive – which is itself a tough proposition [see, for one thing, the issue of the Uncanny Valley], and towards the end of the video expresses how this could be done by using webcams on laptops, mobile devices to capture facial expressions and have the back-end software then translate these onto avatar faces [an approach LL have indicated they plan to develop in 2022].

  • Does see spatial audio of the kind High Fidelity has been developing as a factor in enabling greater depth of interaction, particularly within groups of people, but really sees the ability to mimic facial expressions, gestures, etc., to provide that underpinning level of non-verbal communications as a core part of making avatar-based interactions more acceptable to a larger audience.
  • In terms of avatars, expressiveness, etc., does point out that avatars should not be equated necessarily to “digital twins”  – that your avatar must be a digital representation of Second Life, and his opinion is that this should preferably be true in future virtual worlds / spaces.
  • However [and assuming adoption of virtual spaces into the work medium] sees a possible issues over “class distinction” between those ability interact “in real life” in person or through mediums like Zoom, etc.,  and those interacting through the purely digital, which may have to be addressed.

On the Linden Dollar and Crypto-Currencies

  • Offers a background on the Linden Dollar and why it uniquely works as a virtual currency, presenting something of a mix between crypto and regulated fiat money.
  • Highlights some of the issues with current crypto and why it is presently not a good medium for virtual economies.

On Mobile – Second Life and in General

  • Second Life (and Facebook) arrived before the first of large-screen, images / graphics capabilities arrived on the market in the form of the iPhone. As such, SL was solely geared towards desktop systems, as there was no reason to even consider the idea of compact, powerful mobile devices.
  • Admires the way the Minecraft has made in-world building so intuitive on mobile, and believed that is something virtual worlds need to achieve.
  • Personally believes it is essential for virtual worlds to offer convenient access from multiple devices, noting that perhaps the biggest world-wide platform in this regard is probably Android.
  • Thus the question is one of what features can be included with a mobile solution, and which features should be included when compared to the more immersive “hands-on” capabilities.
  • Allowing for his status as an advisor, he can say that Linden Lab is actively working on mobile. [I try to provide updates on this when there is news, using the SL Mobile tag in this blog.]
  • Suggests that LL could possibly engage in some form of “smaller” acquisition³ or building on an open-source tool.

General Comments

  • In discussing the 20th anniversary of the rezzing of the first prim in LindenWorld – see: Happy 20th rezday to Second Life’s humble Prim!, noted that a good part of the magic of early virtual worlds was that of in-world, real-time building, including doing so collaboratively, and helped build a sense of social engagement and sharing which more recently platforms (or SL through mesh) have either never had or have perhaps lost.
  • In talking about the primitive system, drew a comparison with the current hype around NFTs, noting that (with the introduction of the permissions system) every prim in SL is unique in terms of its creator, data and time of creation, UUID and what subsequent owner might do with it (modify it, copy it, pass it on / sell it to someone else), all of which are indelibly recorded in its metadata.
  • Noted that if “the metaverse” is to be as influential on live and work, etc., as the world Wide Web, then it not only needs people, but content. In particular notes that at its peak growth, the WWW was adding 300,000 new pages of content a day (2012). Clearly, in terms of virtual spaces, an exponential growth rate is liable to prove too much for a single corporate entity to manage.
  • Re-iterates the view that in terms of VR headsets, it is not the weight, the nausea or (in the case of Second Life) potential issues around frame rates, etc., that is key to increasing general adoption by consumers. Rather, it is in making the use of such headsets more inherently “safe” and less anti-social in terms of using them in physical rooms / locations where others are present.

Footnotes

  1. Avi Bar-Zeev has been a pioneer, architect and advisor in Spatial Computing (AR/VR/MR/XR) for nearly 30 years, behind the scenes in the world’s largest tech companies and at large. In early 2010, he helped found and invent the HoloLens at Microsoft, developing the first prototypes, demos, patents, plans and UX concepts, sufficient to convince his leadership. At Bing, he built first prototypes for developer-facing aspects of AR, sometimes called the “AR cloud.” At Amazon, he helped create PrimeAir as well as Echo Frames. He most recently helped Apple advance its own undisclosed projects. In 1999-2001, he co-founded Keyhole, the company behind Google Earth, and helped define Second Life’s core technology (and created the code that gives us prims). Back in the 1990s, he worked on novel VR experiences for Disney, including “Aladdin’s Magic Carpet” VR Ride, the “Virtual Jungle Cruise” and “Cyberspace Mountain.”
  2. For those who may not be familiar with it, High Fidelity Inc was originally set-up to create a VR headset-centric, decentralised virtual spaces / virtual world platform. However, the company pivoted away from this in 2019/2020 with the realisation that consumer VR systems are these not yet a comfortable proposition for the majority of people.
  3. This should probably not be conflated with any idea of buying Lumiya (which has been a constantly-stated view by some users). so far as I’m aware, there is no line of contact between Linden Lab and Lumiya’s developer.

 

Happy 20th rezday to Second Life’s humble Prim!

20 years of the prim by SarahKB7 Koskinen

It All Starts with a Cube

Those six words used to be one of the tag-lines associated with Second Life. Six words that – long before mesh or even sculpties entered our consciousness – summed up the unique magic of Second Life: the ability to create almost anything you might imagine, just by taking simple geometric shapes and playing with them – shaping, sizing, bringing them together, etc., – to produce something either individually or collectively, right there within a virtual space.

Of course, things like scripts and tools were required to get things to do things or to make the shapes that were needed, but at its heart, SL’s creativity lay within the humble primitive shapes offered to users through the viewer.

I mention this because January 25th, 2022 is officially the 20th anniversary of the first prim ever being rezzed within Second Life (or rather, its precursor: LindenWorld) – something marked by SarahKB7 Koskinen, who has produced a celebratory sculpture (seen at the top of this piece) which can – for the 25th of January 2022, at least, be seen at the Ivory Tower of Primitives sandbox.

Touching the sculpture will present you with a notecard about the prim cube it contains explaining that whilst a reproduction, like the very first primitive rezzed in 2002, it has no listed creator. Why? Because the rezzing of the first primitive predated the database that would be used to record information such as object creator names!

Avi Bar-Zeev

But exactly howdid SL’s primitive originate?

Well, their creator is one Avi Bar-Zeev.

For those unfamiliar with the name, Avi has been a pioneer, architect and advisor in Spatial Computing (AR/VR/MR/XR) for nearly 30 years. He’s worked for some of the biggest corporations including Amazon, Apple and Microsoft (where he pioneered the HoloLens, whilst in the 1990s, he worked for the Disney Corporation, working on what he refers to as “novel VR experiences”, including Aladdin’s Magic Carpet, the Virtual Jungle Cruise and Cyberspace Mountain.

Speaking on the January 25th, Avi describes the arrival of primitives thus:

About 10 years into that [his early work in the eXtended Reality space] I met Philip and we worked together on some things in Second Life. And early on, [Philip] had said, “let’s figure out this prim thing; let’s figure out how to build the world”. An I just so happened to have studied computational geometry in a college, and so I said, “I know how to do that!” and wrote a couple of hundred lines of code to make all the primitives in the world, with various knobs and capabilities to stick them together. So that was my claim to fame back then! 

– Avi Bar-Zeev talking with Philip Rosedale during a Twitter Spaces event, January 25th, 2022

Whether or not Avi had any idea back when he wrote those “couple of hundred” lines of code that they would still be in use 20 years later, I’ve no idea. But it cannot be denied that his code was, throughout the early years of Second Life, one of the mainstay reasons people kept up with their engagement with the platform; the joy of shaping simple shapes and learning how to cut and shape them and then bring them together and then going on to texture and (perhaps) script them to make something you can point to and say, “I did that!”.

Even today within the world of mesh, prims building offers opportunities for in-world collaboration, for fun and / or indulgence that simply cannot be matched by the more solitary world of mesh design, and primitives continue to hold a certain magic with anyone who learns to work with them.

So, happy rezday, primitives, and thank you to Avi Bar-Zeev for enriching our world!

Philip Rosedale: musing on Second Life and the metaverse

Philip Rosedale (2006) via Esther Dyson on Flickr

Note: the articles linked to in this article will display a log-in form on opening. Simply click the X to close this and view the article.

Whilst coming a week late to the party, but Protocol, the on-line tech publication, presented a brief but punchy interview with Philip Rosedale on his return to Linden Lab, a piece that makes for worthwhile reading.

I admit that a small part of my attraction to Second Life’s founder doesn’t believe in VR, by Janko Roettgers and Nick Statt, lay in the fact a couple of Rosedale’s comments on the state of VR as it is today, pretty much echo what I was saying a good few years ago (that the current generation of VR headsets are inherently anti-social in the way that cut the user off from those immediately around them). However, that’s not the reason for me to point to the article; there is far more of relevance within it.

What makes this article a particularly pleasant read is the direct approach taken by this authors, with key points neatly broken down into sub-sets of bullet points. These start with a refreshing  – and, I would state – fair summation of the state of consumer-facing VR before moving to to some of the challenges faced by “the metaverse” is trying to reach a significant global audience, and what’s on the horizon for Second Life in the future.

Janko Roettgers

This third sub-set of items has already been covered to some degree and includes the topics we’ve already heard about / surmised:

  • The use of tracking technology for avatar expressiveness.
  • A renewed move towards mobile support for Second Life (again, related to the “decentralised environment patents” transferred to LL?).
  • Improved communications capabilities.

No specifics are offered, admittedly – but what is recognised and – allowing for the fact that Rosedale is only (currently?) a part-time advisor to the Lab – a recognition that Second Life is long in the tooth with a heavy reliance on legacy technology  / approaches – and that at some point it is entirely possible that at some point building a new platform alongside of, and eventually replacing, Second Life as we know it, may well become a necessity.

And before anyone says, “but they did that with Sansar, and look at what happened!”, it is worth pointing out that a) Sansar was never developed as some kind of “SL 2.0”; it was made clear from the outset that the Lab was looking to address two different environments: Second Life and what was believed to be the coming wave for VR users, with agendas / needs that were very different to the majority of Second Life users. As such, there is no reason why, if LL did embark on an actual “SL 2.0”, it would likely be far more in respect of retaining the current user base and growing it, rather than seeking other horizons, as was the case with Sansar, whilst also allowing the platform to pivot more readily to newer technologies.

I actually find this point-of-view – which again, is a personal perspective from Rosedale, and not at this point anything we know to be part of the Lab’s plans for the foreseeable future – to be refreshing. Linden Lab has perhaps been too afraid of the spectre of “content breakage” and Second Life users a little too attached to inventory that they (probably) haven’t used in years, that it’s about time someone voice the reality that in order to move forward, there may well come a time when a break from at least some of the past is required.

For me, a particular point of interest within the article is what Rosedale states about the challenges facing “the metaverse”, and specifically the need to get to a point where avatar-centric communications can be “as effective as a simple Zoom call” together the  need for Second Life to provide “a better communication experience to take on Zoom calls.”

Nick Statt

I find this of a point of interest because it both underlines the coming of “avatar expressiveness in SL, and what the Lab hope to achieve with it, and also a continuing disconnect that is still evident in thinking around what “the metaverse” “must” do.

Within SL (and for the metaverse as a whole), there is no doubting that there are a range of use cases that can only benefit from avatar expressiveness. Picture, for example, a teacher within a virtual classroom being able to recognise a student who is experiencing difficulty or confusion during a lesson just by witnessing their facial expressions, and thus provide assistance.

However, the idea that “the metaverse” can gain traction among users just by emulating tools already at our disposal – Zoom, Skype, Duo, Viber, etc., – is potentially misguided. Such tools are already too ingrained into our psyche of ease-of-access and use to by easily replaced by carrying out the same task in virtual spaces. If “the metaverse” is to gain a mass appeal that isn’t centred on one particular environment / limited demographic – again, note Rosedale’s comments about Fortnite, Roblox and VR Chat – then it has to have a broad-based and compelling set of attractions rather than risking being seen as “just an alternative” to what can already be done using this, that or the other app or programme, etc. that is already at our disposal.

But in this I’ve said more than enough –  or al least the article from which it is drawn, so I’ll close here and leave Roettgers, and  Statt’s piece for you to read directly. And in doing so, I’d also recommend taking a look at what amounts to a follow-up piece by the same authors. With In the metaverse, everyone can sound like Morgan Freeman, Roettgers and Statt talk to Philip Rosedale about spatial audio and the company he currently runs: High Fidelity; it’s another informative read.

Lab blogs on Second Life script performance improvements

As I’ve noted in various pieces in this blog, whilst the physical transition of Second Life services from dedicated hardware operated directly by the Lab in a co-location facility to running those services within an Amazon Web Services (AWS) environment was completed at the end of December 2020, work on the project continued through 2021 in refining how the various services run within the AWS environment and in work leveraging the better capabilities Amazon provide  – hardware configurations, monitoring tools, etc., – to improve the performance of SL’s services.

Towards the end of that year in particular, the simulator engineering team was focused on what has been referred to as the “tools update” which, among other things, should bring improvements in the area of scripts, potentially allowing more scripts within a simulator to run per cycle, and even return some time to the simulator for other processing. It’s work that I’ve referenced in my own Simulator User Group (SUG) summaries and which has, more particularly, been moving through the simulator update process over the past few weeks to the point where it is now grid-wide.

Given this, on Thursday, March 20th the Lab officially blog on the update (as Monty Linden stated would be the case during the Tuesday, January 18th SUG meeting), the core element of which reads:

The release also includes a modernization of our compiler and supporting runtime.  Newer tools allows for better code generation and awareness of modern CPU designs.
While the news is mostly good, a word of caution that with more scripts running, other areas of the simulation environment may be driven harder.  Scripts that were already approaching throttles or other limits may find a throttle engaged; this also applies to remote services accessed via llHTTPRequest. We do see the possibility of revisiting these throttling limits as a result of these improvements. They could see higher request rates as scripts perform more work.  
We hope that you enjoy the additional script performance for your regions. Anecdotes from region owners on the RC channels before release were generally positive. We are keeping an eye on the data with expectations that these improvements are here to stay.  We hope that as the regions improve performance you will find ways to create and explore in ways that you could only dream of before.

Note the emphasis on the middle paragraph has been added by myself.

The blog post also outlines further updates to SSL support within the simulator hosts (simhosts), including all SSLv3, TLS 1.0, TLS 1.1, and related ciphers being deprecated for llHTTPRequest, llRequestURL, and llRequestSecureURL functions – although these changes do not affect log-in services, so users should not see any of the issues witnessed with the recently TLS changes to the login services.

Please read the full official blog post for complete details and context.

Watch Mon Métaverse, reflections on Second Life, Meta and more

Courtesy of Tutsy Navarathna
Following the creation of Meta accompanied by the grandiose announcements by the media singing the praises of future metaverses, we can rightly ask ourselves, which metaverses and what future are we talking about? … My friend, Yann Minh, a fellow explorer of cyberspace shares with us his thoughts and fears.

– From the introduction of Mon Métaverse by Tutsy Navarathna

With these words, Tutsy Navarathna leads us into his latest video, one that among his most thought-provoking (which is saying something, given the depth of content and ideas that are always embraced by his work).

Yann Minh

Published on his You Tube channel on January 16th, Mon Métaverse (“My Metaverse”) offers thoughts and reflections on the futures of “the metaverse” from both Tutsy and cyberspace explorer Yann Minh, who has been active within, and considering, virtual spaces for over 20 years.

Running to just a touch over 5 minutes, the video is a fascinating dissection of the current hype around “the metaverse”. Within it we are invited to consider what we have had up until now, and the choices we may face in the future. Do we hold on to we have thus far had: a digital life of almost limitless horizons and infinite diversity in which freedom of expression and creativity are embraced; or are we going to allow ourselves to be herded into sanitised corporate-defined spaces where expression and creativity run second to the surrender of personal data to feed the corporate revenue machine, and activities are governed by fake corporate morals.

When I thought twenty years ago that we were heading towards a more flexible, versatile and mature future, in fact the opposite is happening. We are clearly heading towards an infantizing, paternalistic future similar to the time when religious morals massively imposed their absurd rules on individuals.

Yann Minh, Mon Métaverse

This is a subject that can be debated at a length that will easily exceed the 5 minutes of the video. However, the beauty of Mon Métaverse is that Yann encapsulates these concerns eloquently and concisely, challenging us to think about our digital future without belabouring the message. In doing so, he positions things perfectly for Tutsy to present a – frankly – marvellous and honest look at the richness we have within Second Life, perfectly illustrating what “the metaverse” should really be about: the creativity of individuals, built without the data-hungry maw of algorithm and data collation sitting beneath it.

Beyond this, and on a personal level, I couldn’t help but see a possible broader context within the video; a more subtle questioning / challenge. It comes both in Yann’s comments around Facebook / Meta as the tip of an iceberg and the follow-on statement regarding religious censorship. We already know Facebook is responsible for the spread of disinformation – a practice it is unwilling to stop, and which has assisted the open growth of authoritarian politics that are, to no small extent, founded on a fake moralistic and divisive organised religion. As it turns out, this was in fact something that both Yann and Tutsy had also been considering in developing the concept of the video, as Tutsy informed me.

We are faced with a system that’s increasingly dominated by normalising algorithms in the service of a radical, conservative, authoritarian right unchallenged by most of the media. Within digital spaces, Meta is just the tip of the iceberg which as Yann Minh puts it, “leads us to a paternalistic, infantilizing future”; it seems high time we express our opposition to the way our freedoms and democracy are being so challenged least, as Yann notes, we see the absurd rules of the religious conservatives imposed on all of us within virtual spaces as well.

– Tutsy Navarathna

Thus, Mon Métaverse folds into itself a broader narrative that is not entirely out-of-place, and which adds further depth to its message for those who like to ponder such matters.

But, leaving messages and narratives aside, Mon Métaverse stands as a superb promotional piece for Second Life, both within the broader context of “the metaverse” and as a means of offering insight into the platforms power to attract, engage and retain users. This makes it more than worth the time take to watch it, and I encourage you to do so, either by viewing it below or clicking on the link within the video panel and watching it directly on Tutsy’s You Tube channel.