Designing Worlds: Ebbe Altberg video and transcript

On Monday October 6th, Designing Worlds, hosted by Saffia Widdershins and Elrik Merlin, broadcast a special celebratory edition, marking the show’s 250th edition, both as Designing Worlds and its earlier incarnation, Meta Makeover. To mark the event, the show featured a very special guest: Linden Lab’s CEO, Ebbe Altberg.

The following is a transcript of the interview, produced for those who would prefer to read what was said, either independently of, or alongside, the video recording, which is embedded below. As with all such transcripts in this blog, when reading, please note that while every effort has been made to encompass the core discussion, to assist in readability and maintain the flow of conversation, not all asides, jokes, interruptions, etc., have been included in the text presented here. If there are any sizeable gaps in comments from a speaker which resulted from asides, repetition, or where a speaker started to make a comment and then re-phrased what they were saying, etc, these are indicated by the use of “…”

The transcript picks-up at the 02:25 minute mark, after the initial opening comments.

Ebbe Altberg, appearing on the 250th edition of Design Worlds via his alter-ego, Ebbe Linden
Ebbe Altberg, appearing on the 250th edition of Design Worlds via his alter-ego, Ebbe Linden

0:02:25 – Ebbe Altberg (EA): Hi, thank you. Thank you for having me on this very special occasion of yours, and ours. 250, that amazing! It’s incredible, incredible; I’m very honoured to be here.

Saffia Widdershins (SW): Well, we’re very honoured to have you here … Now, you’ve been in the job for around nine months now.

EA: Yes, since February, I think. Yeah.

0:02:59 SW: Is it what you were expecting, or how has it proved different?

EA: It’s fairly close to what I expected, because I’ve had a long history of knowing Second life, even from the beginning. So Second Life, the product, was not a mystery to me. Obviously, as you dig in and look under the hood, you see some things that you wouldn’t have expected; and some of the other products in Linden Lab’s portfolio were maybe a little bit surprising to me, but we’re getting that cleaned-up. But with regards to Second Life, it is not too much of a mystery, as I’ve been following it so closely since way back in the beginning … So it felt very natural and quite easy for me to come on-board and figure out where to take things.

0:03:59 – Erik Merlin (EM): And keeping this next question as general as possible: is there anything that’s been a pleasant or unpleasant surprise?

EA: Not that many unpleasant surprises; well, it was a little unpleasant how far we had managed to disconnect ourselves from the community and our customers and residents. So that was a bit shocking to me, because I had missed that part of the history. I remember the beginning of the history, where there was a very close, collaborative relationship between the Lindens and the residents. so that was a bit shocking to me, that … some effort had to be put in to try to restore some of those relations and some of the processes that had introduced here that we had to reverse. You know, the fact that Lindens couldn’t be in-world [using their Linden account] and stuff like that. So that was a little strange to me and unfortunate.

Positives? There are many. There are so many talented people here, so that’s been a lot of fun to get to know people here. some people have been here for a very long time; some absolutely incredible people have been here for over ten years working for Linden, so just getting to recognise what incredible talent we have here has been a positive … it was a little bit low-energy when I first came here, which was a little bit unfortunate, but I think we’ve come quite a bit further, and so the energy today in the office and amongst people working here has gone up quite a bit, so I’m very pleased with that.

0:06:03 SW: That’s brilliant. Have there been any stand-out “wow!” moments when you’ve come in-world and seen something and gone “wow!”?

EA: The “wows” for me may be less visual – I think we could do better with that in the future – but just the communities, and the types of creations and how people collaborate to make these things happen. the variety of subject matter and the variety of things that Second Life helps people to accomplish, whether it is games or education or art – its just incredible, the variety. And also the interactions with people are wild moments, where I can just drop-in somewhere and just start chatting with people, and that’s always a lot of fun and creates “wow!” experiences for me.

So the fact that this is all user-generated, in some ways that just wows me every day. It’s incredible that we can enable all these things to happen. But I’m certainly hoping we can get to a point where it’s more of a visual “wow!” in the future.

0:07:32 SW: I’ve been at the Home and Garden Expo this week … and there’s certainly some things there that are stunning examples of what creators are working on at the moment.

EA: Yeah. It’s taking everybody a while. A lot of new technologies have been introduced, and we’re still trying to make adjustments and fixes and improvement s in some of those things. But as more and more creators figure-out how to take advantage of these things, whether it’s mesh or experience keys and all kinds of stuff that just creating a new wave of different types of content and experiences, it’s fun to watch happen. It’s a lot of fun to be able to enable and empower people that way.

0:08:33 SW: We wanted to talk a little bit about the new user experience.

EM: Ah yes, and talking to different people working with new users, both English and Japanese speakers, interestingly enough, both have talked about problems with the new mesh avatars … One of the first things that people enjoy when they first come to Second Life is [to] customise their appearance, but the mesh avatars don’t really allow this, or they don’t allow it easily. Is there something that can be done about that?

One of the things the Lab is trying to solve is the "dead face" - the fixed facial expression - on current mesh avatars, as demonstrated in the blank look his own avatar wears through the interview
One of the things the Lab is trying to solve is the “dead face” – the fixed facial expression – on current mesh avatars, coincidentally demonstrated in the video by Ebbe’s (non-mesh) avatar

EA: I don’t have a specific list of good things there; the team is working on making improvements to the avatars, from little things that we might see as bugs, and also trying to solve the “dead face” , get some eyes and mouths [to] start moving. But some of the clothing issues is probably also issues with the complexity of understanding what things can I shop for that are going to be compatible with what types of avatars and all that. Some of it is hard to tell with how much of it is complications with the transition… or the fact that you have two different ways of doing things happening simultaneously; we’re sort-of in this transitional period where you can obviously still go back to using any of the previous avatars, those are still all there. But we wanted to push ahead with what we figure is where the future is going to take us, and there’s probably some growing pains in doing that; but other time, this is where it is going to go.

So we just have to try to understand the bugs and the complexities and react to is as fast as we can. but I don’t, off the top of my head, have a list of known issues that we’re fixing with regards to the complexities around avatars, other than the stuff with getting the face to wake up. but I can look into that for a follow-up later on, but right now I don’t have anything right off the top of my head.

0:10:50 SW: As someone who directs dramas like The Blackened Mirror, we’ve long said that we would give anything for the ability to raise a single eyebrow …

EA: Yeah … ultimately over time, as [real world] cameras improve, if you’re willing to be in front of a camera, there are things you can obviously do to really transmit your real-world facial expressions onto your avatar, and we’re going to look at that further out. That’s not something we’re actively working on right now; but there’s certainly other companies, including HiFi that are looking at that, and we know companies that have already proprieted the technology behind it that we could license and do some of those things.

But there are very few of those types of camera around, so even if you would do that kind of functionality, very few people would be able to take advantage of it, so it’s a little bit early to jump on that. We need more 3D cameras in the world. Otherwise, there’s some other techniques – it wouldn’t necessarily be facial expression – but there’s a company working on technology to be able to have your mouth … make the right movements based on the audio. That’s an interesting technology, but they haven’t figured out how to make it real-time yet.

What they’ve found is that regardless of language, if you make a sound, your mouth makes a very specific movement and a very specific shape, and they’ve constructed all of the internals of the mouth and know exactly what your tongue and your cheek bones are doing in order to make that sound. Right now, not in real-time, but they’re working to get there. so then we could get the mouths to actually react to the sounds that you are making through the microphone.

So over time, more and more of this will come, but today it would be difficult to do something that would auto-magically make it work for everybody.

Continue reading “Designing Worlds: Ebbe Altberg video and transcript”