Linden Lab’s corporate pipe dream

Those of us who have been around in Second Life for a while are aware of Linden Lab’s ambition to attract big corporations to Second Life. When the Second Life “bubble” had reached its maximum diameter (and then met the sharp end of a needle, with predictable results), Philip Rosedale, then CEO of LL, made claims that managed to raise the eyebrows of even the most enthusiastic users. In May 2007, he told the Guardian (emphases mine):

What we are saying is SL is the next worldwide web and so every computer has to do 3D perfectly and we are not there yet. We are probably one PC development cycle – so 18 months – away from where every machine with Vista or Mac OS X should be able to run SL. I think we started a little bit early with SL but the sheer enthusiasm that people have had about co-creating the world has sustained it. What is amazing is that we are not even there yet. This is only the beginning of the 3D web and SL.

This reminds me a bit of the days when a more computer-savvy cousin of mine was telling me about how we’d have multiple desktops on a cube that we’d rotate on the screen and move applications and clipboard contents from one side of the cube to the other. Now, the desktops-on-a-rotating-cube is but a gimmick implemented in feature-rich desktop environments like KDE, Gnome 2 (sorry, Gnome 3 is crap), MATE and Enlightenment; impressive, but not many people use it, because it’s not particularly useful.

“SL is the next worldwide web” – quite a bold claim and I’m sure many in the industry must have chuckled a bit when they read this interview. OK, I’m throwing my “good girl” mask away and I’ll be blunt about it: this claim was pure bullshit and it makes one wonder if Mr. Rosedale is in touch with reality. Why did the worldwide web succeed?

  1. It can be used on anything: from a low-spec cellular phone to a supercomputer.
  2. It’s open (source): everyone has access to it, to its inner workings and so everyone can set up servers, websites, the works.
  3. It’s not controlled by a single company that monopolises it, so users don’t feel enslaved or trapped in “walled gardens” and also this “anarchy” that has characterised the worldwide web for so long has meant it simply cannot be taken down or become extinct just because an ISP ended up pushing up daisies.

How does Second Life compare to this?

  1. It has always been demanding, hardware-wise. Much like high-end computer games, it practically dictates that users have a reasonably decent machine (nothing wrong with that, mind you) to run it with at least the basic visual “bells and whistles” that’ll make it look halfway decent – and if you want it to look really good, then you need a seriously good machine.
  2. Its server-side code is proprietary and closed.
  3. It is controlled in a monopolistic way by Linden Lab – and with the varying degrees of compatibility w.r.t. mesh and sculpts implementation among the various virtual world platforms, the abandonment of the “Hypergrid” and the lack of proper content export tools (for which we can also blame – in part – the in-world content creators and their copybot-related paranoia), it’s the archetypal walled garden, in which users are essentially trapped and can’t easily take their creations and inventories to other platforms.
  4. If LL bites the dust, Second Life is finito (and, along with it, all the sims that its users have created with so much effort).

So no, Second Life is not the next worldwide web. Perhaps a virtual reality realm sometime in the future (in 10 or 20 or 30 years ago) will absorb the web – but still, it’ll need to have the qualities that the worldwide web has and Second Life doesn’t.

It is these qualities that have given the worldwide web the massive user base it has. Now tell me: where are big corporations more likely to spend money in order to have their presences? On a virtual world with approximately one million real users (and a gazillion alts, bots and sockpuppets) that is in slow decline or on the worldwide web, with the hundreds of millions of users (who can be potential clients and personnel for the corporations)? I think the answer is a no-brainer and I wonder how LL’s top execs kept deluding themselves into having such unrealistic expectations and pursuing a pipe dream, wasting time and money on it instead of enhancing what they had.

There are also a few more reasons for which SL cannot be the platform of choice for big corporations:

  1. Second Life is – at least in theory – all about privacy, whereas big corporations want to have access even to the most trivial information about us, so that they can profile us and (a) fill our inboxes and browsing experience with spam, (b) sell our data to “national security” organisations or other entities, depending on who makes the best offer. To be honest, I’d really like to see Mr. Humble, who was so vocal about privacy and avatar identity in his talk with Draxtor Despres, tell us where LL stands in the wake of the PRISM scandal.
  2. When SL was marketed as the platform for corporate presences, it simply was not ready. Sculpts were still at a primitive stage, there was no mesh support and, well, prim-based stuff simply doesn’t cut the mustard.
  3. Its very nature (an internet-based virtual world environment) means that there are serious technical limitations w.r.t. the quality of the graphical representation of a company’s goods (cars, for instance). Why would a car company spend money on a corporate presence in Second Life and on making low-quality graphical representations of its cars that will be seen by perhaps 5% of SL’s limited userbase instead of making a few nice videos and uploading them to its Youtube and Vimeo channels and embedding them on its website?

So, if LL’s management wants to show that it’s in touch with reality, they need to forget this pipe dream – at least for now. Otherwise, they’ll repeat the same mistakes they’ve made numerous times in the past and, to quote Horace, “bis repetita non placent.”




See also:




10 thoughts on “Linden Lab’s corporate pipe dream

  1. I think the dream is still realistic… just a few decades ahead of its time.
    Eventually the internet will be transformed into a virtual reality kind of form and of course SL wants to be part of that but if it will…
    The oculus rift has brought virtual reality closer to the masses and if LL plays its cards right, it could play a big part in the next generation of internet/virtual worlds.
    I think we’re close to a crossroads at the moment, pick the right road and SL will have a whole new life (but still not that what they dreamed about 10 years ago), pick the wrong road and it will eventually become obsolete.
    Mind you, as long as they keep making money they will probably keep SL alive even if it just has a few hundred users sharing a single server 😉

    1. The issue with this dream is that three consecutive managements of LL (Rosedale, Kingdon and Rosedale again) took (oftentimes against their better judgement and against good, solid advice they were given by the likes of Randy Farmer) a series of wrong turns, which (a) caused investors to bail out, (b) caused SL to lack technical orientation and focus, (c) caused SL to lose users along the way, (d) caused SL’s image to deteriorate to what it is right now.

      Are virtual worlds in general the future of the web? If they follow the principles that made the worldwide web succeed (and one of them being open data and open source server-side code), maybe, if and only if the entities (companies and non-profits) behind them play their cards right. But if they continue insisting on the walled garden business model, they’re going to have the typical life cycle of any product, which ends with, well, death (i.e. the product becomes irrelevant and obsolete and is abandoned and discontinued).

      But it must be understood that no proprietary, centralised, monopolistically-controlled system like Second Life can be the future of the web. Those who think so are way out of touch with reality, have no understanding of the internet at all and should just get out of this business now, before they do more damage than they already have.

      As for Oculus Rift, I still think there’s too much hype around it, which is expectable. They raised 16 million USD in the seed stage, so they’re doing whatever they can to promote it and get ROI (Return Of Investment) before the investors come and take the lion’s share. Let’s face it, the Oculus Rift crew are under the gun to make as much money as possible in as short a time as possible. But I’m still not entirely convinced about OR’s usefulness. The way I see it, both the Oculus Rift crew and its current supporters will one day have to do the reality check we did with SL.

      But anyway, a lot of this stuff I just told you belongs in another post that I’ll write at a later date.

    2. And also, there’s something else that I really don’t like about the obsession with bringing big corporations into SL. SL is (at least on paper) all about privacy, respect of avatar identity etc. These things simply don’t fit in with big corporations’ view of users: to them, our data is merchandise they can sell at will to the highest bidder, whether that bidder is a big advertising company or the NSA.

  2. Aaah Mona, I love your articles, perhaps even more so when I don’t agree a 100% with your opinion, even though I always feel that your argumentation is very solid!

    So bear with my grumbling for a bit, hopefully you have even better arguments to stifle my grumbling and subdue me to silence 🙂 hehe…

    First, your argument that “the web could be used on anything”. This is certainly the case today, and since we’re comparing SL today with the Web today, you’re clearly right. However, I’m not sure if you ever assisted to the launch of the world’s first graphical web client, Mosaic. At that time, we were all excited about its abilities — transferring images over HTTP? W00T! — but sorely disappointed with its performance: images (and, later, complex styling which led — even later! — to CSS…) took a long time to download, and, unless you were lucky enough to work at a university with a high-speed Internet connection, the Web was not for you. At least not the graphical web, although the text-based web was a possibility — aye, I do remember very well when my whole country was connected to the Internet with two 64 Kbps connections. Hah! At home, our 9600 baud modems were soon being replaced with the newer and faster 14400 baud, but we had no illusions: the graphical Web was not for common users. We were stuck with text.

    But less than two years later, Windows 95 came out, and everything changed. Suddenly, everybody’s computers were “bumped up” to be able to render and display graphical pages, and the era of the text-only Web disappeared (with perhaps a short “revival” with WAP for a while, soon to disappear).

    My point here is that the Web wasn’t “born” as it is today — a massive medium of rich information interchange, which runs from watches (and allegedly coffee machines and fridges!) to supercomputers, and, these days, NASA even had plans to extend the Internet to Mars (which sadly they didn’t do, but still tested the concept successfully on much nearer spacecraft).

    The difference, IMHO, is the rate at which the Web technology (and Internet communications) has evolved, compared to Second Life. In six years, I had at home as much bandwidth as my whole country; ten years after that, I have, again, as much bandwidth at home as my whole country had a decade ago. This continues to increase, even though not at the same rate. So, while in 1993, a web page with more than 10 KBytes would be seen as “catastrophically slow”, nowadays we daily download pages with a few Megabytes without blinking — and which any browser can render in a few milliseconds. By contrast, SL is as slow today as it was a decade ago — even though, of course, it renders things much more beautifully!

    Another difference is the split in focus. Phillip worked under the delusion that Second Life would become a mainstream product; when it was launched, it looked like it would eventually work out: early adopters were graphic designers, gamers, and hard-core programmers, all of which had expensive, powerful hardware which could render SL smoothly. Even today, anyone able to afford the latest and greatest hardware, even if it’s not top-of-the-line, will enjoy top rendering performance. The problem, of course, is that not all hardware runs SL; and that anything a few years old (like all my computers at home; all are obsolete, from the perspective of SL…) will take eons to do anything with a handful of frames per second. Sadly, most of the mainstream users are in that category.

    As the focused shifted away from the mainstream usage, and more and more into the realm of niche markets, LL slowly gave up on mainstream users. Designers, artists, programmers, gamers, hackers… all of them have powerful computers, all of them expect top performance and the highest quality from virtual worlds, and, yes, LL is catering to their needs.

    What is the Web industry doing in the mean time? Simplifying their interfaces. Google, of course, set the standard of lean, clean designs, but, these days, pretty much every high-end, multi-million-user environment uses simplistic designs — which load super-fast and can be easily viewed even on the slowest and less performant hardware. Such as watches or first-generation smartphones. My impossible-to-upgrade ancient iPhone is still able to view Google or Facebook, but (until I made some changes) I couldn’t access my own blog — it simply rendered too slowly.

    So here is the point where I agree with you. If Linden Lab still wanted to compete for the mainstream, then they would not have any other choice but to simplify the rendering engine, at the cost of making everything look ugly (or “simplistic”) but still usable. A Web-based — even better, a HTML5-based — solution would obviously make all the sense for that. The real reason why LL isn’t bothering is that they find that exploiting the niche market they already got is more worth their efforts than to expand to the unknown regions of the mainstream, where there certainly are users, but… are they paying users? That’s the big question!

    Until not so long ago, Facebook didn’t make a cent. They rode on hype — and investors’ capital, which they burned cheerfully, in order to be able to send out press releases with higher and higher numbers of users. Not so long ago (2009, I believe, but I might be mistaken), when Facebook finally broke even, they made less money than Linden Lab. Then they got an agreement with Microsoft, started selling ads, made so much money that they dumped Microsoft and continued to sell ads on their own, and, well, the rest is history: they make so much money on ads that they can afford to target the mainstream and support a huge userbase which doesn’t pay them a cent. Google, of course, makes US$42 billion or so in ads, so they can afford to “give away” Gmail, Google+, Hangouts, YouTube, and who knows what else they do for free. Similarly, Microsoft makes so much money from licensing software that they can also afford to “give away” Hotmail, SkyDrive, Online Office and similar free products — to the mainstream.

    In order to emulate them, Linden Lab would need a different stream of revenue. The model based mostly on tier, and marginally on LindeX/SL Marketplace fees, does not grow well beyond a niche. It’s impossible for LL to support, say, hundred million regular users, if only some 100,000 or so are willing to pay tier — and among those, 90% just pay a handful of dollars per month. This clearly won’t work. So, in that regard, LL is not really creating “the 3D Facebook of the future”. Rather, they have shrunk to fill a very small niche which just happens to have the right people, with the right hardware, and the right attitude towards paid online services. There are not many of those, and thus no wonder LL’s niche market is small. Small, yes, but still worth half a billio US$ in transactions, and making LL a hundred million US$ annually, which is not to be scorned at — they’re by no means a “tiny company”. Just a rather successful medium-sized company successfully exploring a niche market.

    Now, I agree, in principle, with the idea that the more closed a technology is, the more likely it will disappear “sooner or later” — usually, when the technology’s owner goes bankrupt and closes shop. Of course we have endured in the computer industry with very long-lived companies which successfully have kept their proprietary and closed formats alive for decades — Microsoft comes immediately to mind (but obviously they’re not the only ones!).

    However, we have been seeing a huge move away from open protocols recently. This has been quite subtle because it hasn’t been so obvious. Let me give you an example: a decade ago, we all assumed that one would be able to download documents from any server connected to the Internet using FTP, SFTP, or even HTTP. These days, however, document upload and download happens mostly through proprietary protocols embedded in Flash applications (or similar technologies).

    Wide-area forums have always existed… first on BBSes, but later on USENET, which was open, even though it had governance (to decide when to open new groups or not). Anyone could use NNTP to communicate to a USENET server, create your own server, get your own feeds, and so forth… what happened? Well, mostly, spam — but the point is, nobody seriously launches a USENET server any more. We all use Web-based forums — closed, locked down to certain applications, and unable to communicate with each other. The only way you’re aware of them is because (thankfully) Google still searches for them. But the ease provided by USENET — a global forum, spread world-wide, with all sorts of discussions on all kinds of topics — is gone: bye-bye, Internet-wide forum discussions.

    We also assumed that email communications would use SMTP/POP/IMAP4; but every week, more and more messages are being transferred using Facebook’s proprietary messaging format which is incompatible with anything and fully proprietary — but you’re expected to have a Facebook account (or a LinkedIn account, or a [insert popular social networking site here] account…), so “that’s ok”. It’s not: every day, someone else is developing a new and incompatible protocol to communicate and transfer files, and you’re expected to catch up — and hope that the company doesn’t disappear. Look at what happened to text chat: first we had IRC, then Jabber/XMPP, and then suddenly everybody (except Google) dropped all open protocols for instant messaging — and even Linden Lab said, “we tested XMPP and found it wouldn’t work for us”. Right! As if they had a better solution! (We all know the troubles with LL’s implementation of group chat) But the trouble is that everybody is suddenly reinventing the wheel. The only recent development of trying to get things communicating with each other was Microsoft’s integration of the MSN Messenger protocol with Skype and Facebook — finally, three proprietary technologies could chat to each other (and even use video chat!). But they still remain proprietary: it’s the Skype client application that supports all three protocols and seamlessly bridges across them.

    This killing of open protocols goes further and further. We had a fantastic open streaming protocol, RTSP (the one supported by LL inside its viewer… because, well, it’s an open Internet standard). Recently, the last company supporting RTSP with a free and open server, Apple (which maintained their Darwin Streaming Server), announced that they gave up on it. Why? Well, because Adobe managed to convince the whole world that they had a “better” protocol — opening the door for YouTube, Vimeo, and gazillions of other streaming providers to dump RTSP and develop their own proprietary protocols. Also, until recently, we had nice open protocols to share network-mounted disks. It all started with NFS, eons ago. Microsoft proposed SMP instead, and Apple suggested AFP — it was the beginning of the end. When HTTP became popular, we got WebDAV, which worked rather nicely, and it seemed to pave the way for open protocols to be able to share networked disks again. Then, well, companies like Dropbox — followed by Google Drive, SkyDrive, and WhoKnowsWhatsPopularTodayDrive — all of which couldn’t care less about open protocols and implemented their own, making sure they deployed viewers compatible with “their” protocol on pretty much every kind of hardware out there.

    So if you look at the history of open protocols… they’re slowly dying out, one by one. The last three bastions are HTTP, email and DNS — and email, as said, is starting to be seriously under attack by Facebook. If Facebook “wins” the war, in the future, you will be unable to use email except among a tiny community of nerds and geeks. Look at how people in SL/OpenSim use store-and-forward message technology: they drop notecards into other users’ inventory. Email was “left out” of virtual worlds — as it’s being “left out” of social networking. We might just be left with HTTP and DNS… unless someone in a garage is sitting on a pile of venture capital money and planning how to get rid of the Web and DNS. It’s possible. Nobody could have predicted, in 1993, how quickly USENET, FTP, IRC, and later RSTP and XMPP, would die out…

    The attempt to normalise and standardise virtual world communication protocols died out in 2010, with the VWRAP group, which showed so much promise in 2007. In 2010, however, nobody wanted to implement their recommendations. Not even the OpenSim gang, which had always kept up with VWRAP’s latest working specifications, while there was still hope, one day, to get full communication with Second Life. 2010, however, was when M Linden thought it was pointless to keep “an open grid” and just shut down most of the more “open” ideas that LL still had floating around. Maybe he just thought, “if Facebook can remain a closed and locked island, so can we”. And maybe, in retrospective, he wasn’t that wrong — he was just following the lead that everybody is doing. Google might be one of the last bastions of open protocols, but YouTube does not support RTSP any more (it used to, btw, for a short while, to provide streaming video to early-generation smartphones which weren’t licensed to run Flash), and Google Drive uses Google’s own “Google Data Protocol” (which, well, at least runs over HTTP…).

    In a sense, I think that at this stage the trend is to run every proprietary protocol over HTTP (mostly because it successfully goes through any firewall!). In that case, LL’s move to HTTP and phasing-out of proprietary UDP-based communications is not that silly: they’re just following a trend.

    Whew. So. Did I lose my point somewhere? I think so! My point is that free and open protocols are out of fashion, and, because of that, it’s highly unlikely that “anyone” will be able to suddenly reverse the trend and launch an “open metaverse protocol” and expect to get away with it. In fact, what we’re seeing is quite the reverse: anyone launching a new virtual world seems to be specially keen in reinventing the wheel and develop their own protocol, instead of sticking to (any of many) standards for virtual world communications. It’s just at the research and academic levels that (some) people still bother with open protocols. In a sense, the Internet’s biggest success — its reliance upon free and open protocols, released to the public domain — is being cut at the root. What every new virtual world wannabe does is to create everything from scratch, as closely as possibly, and expecting that magically everybody will move over to “their” protocol and specifications. Which obviously doesn’t happen — not even Google was able to launch their own virtual world (Lively) with that approach. And just look at what the OpenSim commercial grid operators are doing: while every OpenSim-based grid can freely communicate with every other OpenSim-based grid in the universe, what all commercial operators immediately do is to make sure that their grid is isolated from Day One! Seems to be a completely stupid approach — but all justify themselves with the argument that they’re not willing to allow “freeriders” to use their bandwidth and resources. Which is pretty much the same argument why LL has their grid closed to the outside world as well.

    Your last point, however, rings true: corporations and governments don’t want privacy, and, as such, anything that promotes privacy — like Second Life — runs contrary to the current trend in social networking, which is just useful for two things: 1) dating (for the end-users); 2) profiling data and selling ads (for the companies running the service). Pretty much everything else is marginal and irrelevant. So, yes, a metaverse which focuses on privacy, security, content protection, and users making money (as opposed to companies sucking users dry) is completely out of fashion. I do agree.

    However, I also think that we’re quickly running to a point where a catastrophe has to occur. Somewhere the Big Drama is going to break lose — somewhere, a company running one of the leading social networking tools will have a security issue, a leak, some huge privacy problem… and which will be so big, so big, that nobody will be able to hide it. I think that the recent debate around PRISM is the top of the iceberg, but one that is already raising a huge amount of drama, which could tip the point. And when that happens, we’ll see a swing back to normalcy — where privacy, suddenly a rare commodity, will be in much demand again. But this will not happen very quickly. Facebook took ten years to break apart the idea that “privacy is bad” and managed to successfully convince a billion people of that. Convincing that billion that “privacy is an unalienable right” — reverting Facebook’s strategy of the past decade — will not be an easy task, even if there is a single event that shatters the carefully built castle of cards on the “no-privacy” issue. It would have to be something like having a drunken Zuckerberg logging into the Dalai Lama’s account and sharing with everybody in the world pictures of the Pope having sex with Obama. And even so, some people would defend Zuckerberg’s right to “go public”. The no-privacy lobby is incredibly strong and has brainwashed a substantial part of humanity on this planet.

    Oh, speaking of which, I noticed you mentioned the old myth that Second Life is full of ‘bots. You know, this is a rather interesting meme. So interesting, in fact, that it has been the target of several — really quite a lot — of active academic research: there is a whole body of knowledge in bot detection technology, and lots of researchers have applied the same techniques to figure out how many ‘bots there are in SL. Remember, LL claims that, since they have changed the ToS to make it mandatory for ‘bots to be registered as such and to limit how ‘bots can be employed in SL, they have announced that “‘bots are less than 10% of all connected users” — an announcement met with a lot of skepticism by the residents. Scientists, however, probed this claim further and used advanced ‘bot detection techniques, and the actual figure is much lower — 5 to 7% at most! The results are routinely published in peer-reviewed academic journals. At least as recent as 2009 — when we still had a more lenient ToS and allegedly far more ‘bots than today — a paper was submitted, co-authored by a ‘bot development company, that could not find more than 1.8% of ‘bots among the SL population, and concluded that the market for ‘bot developers was hugely unexplored, because there were so few ‘bots around (compared to other services, like Yahoo or Twitter…)! So, well, claims that there are “a gazillion bots around” are simply not sustained by researched evidence. Truly, the vast majority of us are really human. That doesn’t mean that we still see clusters of ‘bots on many popular venues, but they are, on average — an average spread over the whole landmass and resident population — truly negligible.

    Unless, of course, the AIs behind those ‘bots are so clever that they manage to figure out who is a researcher (more likely: a researcher ‘bot!) trying to count them, and are able to elude them and hide… 🙂

      1. Aw I was sort of restless and sleepless after a long day… don’t bother to address them all. And, in truth, I agree with most of your points. However, I guess it shows that somehow I felt “cheated” by the idea that open protocols would change the world — I’ve been promoting open protocols as early as, oh, I don’t know, 1993 perhaps? So in the past two decades this has been my mantra — “open protocols, open protocols, open protocols”. And what was the result? Everybody is abandoning them, and, worse than that, they have been extremely successful in doing so!

        This lead me to vent some frustration, and thus my long comment. It felt like a purge, really…

  3. I still believe in Rosedale’s dream. And I still think that Second Life can be the start of the metaverse. It wouldn’t be the whole thing — like you said, corporations need more control, access to their visitors, etc… — and you get that on your own grid. But Second Life could be part of that bigger ecosystem. The way, say, that Facebook is. So companies that have their own websites ALSO have a Facebook presence (usually much smaller) for marketing purposes.

    I don’t think the metaverse will replace the Web. I think it will add a new later to it. The Web will continue to be used to communicate information. And the metaverse will be used to communicate EXPERIENCES. Like learning simulations, company meetings, parties, virtual tours — anything that people do, as opposed to read about or watch in a video.

    It will take new user interfaces (I have big, big hopes for the Oculus Rift!) and better viewers (I’m hoping that WebGL viewers like Pixieviewer become usable this year). It will take Second Life finally taking control of its marketing, and investing heavily in it (I’m thinking of a big, AOL-style push). And it will take growth in the broader metaverse (such as the OpenSim-based worlds) and Second Life connecting to them via hypergrid. (The technology is already there, for when it makes business sense.)

    It is discouraging that the metaverse still seems no closer than it was five years ago. But I think the huge explosion of interest in the Oculus Rift and the Viruix Omni kickstarters shows that people are really hungry for it. I’m hoping that this indicates that we’re on the verge of major breakthroughs.

Comments are closed.