The InterGrid and the Second Life Foundation

When the Metaverse Roadmap was released last year, people were excited. For the first time in history, several different technologies were planned out for the next 10-20 years, and their convergence — desired, or undesired — laid out and discussed openly, surveys were made, presentations were given, and a lot of documentation was produced. The Metaverse Roadmap is not a “prophet’s tool”. It sort of gave directions and guidelines; it tried to “define” what people’s expectations of a “metaverse” should look like, and how to slowly proceed to implement it. Although the Roadmap could and was criticised — for instance, it appealed to people’s participation on surveys; it extracted information from existing technologies; but it didn’t plan to implement anything — it was better than the alternative: having no information on what a “metaverse” should look like.

During Virtual World 2008, what suddenly happened was that the Metaverse went through an “identity crisis”, as Hiro Pendragon so aptly named it. Put into other words: apparently, the industry is not aligned with what the “Metaverse” is supposed to be. They have forked and gone different roads.

One or many Metaverses?

Neal Stephenson’s original idea of a Metaverse seemed to indicate a “single” environment, with differing uses. Although the Metaverse Roadmap tends to talk about “integrating separate technologies”, the ultimate goal would be a single, unified virtual environment, even if with multiple purposes — from socialising to gaming, from research to teaching, from simulation to augmented reality. What was expected was a certain “convergence” of technologies towards that ultimate goal, not unlike what happened with “online networks” that ultimately fused in what we broadly classify as the Internet today.

But just a year after the Roadmap was published, it’s quite clear that the industry doesn’t agree with that vision. Instead of an “unified metaverse”, they’re using the name ambiguously as a more fancy synonym of “virtual world”. The industry players wish to address a different target — games for teens — and are reluctant in accepting an “unified” environment. Each company wishes to have their own virtual universe and aggressively compete with the others.

Now, we can safely assume that all those companies have had market analysts and specialists who have told them where exactly to invest — clearly defining marketing targets where a profit is to be made. The Metaverse Roadmap — like so many other similar initiatives — is much more a “vision thing”: here is what we wish to do in the future, here is how we can achieve it. Both assumptions can work at odds with each other, and the result, of course, is that they won’t agree on the results.

There is, of course, an exception. And this is where our focus should be.

Let’s keep the question “unanswered” for now. Some claimed there would be an “unified metaverse”; the industry thinks otherwise.

Controlled Content

If you have followed Giff Constable (COO of the Electric Sheep Company) very interesting and thought-provoking articles, sometimes we find him complaining about the Second Life® platform. He has slightly different reasons for complaining; the Sheep continue to be the largest company doing development in Second Life, but they seem to have given up on exclusive commitment to SL after the MTV fiasco.

Briefly put, MTV required much stronger control on their own environment that Linden Lab® was willing to provide them. That’s a case where both sides are right and none are wrong. Second Life’s technology is too open — but for certain kinds of projects, companies demand a closed environment.

Well, of course we know that you can have “hidden” and “secret” private islands. But it’s just not the same. You’ll still be at the mercy of LL’s upgrades and grid instability. Users will still need to go through LL to fix their problems — recovering passwords, inventory, or whatever they need. The simulators will still be hosted on LL’s own grid servers. At least that was the case back in early 2007 — these days, companies like Dell run their islands on Dell hardware; and IBM, as we know, is experimenting in putting their own corporate sim servers behind firewalls.

Of course the alternative is not OpenSim — it simply is not “industry-grade” yet. So corporations turn to alternatives. MTV picked — much simpler to install the client, very low-resolution graphics (so everything is faster), and absolute control over content and people.

Here we’ll see the big issue for developers. As Richard Bartle and others argue at Terra Nova, developers are not happy about people claiming “rights” over their content. You’ll see that this is common everywhere else (except for Second Life), where although many companies and their developers ultimately allow users to upload some content, they view the notion of “users claiming ownership of content” as absurd. The older and the more established the developer is, the more they are likely to frown upon this idea.

Contrast this with the whole attitude of Second Life residents. We live in a virtual world without almost zero content provided by Linden Lab. The client is open source — and there are better alternatives to LL’s own client. Although LL’s simulator software is not yet open source, OpenSim provides a viable alternative today, even if not industry-grade. Linden Lab and residents join forces together on projects like libsecondlife and the Architecture Working Group to try to make the whole technology as open as possible. Granted, this is not peaceful — everybody wishes that Linden Lab were faster in deploying user-contributed patches, ideas, suggestions, and projects for the “future of Second Life” — but at least it exists.

Blizzard developers are not going to sit down with World of Warcraft users and discuss how to redesign the game so that users can upload their maps and their own content to Blizzard’s “gaming platform”. That concept is totally absurd. People pay to get access to high-quality content developed by the company that owns the platform.

This was the lesson learned at VW’08 — nobody wants their users to have full control over the platform and its content. Basically, if you wish to have control over your intellectual property — create your own platform. Developers, like Prokofy Neva always reminds us, adore control. (In my country, this attitude is called the “bouncer syndrome”, in the sense that club bouncers have the illusion of “power” by denying access to whomever they wish to the club — although they don’t own the club themselves.) And although one might disagree with Prokofy when he reminds us of that, the truth is that the academic experts on Terra Nova tend to behave with alarmingly increased levels of paranoia in this age where one platform creator — Linden Lab — showed that there is nothing to “fear” from user-generated content.

In any case, the message is clear: the industry is totally abandoning user-generated content, and moving into the realm of controlled environments. They want to make them “kid-safe” and explore the teenager market. Adult content — specially, user-generated adult content — is not part of their “plans for the metaverse”.

“Better alone than in bad company”?

That’s actually a popular saying in Portugal, too. We now seem to have two major pushing forces in the industry, and a third “hidden” one. The “hidden” force is, of course, the virtual worlds currently under research. Allegedly, all big players, from IBM to Sun to Xerox, including Adobe and Microsoft, and probably even more, have their own platforms, their own virtual worlds — in some cases, they have been working at them for years. They have not been crossing their arms and waiting, but actively pursuing the technology. Why don’t we see these technologies becoming products?

I’m sure that every research facility has a reason for that — either the technology is not mature enough; or they haven’t been able to create a “product” out of it; or, more likely, their own market research doesn’t show a target audience for the technology. So they’re still experimenting. And, more curiously, they’re all (or almost all) experimenting with SL as well. IBM, of course, is at the forefront; Sun does have 1500 employees regularly logging in to SL and do their research there; even Microsoft has not given SL up and has recently done another product launch in SL. So they all may be researching and developing their own technologies, but they’re not abandoning SL yet. Sure, they complain about everything we do: too complex UI, an insane “orientation island” experience, instability, too many bugs, lack of crucial features, and so on. But they’re using it — even if they research other technologies, too.

The major forces in the industry, however, have all suddenly jumped into the “closed content” environment. We might have had some hints from the likes of Sony (whose VW product, Sony Home, is delayed again, and after critically enlarging their “snapshots”, I now believe that they’re just Photoshopped images to “look” like a VW…) — clearly the first to state that their product would only have user-generated content “in a future phase” (translation: never). But all the newcomers are now echoing Sony and targeting the teenager market with closed content, closed platforms, and isolated environments. They’ll go the way of Kaneva, IMVU, MOOVE,, and so many others — sure, independent developers might get licenses to develop content for these new “TeenWorlds”, but they will be screened, they’ll have to pay for licenses, and will need to have a high degree of competence, talent, and development experience to be part of their projects. Developing for Multiverse, for instance, is not for the faint of heart (as Ted Castronova quickly found out); the others might be easier or harder, but they’ll be aimed at professional development studios.

The “common user” — the legions of users of MySpace, Facebook, YouTube, and all Web 2.0 user-generated 2D content environments — are out of this Metaverse.

Well, there is the third power in the industry — Second Life-compatible environments. These are the only ones that are betting on Neal Stephenson’s vision of the “Unified Metaverse”, but with a twist.

The InterGrid

Let’s forget Stephenson’s Metaverse for a while — an utopian technology which isn’t really going to happen, if we’re reading the signs correctly. Instead, we should scroll back our history window to the late 1980s, where dozens (hundreds!) of “online systems” ferociously competed among themselves to become the “dominant force” on the online universe. In the 1990s, however, it started to become clear — thanks to the pushing from the academic and research environments, and aided by companies like IBM, Sun, Xerox, Microsoft, Adobe, Apple (I hope this does ring a bell…) — that there was a new player in town: the Internet, not really “a” technology, but a set of protocols that allowed networks to exchange information between themselves. The Internet, after all, did not mandate that all networks used the same (internal) protocols — they just arranged a way to allow interconnected networks to become a reality.

In effect, what this meant was that companies could still keep their legacy systems inside their firewalls, but communicate with others using a set of common protocols. By then, the Internet was hardly a “newcomer”; it had two decades of development behind it, and dozens of thousands of “private networks” interconnected in some way. When suddenly it was clear that the only way to exchange information between these networks was by using a common protocol, a revolution was born. A revolution that made things like Compuserve disappear almost overnight, and brought MSN and AOL into the Internet incredibly quickly. All of a sudden, in less than ten years, old legacy communication systems were dropped, one by one, and an “Internet Protocol” infrastructure was deployed to replace it. Sun’s 25-year-old motto “The Network is the Computer” became reality very quickly, and perhaps not in the way Sun intended it to (to give them credit, a lot of the crucial protocols that allow things we take for granted — like mounting remote drives — were invented or developed by Sun first).

The interesting bit about the Internet is how quickly it swallowed up everything. Except perhaps for mobile phones, even most landline communications these days work on top of the Internet’s protocols — which is the reverse of what happened on the mid-1990s. “Everything-over-IP” was a vision in the final years of the 20th century, but became commonplace in the first decade of the 21st century. It was so quick, so fast, and so overwhelming that we hardly noticed when it happened.

In effect, while in the mid-1990s people still used proprietary networks — and used “Internet gateways” to communicate with the rest of the world — this completely disappeared today. Almost no one is offering non-IP solutions anyway. And if Steve Jobs handles it correctly, he’ll finish up with proprietary mobile telecommunications too (no wonder that iPhones work better with Wi-Fi — which use IP over wireless radio — than with GSM…).

So where does this leave the Metaverse?

I believe that the notion of the “unified Metaverse” will not happen “soon”, but something somewhat different will happen. First and foremost, we’ve got a slight time synchronisation problem. LL was eight years old when in 2007 the mainstream suddenly understood that “virtual worlds are the way to go”. However, 2008 is “the year of the virtual world startup”, where suddenly everybody understood that there is a new “wheel concept” and is hurrying up to develop their own version of the “wheel”. That’s all very fine, but all very late-1980-ish. In one or two years, what all these developers will find out is that they have managed to duplicate or triplicate the market for virtual worlds (estimated 50-60 million users these days), or probably even more, but they’ll have a huge problem in hands:

Their virtual worlds will never interoperate. And their users will soon find out that the novelty wears out after a few years — they’re avid content consumers, and will demand more and more. There are not so many Blizzards out there, able to launch massive amounts of new content to keep their users happy. The small start-ups have no chance to survive — they’ll be dealing with stability issues and fluidity of “gameplay” with hundreds of thousands of simultaneous users, and hitting the roadblocks that Linden Lab has hit in late 2005/early 2006 (most MMORPGs deal with the amount of simultaneous users by simply “sharding” their virtual worlds).

In the mean time, Linden Lab, the decade-old, sluggish, patient veterans of the Old Metaverse, will be paving the road towards something completely different. Not a “Second Life Über Alles” kind of environment — not even LL is that insane — but a SL-compatible environment, where all sorts of independent grids, from IBM to college campuses, from LL-licensed subgrids, to OpenSim (and forked derivatives) grids, are interconnected together. Each will be dominated by their respective control freak, watching those “boundaries” (the point where avatars with their user-generated content cross grids) very closely and carefully. But they will still interconnect. People will use the general-purpose SL client or something like RealXtend from to connect to all of the SL-compatible grids; the choice will be on the users’ side. Certainly a lot of those will feature “closed content” and limit the types of content that can be brought across grids (like the corporate grids, and probably teen-oriented grids) — and others will probably allow all content to be copied and have no permission systems in place. The majority, however, will be very similar to what LL provides today with SL: an adequate (even if cumbersome) permission system; a global microcurrency unit; almost absolute freedom of user-generated content; a more-or-the-less permissive ToS, adapted to specific countries.

But, overall, in spite of using different technologies, hosted on different types of servers, using different concepts for the “central servers” (the asset servers and similar ones), and very likely with improved or just different user interfaces, they will share something in common:

They will be all compatible with each other.

Now this is the crucial difference between the “Metaverse as an unified environment” (ie. ultimately everything works under a single software platform) and what I call the InterGrid — interconnected grids, each handled separately with different software and technology, and with its own quirks, improvements, management, look & feel, but all allowing users to enter the InterGrid at any point (at their choice) and cross grids (within limits) at will. This can be made possible thanks to the release of the Second Life Protocol to the public, and the immense documentation work being done by the Architecture Working Group, which is trying to define the standards to allow grids to interoperate, while still giving each grid the ability to enable control over “their” environment.

I believe that this will come gradually, one step at the time, and without most of the people noticing what’s going on “under the hood”. We got IBM’s announcement that they’ve been working on getting sims behind firewalls; we have some OpenSim grid managers working on interconnecting their grids with LL’s own, and we know that LL has made it a priority. So, probably by 2010, we’ll get them all working together, still experimentally, still on a very rudimentary “alpha” level. But they’ll work. While by 2010 all the start-ups that announced proudly that they’ll continue to forge ahead with “their” vision of the Metaverse will collectively go oops.

The Second Life Foundation

So what will Linden Lab’s role actually be in 2010? With Mark Kingdon at the helm, we can expect a different management style — one that will focus on “marketing a vision”. Mark will have a lot of problems dealing with robustness, scalability, and revamping millions of lines of code to make LL’s application become solid and reliable — because all the 2008 start-ups will have that all (until, of course, they start attracting millions of users…). But once that huge development effort is deployed, Mark will need to find a place for Linden Lab to exist as a dominant market player.

I think that what we’ll see is a better definition of what the “Second Life Grid” is going to be. Right now, that website of Linden Lab just talks about the grid-as-we-know-it, but with a language addressing a business audience. It’s not enough. They forget to explain that, for LL, there is no difference between “businesses”, users (consumers of content), content creators, and “metaverse engineers” (the ones developing the InterGrid) — all are placed inside the same virtual environment without interference from Linden Lab. That’s all very nice, but very likely not the business model that LL wishes to continue to present.

What makes much more sense is to neatly split the company into two major areas. One is going to be fully embracing the foundations of the InterGrid — and what would be more appropriate than creating a Second Life Foundation for that?

This is just going the route of so many open source projects — from Apache to Mozilla — which, at some point in their lives, get the company (or the university, or the group of geeks) who developed a technology to place all source code, names, logos, and associated intellectual property rights over it into a non-profit foundation, and become its largest endorser — making sure they’ll get a place at the board.

Obviously, LL would be the biggest and more important sponsor of the Second Life Foundation — but IBM and Sun would most certainly wish to have a place there, too, and I’m also pretty sure that AOL/Warner Brothers, Xerox, Microsoft, NBC, and several other technology/content producers would like to be part of it as well. And almost all universities currently doing research in SL.

The major development work on the Second Life protocol would be done by the Second Life Foundation. They would be in charge of the specifications, and “absorb” the Architecture Working Group — both the Linden employees and the hordes of volunteers — into it. They would also organise the libsecondlife group and “absorb” the OpenSim group — and probably abandon the creation of the “SL Viewer” to the Foundation as well.

In essence, the Second Life Foundation would have the code, the protocol, and the rights over what “Second Life” means (yes, solving once and for all the nasty trademark issues). Linden Lab would remain in control as the biggest sponsor, but they wouldn’t be the only active voice defining the roadmaps and the timelines for all future SL development.

Under this model, it’s also obvious that things like the “mainland” would have to be completely rethought. It’s clear that LL has no resources to “run the mainland”, imposing rules, order, and acting fairly as judges and overseers. They’re simply too thinly stretched for that.

Very likely, they would enter the corporate market — basically, providing system administration services for the paying customers. They have the expertise and the know-how to run a grid with dozens of thousands of simulators that can get almost a hundred thousand simultaneous users. And this would be their sales pitch: if you wish reliability, hire your sims from us. If you wish to have fun and experiment with the SL toolkit, get a cheap sim from one of the OpenSim grids. Oh, and you can bring your avatar to our corporate grid too.

This would effectively disrupt the whole in-world land market — and it means I’ll get a lot of angry comments from the landbarons on this article — unless, of course, the landbarons learn the lesson. In 1990, people developed content for Compuserve and AOL. When the Internet was rolled out, they faced competition from thousands upon thousands of small companies that were setting up their web servers for the first time, at a fraction of the cost of the license to produce content for the “online services”. Well, the smart ones brought their know-how and expertise to run web sites, too. Anshe Chung is a much better community manager than LL — she can run her own grid, with her own software, her own system administrators, and aggressively compete with much lower prices (if she doesn’t need to buy sims from Linden Lab any more).

So we’ll start to see a much heavier competition, across prices and services. Poor universities looking for cheap, unreliable sims will probably just run their own servers inside their campuses — but their students will be able to continue to go shopping on sims hosted by LL. Community managers (“landbarons”) will have the choice either to go for price or for services — and depending on their choice, either rent cheap sims from an OpenSim grid, run their own grids (if they can get hold of enough system administrators), or compete in services by renting sims from LL. Companies like IBM (and perhaps Sun) will run their own grids for their own customers, and compete with LL not on price, but services — LL has a head start, of course, but IBM has 60 or more years of running large and complex computational resources, and will certainly catch up quickly. (In effect, instead of believing that LL might go public, I think it’s much more realistic to assume that a fair share of LL might be bought by IBM.) In either case, what we call the “OpenSim Grids” will soon become “SL-compatible hosting providers” in the near future.

What does this mean for LL’s revenue model? They’ll still be rolling out industry-grade sims; they’ll be able to charge for interconnection fees; they’ll leverage their position on the Second Life Foundation to let Philip put his “stamp of approval” on the future development of the SL protocol. Philip will, in effect, play the role of Tim Berners-Lee in the W3C — a reference for future work, but not the only one having his say. In fact, away from his role as LL’s CEO, Philip is now free to plan the Second Life Foundation. Mitch Kapor will certainly know everything to help him with this transition — and we have seen how deeply Kapor has been recently involved in LL’s operations, to the point that he was once labelled “acting CEO” for LL. (In fact, Philip & Kapor will probably be pushing for the innovation aspects of Second Life-related technologies — while Linden Lab would run the servers, and the Second Life Foundation will take good care of the InterGrid communication protocols.)

Diverging paths, common futures?

In conclusion, I really thought that 2007 was the “turning point” for Second Life, although, to be honest, in a completely different way. In fact, what happened was that 2007 brought the concept of “virtual worlds” into the mainstream — but not, as I had expected, “Second Life as the Metaverse”. In 2008, we’re watching the first “scism” among virtual world evangelists: the ones betting on “closed content”, and Second Life as the only one (but by far the one with the larger user base) going ahead with user-generated content and grid interconnection. We’ll now see a plethora of smaller virtual worlds, all targeted to specific audiences, but only one focusing on the 30-50 year olds who run companies. We’ll see all sorts of companies, from small start-ups with too much funding, to huge entertainment monsters like Sony (and, who knows, Internet giants like Google and Yahoo), launching “their” vision of a controlled-content metaverse — and we’ll see Linden Lab and their “satellites” growing the InterGrid. If history repeats itself, we’ll have by 2010 a huge industry conglomerate around the Second Life Foundation deploying a seamless (but independently owned) InterGrid with 20-30 million users, while perhaps another 20-30 million will be spread across hundreds if not thousands small virtual worlds and suddenly demanding to be interconnected too. By 2015, however, the battle will be for the “winning strategy”. Technology has this strange, market-driven effect to get standardised — a lesson that industry learned a century ago, and that’s why we can drive your cars and use any type of gasoline produced in the world. It’s also a lesson that made Macs interoperate with Window-based or Linux-based PCs, in spite of Steve Jobs’ efforts to remain as incompatible as possible for as long as he could.

Once the “many small metaverses” model start to hit roadblocks — they won’t converge, but diverge, as new creative applications will demand their models to further change and remain incompatible — there will be just one “unifying force”: a industry-sponsored foundation, sponsored by giants like IBM, Sun, Microsoft, and who knows how many others, that will be the only one addressing the fundamental issue of interconnecting grids. It seems clear now that nobody else will be doing this.

The question remains if Linden Lab is, indeed, willing to take this bold step. I think that they’ll delay it as long as possible, but ultimately, their choices will be much limited. In another two years, OpenSim will be at the same level of technology advancement as Second Life was in 2006 — good enough for corporations to routinely use it. It will also feature grid interconnection from the very start. Linden Lab has the power to decide now if they will be part of the InterGrid or not. All the signals they’re giving — the release of the code as open source; the support to the libsecondlife and OpenSim initiatives; the Architecture Working Group — seem to indicate that this is what they want to pursue. Still, the biggest step is to “go W3C” on the SL Protocol. It remains to be seen if Mark Kingdon is a Philip-class or Mitch-class visionary and understands that this is the only choice that they’ve got — even if not yet for 2010.

But I would certainly use the opportunity of having Philip as a “wildcard” at Linden Lab right now — unburdened by the role of CEO — to begin developing the Second Life Foundation immediately and funnel the required funding into it, while the media is asleep and confused with the “multiple multiverses” that were the conclusion of VW’08.

About Gwyneth Llewelyn

I'm just a virtual girl in a virtual world...

4 Pingbacks/Trackbacks

  • Well, I know you read this article of mine at the time, which responded to your effort to calculate LL’s revenue, but you aren’t crediting me with this idea now ROFL. Oh well, here’s the relevant paragraphs from my August 2007 blog:

    “My conviction is that Linden Lab, something like Mozilla rather than Google, is in the Better World business, not business business. People like Pierre Omidyar or Philip Rosedale, who are already as rich as God, don’t need to make that much money and don’t care if they are looking at a $5 or $6 million a month cash burn machine with a huge propensity for liability to litigation. It’s not that they’re going to put in their own money — that’s not likely. But they know that they have sure-fire recipes for taking people’s desire to socialize, buy, sell, have sex, and be entertained and convert it into things like E-bay or RealAudio or for that matter Lotus 1-2-3 that can make lots of money for them, be sold, and free them up to do what might be called philanthropic pursuits, but is really something much more widespread. If it goes sour and the Gambling Commission of the State of California shuts them down, they can re-open merely as an education or business meeting grid, having shaken loose all the load-testers who helped them make the software but then…*got in the way* with the sex and clubs.

    They want to convert hearts and minds. Do you convert them with Mozilla, which pretty much everybody likes and downloads without too much criticism, or with Google, which makes money, everybody is forced to use, but increasingly resents? Of course they’d rather be in the business of developing not only software, but a way of life — the Tao of Linden, the Love Machine, the JIRA, the edu-sims, the creepy sycophantic volunteers system — these are as important to them to sell as mind-memes as sims are for people’s cyber nests; indeed, their genius is merely figuring out how to get the latter to keep funding the development of the former, which is all they need to do — break even, or make even a little less, which can be made up with foundation and government grants.

    When the Lindens IPO and/or open source, they’re likely to have already prepared for themselves a 501-c-3 type of not-for-profit entity that will enable them to take the sales proceeds and go on doing all the thinking and coding and influencing they like, and promoting various showcase projects in the third world. They can set themselves up to be the ICANN of the Metaverse and work on standards and cross-platform protocols and all kinds of stuff. They may go on providing services, and be the sort of Associated Press of worlds (the AP isn’t a profit-making corporation, but a not-for-profit service that all newspapers subscribe to — it’s a cooperative that makes it possible for local newspapers not to maintain foreign or capital news bureaus, but use the news feed from this organization that does — AP isn’t an independent media corporation itself making a profit from its fees for the use of its stories and pictures. Becoming the AP-like news/views/ideology cooperative of the Metaverse providing hook-up or sim prototyping services is an extremely attractive role for Linden Lab.

    You never seen Lindens use the word “Metaverse”. They never talk about the “Metaverse”. It’s not in their vocabulary, if you study their words carefully. This may be due to them planning to *becoming* the Metaverse and making sure it then becomes called Lindenworld or Lindenverse, or they may have some other term of art they’d like to use. By the time something called “the Metaverse” is a household world with every home having a $37/month subscription to it like cable, the Lindens will be in every warp and woof, having influenced every aspect of it — though of course in fierce competition with Sony, Blizzard, etc.”

    BTW, the “term of art” they use and are trademarking is “Second Life Grid”.

    Also BTW my predictions for 2007 included that the Lindens would make a 501-c-3 non-profit organization:

    Prokofy Neva

  • jcm

    Hi, Gwyneth.

    I’m not sure how to reach you so here goes. I’m a Ph.D. student at UO. My research area involves TV fans and virtual worlds. I’m conducting a survey that asks about the CSI: New York and Second Life experiment.

    I was hoping to get your permission to post the survey on your blog. I know many of your readers watched the show and took part in the experiment.

    Would you please let me know if you will allow such a survey on your blog?

    I appreciate your time.

  • Pingback: » Corporate Stupidity Or What? - Living in the Metaverse()

  • Prokofy, I stand corrected 🙂 I didn’t remember to re-read your article on that but in fact I was thinking more about your more recent “Cultural Revolution” article.

    However, the “meme” of LL starting/joining a Foundation has been around the SLogosphere for quite some time. The oldest article I wrote about it was on November, 2005 (, and although it talks about “crowdsourcing marketing”, it mentions Mitch Kapor as a possibility to point LL into a “foundation” as part of LL’s efforts.

    More recently, a week before that “math article” came out on the Herald, I took some time to write about the Open Source Grid (Open Source Second Life — The Geeks Strike Back) and their creators’ wish to establish a foundation for it. The difference perhaps was that it would be a non-LL-founded foundation (sorry about the pleonasm), but I’m sure that if Gareth Ellison goes ahead and turns his not-for-profit into a foundation, this would very likely get LL’s sponsorship as well.

    jcm, I’d be pleased to run your survey if you wish, but please ask permission from either the CSI:NY team or from the Electric Sheep Company first. My email address is [email protected].

  • As ever, an intriguing article. There is, of course, a great deal of uncertainty around the future of virtual worlds: the use cases for virtual worlds are not presently as clear as they are for the sharing of text, images and videos, and the prophecy here relies on the generic use-cases (as opposed to the individual sectioned use-cases of specific games, as are presently trendy) being very important. It is hard to predict whether they will reach this level of importance: not everything prophesised to be vastly popular always has been.

    What is clear, however, is that the multiplicity of micro-worlds are likely to have a very limited long-term appeal: I rather doubt that more than a handful of those being started now will be around in five years’ time.

  • What’s the GOAL of connecting to a grid which is renowned for its instability, limitations and poor structure in general? What would be the value of this ‘grid interoperability’?

    I find the notion of placing LL at the center of this VW development a really far stretch of the imagination. The idea SL would create any of the standards that could be developed, such as:
    * Asset Server Protocols to artificially maintain value of 3D content
    * Multiple client/browser/server support
    * 3D standards to ‘import and export’ your 3D content cross world such as a high quality VRML
    * Data portability to create cross-world avatars (or one AgAv – see the prometeus scenario)
    based on their prim based (ug)content (which is of no value within the projection you created, see Blue Mars) and its half a million active users is completely unreal. I think it’s far more likely SL will have to adjust to these standards or become obsolete themselves, designing and maintaining them is well beyond their capabilities realistically speaking.

    Sure IBM is IN/working with Second Life, but like they say time and time again, they are everywhere, which is exactly where you should be if you want to create interoperability, because there already IS an increasingly interoperable datastream connecting these worlds called ‘The Internet’ and this is evolving entirely without the need of a Second Life, Foundation or no foundation.

    This kind of object oriented thinking, where you can take your second life avatar from place to place on a single grid only appeals to a very, very small niche – while goal oriented design (what do we actually WANT from VW’s?) will show you interoperability is far from a ‘must’, and its uses are not only fairly limited, they are largely captured in 2D data flexibility and the future ability of VW’s to tap into this data.

  • Patience young padawan, patience! Your timelines are awfully optimistic, Second Life has a hell of a long way to go before it’s the centre of its own universe, never mind someone else’s.

    People need to respect the simple concepts of “If it’s not broke, don’t fix it”. The internet works, it works well. VHS and the floppy disk also worked well and even now in their death throes their still putting up a fight.

    The protocols exist to allow these disparate worlds to operate on existing technology, there’s no rush to change that and it took a long long term for the development of the internet to arrive in a form that found mass consumer confidence. Along the way there were twists and turns and claims that Tim Berners-Lee wasn’t exactly impressed with the development of the gif, and that’s quite key because at this very early stage it’s highly unlikely that what we now envisage will be the reality.

    As for land barons, they’re not going anywhere. People seem to be under the misguided impression that cheap web hosting means server hosting is cheap, but compare bandwidth and resources from an average website to that which is required to run a virtual world and reality bites. Open Sim is cheaper than Second Life, but it has no economy, they’re only just getting around to transferring assets and there are a lot less users. Eventually bills have to be paid and the land barons will be presenting their bills as they do now. Who those land barons are is the issue, I’m sure companies like Godaddy are paying attention.

    However I do get the impression that Linden Lab would be happier with Anshe Chung managing the land business than they themselves.

    Second Life will be long remembered no matter what the future holds because it will always be remembered as a pioneering project, but as you point out, pioneers are oft swallowed up in the tidal wave and Linden Lab need to position themselves sensibly. Whether that’s via a foundation I’m not at all sure, I’m not convinced their big enough, but Sun, Novell or someone like that might be able to fill the spaces.

  • Pingback: InterGrid en de Second Life Foundation « Virtualisatie()

  • jiha

    good write up…

    but sadly only a complete repeat of the “metaverse” of 1995-7

    all it shows it that history has become a dirty word in the techbanker driven bubbles of the late 20th and now early 21st century.

    and that virtual worlds techologies when combined with the horrible myth of Googles will not be what the gullible want.

    but what you deserve.

  • Pingback: Bart in SL()

  • Pingback: UgoTrade » Blog Archive » Interview with Mitch Kapor()

  • ina

    This seems a nice concise overview of a potential future that may occur, but if I’m not skimming too fast to miss a major point… I don’t see the point of having an intergrid where people can’t directly TP without having to “log off.” Seems like they would turn into a totally different avatar due to asset server differences even if the logoff screen were faked as a TP screen. I mean, sure, using the same software for multiple VW’s is a convenience, but then again it isn’t an intergrid imho.

  • Ina, I’m pretty sure I didn’t intend to imply that people would need to “log off” to enter a different (but interconnected) grid. Take a look at Ugotrade’s article, where an example of how a distributed grid — with separate asset servers — can work (now, not in the future!) without the requirement of logging out of one grid and entering the next.

    As a matter of fact, the realXtend project is something like a mix of Gravatar + OpenID for the 2D Web: use any entry point to log in, bring your avatar and your inventory with you, no matter which grid you’re connecting to.