The InterGrid and the Second Life Foundation

When the Metaverse Roadmap was released last year, people were excited. For the first time in history, several different technologies were planned out for the next 10-20 years, and their convergence — desired, or undesired — laid out and discussed openly, surveys were made, presentations were given, and a lot of documentation was produced. The Metaverse Roadmap is not a “prophet’s tool”. It sort of gave directions and guidelines; it tried to “define” what people’s expectations of a “metaverse” should look like, and how to slowly proceed to implement it. Although the Roadmap could and was criticised — for instance, it appealed to people’s participation on surveys; it extracted information from existing technologies; but it didn’t plan to implement anything — it was better than the alternative: having no information on what a “metaverse” should look like.

During Virtual World 2008, what suddenly happened was that the Metaverse went through an “identity crisis”, as Hiro Pendragon so aptly named it. Put into other words: apparently, the industry is not aligned with what the “Metaverse” is supposed to be. They have forked and gone different roads.

One or many Metaverses?

Neal Stephenson’s original idea of a Metaverse seemed to indicate a “single” environment, with differing uses. Although the Metaverse Roadmap tends to talk about “integrating separate technologies”, the ultimate goal would be a single, unified virtual environment, even if with multiple purposes — from socialising to gaming, from research to teaching, from simulation to augmented reality. What was expected was a certain “convergence” of technologies towards that ultimate goal, not unlike what happened with “online networks” that ultimately fused in what we broadly classify as the Internet today.

But just a year after the Roadmap was published, it’s quite clear that the industry doesn’t agree with that vision. Instead of an “unified metaverse”, they’re using the name ambiguously as a more fancy synonym of “virtual world”. The industry players wish to address a different target — games for teens — and are reluctant in accepting an “unified” environment. Each company wishes to have their own virtual universe and aggressively compete with the others.

Now, we can safely assume that all those companies have had market analysts and specialists who have told them where exactly to invest — clearly defining marketing targets where a profit is to be made. The Metaverse Roadmap — like so many other similar initiatives — is much more a “vision thing”: here is what we wish to do in the future, here is how we can achieve it. Both assumptions can work at odds with each other, and the result, of course, is that they won’t agree on the results.

There is, of course, an exception. And this is where our focus should be.

Let’s keep the question “unanswered” for now. Some claimed there would be an “unified metaverse”; the industry thinks otherwise.

Controlled Content

If you have followed Giff Constable (COO of the Electric Sheep Company) very interesting and thought-provoking articles, sometimes we find him complaining about the Second Life® platform. He has slightly different reasons for complaining; the Sheep continue to be the largest company doing development in Second Life, but they seem to have given up on exclusive commitment to SL after the MTV fiasco.

Briefly put, MTV required much stronger control on their own environment that Linden Lab® was willing to provide them. That’s a case where both sides are right and none are wrong. Second Life’s technology is too open — but for certain kinds of projects, companies demand a closed environment.

Well, of course we know that you can have “hidden” and “secret” private islands. But it’s just not the same. You’ll still be at the mercy of LL’s upgrades and grid instability. Users will still need to go through LL to fix their problems — recovering passwords, inventory, or whatever they need. The simulators will still be hosted on LL’s own grid servers. At least that was the case back in early 2007 — these days, companies like Dell run their islands on Dell hardware; and IBM, as we know, is experimenting in putting their own corporate sim servers behind firewalls.

Of course the alternative is not OpenSim — it simply is not “industry-grade” yet. So corporations turn to alternatives. MTV picked There.com — much simpler to install the client, very low-resolution graphics (so everything is faster), and absolute control over content and people.

Here we’ll see the big issue for developers. As Richard Bartle and others argue at Terra Nova, developers are not happy about people claiming “rights” over their content. You’ll see that this is common everywhere else (except for Second Life), where although many companies and their developers ultimately allow users to upload some content, they view the notion of “users claiming ownership of content” as absurd. The older and the more established the developer is, the more they are likely to frown upon this idea.

Contrast this with the whole attitude of Second Life residents. We live in a virtual world without almost zero content provided by Linden Lab. The client is open source — and there are better alternatives to LL’s own client. Although LL’s simulator software is not yet open source, OpenSim provides a viable alternative today, even if not industry-grade. Linden Lab and residents join forces together on projects like libsecondlife and the Architecture Working Group to try to make the whole technology as open as possible. Granted, this is not peaceful — everybody wishes that Linden Lab were faster in deploying user-contributed patches, ideas, suggestions, and projects for the “future of Second Life” — but at least it exists.

Blizzard developers are not going to sit down with World of Warcraft users and discuss how to redesign the game so that users can upload their maps and their own content to Blizzard’s “gaming platform”. That concept is totally absurd. People pay to get access to high-quality content developed by the company that owns the platform.

This was the lesson learned at VW’08 — nobody wants their users to have full control over the platform and its content. Basically, if you wish to have control over your intellectual property — create your own platform. Developers, like Prokofy Neva always reminds us, adore control. (In my country, this attitude is called the “bouncer syndrome”, in the sense that club bouncers have the illusion of “power” by denying access to whomever they wish to the club — although they don’t own the club themselves.) And although one might disagree with Prokofy when he reminds us of that, the truth is that the academic experts on Terra Nova tend to behave with alarmingly increased levels of paranoia in this age where one platform creator — Linden Lab — showed that there is nothing to “fear” from user-generated content.

In any case, the message is clear: the industry is totally abandoning user-generated content, and moving into the realm of controlled environments. They want to make them “kid-safe” and explore the teenager market. Adult content — specially, user-generated adult content — is not part of their “plans for the metaverse”.

“Better alone than in bad company”?

That’s actually a popular saying in Portugal, too. We now seem to have two major pushing forces in the industry, and a third “hidden” one. The “hidden” force is, of course, the virtual worlds currently under research. Allegedly, all big players, from IBM to Sun to Xerox, including Adobe and Microsoft, and probably even more, have their own platforms, their own virtual worlds — in some cases, they have been working at them for years. They have not been crossing their arms and waiting, but actively pursuing the technology. Why don’t we see these technologies becoming products?

I’m sure that every research facility has a reason for that — either the technology is not mature enough; or they haven’t been able to create a “product” out of it; or, more likely, their own market research doesn’t show a target audience for the technology. So they’re still experimenting. And, more curiously, they’re all (or almost all) experimenting with SL as well. IBM, of course, is at the forefront; Sun does have 1500 employees regularly logging in to SL and do their research there; even Microsoft has not given SL up and has recently done another product launch in SL. So they all may be researching and developing their own technologies, but they’re not abandoning SL yet. Sure, they complain about everything we do: too complex UI, an insane “orientation island” experience, instability, too many bugs, lack of crucial features, and so on. But they’re using it — even if they research other technologies, too.

The major forces in the industry, however, have all suddenly jumped into the “closed content” environment. We might have had some hints from the likes of Sony (whose VW product, Sony Home, is delayed again, and after critically enlarging their “snapshots”, I now believe that they’re just Photoshopped images to “look” like a VW…) — clearly the first to state that their product would only have user-generated content “in a future phase” (translation: never). But all the newcomers are now echoing Sony and targeting the teenager market with closed content, closed platforms, and isolated environments. They’ll go the way of Kaneva, IMVU, MOOVE, There.com, and so many others — sure, independent developers might get licenses to develop content for these new “TeenWorlds”, but they will be screened, they’ll have to pay for licenses, and will need to have a high degree of competence, talent, and development experience to be part of their projects. Developing for Multiverse, for instance, is not for the faint of heart (as Ted Castronova quickly found out); the others might be easier or harder, but they’ll be aimed at professional development studios.

The “common user” — the legions of users of MySpace, Facebook, YouTube, and all Web 2.0 user-generated 2D content environments — are out of this Metaverse.

Well, there is the third power in the industry — Second Life-compatible environments. These are the only ones that are betting on Neal Stephenson’s vision of the “Unified Metaverse”, but with a twist.

The InterGrid

Let’s forget Stephenson’s Metaverse for a while — an utopian technology which isn’t really going to happen, if we’re reading the signs correctly. Instead, we should scroll back our history window to the late 1980s, where dozens (hundreds!) of “online systems” ferociously competed among themselves to become the “dominant force” on the online universe. In the 1990s, however, it started to become clear — thanks to the pushing from the academic and research environments, and aided by companies like IBM, Sun, Xerox, Microsoft, Adobe, Apple (I hope this does ring a bell…) — that there was a new player in town: the Internet, not really “a” technology, but a set of protocols that allowed networks to exchange information between themselves. The Internet, after all, did not mandate that all networks used the same (internal) protocols — they just arranged a way to allow interconnected networks to become a reality.

In effect, what this meant was that companies could still keep their legacy systems inside their firewalls, but communicate with others using a set of common protocols. By then, the Internet was hardly a “newcomer”; it had two decades of development behind it, and dozens of thousands of “private networks” interconnected in some way. When suddenly it was clear that the only way to exchange information between these networks was by using a common protocol, a revolution was born. A revolution that made things like Compuserve disappear almost overnight, and brought MSN and AOL into the Internet incredibly quickly. All of a sudden, in less than ten years, old legacy communication systems were dropped, one by one, and an “Internet Protocol” infrastructure was deployed to replace it. Sun’s 25-year-old motto “The Network is the Computer” became reality very quickly, and perhaps not in the way Sun intended it to (to give them credit, a lot of the crucial protocols that allow things we take for granted — like mounting remote drives — were invented or developed by Sun first).

The interesting bit about the Internet is how quickly it swallowed up everything. Except perhaps for mobile phones, even most landline communications these days work on top of the Internet’s protocols — which is the reverse of what happened on the mid-1990s. “Everything-over-IP” was a vision in the final years of the 20th century, but became commonplace in the first decade of the 21st century. It was so quick, so fast, and so overwhelming that we hardly noticed when it happened.

In effect, while in the mid-1990s people still used proprietary networks — and used “Internet gateways” to communicate with the rest of the world — this completely disappeared today. Almost no one is offering non-IP solutions anyway. And if Steve Jobs handles it correctly, he’ll finish up with proprietary mobile telecommunications too (no wonder that iPhones work better with Wi-Fi — which use IP over wireless radio — than with GSM…).

So where does this leave the Metaverse?

I believe that the notion of the “unified Metaverse” will not happen “soon”, but something somewhat different will happen. First and foremost, we’ve got a slight time synchronisation problem. LL was eight years old when in 2007 the mainstream suddenly understood that “virtual worlds are the way to go”. However, 2008 is “the year of the virtual world startup”, where suddenly everybody understood that there is a new “wheel concept” and is hurrying up to develop their own version of the “wheel”. That’s all very fine, but all very late-1980-ish. In one or two years, what all these developers will find out is that they have managed to duplicate or triplicate the market for virtual worlds (estimated 50-60 million users these days), or probably even more, but they’ll have a huge problem in hands:

Their virtual worlds will never interoperate. And their users will soon find out that the novelty wears out after a few years — they’re avid content consumers, and will demand more and more. There are not so many Blizzards out there, able to launch massive amounts of new content to keep their users happy. The small start-ups have no chance to survive — they’ll be dealing with stability issues and fluidity of “gameplay” with hundreds of thousands of simultaneous users, and hitting the roadblocks that Linden Lab has hit in late 2005/early 2006 (most MMORPGs deal with the amount of simultaneous users by simply “sharding” their virtual worlds).

In the mean time, Linden Lab, the decade-old, sluggish, patient veterans of the Old Metaverse, will be paving the road towards something completely different. Not a “Second Life Über Alles” kind of environment — not even LL is that insane — but a SL-compatible environment, where all sorts of independent grids, from IBM to college campuses, from LL-licensed subgrids, to OpenSim (and forked derivatives) grids, are interconnected together. Each will be dominated by their respective control freak, watching those “boundaries” (the point where avatars with their user-generated content cross grids) very closely and carefully. But they will still interconnect. People will use the general-purpose SL client or something like RealXtend from OpenLifeGrid.com to connect to all of the SL-compatible grids; the choice will be on the users’ side. Certainly a lot of those will feature “closed content” and limit the types of content that can be brought across grids (like the corporate grids, and probably teen-oriented grids) — and others will probably allow all content to be copied and have no permission systems in place. The majority, however, will be very similar to what LL provides today with SL: an adequate (even if cumbersome) permission system; a global microcurrency unit; almost absolute freedom of user-generated content; a more-or-the-less permissive ToS, adapted to specific countries.

But, overall, in spite of using different technologies, hosted on different types of servers, using different concepts for the “central servers” (the asset servers and similar ones), and very likely with improved or just different user interfaces, they will share something in common:

They will be all compatible with each other.

Now this is the crucial difference between the “Metaverse as an unified environment” (ie. ultimately everything works under a single software platform) and what I call the InterGrid — interconnected grids, each handled separately with different software and technology, and with its own quirks, improvements, management, look & feel, but all allowing users to enter the InterGrid at any point (at their choice) and cross grids (within limits) at will. This can be made possible thanks to the release of the Second Life Protocol to the public, and the immense documentation work being done by the Architecture Working Group, which is trying to define the standards to allow grids to interoperate, while still giving each grid the ability to enable control over “their” environment.

I believe that this will come gradually, one step at the time, and without most of the people noticing what’s going on “under the hood”. We got IBM’s announcement that they’ve been working on getting sims behind firewalls; we have some OpenSim grid managers working on interconnecting their grids with LL’s own, and we know that LL has made it a priority. So, probably by 2010, we’ll get them all working together, still experimentally, still on a very rudimentary “alpha” level. But they’ll work. While by 2010 all the start-ups that announced proudly that they’ll continue to forge ahead with “their” vision of the Metaverse will collectively go oops.

The Second Life Foundation

So what will Linden Lab’s role actually be in 2010? With Mark Kingdon at the helm, we can expect a different management style — one that will focus on “marketing a vision”. Mark will have a lot of problems dealing with robustness, scalability, and revamping millions of lines of code to make LL’s application become solid and reliable — because all the 2008 start-ups will have that all (until, of course, they start attracting millions of users…). But once that huge development effort is deployed, Mark will need to find a place for Linden Lab to exist as a dominant market player.

I think that what we’ll see is a better definition of what the “Second Life Grid” is going to be. Right now, that website of Linden Lab just talks about the grid-as-we-know-it, but with a language addressing a business audience. It’s not enough. They forget to explain that, for LL, there is no difference between “businesses”, users (consumers of content), content creators, and “metaverse engineers” (the ones developing the InterGrid) — all are placed inside the same virtual environment without interference from Linden Lab. That’s all very nice, but very likely not the business model that LL wishes to continue to present.

What makes much more sense is to neatly split the company into two major areas. One is going to be fully embracing the foundations of the InterGrid — and what would be more appropriate than creating a Second Life Foundation for that?

This is just going the route of so many open source projects — from Apache to Mozilla — which, at some point in their lives, get the company (or the university, or the group of geeks) who developed a technology to place all source code, names, logos, and associated intellectual property rights over it into a non-profit foundation, and become its largest endorser — making sure they’ll get a place at the board.

Obviously, LL would be the biggest and more important sponsor of the Second Life Foundation — but IBM and Sun would most certainly wish to have a place there, too, and I’m also pretty sure that AOL/Warner Brothers, Xerox, Microsoft, NBC, and several other technology/content producers would like to be part of it as well. And almost all universities currently doing research in SL.

The major development work on the Second Life protocol would be done by the Second Life Foundation. They would be in charge of the specifications, and “absorb” the Architecture Working Group — both the Linden employees and the hordes of volunteers — into it. They would also organise the libsecondlife group and “absorb” the OpenSim group — and probably abandon the creation of the “SL Viewer” to the Foundation as well.

In essence, the Second Life Foundation would have the code, the protocol, and the rights over what “Second Life” means (yes, solving once and for all the nasty trademark issues). Linden Lab would remain in control as the biggest sponsor, but they wouldn’t be the only active voice defining the roadmaps and the timelines for all future SL development.

Under this model, it’s also obvious that things like the “mainland” would have to be completely rethought. It’s clear that LL has no resources to “run the mainland”, imposing rules, order, and acting fairly as judges and overseers. They’re simply too thinly stretched for that.

Very likely, they would enter the corporate market — basically, providing system administration services for the paying customers. They have the expertise and the know-how to run a grid with dozens of thousands of simulators that can get almost a hundred thousand simultaneous users. And this would be their sales pitch: if you wish reliability, hire your sims from us. If you wish to have fun and experiment with the SL toolkit, get a cheap sim from one of the OpenSim grids. Oh, and you can bring your avatar to our corporate grid too.

This would effectively disrupt the whole in-world land market — and it means I’ll get a lot of angry comments from the landbarons on this article — unless, of course, the landbarons learn the lesson. In 1990, people developed content for Compuserve and AOL. When the Internet was rolled out, they faced competition from thousands upon thousands of small companies that were setting up their web servers for the first time, at a fraction of the cost of the license to produce content for the “online services”. Well, the smart ones brought their know-how and expertise to run web sites, too. Anshe Chung is a much better community manager than LL — she can run her own grid, with her own software, her own system administrators, and aggressively compete with much lower prices (if she doesn’t need to buy sims from Linden Lab any more).

So we’ll start to see a much heavier competition, across prices and services. Poor universities looking for cheap, unreliable sims will probably just run their own servers inside their campuses — but their students will be able to continue to go shopping on sims hosted by LL. Community managers (“landbarons”) will have the choice either to go for price or for services — and depending on their choice, either rent cheap sims from an OpenSim grid, run their own grids (if they can get hold of enough system administrators), or compete in services by renting sims from LL. Companies like IBM (and perhaps Sun) will run their own grids for their own customers, and compete with LL not on price, but services — LL has a head start, of course, but IBM has 60 or more years of running large and complex computational resources, and will certainly catch up quickly. (In effect, instead of believing that LL might go public, I think it’s much more realistic to assume that a fair share of LL might be bought by IBM.) In either case, what we call the “OpenSim Grids” will soon become “SL-compatible hosting providers” in the near future.

What does this mean for LL’s revenue model? They’ll still be rolling out industry-grade sims; they’ll be able to charge for interconnection fees; they’ll leverage their position on the Second Life Foundation to let Philip put his “stamp of approval” on the future development of the SL protocol. Philip will, in effect, play the role of Tim Berners-Lee in the W3C — a reference for future work, but not the only one having his say. In fact, away from his role as LL’s CEO, Philip is now free to plan the Second Life Foundation. Mitch Kapor will certainly know everything to help him with this transition — and we have seen how deeply Kapor has been recently involved in LL’s operations, to the point that he was once labelled “acting CEO” for LL. (In fact, Philip & Kapor will probably be pushing for the innovation aspects of Second Life-related technologies — while Linden Lab would run the servers, and the Second Life Foundation will take good care of the InterGrid communication protocols.)

Diverging paths, common futures?

In conclusion, I really thought that 2007 was the “turning point” for Second Life, although, to be honest, in a completely different way. In fact, what happened was that 2007 brought the concept of “virtual worlds” into the mainstream — but not, as I had expected, “Second Life as the Metaverse”. In 2008, we’re watching the first “scism” among virtual world evangelists: the ones betting on “closed content”, and Second Life as the only one (but by far the one with the larger user base) going ahead with user-generated content and grid interconnection. We’ll now see a plethora of smaller virtual worlds, all targeted to specific audiences, but only one focusing on the 30-50 year olds who run companies. We’ll see all sorts of companies, from small start-ups with too much funding, to huge entertainment monsters like Sony (and, who knows, Internet giants like Google and Yahoo), launching “their” vision of a controlled-content metaverse — and we’ll see Linden Lab and their “satellites” growing the InterGrid. If history repeats itself, we’ll have by 2010 a huge industry conglomerate around the Second Life Foundation deploying a seamless (but independently owned) InterGrid with 20-30 million users, while perhaps another 20-30 million will be spread across hundreds if not thousands small virtual worlds and suddenly demanding to be interconnected too. By 2015, however, the battle will be for the “winning strategy”. Technology has this strange, market-driven effect to get standardised — a lesson that industry learned a century ago, and that’s why we can drive your cars and use any type of gasoline produced in the world. It’s also a lesson that made Macs interoperate with Window-based or Linux-based PCs, in spite of Steve Jobs’ efforts to remain as incompatible as possible for as long as he could.

Once the “many small metaverses” model start to hit roadblocks — they won’t converge, but diverge, as new creative applications will demand their models to further change and remain incompatible — there will be just one “unifying force”: a industry-sponsored foundation, sponsored by giants like IBM, Sun, Microsoft, and who knows how many others, that will be the only one addressing the fundamental issue of interconnecting grids. It seems clear now that nobody else will be doing this.

The question remains if Linden Lab is, indeed, willing to take this bold step. I think that they’ll delay it as long as possible, but ultimately, their choices will be much limited. In another two years, OpenSim will be at the same level of technology advancement as Second Life was in 2006 — good enough for corporations to routinely use it. It will also feature grid interconnection from the very start. Linden Lab has the power to decide now if they will be part of the InterGrid or not. All the signals they’re giving — the release of the code as open source; the support to the libsecondlife and OpenSim initiatives; the Architecture Working Group — seem to indicate that this is what they want to pursue. Still, the biggest step is to “go W3C” on the SL Protocol. It remains to be seen if Mark Kingdon is a Philip-class or Mitch-class visionary and understands that this is the only choice that they’ve got — even if not yet for 2010.

But I would certainly use the opportunity of having Philip as a “wildcard” at Linden Lab right now — unburdened by the role of CEO — to begin developing the Second Life Foundation immediately and funnel the required funding into it, while the media is asleep and confused with the “multiple multiverses” that were the conclusion of VW’08.

Print Friendly, PDF & Email