“Lag Hell”. This is how Second Life has always been referred to by oh-so-many people throughout its ten years of existence. We often see people complaining about how laggy a region gets the moment a few avatars are on it – or, in some cases, how laggy a region is, no matter if it’s empty or packed full. The blame is invariably put on Linden Lab. Lag is always someone else’s problem. Oh really?
First of all, I’ll be the first to that Linden Lab have made many bad design decisions from the very beginning, and many of them are still haunting them to this day. But is it really so simple? As a matter of fact, it’s not. Yes, there’s much wrong with Second Life on the technical front. It has numerous limitations, but most of them could actually be worked around, provided content creators (from the complete beginners to the “professionals” out there) actually bothered to build and script like, well, professionals. Someone with far greater experience in the 3D graphics industry than I could ever hope to have has actually written about it. I’m talking about Penny Patton, who wrote an excellent post about this – but I’m going to revisit this issue anyway, in my own way, simplifying things to make them easier to understand for people like me who aren’t 3D graphics experts.
Now, what do I mean by saying that many content creators don’t build and script like professionals? Two things: textures and scripts. As a matter of fact, they are explained in the Second Life Wiki’s “Good Building Practices“, but not many seem to bother reading them…
Let’s talk textures first
On SL’s marketplace website, you’ll find many texture creators selling high-resolution textures (1024×1024, which is the maximum SL supports). I don’t blame them – after all, they give them with full permissions, so content creators (amateur and professional alike) can very well save them on their hard drives and manipulate them as they see fit. This manipulation also includes scaling to a different, smaller size, depending on the size of the object (or the size of the face of the object) they’ll be applied to.
Yet, we see far too many content creators go overboard and say “ooh, I’ll fill my sim and my builds with super-duper-high resolution textures to ensure they’ll never lose definition when someone zooms in.” While it is commendable that someone wants to make sure their objects’ textures won’t pixelate when someone zooms in, using 1024×1024 textures everywhere is unnecessary – and even stupid and lazy.
Penny Patton explains (and she’s not the only one to do so, as a matter of fact) that textures consume a lot of bandwidth to download, demand a lot of processing power from the sim so that they’ll be retrieved from the asset server (i.e. the server in which our inventories are stored) and delivered to us – and, to top it all off, they hog our graphics cards down, as the graphics cards need to store them in memory and render them.
I’ve you’ve been to a region where only few scripts are running, there are only very few avatars (or even none) and still you get pathetically low frame rates, now you know the reason: the region has too many large textures on the things that are in it. The chart below shows just how large files each texture size produces.
To cut a long story short, for the most popular texture sizes in Second Life, the files for 32-bit colour depth (this is required when we want transparency) are as follows:
While for textures that don’t contain an alpha channel (24-bit colour depth), things are as follows:
Penny Patton also explains that the viewer doesn’t allocate more than 512MB of the graphics card’s dedicated RAM to textures (and most people still have graphics cards with 512MB RAM anyway). When I raised this topic to Oz Linden, he didn’t know, so I guess I’ll take Penny’s word for it, for the time being at least.
As I wrote earlier, many content creators (amateur and professional alike) think that they need to use 1024x1024s everywhere, but this is not the case. You see, the size of the texture you’ll need depends on (a) the size of the object faces it will be applied to, and (b) how closely a user is likely to zoom into the specific object. For demonstration purposes, please have a look at the attachment I’m wearing in the following picture:
Out of misplaced politeness and a desire to avoid stupid drama, I won’t name the content creator that makes this cyberpunk version of the CString you see in the picture above. How big do you think this object is, anyway? At its widest, it can’t be more than 6 or 7cm. And as for its length, it really doesn’t seem to matter, as this can easily be taken care of with texture repeats. But really, it had three 1024×1024 24-bit colour depth textures and one 1024×128 (this one had an alpha channel, so its colour depth was 32-bit. So, its textures were a download of 9MB and 512KB, i.e. 9.5MB. And that’s without the sculpt map – by the way, have you noticed that sculpties are always the last objects to rez on a scene? Now, I’m sure you can guess how laggy this object was, without even taking its scripts into account: a listener for the HUD, colour changers, a resize script, a texture animation script for the light strip down the middle (even though it’s not necessary…).
So, what did I do? Well, the black gasket-like sculpts lost their textures, plain and simple. They were not necessary and were barely visible in the first place. The other two textures? I went into edit mode, opened the texture selection floater and saw the textures themselves. Obviously, I didn’t have them in my inventory and I couldn’t download them, so I took a screen grab, isolated the 165×165 (or so) square that showed the texture thumbnail, cropped it and turned the big one into a 128×128 texture, and the one with the alpha (the one on the light strip) into a 128×16. Total download burden for SL? 48KB and 8KB respectively, i.e. 56KB. Compare these two total file sizes, please. Also note that, due to the fact that the object is small, these small textures cause no loss of resolution or definition upon zooming into it at all.
It goes without saying that, once I sized this object the way I wanted, I also removed all the scripts from it. If you click on the photo above, you’ll see the full-size image, which was upscaled from my viewer’s 1366×714 window to 1920×1004 and cropped accordingly. Note that, although I do take snapshots larger than my screen size, I’m still rather conservative and I’ll explain why. “But,” I hear the snapshot size queens moan, “it’ll lose definition when we take snapshots at 6144×3840 or larger.” Well… Let me give you a little piece of mind. ALL textures in SL are raster images, which means that, the more you blow them up, the more they’ll lose detail and definition and even the smartest interpolation algorithms won’t make up for it. And also, let me remind you that resolutions higher than 24MP are really only meant for printing at 300dpi on super-high-quality photographic paper of A3 size or larger. When exactly was the last time you printed your SL snapshots on such paper? As a matter of fact, the pros (not the wannabes and the poseurs) even drop the printing resolution to 200dpi or lower when they need to print on super-large media, depending on how close they expect the viewers to get to the image…
So, to wrap it up: Don’t use lots of 1024×1024 textures on your builds. Use the smallest textures you can get away with and think while you build – oh, and read the damned Wiki.
Please use the numbers below to navigate between the article’s pages