Hardware of the casual gamer

(if this sounds like a rehash of a blog post on blogs.unity3d.com, well, it is…)

Everyone knows the Valve’s hardware survey. But what if your target game players are not the traditional “big budget AAA game” type? For example, at the moment most Unity Web Player games are oriented to much more casual market, so hardware there might be very different. And indeed, turns out it is quite different.

Without further ado, here’s the data we have: Unity Web Player hardware statistics.

It’s about two million data points since we started gathering it earlier this year.

Some subjective points of interest (I’ll be using current data for 2008 Q3 here):

  • Operating systems: Mac OS X is 2.5%, the rest is Windows. 64 bit Windows haven’t really picked up yet (0.7%). Windows 2000 is dying fast (0.7%). OS X Leopard already took over OS X Tiger.

  • CPUs: poor Transmeta :) Dual core CPUs are becoming the norm (46%).

  • Graphics cards: quite sad, in fact… top 15 cards are slow or horribly slow. Capability wise, they are quite good, with about 70% having shader model 2.0 or higher. Shader model 1.x cards are dead. “Can has DX10” is 2.7%.

  • Casual machines don’t have lots of RAM. Nor lots of VRAM.

  • Most popular nvidia driver? 56.73. Looks like this is the driver that comes integrated in XP SP2… Now, who says regular people ever update their drivers? Likewise, vga.dll (i.e. standard VGA) is 1.6% of machines; additional 1.5% don’t report any driver (not sure how that happens…).

So yeah. Casual machines: capabilities quite okay, performance low, low, low. That’s life.


OpenGL 3: a big step in no direction at all?

Well, the post title pretty much summarizes my take on it, doesn’t it? I guess I could just stop typing now… but I won’t!

So after some promises, delays and a period of deadly silence, OpenGL 3.0 was released.

Response to it was “interesting”, to say at least. Some part of that response is related to seriously mishandled communication on Khronos part. Some part is because GL 3.0 is not what it was promised to be. Let’s just ignore the communication issue, it does not affect OpenGL itself in a direct way (it affects the developer community though).

By the way, I borrowed part of the post title from a blog post linked from opengl.org. In general, I do not agree with that blog post, but it’s a valid point of view. Unlike some other blog posts linked from opengl.org that are just pure garbage…

I am not sure what are the goals of OpenGL at this point. OpenGL’s current position, as far as games are concerned, seems to be roughly this:

Be the graphics API on various platforms where no alternatives are available.

Why? Because Windows has got D3D, which is far more stable, comes with useful tools, more often updated and actually works for variety of users (I’ll get to this point in a second). Mobile platforms have OpenGL ES, which is decent. All consoles have their own APIs (some of them similar to D3D, none of them similar to GL). So that leaves OpenGL as the choice on OS X, Linux and such. Not because it’s better. Because it’s the only choice.

“Oh, but look, id uses OpenGL! Two other games use OpenGL as well!” Well, good for them. But they are in a different league than “the rest of us”. For some games, driver writers will do whatever it takes to get those games running correct & fast. Surprise surprise, id games fall into this category. For the rest of us - no such luxury. Hey, try talking to your friendly IHV, the most likely answer is “yeah, but are really busy with some high profile games right now, ping us back in two months”. After two months, repeat.

So the rest comes from somone who is not working on the high-profile games that IHVs specially tune drivers to.

If OpenGL’s goals are to stay in this current position, then GL 3.0 is okay. It adds some new features, brings some extensions into core, hey, it even says “it’s quite likely that maybe perhaps someday some of the old cruft in the API will be removed, if we feel like it”. No problem with that.

However, OpenGL is advertised as something different, as if it wants to:

Be the graphics API on various platforms.

Which is quite different from it’s current position. I’m not sure if that’s the goal of OpenGL. Myself, I don’t care about the mythical cross-platform API that would actually work on those different platforms. API is a tool to do stuff; if different platforms have different APIs - no problem with that.

However, if OpenGL wants to achieve this advertised goal, it has to do several things. First and foremost:

Actually work

Stable drivers and runtime. In it’s current state, GL is too complex to implement good quality drivers/runtime. Complexity can be reduced in several ways:

  • Cleanup the API. This was what GL 3.0 was supposed to be. Actual 3.0 did not do any of that, instead it just postponed the cleanup “until we feel like it”.

  • Share some of the hard work. Why does everyone and their dog have to write GLSL preprocessor, lexer, parser and basic optimizer themselves? Define precompiled shader format, write frontend once, make it open. This would also be actually useful to reduce load times.

GL 3.0 could have done both of the above, instead it did none. It could have cleaned up the API, and provide one platform independent GL 1.x/2.x library that calls into actual 3.0 runtime. All the fixed function, immediate mode, display lists, whatever would be in one nice library. Even existing apps could continue to function transparently this way (with the benefit of actually simpler = more stable drivers).

Support platforms/hardware/features user needs

This is of course dependent on the user in question. For someone like us, we still have to support 10 year old hardware.

D3D9 does a fine job for that (provided you have drivers installed, and DX9 runtime installed - which comes included in XP SP2 and upwards). OpenGL 2.1 and earlier would do a fine job for that, provided it would “actually work” (see above).

If GL 3.0 would be as was originally promised - almost new API, shader model 2.0+ hardware, it would be sort of fine. In our case, that would mean writing and supporting two renderers - “old GL” and “new GL”, where old one would be used on old hardware or old platforms where “new GL” is not available. If the new runtime were much leaner, much more stable and generally nicer, this would not be a big problem.

With actual GL 3.0, in theory one does not have to write two renderers. Minimum hardware level for GL 3.0 is shader model 4+ though. So to support both old hardware/platforms and new hardware/platforms, quite a lot of duplication has to be done. Especially if you intend to go towards proposed “future GL path”, i.e. start dropping deprecated functionality from the codebase. At which point you’ll probably write two separate renderers already. So we’re back to where original GL 3.0 would have been, just without any extra niceness/stability/leanness right now.

Oh, and look at vendor announcements from 2008 OpenGL BOF. NVIDIA: we have almost full drivers now. AMD: we’re committed to having drivers. Intel: look for GL 3.0 on future platforms. In other words, looks like current Intel’s cards won’t ever have GL 3.0 drivers. And in our target market, Intel has the majority of cards.

That sounds very much like “just ignore whole GL 3.0 thing” plan to me.

Be nice

This is a point of far lesser importance than “actually work” and “support what is needed” ones. Having good tools (PIX, …), documentation, code examples etc. is nice. But not much more; being nicest API in the world does not do much if it does not actually work or does not support what you need. Even in this area, actual GL 3.0 is not nice - it’s full of redundancies and crap that goes 15 years back in history.

Summing it up

To me, GL 3.0 looks like a blunder. Instead of fixing the core problems, they just postponed that. Well, Keep up the good work!


Uh-oh, this can't be good

Can this lead to anything good when I’m starting to write lines like this myself?

my $filter = join ’ and ‘, map { “agr.$_ = $temp.$_” } split(/, /, $fields);

I know, it’s not that scary, and does not involve even a single regexp, but still…


It must be a bug in OS/compiler/...

Ever looked at the code which is absolutely correct, yet runs incorrectly? Sometimes it looks like a genuine compiler bug. “I swear, mister! The compiler corrupts my code!”

Look again. And again. Eventually you’ll find where your code is broken.

(Of course, in some cases quite often the compiler is broken… GLSL, anyone?)

Pimp my code, part 15: The Greatest Bug of All says the above in a much nicer way:

Maybe the problem was there was some huge bug in Apple’s Mach, where if you open too many files in a short period of time, the filesystem tried to, like, cache the results, and the cache blew up, and as a result the filesystem incorrectly just would fail to open any more files, instead of flushing the cache.

I’ve also been around long enough to know that whenever I know the operating system must be bugged, since my code is correct, I should take a damn close look at my code. The old adage (not mine) is that 99% of the time operating system bugs are actually bugs in your program, and the other 1% of the time they are still bugs in your program, so look harder, dammit.

A post well worth reading… about the process of investigating tricky bugs. And sincere as well. It’s so good that I’ll just quote it again:

It’s a bug we should have caught. We should have spent the time to get the images in the 10,000 item file. I messed up.

Software is written by humans. Humans get tired. Humans become discouraged. They aren’t perfect beings. As developers, we want to pretend this isn’t so, that our software springs from our head whole and immaculate like the goddess Athena. Customers don’t want to hear us admit that we fail.

The measure of a man cannot be whether he ever makes mistakes, because he will make mistakes. It’s what he does in response to his mistakes. The same is true of companies.

We have to apologize, we have to fix the problem, and we have to learn from our mistakes.

So very true.