Monday, April 4, 2011

The Great Thing About Standards...

...is that there are so many of them, of course.

Following on from my previous post I've been reviewing the entity alpha code, and what a hellish mess it's become. Does it use a U_TRANS bit? Does it use a U_ALPHA bit? Does it read from the entities lump? Does it come from an entity field? Is it a part of the protocol? Is it an extension to protocol 15? Is it in the 0..1 range? Is it in the 0..255 range? Is it sent as a byte? Is it sent as a float? At least thankfully there isn't a server message or a QC builtin for it too. (Note to self - mustn't give people ideas!)

IMO this is one serious weakness of the open source world (not open source itself but I'll come to that). It allows for - nay, encourages - proliferation of standards like this, as everyone has their own idea of what the best way of doing things is, and everyone implements the same thing in different ways. Sometimes a half thought-through implementation gets out, sometimes an incomplete or bugged (or just plain crap) implementation gets out, and they all become part of the standard and all need to be supported.

(Aside: if it was up to me alpha would use U_ALPHA, be 0..255, sent as a byte, need explicit protocol support and be either in the entities lump or from a field.)

Now, this isn't a natural consequence of any open source project; it's perfectly possible to have an open source project that doesn't display these tendencies. Look at Firefox as a great example. So if being open source doesn't inevitably lead to this kind of mess, then what does?

Thinking over it, it's quite clear - design by committee is the real culprit. It's just that in the open source world a lot of standards tend to evolve by committee (or at least by different people trying out different approaches to the same thing in rough and loose cooperation - a defacto committee, pretty much). Without strong leadership behind an evolving standard, without someone to call the shots and have final say on the way things are, the result is a mess.

So while open source doesn't spontaneously generate a mess of conflicting standards in and of itself, what it does do is create an environment where people can more easily take something that was once at least reasonably clear and graft the mess on top of it. That's not to say that open source is bad because of this; I said it was just a weakness remember. Open source with a firm guiding principle behind it, and a final arbiter who has final veto doesn't create this mess.

This has parallels in something else I've experienced, namely D3D vs OpenGL. Now, OpenGL isn't open source but an open standard (it existed long before the term "open source" did - at least in it's familiar, modern context), but the same principle applies. In ye olde dayes it was held as a significant thing that (dark clouds and thunder) vith ze D3D only ze eeeevil Microsoft called ze shots, but (rays of sunshine and birdsong) with OpenGL the ARB group guided it's evolution in the golden light of greater communal wisdom. Anyone who's ever had to wrestle with the mess of extensions that OpenGL has become, where each vested interest group is trying to pull it in different directions (with the poor developers stuck in the middle desperately crying STOP! but nobody listens) should agree that things don't always work that way in the real world.

So do I use GL_ARB_shader_objects or GL_EXT_shader_objects? Or do I assume a specific GL_VERSION and it's part of the core? What about GL_NV_fragment_program? Does this have a software emulation fallback or not? Is this deprecated in 3.3 but what if I'm targetting 2.1? Do my shaders use 16 bit, 24 bit or 32 bit floats by default? Can I use glVertexAttribPointer here or glVertexPointer there and what if one needs client memory but the other needs GPU memory? What if the driver I'm on tells me it's 3.2 but doesn't support glPointParameteri?

Sound familiar? Yeah, it's just like entity alpha.

So I didn't mean this to turn into a dig at OpenGL, and I'm rambling now, so it's time to stop.

1 comments:

gb said...

There is another factor. It is the structure of open source / volunteer development groups vs. the structure of a for-profit company.

In the latter, people accept a hierarchy because they're paid off, pretty much.

In the former, with something like X.org or Debian, when someone tries to introduce a hierarchy, everybody else (who may all be veteran coders etc) gets the short end of the stick, without compensation.

The Linux kernel has a leader in Linus Torvalds himself, who is the kernel's sole original creator. Still, Linus doesn't own the copyright in every patch ever created, which is how it can become bloated and accumulate cruft.

In a company, everyone signs away their copyright and gets financially compensated, which makes it very easy to establish very strong leadership (at a high cost).

It's a social thing.