Page 2 of 2
Posted: Wed Oct 26, 2005 10:38 am
by George Gilbert
Fixed (with much help from Sophia in tracking down the source) for V0.35
Posted: Wed Oct 26, 2005 11:58 am
by beowuuf
Out of interest what was the problem?
Posted: Wed Oct 26, 2005 1:43 pm
by George Gilbert
OK; this requires a bit of knowledge about bitmaps are stored in memory, and bit-wise masking, but here goes nothing:
The problem was to do with the colour values being returned from the DirectX draw surface for the bitmaps. In memory the bitmaps are stored as the sequential R1,G1,B1,R2,G2,B2,R3,G3,B3 etc byte values. To get the colour value of, say the second pixel, DirectX (on most systems) returns the DWORD 0x00B2G2R2 (the bytes are in reverse order in the DWORD because of endian-ness). What I think was happening on some systems was that it was returning 0xR3B2G2B2 (i.e. as a speed optimisation, it just returned the 4 raw bytes of data rather than masking the most significant byte containing R3).
As a result, when comparing the returned value with the transparent pink colour in the shading algorithm (because obviously you don't want to shade the pink, just the picture!) if would sometimes get it wrong and think the pink wasn't really pink and shade it - this lead to the stripes.
The upshot of all of this is that the old pseudo-code looked something like:
col = GetPixelColourFromBitmap();
if (col != 0xFF00FF)
{
shade it
}
where as the new (fully working) new pseudo-code looks something like:
col = GetPixelColourFromBitmap();
if ((col & 0xFFFFFF) != 0xFF00FF)
{
shade it
}
Hope all that makes sense!
Posted: Wed Oct 26, 2005 2:07 pm
by beowuuf
Lol, yes it does, kinda, thank you!
Posted: Wed Oct 26, 2005 8:51 pm
by Sophia
I think I got it
What confuses me about that explanation is why a 32-bit screenmode still showed the bug, because it seems like the DWORD returned would be something like 0xA2B2G2R2, and masking wouldn't matter...unless the alpha channel was just ignored or something.
Oh well, at least it works now!

Posted: Wed Nov 02, 2005 1:03 am
by George Gilbert
The mode that counts is that used internally by your graphics cards - specifically not that which your graphics card tells you its using (which might be emulated).
In practice, most new graphics cards use a 32 bit mode internally (and emulate 24 and 32 bit) which means that this bug won't show on most systems regardless of the setting you use.
Some cards though (for example yours / beowuufs) must use a 24 bit mode internally (but emulate 32 bit) and so will show the bug, again regardless of the setting you use.
As you say - at least it works now!

Posted: Wed Nov 02, 2005 5:40 am
by Sophia
Oh! It makes sense now.
