GAME: Weird black lines in scaled images
Moderator: George Gilbert
Forum rules
Please read the Forum rules and policies before posting. You may
to help finance the hosting costs of this forum.
Please read the Forum rules and policies before posting. You may

- George Gilbert
- Dungeon Master
- Posts: 3022
- Joined: Mon Sep 25, 2000 11:04 am
- Location: London, England
- Contact:
- George Gilbert
- Dungeon Master
- Posts: 3022
- Joined: Mon Sep 25, 2000 11:04 am
- Location: London, England
- Contact:
OK; this requires a bit of knowledge about bitmaps are stored in memory, and bit-wise masking, but here goes nothing:
The problem was to do with the colour values being returned from the DirectX draw surface for the bitmaps. In memory the bitmaps are stored as the sequential R1,G1,B1,R2,G2,B2,R3,G3,B3 etc byte values. To get the colour value of, say the second pixel, DirectX (on most systems) returns the DWORD 0x00B2G2R2 (the bytes are in reverse order in the DWORD because of endian-ness). What I think was happening on some systems was that it was returning 0xR3B2G2B2 (i.e. as a speed optimisation, it just returned the 4 raw bytes of data rather than masking the most significant byte containing R3).
As a result, when comparing the returned value with the transparent pink colour in the shading algorithm (because obviously you don't want to shade the pink, just the picture!) if would sometimes get it wrong and think the pink wasn't really pink and shade it - this lead to the stripes.
The upshot of all of this is that the old pseudo-code looked something like:
col = GetPixelColourFromBitmap();
if (col != 0xFF00FF)
{
shade it
}
where as the new (fully working) new pseudo-code looks something like:
col = GetPixelColourFromBitmap();
if ((col & 0xFFFFFF) != 0xFF00FF)
{
shade it
}
Hope all that makes sense!
The problem was to do with the colour values being returned from the DirectX draw surface for the bitmaps. In memory the bitmaps are stored as the sequential R1,G1,B1,R2,G2,B2,R3,G3,B3 etc byte values. To get the colour value of, say the second pixel, DirectX (on most systems) returns the DWORD 0x00B2G2R2 (the bytes are in reverse order in the DWORD because of endian-ness). What I think was happening on some systems was that it was returning 0xR3B2G2B2 (i.e. as a speed optimisation, it just returned the 4 raw bytes of data rather than masking the most significant byte containing R3).
As a result, when comparing the returned value with the transparent pink colour in the shading algorithm (because obviously you don't want to shade the pink, just the picture!) if would sometimes get it wrong and think the pink wasn't really pink and shade it - this lead to the stripes.
The upshot of all of this is that the old pseudo-code looked something like:
col = GetPixelColourFromBitmap();
if (col != 0xFF00FF)
{
shade it
}
where as the new (fully working) new pseudo-code looks something like:
col = GetPixelColourFromBitmap();
if ((col & 0xFFFFFF) != 0xFF00FF)
{
shade it
}
Hope all that makes sense!
- Sophia
- Concise and Honest
- Posts: 4306
- Joined: Thu Sep 12, 2002 9:50 pm
- Location: Nowhere in particular
- Contact:
I think I got it
What confuses me about that explanation is why a 32-bit screenmode still showed the bug, because it seems like the DWORD returned would be something like 0xA2B2G2R2, and masking wouldn't matter...unless the alpha channel was just ignored or something.
Oh well, at least it works now!

What confuses me about that explanation is why a 32-bit screenmode still showed the bug, because it seems like the DWORD returned would be something like 0xA2B2G2R2, and masking wouldn't matter...unless the alpha channel was just ignored or something.
Oh well, at least it works now!

- George Gilbert
- Dungeon Master
- Posts: 3022
- Joined: Mon Sep 25, 2000 11:04 am
- Location: London, England
- Contact:
The mode that counts is that used internally by your graphics cards - specifically not that which your graphics card tells you its using (which might be emulated).
In practice, most new graphics cards use a 32 bit mode internally (and emulate 24 and 32 bit) which means that this bug won't show on most systems regardless of the setting you use.
Some cards though (for example yours / beowuufs) must use a 24 bit mode internally (but emulate 32 bit) and so will show the bug, again regardless of the setting you use.
As you say - at least it works now!
In practice, most new graphics cards use a 32 bit mode internally (and emulate 24 and 32 bit) which means that this bug won't show on most systems regardless of the setting you use.
Some cards though (for example yours / beowuufs) must use a 24 bit mode internally (but emulate 32 bit) and so will show the bug, again regardless of the setting you use.
As you say - at least it works now!