GAME: Weird black lines in scaled images

Messages are moved here (should anyone ever want to see them again) once they are no longer applicable to the current version (e.g. suggestions that have been implemented or bugs that have been fixed).

Moderator: George Gilbert

Forum rules
Please read the Forum rules and policies before posting. You may Image to help finance the hosting costs of this forum.
User avatar
George Gilbert
Dungeon Master
Posts: 3022
Joined: Mon Sep 25, 2000 11:04 am
Location: London, England
Contact:

Post by George Gilbert »

Fixed (with much help from Sophia in tracking down the source) for V0.35
User avatar
beowuuf
Archmastiff
Posts: 20686
Joined: Sat Sep 16, 2000 2:00 pm
Location: Basingstoke, UK

Post by beowuuf »

Out of interest what was the problem?
User avatar
George Gilbert
Dungeon Master
Posts: 3022
Joined: Mon Sep 25, 2000 11:04 am
Location: London, England
Contact:

Post by George Gilbert »

OK; this requires a bit of knowledge about bitmaps are stored in memory, and bit-wise masking, but here goes nothing:

The problem was to do with the colour values being returned from the DirectX draw surface for the bitmaps. In memory the bitmaps are stored as the sequential R1,G1,B1,R2,G2,B2,R3,G3,B3 etc byte values. To get the colour value of, say the second pixel, DirectX (on most systems) returns the DWORD 0x00B2G2R2 (the bytes are in reverse order in the DWORD because of endian-ness). What I think was happening on some systems was that it was returning 0xR3B2G2B2 (i.e. as a speed optimisation, it just returned the 4 raw bytes of data rather than masking the most significant byte containing R3).

As a result, when comparing the returned value with the transparent pink colour in the shading algorithm (because obviously you don't want to shade the pink, just the picture!) if would sometimes get it wrong and think the pink wasn't really pink and shade it - this lead to the stripes.

The upshot of all of this is that the old pseudo-code looked something like:

col = GetPixelColourFromBitmap();
if (col != 0xFF00FF)
{
shade it
}

where as the new (fully working) new pseudo-code looks something like:

col = GetPixelColourFromBitmap();
if ((col & 0xFFFFFF) != 0xFF00FF)
{
shade it
}

Hope all that makes sense!
User avatar
beowuuf
Archmastiff
Posts: 20686
Joined: Sat Sep 16, 2000 2:00 pm
Location: Basingstoke, UK

Post by beowuuf »

Lol, yes it does, kinda, thank you!
User avatar
Sophia
Concise and Honest
Posts: 4306
Joined: Thu Sep 12, 2002 9:50 pm
Location: Nowhere in particular
Contact:

Post by Sophia »

I think I got it :D

What confuses me about that explanation is why a 32-bit screenmode still showed the bug, because it seems like the DWORD returned would be something like 0xA2B2G2R2, and masking wouldn't matter...unless the alpha channel was just ignored or something.

Oh well, at least it works now! :)
User avatar
George Gilbert
Dungeon Master
Posts: 3022
Joined: Mon Sep 25, 2000 11:04 am
Location: London, England
Contact:

Post by George Gilbert »

The mode that counts is that used internally by your graphics cards - specifically not that which your graphics card tells you its using (which might be emulated).

In practice, most new graphics cards use a 32 bit mode internally (and emulate 24 and 32 bit) which means that this bug won't show on most systems regardless of the setting you use.

Some cards though (for example yours / beowuufs) must use a 24 bit mode internally (but emulate 32 bit) and so will show the bug, again regardless of the setting you use.

As you say - at least it works now! :wink:
User avatar
Sophia
Concise and Honest
Posts: 4306
Joined: Thu Sep 12, 2002 9:50 pm
Location: Nowhere in particular
Contact:

Post by Sophia »

Oh! It makes sense now. :D
Post Reply