This has been my mini-project starting last night.
So far I've deduced the functions of two 'unknowns':
The two I've been tinkering with are 'NonPalettized' and 'BitDepth'.
NonPalettized seems to be a boolean value that flips between using nonpalettized textures and palettized ones. I'm pretty sure the resulting surface uses the same properties as the display surface. This means that if there's ever a 32-bit display depth exe hack 16-bit textures won't work anymore.
BitDepth serves a different function. As best as I can tell, it's used solely in buffer allocation. I was able to get JK to load a 256x512x16 MAT file by using '32' as the bitdepth, but it seems to be scaled down to fit a 256x256 surface.
I'm still determined to solve all of the unknowns in spite of these enraging setbacks.
So far I've deduced the functions of two 'unknowns':
Code:
struct MATHEADER { int Version; int Type; int NumOfTextures; int NumOfTextures1; int NonPalettized; // Enable 16-bit, 24-bit, 32-bit, ...? int BitDepth; int BlueBitDepth; int GreenBitDepth; int RedBitDepth; int RedShiftLeft; int GreenShiftLeft; int BlueShiftLeft; int RedShiftRight; int GreenShiftRight; int BlueShiftRight; int Unk1; // Alpha depth? int Unk2; // Alpha shift left? int Unk3; // Alpha shift right? };
The two I've been tinkering with are 'NonPalettized' and 'BitDepth'.
NonPalettized seems to be a boolean value that flips between using nonpalettized textures and palettized ones. I'm pretty sure the resulting surface uses the same properties as the display surface. This means that if there's ever a 32-bit display depth exe hack 16-bit textures won't work anymore.
BitDepth serves a different function. As best as I can tell, it's used solely in buffer allocation. I was able to get JK to load a 256x512x16 MAT file by using '32' as the bitdepth, but it seems to be scaled down to fit a 256x256 surface.
I'm still determined to solve all of the unknowns in spite of these enraging setbacks.