GU_TEXTURE_8BIT not working?

Discuss the development of new homebrew software, tools and libraries.

Moderators: cheriff, TyRaNiD

Post Reply
av
Posts: 7
Joined: Sun Jan 01, 2006 11:15 am

GU_TEXTURE_8BIT not working?

Post by av »

Hi,

I can't get 8-bit texture coordinates to work at all. When I call:

Code: Select all

sceGuDrawArray (pspPrimType, (GU_TEXTURE_8BIT | GU_COLOR_8888 | GU_VERTEX_16BIT | GU_TRANSFORM_3D), ...)
the results look like the vertex stride in hardware is wrong (vertices everywhere at runtime). I was using 16-bit texture coords (this works perfectly) which resulted in my vertex structure being:

Code: Select all

int16 u, v;
uint32 color;
int16 x, y, z, pad;
I'm trying to get my vertex structure to be 32-bits smaller:

Code: Select all

int8 u, v; //Requires GU_TEXTURE_8BIT
uint32 color;
int16 x, y, z; //No more padding!
But the hardware doesn't seem to like this. I'm hoping it's a problem with the SDK and not a hardware limitation. Does anyone have any idea why this format wouldn't work?

Thanks!
chp
Posts: 313
Joined: Wed Jun 23, 2004 7:16 am

Post by chp »

I can't verify everything right now, but if you declare two 8-bit values followed by a 32-bit value, you'll could probably end up with 16 bits of padding between 'v' and 'color'. What happens if you drop color from a 32-bit value to 16-bit?
GE Dominator
av
Posts: 7
Joined: Sun Jan 01, 2006 11:15 am

Post by av »

chp wrote:What happens if you drop color from a 32-bit value to 16-bit?
Same thing. I tried changing the color to 4444 (and added a 16-bit pad to the end of the position) but had the same result. I also tried setting all values to 0 for every vertex except for the xyz position...same results, vertices everywhere.

That's why I think it's an internal stride calculation issue. When I set everything to 0 except for the xyz position and the vertices still end up everywhere, that screams bad stride to me.

Thanks.
Post Reply