the results look like the vertex stride in hardware is wrong (vertices everywhere at runtime). I was using 16-bit texture coords (this works perfectly) which resulted in my vertex structure being:
int8 u, v; //Requires GU_TEXTURE_8BIT
uint32 color;
int16 x, y, z; //No more padding!
But the hardware doesn't seem to like this. I'm hoping it's a problem with the SDK and not a hardware limitation. Does anyone have any idea why this format wouldn't work?
I can't verify everything right now, but if you declare two 8-bit values followed by a 32-bit value, you'll could probably end up with 16 bits of padding between 'v' and 'color'. What happens if you drop color from a 32-bit value to 16-bit?
chp wrote:What happens if you drop color from a 32-bit value to 16-bit?
Same thing. I tried changing the color to 4444 (and added a 16-bit pad to the end of the position) but had the same result. I also tried setting all values to 0 for every vertex except for the xyz position...same results, vertices everywhere.
That's why I think it's an internal stride calculation issue. When I set everything to 0 except for the xyz position and the vertices still end up everywhere, that screams bad stride to me.