Depth buffer works bad
Posted 27 April 2005 - 02:53 PM
It may be useful to show you some code for better understanding:
// attaching a Z buffer to the target surface
// initialising states
lpD3D9Device->SetRenderState( D3DRS_ZENABLE, D3DZB_TRUE ) ;
lpD3D9Device->SetRenderState(D3DRS_ZFUNC, D3DCMP_GREATEREQUAL ) ;
// Calculating view and projection matrices
GetBoundingSphere(&vCenter,&fRadius); // Center and radius of the object to show.
D3DXMatrixPerspectiveFovRH(&matProj,D3DX_PI/4,1.0f ,0.0f, vCenter.z-fRadius*5);
// Clearing render surface
lpD3D9Device->Clear( 0, NULL, D3DCLEAR_TARGET|D3DCLEAR_ZBUFFER,
D3DCOLOR_XRGB(0,0,0), 1.0f, 0 );
Light source is directional (towards negatif Z)
The issue is that hidden pixels are not eliminated, so the object seems to be transparent.
It works successfully if I use W buffer (D3DRS_ZENABLE, D3DZB_USEW) with some graphic cards (TNT2, RADEON 7000), but I couldn’t use it because some cards didn’t support W buffering (bad results with GEFORCE MX 4000 and FX 5200).
Posted 27 April 2005 - 09:23 PM
Posted 29 April 2005 - 06:39 AM
Posted 29 April 2005 - 02:19 PM
Posted 30 April 2005 - 07:31 AM
But how do you explain that such code works using W buffer?
Posted 30 April 2005 - 07:48 AM
The real code written detects the best format (enumeration for available modes and than selection for the greater one)
Posted 30 April 2005 - 07:52 AM
1 user(s) are reading this topic
0 members, 1 guests, 0 anonymous users