Partager via


RenderStates Enumeration (Microsoft.DirectX.Direct3D)

Defines device render states.

Definition

Visual Basic Public Enum RenderStates
C# public enum RenderStates
C++ public enum class RenderStates
JScript public enum RenderStates

Members

Member Value Description
Wrap15 205 See Wrap0
Wrap14 204 See Wrap0
Wrap13 203 See Wrap0
Wrap12 202 See Wrap0
Wrap11 201 See Wrap0
Wrap10 200 See Wrap0
Wrap9 199 See Wrap0
Wrap8 198 See Wrap0
PointScaleA 158 Controls the distance-based size attenuation for point primitives. The default value is 1.0f.

This render state is active only when PointScaleEnable is set to true. The range for this value is greater than or equal to 0.0f.
PointScaleEnable 157 Controls how point sprites are rendered. When set to true, texture coordinates of point sprites are set so that full textures are mapped on each point. When set to false, the vertex texture coordinates are used for the entire point.

The default value is false.
PointSpriteEnable 156 Controls how point sprites are rendered. When set to true, texture coordinates of point sprites are set so that full textures are mapped on each point. When set to false, the vertex texture coordinates are used for the entire point.

The default value is false.
PointSizeMin 155 Specifies the minimum size to which point sprites can be set. The default value is 1.0f. The range for this value is greater than or equal to 0.0f.

Point sprites are clamped to the specified size during rendering. Setting the size to less than 1.0 results in points dropping out, if the point does not cover a pixel center and antialiasing is disabled.

Points also will drop out if the size is less than 1.0, if they are being rendered with reduced intensity, and if antialiasing is enabled.
EmissiveMaterialSource 148 Emissive color source for lighting calculations. Valid values are members of the ColorSource enumeration. The default value is ColorSource.Material.
AmbientMaterialSource 147 Ambient color source for lighting calculations. Valid values are members of the ColorSource enumeration. The default value is ColorSource.Material.
SpecularMaterialSource 146 Specular color source for lighting calculations. Valid values are members of the ColorSource enumeration. The default value is ColorSource.Color2.
DiffuseMaterialSource 145 Diffuse color source for lighting calculations. Valid values are members of the ColorSource enumerated type. The default value is ColorSource.Color1. The value for this render state is used only if ColorVertex is set to true.
NormalizeNormals 143 Enables or disables automatic normalization of vertex normals. Set to true to enable normalization of vertex normals, or false to disable it. The default value is false.

Enabling this feature causes the system to normalize the vertex normals for vertices after transforming them to camera space, an operation that can slow down the system.
LocalViewer 142 Specifies whether to use camera-relative specular highlights or orthogonal specular highlights. Set to true to enable camera-relative specular highlights, or false to use orthogonal specular highlights. The default value is true. Applications that use orthogonal projection should specify false.
ColorVertex 141 Enables or disables per-vertex color. Set to true to enable per-vertex color, or false to disable it. The default value is true.

Enabling per-vertex color allows the system to include the color defined for individual vertices in its lighting calculations. For more information, see the following topics.

  • AmbientMaterialSource
  • DiffuseMaterialSource
  • EmissiveMaterialSource
  • SpecularMaterialSource
FogVertexMode 140 The fog formula to use for vertex fog. Valid values are from the FogMode enumeration. The default fog mode is FogMode.None. See Fog.
Ambient 139 Ambient light color. This value is a ColorLeave Site object that specifies the ambient color value. The default value is 0.
Lighting 137 Enables or disables Microsoft Direct3D lighting. Set to true to enable Direct3D lighting, or false to disable it. The default value is true.

Only vertices that include a vertex normal are properly lit; vertices that do not contain a vertex normal employ a dot product of 0 in all lighting calculations.
Clipping 136 Enables or disables primitive clipping by Direct3D. Set to true to enable primitive clipping, or false to disable it. The default value is true.
Wrap7 135 See Wrap0
Wrap6 134 See Wrap0
Wrap5 133 See Wrap0
Wrap4 132 See Wrap0
Wrap3 131 See Wrap0
Wrap2 130 See Wrap0
Wrap1 129 See Wrap0
Wrap0 128 Texture-wrapping behavior for multiple sets of texture coordinates.

Valid values for this render state can be any combination of WrapCoordinates.Zero (or Wrap.U), WrapCoordinates.One (or Wrap.V), WrapCoordinates.Two (or Wrap.W), and WrapCoordinates.Three. These cause the system to wrap in the direction of the first, second, third, and fourth dimensions, sometimes referred to as the s, t, r, and q directions, for a given texture.

The default value for this render state is 0 (wrapping disabled in all directions).
TextureFactor 60 Specifies the color used for multiple-texture blending with the TextureArgument.TFactor texture blending argument or the TextureOperation.BlendFactorAlpha texture blending operation. The default value is 0.
StencilWriteMask 59 Specifies the write mask applied to values written into the stencil buffer. The default mask is 0xFFFFFFFF.
StencilMask 58 Specifies the stencil mask to apply to the reference value and each stencil buffer entry to determine the significant bits for the stencil test. The default mask is 0xFFFFFFFF.
ReferenceStencil 57 Specifies a reference value to use for the stencil test. The default value is 0.
StencilFunction 56 Specifies the comparison function to use for the stencil test. Valid values are members of the Compare enumeration. The default value is Compare.Always.

The comparison function is used to compare the reference value to a stencil buffer entry. This comparison applies only to the bits in the reference value and stencil buffer entry that are set in the stencil mask (by StencilMask).

If the comparison is true, the stencil test passes.
StencilPass 55 Specifies the stencil operation to perform if both the stencil test and the depth test (z-test) pass. Valid values are members of the StencilOperation enumeration. The default value is StencilOperation.Keep.
StencilZBufferFail 54 Specifies the stencil operation to perform if the stencil test passes and the depth test (z-test) fails. Valid values are members of the StencilOperation enumeration. The default value is StencilOperation.Keep.
StencilFail 53 Specifies the stencil operation to perform if the stencil test fails. Valid values are from the StencilOperation enumeration. The default value is StencilOperation.Keep.
StencilEnable 52 Enables or disables stenciling. Set to true to enable stenciling, or false to disable it. The default value is false.
RangeFogEnable 48 Enables or disables range-based vertex fog. Set to true to enable range-based vertex fog, or false to use depth-based fog. The default value is false.

In range-based fog, the distance of an object from the viewer is used to compute fog effects, not the depth of the object (that is, the z-coordinate) in the scene. All fog methods work as usual, except that they use range instead of depth in the computations. See Fog.

Range is the correct factor to use for fog computations, but depth is commonly used instead because it is generally already available, and range is resource-intensive to compute. Using depth to calculate fog has the undesirable effect of making the fogginess of peripheral objects change as the viewer's eye moves; in this case, the depth changes and the range remains constant. Because no hardware currently supports per-pixel range-based fog, range correction is offered only for Vertex Fog.
FogDensity 38 Fog density for pixel or vertex fog used in exponential fog modes. Valid fog density values range from 0.0 through 1.0. The default value is 1.0. See Fog.
FogEnd 37 The depth at which pixel or vertex fog effects end for linear fog mode. The default value is 1.0f.

Depth is specified in world space for vertex fog, and in either device space [0.0, 1.0] or world space for pixel fog. For pixel fog, these values are in device space when the system uses z for fog calculations, or in world space when the system uses eye-relative fog (w-fog). See Fog.
FogStart 36 The depth at which pixel or vertex fog effects begin for linear fog mode. The default value is 0.0f.

Depth is specified in world space for vertex fog, and in either device space [0.0, 1.0] or world space for pixel fog. For pixel fog, these values are in device space when the system uses z for fog calculations, or in world space when the system uses eye-relative fog (w-fog). See Fog.
FogTableMode 35 The fog formula to use for pixel fog. Valid values are from the FogMode enumeration. The default fog mode is FogMode.None. See Fog.
SpecularEnable 29 Enables or disables specular highlights. Set to true to enable specular highlights, or false to disable them. The default value is false.

Specular highlights are calculated as though every vertex in the object being lit is at the object's origin. This gives the expected results as long as the object is modeled around the origin and the distance from the light to the object is relatively large. In other cases, the results are undefined.

When this state is set to true, the specular color is added to the base color after the texture cascade but before alpha blending.
FogEnable 28 Enables or disables fog blending. Set to true to enable fog blending, or false to disable it. The default value is false. See Fog.
AlphaBlendEnable 27 Set to true to enable alpha-blended transparency, or false to disable it.

The default value is false. The type of alpha blending is determined by the SourceBlend and DestinationBlend render states.
DitherEnable 26 Enables or disables dither. Set to true to enable dithering, or false to disable it. The default value is false.
AlphaFunction 25 A member of the Compare enumeration that represents the alpha comparison function.

The default value is Compare.Always.

The AlphaFunction member enables an application to accept or reject a pixel based on its alpha value.
ReferenceAlpha 24 Specifies a reference alpha value against which pixels are tested when alpha testing is enabled. The default value is 0. Values can range from 0x00000000 to 0x000000FF.
ZBufferFunction 23 Specifies the comparison function for the z-buffer test. Valid values are members of the Compare enumeration.

The depth value of the pixel is compared to the depth-buffer value. If the depth value of the pixel passes the comparison function, the pixel is written.
CullMode 22 Specifies how back-facing triangles are culled, if at all. Set to a member of the Cull enumeration that specifies the culling mode. The default value is Cull.CounterClockwise.
DestinationBlend 20 Contains a member of the Blend enumeration that represents the destination . The default value is Blend.Zero.
SourceBlend 19 Contains a member of the Blend enumeration that represents the source . The default value is Blend.One.
LastPixel 16 Enables or disables drawing of the last pixel in a line. Set to true to enable drawing of the last pixel in a line, or false to prevent it. The default value is true.
AlphaTestEnable 15 Enables a per-pixel alpha test. Set to true to enable per-pixel alpha testing, or false to disable it.

If the test passes, the pixel is processed by the frame buffer. Otherwise, all frame-buffer processing is skipped for the pixel.

The test is done by comparing the incoming alpha value with the reference alpha value, using the comparison function provided by AlphaFunction. The reference alpha value is determined by the value set for ReferenceAlpha.

The default value is false.
ZBufferWriteEnable 14 Set to true to enable writing to the depth buffer, or false to disable it.

The default value is true.

This member enables an application to prevent the system from updating the depth buffer with new depth values.

If false, depth comparisons are still made according to the render state ZBufferFunction, assuming that depth buffering is taking place, but depth values are not written to the buffer.
BlendOperationAlpha 209 Value used to select the arithmetic operation applied to separate alpha blend when the render state, SeparateAlphaBlendEnabled, is set to true.

Valid values are defined by the BlendOperation enumeration. The default value is BlendOperation.Add.

If the MiscCaps.SupportsBlendOperation device capability is not supported, BlendOperation.Add is performed.
DestinationBlendAlpha 208 A member of the Blend enumeration that represents the destination .

This value is ignored unless SeparateAlphaBlendEnabled is true.

The default value is Blend.Zero.
SourceBlendAlpha 207 A member of the Blend enumeration that represents the source .

This value is ignored unless SeparateAlphaBlendEnabled is true.

The default value is Blend.One.
SeparateAlphaBlendEnable 206 Enables or disables the separate for the alpha channel. Set to true to enable separate alpha blend, or false disable it. The default value is false.

When set to false, the render target blending factors and operations applied to alpha are forced to be the same as those defined for color. This mode is effectively hardwired to false on implementations that do not set MiscCaps.SupportsSeparateAlphaBlend.

The type of separate alpha blending is determined by SourceBlendAlpha and DestinationBlendAlpha.
DepthBias 195 Sets or retrieves the depth bias for polygons.

The DepthBias value is an integer in the range of 0 through 16 which causes polygons that are physically coplanar to appear separate. Polygons with a high z-bias value appear in front of polygons with a low value, without requiring sorting for drawing order. For example, polygons with a value of 1 appear in front of polygons with a value of 0.
SrgbWriteEnable 194 Enables render-target writes to be gamma corrected to sRGB. Set to true to enable sRGB writes, or false to disable them. The default value is false.
BlendFactor 193 A ColorLeave Site object used for a constant blend factor during alpha blending.

BlendFactor member is available if BlendCaps.SupportsBlendFactor is set in Caps.SourceBlendCaps or Caps.DestinationBlendCaps for the device.
ColorWriteEnable3 192 Additional ColorWriteEnable values for a device. See ColorWriteEnable.

ColorWriteEnable3 is available if MiscCaps.SupportsIndependentWriteMasks is set in Caps.PrimitiveMiscCaps for the device.
ColorWriteEnable2 191 Additional ColorWriteEnable values for a device. See ColorWriteEnable.

ColorWriteEnable2 is available if MiscCaps.SupportsIndependentWriteMasks is set in Caps.PrimitiveMiscCaps for the device.
ColorWriteEnable1 190 Additional ColorWriteEnable values for a device. See ColorWriteEnable.

ColorWriteEnable1 is available if MiscCaps.SupportsIndependentWriteMasks is set in Caps.PrimitiveMiscCaps for the device.
CounterClockwiseStencilFunction 189 The comparison function used by the counterclockwise (CCW) stencil test. Valid values are from the Compare enumeration.

The test passes if ((ref & mask) stencil_compare_function (stencil & mask)) == true.
CounterClockwiseStencilPass 188 The StencilOperation to perform if both counterclockwise (CCW) stencil and z-tests pass. Valid values are from the StencilOperation enumeration. The default value is StencilOperation.Keep.
CounterClockwiseStencilZBufferFail 187 The StencilOperation to perform if the counterclockwise (CCW) stencil test passes and the z-test fails. Valid values are from the StencilOperation enumeration. The default value is StencilOperation.Keep.
CounterClockwiseStencilFail 186 The StencilOperation to perform if the counterclockwise (CCW) stencil test fails. Valid values are from the StencilOperation enumeration. The default value is StencilOperation.Keep.
TwoSidedStencilMode 185 Enables or disables two-sided stenciling. Set to true to enable two-sided stenciling, or false to disable it.

The application should set CullMode to Cull.None to enable two-sided stencil mode. If the triangle winding order is clockwise, the Stencil* operations are used. If the winding order is counterclockwise, the CounterClockwiseStencil* operations are used. See StencilOperation
EnableAdaptiveTessellation 184 Enables or disables adaptive tessellation. Set to true to enable adaptive tessellation, or false to disable it. The default value is false.
AdaptiveTessellateW 183 Amount to adaptively tessellate in the w direction. The default value is 0.0f.
AdaptiveTessellateZ 182 Amount to adaptively tessellate in the z direction. The default value is 1.0f.
AdaptiveTessellateY 181 Amount to adaptively tessellate in the y direction. The default value is 0.0f.
AdaptiveTessellateX 180 Amount to adaptively tessellate in the x direction. The default value is 0.0f.
MaxTessellationLevel 179 The maximum tessellation level. The default value is 1.0f.
MinTessellationLevel 178 The minimum tessellation level. The default value is 1.0f.
AntialiasedLineEnable 176 Set to true to enable antialiasing of lines, or false to disable it. The default value is false.

The AntialiasedLineEnable member applies to triangles drawn in wireframe mode as well as line-drawing primitive types.

When rendering to a multisample render target, this render state is ignored, and all lines are rendered aliased. For antialiased line rendering in multisample render targets, use a Line object, which generates textured polygons.
SlopeScaleDepthBias 175 Used to determine how much bias can be applied to coplanar primitives to reduce z-fighting. The default value is 0.

bias = (max * SlopeScaleDepthBias) + DepthBias

where max is the maximum depth slope of the triangle being rendered. See Depth Buffers.
ScissorTestEnable 174 Enables or disables scissor testing. Set to true to enable scissor testing, or false to disable it. The default value is false.
NormalDegree 173 The degree of interpolation (linear, cubic, quadratic, or quintic) using the N-patch normal. Valid values are from the DegreeType enumeration. The default render state is DegreeType.Linear.
PositionDegree 172 The N-patch position interpolation degree. Valid values are from the DegreeType enumeration that specifies the degree. The default value is DegreeType.Cubic.

The DegreeType.Linear value also can be used.
BlendOperation 171 A value used to select the arithmetic operation to apply when the alpha blend render state, AlphaBlendEnable is set to true.

Valid values are defined by the BlendOperation enumeration. The default value is BlendOperation.Add.

If the MiscCaps.SupportsBlendOperation device capability is not supported, BlendOperation.Add is performed.
TweenFactor 170 Specifies a floating-point value that controls the tween factor. The default value is 0.0f.
ColorWriteEnable 168 Enables a per-channel write for the render target color buffer. Valid values for this render state can be any combination of ColorWriteEnable enumeration members.
IndexedVertexBlendEnable 167 Enables or disables indexed vertex blending. Set to true to enable indexed vertex blending, or false to disable it. The default value is false.

When indexed vertex blending is disabled and vertex blending is enabled through VertexBlend, it is equivalent to having matrix indices 0, 1, 2, and 3 in every vertex. If indexed vertex blending is enabled (set to true), the user must pass matrix indices with every vertex.
PointSizeMax 166 Specifies the maximum size to which point sprites can be set. The default value is 64.0f. The value must be less than or equal to Caps.MaxPointSize and greater than or equal to PointSizeMin.
DebugMonitorToken 165 Enables or disables the debug monitor token. Set to true to enable the debug monitor, or false to disable it. Should be set only for debugging the monitor and is useful for debug builds only.

The default value for the debug monitor is true.
PatchEdgeStyle 163 The tessellation mode for patch edges. Valid value are from the PatchEdge enumeration. The default render state is PatchEdge.Discrete.

Using the PatchEdge.Continuous tessellation mode helps reduce rendering artifacts.
MultisampleMask 162 Enables use of a multisample buffer as an accumulation buffer. The default value is 0xFFFFFFFF.

This render state allows multipass rendering of geometry, in which each pass updates a subset of samples. The state has no effect when rendering to a single sample buffer.

Each bit in the MultisampleMask, starting at the least significant bit (LSB), controls modification of one of the samples in a multisample render target. Thus, for an 8-sample render target, the low byte contains the eight write enables for each of the eight samples. This property enables use of a multisample buffer as an accumulation buffer. If there are n multisamples and k enabled samples, the resulting intensity of the rendered image should be k/n. Each component RGB of every pixel is factored by k/n. See MultiSampleType
MultisampleAntiAlias 161 Determines how individual samples are computed when using a multisample render target buffer.

When set to true, the multiple samples are computed so that full-scene antialiasing is performed by sampling at different sample positions for each multiple sample.

When set to false, the multiple samples are all written with the same sample value, sampled at the pixel center, which allows non-antialiased rendering to a multisample buffer. The default value is true. This render state has no effect when rendering to a single sample buffer.
PointScaleC 160 Controls the distance-based size attenuation for point primitives. The default value is 1.0f.

This render state is active only when PointScaleEnable is set to true. The range for this value is greater than or equal to 0.0f.
PointScaleB 159 Controls the distance-based size attenuation for point primitives. The default value is 1.0f.

This render state is active only when PointScaleEnable is set to true. The range for this value is greater than or equal to 0.0f.
ClipPlaneEnable 152 Enables or disables user-defined clipping planes. Valid values are any DWORD in which the status of each bit (set or not set) toggles the activation state of a corresponding user-defined clipping plane. The default value is 0.

The least significant bit (bit 0) controls the first clipping plane at index 0, and subsequent bits control the activation of clipping planes at higher indexes. If a bit is set, the system applies the appropriate clipping plane during scene rendering.
VertexBlend 151 Specifies the number of matrices to use to perform geometry blending. Valid values are members of the VertexBlend enumeration. The default value is VertexBlend.Disable.
FogColor 34 Retrieves or sets the fog color as a ColorLeave Site object.
ShadeMode 9 Specifies the shade mode to use for rendering. Valid values are from the ShadeMode enumeration. The default value is ShadeMode.Gouraud.
ZEnable 7 Enables or disables depth buffering. Set to true to enable depth buffering, or false to disable it. The default value is true.
PointSize 154 Specifies the size to use for point size computation in cases in which point size is not specified for each vertex. The default value is 1.0f.

This value is not used when the vertex contains a point size. The value is defined in screen space units if PointScaleEnable is set to false; otherwise, it is defined in world space units. The range for the value is greater than or equal to 0.0f.
FillMode 8 A value from the FillMode enumeration that represents the fill mode to apply. The default value is FillMode.Solid.

Enumeration Information

Namespace Microsoft.DirectX.Direct3D
Assembly Microsoft.DirectX.Direct3D (microsoft.directx.direct3d.dll)
Strong Name Microsoft.DirectX.Direct3D,  Version=1.0.900.0,  Culture=neutral,  PublicKeyToken=d3231b57b74a1492

See Also