intel / DRI2: When available, use DRI2GetBuffersWithFormat

This interface gives the driver two important features.  First, it can
allocate the (fake) front-buffer only when needed.  Second, it can
tell the buffer allocator the format of buffers being allocated.  This
enables support for back-buffer and depth-buffer with different bits
per pixel.

Signed-off-by: Ian Romanick <ian.d.romanick@intel.com>
Reviewed-by: Kristian Høgsberg <krh@redhat.com>
This commit is contained in:
Ian Romanick
2009-04-20 20:56:45 -07:00
committed by Ian Romanick
parent dbf87f2312
commit f2272b5b2f
2 changed files with 98 additions and 15 deletions

View File

@@ -323,8 +323,18 @@ intelDrawBuffer(GLcontext * ctx, GLenum mode)
{
if ((ctx->DrawBuffer != NULL) && (ctx->DrawBuffer->Name == 0)) {
struct intel_context *const intel = intel_context(ctx);
const GLboolean was_front_buffer_rendering =
intel->is_front_buffer_rendering;
intel->is_front_buffer_rendering = (mode == GL_FRONT_LEFT);
/* If we weren't front-buffer rendering before but we are now, make sure
* that the front-buffer has actually been allocated.
*/
if (!was_front_buffer_rendering && intel->is_front_buffer_rendering) {
intel_update_renderbuffers(intel->driContext,
intel->driContext->driDrawablePriv);
}
}
intel_draw_buffer(ctx, ctx->DrawBuffer);