OSDN Git Service

Don't use implementation-defined format with CPU consumers
authorJesse Hall <jessehall@google.com>
Tue, 5 Nov 2013 00:43:03 +0000 (16:43 -0800)
committerJesse Hall <jessehall@google.com>
Tue, 5 Nov 2013 00:43:03 +0000 (16:43 -0800)
commit497ba0e08503806571b52ebe27cc7eee4c0e71a7
tree0edeb7b6cce3fa669fb45be3ef3a1dd6febde936
parent40da5283ebc6b5cf1e3820740dc274c47cc55f6d
Don't use implementation-defined format with CPU consumers

If the virtual display surface is being consumed by the CPU, it can't
be allowed with HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED since there is
no way for the CPU consumer to find out what format gralloc chose. So
for CPU-consumer surfaces, just use the BufferQueue's default format,
which can be set by the consumer.

A better but more invasive change would be to let the consumer require
a certain format (or set of formats?), and disallow the producer from
requesting a different format.

Bug: 11479817
Change-Id: I5b20ee6ac1146550e8799b806e14661d279670c0
services/surfaceflinger/DisplayHardware/VirtualDisplaySurface.cpp
services/surfaceflinger/DisplayHardware/VirtualDisplaySurface.h