Message ID | 20220805035904.59799-2-philipl@overt.org |
---|---|
State | New |
Headers | show |
Series | lavc/vaapi: More 4:4:4 changes | expand |
Context | Check | Description |
---|---|---|
yinshiyou/make_loongarch64 | success | Make finished |
yinshiyou/make_fate_loongarch64 | success | Make fate finished |
andriy/make_x86 | success | Make finished |
andriy/make_fate_x86 | success | Make fate finished |
On Thu, 2022-08-04 at 20:59 -0700, Philip Langdale wrote: > vaapi_decode_find_best_format currently does not set the > VA_SURFACE_ATTRIB_SETTABLE flag on the pixel format attribute that it > returns. > > Without this flag, the attribute will be ignored by vaCreateSurfaces, > meaning that the driver's default logic for picking a pixel format > will > kick in. > > So far, this hasn't produced visible problems, but when trying to > decode 4:4:4 content, at least on Intel, the driver will pick the > 444P planar format, even though the decoder can only return the AYUV > packed format. > > The hwcontext_vaapi code that sets surface attributes when picking > formats does not have this bug. > > Applications may use their own logic for finding the best format, and > so may not hit this bug. eg: mpv is unaffected. > > Signed-off-by: Philip Langdale <philipl@overt.org> > --- > libavcodec/vaapi_decode.c | 1 + > 1 file changed, 1 insertion(+) > > diff --git a/libavcodec/vaapi_decode.c b/libavcodec/vaapi_decode.c > index db48efc3ed..38813eb8e4 100644 > --- a/libavcodec/vaapi_decode.c > +++ b/libavcodec/vaapi_decode.c > @@ -358,6 +358,7 @@ static int > vaapi_decode_find_best_format(AVCodecContext *avctx, > > ctx->pixel_format_attribute = (VASurfaceAttrib) { > .type = VASurfaceAttribPixelFormat, > + .flags = VA_SURFACE_ATTRIB_SETTABLE, Better to fill .value.type with VAGenericValueTypeInteger together: https://github.com/intel/media-driver/blob/4c95e8ef1e98cac661412d02f108e4e1c94d3556/media_driver/linux/common/ddi/media_libva.cpp#L2780 Thanks Fei > .value.value.i = best_fourcc, > }; >
On Fri, 5 Aug 2022 05:16:19 +0000 "Wang, Fei W" <fei.w.wang-at-intel.com@ffmpeg.org> wrote: > > Better to fill .value.type with VAGenericValueTypeInteger together: > https://github.com/intel/media-driver/blob/4c95e8ef1e98cac661412d02f108e4e1c94d3556/media_driver/linux/common/ddi/media_libva.cpp#L2780 > Thanks! I forgot about that. --phil
diff --git a/libavcodec/vaapi_decode.c b/libavcodec/vaapi_decode.c index db48efc3ed..38813eb8e4 100644 --- a/libavcodec/vaapi_decode.c +++ b/libavcodec/vaapi_decode.c @@ -358,6 +358,7 @@ static int vaapi_decode_find_best_format(AVCodecContext *avctx, ctx->pixel_format_attribute = (VASurfaceAttrib) { .type = VASurfaceAttribPixelFormat, + .flags = VA_SURFACE_ATTRIB_SETTABLE, .value.value.i = best_fourcc, };
vaapi_decode_find_best_format currently does not set the VA_SURFACE_ATTRIB_SETTABLE flag on the pixel format attribute that it returns. Without this flag, the attribute will be ignored by vaCreateSurfaces, meaning that the driver's default logic for picking a pixel format will kick in. So far, this hasn't produced visible problems, but when trying to decode 4:4:4 content, at least on Intel, the driver will pick the 444P planar format, even though the decoder can only return the AYUV packed format. The hwcontext_vaapi code that sets surface attributes when picking formats does not have this bug. Applications may use their own logic for finding the best format, and so may not hit this bug. eg: mpv is unaffected. Signed-off-by: Philip Langdale <philipl@overt.org> --- libavcodec/vaapi_decode.c | 1 + 1 file changed, 1 insertion(+)