Message ID | 20230301000126.49666-1-stefasab@gmail.com |
---|---|
State | New |
Headers | show |
Series | [FFmpeg-devel] lavfi/buffersrc: issue more specific error in case of invalid parameters | expand |
Context | Check | Description |
---|---|---|
yinshiyou/make_loongarch64 | fail | Make failed |
andriy/make_x86 | fail | Make failed |
On date Wednesday 2023-03-01 01:01:26 +0100, Stefano Sabatini wrote: > --- > libavfilter/buffersrc.c | 13 ++++++++++--- > 1 file changed, 10 insertions(+), 3 deletions(-) > > diff --git a/libavfilter/buffersrc.c b/libavfilter/buffersrc.c > index ba17450b93..ea9556d691 100644 > --- a/libavfilter/buffersrc.c > +++ b/libavfilter/buffersrc.c > @@ -273,9 +273,16 @@ static av_cold int init_video(AVFilterContext *ctx) > { > BufferSourceContext *c = ctx->priv; > > - if (c->pix_fmt == AV_PIX_FMT_NONE || !c->w || !c->h || > - av_q2d(c->time_base) <= 0) { > - av_log(ctx, AV_LOG_ERROR, "Invalid parameters provided.\n"); > + if (c->pix_fmt == AV_PIX_FMT_NONE) { > + av_log(ctx, AV_LOG_ERROR, "Unspecified pixel format\n"); > + return AVERROR(EINVAL); > + } > + if (!c->w || !c->h) { > + av_log(ctx, AV_LOG_ERROR, "Invalid null size %dx%d\n", c->w, c->h); > + return AVERROR(EINVAL); > + } > + if (av_q2d(c->time_base) <= 0) { > + av_log(ctx, AV_LOG_ERROR, "Invalid time base %d/%d\n", c->time_base.num, c->time_base.den); > return AVERROR(EINVAL); > } BTW, I noticied this as part of debugging transcode.c (which looks broken at the moment), since the timebase is read as 0/1 from the decoder context, it would be a valid value when reading from the AVStream (but this information is not copied by avcodec_parameters_to_context). In decode_filter_video.c this is indeed "fixed" by copying the timebase directly from the AVStream. Is this expected? Shouldn't the timebase be copied to the decoder context?
Quoting Stefano Sabatini (2023-03-01 01:05:29) > On date Wednesday 2023-03-01 01:01:26 +0100, Stefano Sabatini wrote: > > --- > > libavfilter/buffersrc.c | 13 ++++++++++--- > > 1 file changed, 10 insertions(+), 3 deletions(-) > > > > diff --git a/libavfilter/buffersrc.c b/libavfilter/buffersrc.c > > index ba17450b93..ea9556d691 100644 > > --- a/libavfilter/buffersrc.c > > +++ b/libavfilter/buffersrc.c > > @@ -273,9 +273,16 @@ static av_cold int init_video(AVFilterContext *ctx) > > { > > BufferSourceContext *c = ctx->priv; > > > > - if (c->pix_fmt == AV_PIX_FMT_NONE || !c->w || !c->h || > > - av_q2d(c->time_base) <= 0) { > > - av_log(ctx, AV_LOG_ERROR, "Invalid parameters provided.\n"); > > + if (c->pix_fmt == AV_PIX_FMT_NONE) { > > + av_log(ctx, AV_LOG_ERROR, "Unspecified pixel format\n"); > > + return AVERROR(EINVAL); > > + } > > + if (!c->w || !c->h) { > > + av_log(ctx, AV_LOG_ERROR, "Invalid null size %dx%d\n", c->w, c->h); > > + return AVERROR(EINVAL); > > + } > > + if (av_q2d(c->time_base) <= 0) { > > + av_log(ctx, AV_LOG_ERROR, "Invalid time base %d/%d\n", c->time_base.num, c->time_base.den); > > return AVERROR(EINVAL); > > } > > BTW, I noticied this as part of debugging transcode.c (which looks > broken at the moment), since the timebase is read as 0/1 from the > decoder context, it would be a valid value when reading from the > AVStream (but this information is not copied by > avcodec_parameters_to_context). In decode_filter_video.c this is > indeed "fixed" by copying the timebase directly from the AVStream. > > Is this expected? Shouldn't the timebase be copied to the decoder > context? Historically, AVCodecContext.time_base for decoding was NOT (as one might expect) the timebase of input packets, set by the user. It was instead the inverse of the framerate stored in codec-level headers, which was called "codec timebase" by some documents. Since that was massively confusing for pretty much everyone, I added AVCodecContext.framerate for exporting the framerate from the decoder, and deprecated the use of AVCodecContext.time_base for decoding entirely. After the recent major bump, time_base should not be used at all in any way when decoding. The timebase of input packets should instead be stored in AVCodecContext.pkt_timebase. I suppose after some time has passed we might want to merge its functionality into time_base.
Quoting Stefano Sabatini (2023-03-01 01:01:26) > --- > libavfilter/buffersrc.c | 13 ++++++++++--- > 1 file changed, 10 insertions(+), 3 deletions(-) > > diff --git a/libavfilter/buffersrc.c b/libavfilter/buffersrc.c > index ba17450b93..ea9556d691 100644 > --- a/libavfilter/buffersrc.c > +++ b/libavfilter/buffersrc.c > @@ -273,9 +273,16 @@ static av_cold int init_video(AVFilterContext *ctx) > { > BufferSourceContext *c = ctx->priv; > > - if (c->pix_fmt == AV_PIX_FMT_NONE || !c->w || !c->h || > - av_q2d(c->time_base) <= 0) { > - av_log(ctx, AV_LOG_ERROR, "Invalid parameters provided.\n"); > + if (c->pix_fmt == AV_PIX_FMT_NONE) { > + av_log(ctx, AV_LOG_ERROR, "Unspecified pixel format\n"); > + return AVERROR(EINVAL); > + } > + if (!c->w || !c->h) { > + av_log(ctx, AV_LOG_ERROR, "Invalid null size %dx%d\n", c->w, c->h); ^^^^ I don't know what a null size is, just drop the word. Otherwise looks good.
On date Wednesday 2023-03-01 15:33:51 +0100, Anton Khirnov wrote: > Quoting Stefano Sabatini (2023-03-01 01:05:29) [...] > > BTW, I noticied this as part of debugging transcode.c (which looks > > broken at the moment), since the timebase is read as 0/1 from the > > decoder context, it would be a valid value when reading from the > > AVStream (but this information is not copied by > > avcodec_parameters_to_context). In decode_filter_video.c this is > > indeed "fixed" by copying the timebase directly from the AVStream. > > > > Is this expected? Shouldn't the timebase be copied to the decoder > > context? > > Historically, AVCodecContext.time_base for decoding was NOT (as one > might expect) the timebase of input packets, set by the user. It was > instead the inverse of the framerate stored in codec-level headers, > which was called "codec timebase" by some documents. > > Since that was massively confusing for pretty much everyone, I added > AVCodecContext.framerate for exporting the framerate from the decoder, > and deprecated the use of AVCodecContext.time_base for decoding > entirely. After the recent major bump, time_base should not be used at > all in any way when decoding. > > The timebase of input packets should instead be stored in > AVCodecContext.pkt_timebase. I suppose after some time has passed we > might want to merge its functionality into time_base. Makes sense, for the time being I understand the correct solution is to use pkt_timebase. Thank you.
diff --git a/libavfilter/buffersrc.c b/libavfilter/buffersrc.c index ba17450b93..ea9556d691 100644 --- a/libavfilter/buffersrc.c +++ b/libavfilter/buffersrc.c @@ -273,9 +273,16 @@ static av_cold int init_video(AVFilterContext *ctx) { BufferSourceContext *c = ctx->priv; - if (c->pix_fmt == AV_PIX_FMT_NONE || !c->w || !c->h || - av_q2d(c->time_base) <= 0) { - av_log(ctx, AV_LOG_ERROR, "Invalid parameters provided.\n"); + if (c->pix_fmt == AV_PIX_FMT_NONE) { + av_log(ctx, AV_LOG_ERROR, "Unspecified pixel format\n"); + return AVERROR(EINVAL); + } + if (!c->w || !c->h) { + av_log(ctx, AV_LOG_ERROR, "Invalid null size %dx%d\n", c->w, c->h); + return AVERROR(EINVAL); + } + if (av_q2d(c->time_base) <= 0) { + av_log(ctx, AV_LOG_ERROR, "Invalid time base %d/%d\n", c->time_base.num, c->time_base.den); return AVERROR(EINVAL); }