Message ID | 20200922034740.1478-1-pkoshevoy@gmail.com |
---|---|
State | Accepted |
Commit | 5bbf58ab876279ca1a5a2f30563f271c99b93e62 |
Headers | show |
Series | [FFmpeg-devel] avfilter/setparams: add FF_FILTER_FLAG_HWFRAME_AWARE | expand |
Context | Check | Description |
---|---|---|
andriy/default | pending | |
andriy/make | success | Make finished |
andriy/make_fate | success | Make fate finished |
On Mon, Sep 21, 2020 at 09:47:40PM -0600, Pavel Koshevoy wrote: > Allow setparams to be used with hw backed frames and > avoid an assertion failure in avfilter_config_links. > --- > libavfilter/vf_setparams.c | 3 +++ > 1 file changed, 3 insertions(+) > LGTM > diff --git a/libavfilter/vf_setparams.c b/libavfilter/vf_setparams.c > index 689097fac0..72a69e3fc2 100644 > --- a/libavfilter/vf_setparams.c > +++ b/libavfilter/vf_setparams.c > @@ -169,6 +169,7 @@ AVFilter ff_vf_setparams = { > .priv_class = &setparams_class, > .inputs = inputs, > .outputs = outputs, > + .flags_internal = FF_FILTER_FLAG_HWFRAME_AWARE, > }; > > #if CONFIG_SETRANGE_FILTER > @@ -208,6 +209,7 @@ AVFilter ff_vf_setrange = { > .priv_class = &setrange_class, > .inputs = inputs, > .outputs = outputs, > + .flags_internal = FF_FILTER_FLAG_HWFRAME_AWARE, > }; > #endif /* CONFIG_SETRANGE_FILTER */ > > @@ -242,5 +244,6 @@ AVFilter ff_vf_setfield = { > .priv_class = &setfield_class, > .inputs = inputs, > .outputs = outputs, > + .flags_internal = FF_FILTER_FLAG_HWFRAME_AWARE, > }; > #endif /* CONFIG_SETFIELD_FILTER */ > -- > 2.26.2 > > _______________________________________________ > ffmpeg-devel mailing list > ffmpeg-devel@ffmpeg.org > https://ffmpeg.org/mailman/listinfo/ffmpeg-devel > > To unsubscribe, visit link above, or email > ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".
On Wed, Sep 23, 2020 at 1:32 PM Paul B Mahol <onemda@gmail.com> wrote: > On Mon, Sep 21, 2020 at 09:47:40PM -0600, Pavel Koshevoy wrote: > > Allow setparams to be used with hw backed frames and > > avoid an assertion failure in avfilter_config_links. > > --- > > libavfilter/vf_setparams.c | 3 +++ > > 1 file changed, 3 insertions(+) > > > > LGTM > > > applied, pushed. Thank you, Pavel.
On 28/09/2020 02:17, Pavel Koshevoy wrote: > On Wed, Sep 23, 2020 at 1:32 PM Paul B Mahol <onemda@gmail.com> wrote: > >> On Mon, Sep 21, 2020 at 09:47:40PM -0600, Pavel Koshevoy wrote: >>> Allow setparams to be used with hw backed frames and >>> avoid an assertion failure in avfilter_config_links. >>> --- >>> libavfilter/vf_setparams.c | 3 +++ >>> 1 file changed, 3 insertions(+) >>> >> >> LGTM >> >> >> > > applied, pushed. I reverted this: setparams is not hwframe aware so adding the flag does not make sense. It broke all attempts to use it with hardware frames, because the default context passthrough is required. E.g. $ ./ffmpeg_g -y -i in.mp4 -init_hw_device vaapi:/dev/dri/renderD128 -an -vf hwupload,setparams=range=full,hwdownload out.mp4 ... [hwdownload @ 0x55916d90a580] The input must have a hardware frame reference. [Parsed_hwdownload_2 @ 0x55916dcab140] Failed to configure input pad on Parsed_hwdownload_2 Error reinitializing filters! Failed to inject frame into filter network: Invalid argument Error while processing the decoded data for stream #0:0 Conversion failed! Perhaps it would help if you explained what your original problem was, because it definitely wasn't this. - Mark
On Tue, Sep 29, 2020 at 10:09 AM Mark Thompson <sw@jkqxz.net> wrote: > On 28/09/2020 02:17, Pavel Koshevoy wrote: > > On Wed, Sep 23, 2020 at 1:32 PM Paul B Mahol <onemda@gmail.com> wrote: > > > >> On Mon, Sep 21, 2020 at 09:47:40PM -0600, Pavel Koshevoy wrote: > >>> Allow setparams to be used with hw backed frames and > >>> avoid an assertion failure in avfilter_config_links. > >>> --- > >>> libavfilter/vf_setparams.c | 3 +++ > >>> 1 file changed, 3 insertions(+) > >>> > >> > >> LGTM > >> > >> > >> > > > > applied, pushed. > > I reverted this: setparams is not hwframe aware so adding the flag does > not make sense. It broke all attempts to use it with hardware frames, > because the default context passthrough is required. > > E.g. > > $ ./ffmpeg_g -y -i in.mp4 -init_hw_device vaapi:/dev/dri/renderD128 -an > -vf hwupload,setparams=range=full,hwdownload out.mp4 > ... > [hwdownload @ 0x55916d90a580] The input must have a hardware frame > reference. > [Parsed_hwdownload_2 @ 0x55916dcab140] Failed to configure input pad on > Parsed_hwdownload_2 > Error reinitializing filters! > Failed to inject frame into filter network: Invalid argument > Error while processing the decoded data for stream #0:0 > Conversion failed! > > > Perhaps it would help if you explained what your original problem was, > because it definitely wasn't this. > > - Mark > > It's pretty much this use case, except I'm not using ffmpeg cli but the avfilter api to configure the filter chain, and I'm working with AV_PIX_FMT_CUDA frames. Perhaps I am mis-using the api, but my patch was sufficient for my needs: ``` bool VideoFilterChain::setup_filter_links(int num_threads) { graph_ = avfilter_graph_alloc(); graph_->nb_threads = num_threads; int err = avfilter_graph_parse2(graph_, filters_.c_str(), &in_, &out_); UL_FAIL_IF_AVERROR(err); if (hw_frames_.ref_) { AVHWFramesContext * hw_frames_ctx = hw_frames_.get<AVHWFramesContext>(); AVBufferRef * device_ref = hw_frames_ctx->device_ref; for (int i = 0; i < graph_->nb_filters; i++) { AVFilterContext * filter_ctx = graph_->filters[i]; UL_ASSERT(!filter_ctx->hw_device_ctx); filter_ctx->hw_device_ctx = av_buffer_ref(device_ref); bool found_hwdownload = strcmp(filter_ctx->filter->name, "hwdownload") == 0; if (found_hwdownload) { break; } for (int j = 0; j < filter_ctx->nb_outputs; j++) { AVFilterLink * link = filter_ctx->outputs[j]; UL_ASSERT(!link->hw_frames_ctx); link->hw_frames_ctx = av_buffer_ref(hw_frames_.ref_); } } } err = avfilter_graph_config(graph_, NULL); UL_FAIL_IF_AVERROR(err); src_ = lookup_src(graph_->nb_filters ? graph_->filters[0] : NULL, "buffer"); sink_ = lookup_sink(src_, "buffersink"); UL_FAIL_UNLESS(src_ && sink_); return true; } ``` the filters_ datamember could contain setparams filter, like this: ``` if (src_pix_fmt != dst_pix_fmt) { oss << ",format=pix_fmts=" << dst_pix_fmt_txt; } if (!(same_color_specs || csp_xform_.zscale_transform())) { // a bold faced lie ... it's either that // or do the actual colorspace conversion: oss << ",setparams" << "=range=" << av_color_range_name(dst_specs_.color_range) << ":color_primaries=" << av_color_primaries_name(dst_specs_.color_primaries) << ":color_trc=" << to_colorspace_trc_name(dst_specs_.color_trc) << ":colorspace=" << av_color_space_name(dst_specs_.colorspace); } } oss << ",buffersink"; filters_ = oss.str().c_str(); ```
On 29/09/2020 18:14, Pavel Koshevoy wrote: > On Tue, Sep 29, 2020 at 10:09 AM Mark Thompson <sw@jkqxz.net> wrote: > >> On 28/09/2020 02:17, Pavel Koshevoy wrote: >>> On Wed, Sep 23, 2020 at 1:32 PM Paul B Mahol <onemda@gmail.com> wrote: >>> >>>> On Mon, Sep 21, 2020 at 09:47:40PM -0600, Pavel Koshevoy wrote: >>>>> Allow setparams to be used with hw backed frames and >>>>> avoid an assertion failure in avfilter_config_links. >>>>> --- >>>>> libavfilter/vf_setparams.c | 3 +++ >>>>> 1 file changed, 3 insertions(+) >>>>> >>>> >>>> LGTM >>>> >>>> >>>> >>> >>> applied, pushed. >> >> I reverted this: setparams is not hwframe aware so adding the flag does >> not make sense. It broke all attempts to use it with hardware frames, >> because the default context passthrough is required. >> >> E.g. >> >> $ ./ffmpeg_g -y -i in.mp4 -init_hw_device vaapi:/dev/dri/renderD128 -an >> -vf hwupload,setparams=range=full,hwdownload out.mp4 >> ... >> [hwdownload @ 0x55916d90a580] The input must have a hardware frame >> reference. >> [Parsed_hwdownload_2 @ 0x55916dcab140] Failed to configure input pad on >> Parsed_hwdownload_2 >> Error reinitializing filters! >> Failed to inject frame into filter network: Invalid argument >> Error while processing the decoded data for stream #0:0 >> Conversion failed! >> >> >> Perhaps it would help if you explained what your original problem was, >> because it definitely wasn't this. >> >> - Mark >> >> > It's pretty much this use case, except I'm not using ffmpeg cli but the > avfilter api to configure the filter chain, and I'm working with > AV_PIX_FMT_CUDA frames. > Perhaps I am mis-using the api, but my patch was sufficient for my needs: > > ``` > bool > VideoFilterChain::setup_filter_links(int num_threads) > { > graph_ = avfilter_graph_alloc(); > graph_->nb_threads = num_threads; > int err = avfilter_graph_parse2(graph_, filters_.c_str(), &in_, > &out_); > UL_FAIL_IF_AVERROR(err); > > if (hw_frames_.ref_) > { > AVHWFramesContext * hw_frames_ctx = > hw_frames_.get<AVHWFramesContext>(); > AVBufferRef * device_ref = hw_frames_ctx->device_ref; > > for (int i = 0; i < graph_->nb_filters; i++) > { > AVFilterContext * filter_ctx = graph_->filters[i]; > UL_ASSERT(!filter_ctx->hw_device_ctx); > filter_ctx->hw_device_ctx = av_buffer_ref(device_ref); > > bool found_hwdownload = strcmp(filter_ctx->filter->name, > "hwdownload") == 0; > if (found_hwdownload) > { > break; > } > > for (int j = 0; j < filter_ctx->nb_outputs; j++) > { > AVFilterLink * link = filter_ctx->outputs[j]; > UL_ASSERT(!link->hw_frames_ctx); > link->hw_frames_ctx = av_buffer_ref(hw_frames_.ref_); <http://git.videolan.org/?p=ffmpeg.git;a=blob;f=libavfilter/avfilter.h;h=99297ae798aa325ac37836a3a90d9a3f8e1e7a95;hb=HEAD#l497> Don't write to internal fields. I'm not sure exactly what you're trying to do by writing the internal context fields in this section, but I suspect that if you just remove it entirely then the expected context propagation will happen and it will work. > } > } > } > > err = avfilter_graph_config(graph_, NULL); > UL_FAIL_IF_AVERROR(err); > > src_ = lookup_src(graph_->nb_filters ? graph_->filters[0] : NULL, > "buffer"); > sink_ = lookup_sink(src_, "buffersink"); > UL_FAIL_UNLESS(src_ && sink_); > > return true; > } > ``` > > the filters_ datamember could contain setparams filter, like this: > > ``` > if (src_pix_fmt != dst_pix_fmt) > { > oss << ",format=pix_fmts=" << dst_pix_fmt_txt; > } > > if (!(same_color_specs || csp_xform_.zscale_transform())) > { > // a bold faced lie ... it's either that > // or do the actual colorspace conversion: > oss << ",setparams" > << "=range=" > << av_color_range_name(dst_specs_.color_range) > << ":color_primaries=" > << > av_color_primaries_name(dst_specs_.color_primaries) > << ":color_trc=" > << to_colorspace_trc_name(dst_specs_.color_trc) > << ":colorspace=" > << av_color_space_name(dst_specs_.colorspace); > } > } > > oss << ",buffersink"; > > filters_ = oss.str().c_str(); > ``` - Mark
On Tue, Sep 29, 2020 at 12:14 PM Mark Thompson <sw@jkqxz.net> wrote: > On 29/09/2020 18:14, Pavel Koshevoy wrote: > > On Tue, Sep 29, 2020 at 10:09 AM Mark Thompson <sw@jkqxz.net> wrote: > > > > <snip> > > >> - Mark > >> > >> > > It's pretty much this use case, except I'm not using ffmpeg cli but the > > avfilter api to configure the filter chain, and I'm working with > > AV_PIX_FMT_CUDA frames. > > Perhaps I am mis-using the api, but my patch was sufficient for my needs: > > > > ``` > > bool > > VideoFilterChain::setup_filter_links(int num_threads) > > { > > graph_ = avfilter_graph_alloc(); > > graph_->nb_threads = num_threads; > > int err = avfilter_graph_parse2(graph_, filters_.c_str(), &in_, > > &out_); > > UL_FAIL_IF_AVERROR(err); > > > > if (hw_frames_.ref_) > > { > > AVHWFramesContext * hw_frames_ctx = > > hw_frames_.get<AVHWFramesContext>(); > > AVBufferRef * device_ref = hw_frames_ctx->device_ref; > > > > for (int i = 0; i < graph_->nb_filters; i++) > > { > > AVFilterContext * filter_ctx = graph_->filters[i]; > > UL_ASSERT(!filter_ctx->hw_device_ctx); > > filter_ctx->hw_device_ctx = av_buffer_ref(device_ref); > > > > bool found_hwdownload = strcmp(filter_ctx->filter->name, > > "hwdownload") == 0; > > if (found_hwdownload) > > { > > break; > > } > > > > for (int j = 0; j < filter_ctx->nb_outputs; j++) > > { > > AVFilterLink * link = filter_ctx->outputs[j]; > > UL_ASSERT(!link->hw_frames_ctx); > > link->hw_frames_ctx = > av_buffer_ref(hw_frames_.ref_); > > > < > http://git.videolan.org/?p=ffmpeg.git;a=blob;f=libavfilter/avfilter.h;h=99297ae798aa325ac37836a3a90d9a3f8e1e7a95;hb=HEAD#l497 > > > > Don't write to internal fields. > > I'm not sure exactly what you're trying to do by writing the internal > context fields in this section, but I suspect that if you just remove it > entirely then the expected context propagation will happen and it will work. > > Okay, if I fix my API mis-use and context propagation happens automagically (idk where it would get the context in the 1st place, yet)... won't that still leave me with setparams that is not FF_FILTER_FLAG_HWFRAME_AWARE and avfilter_config_links would still fail? Thank you, Pavel.
On 29/09/2020 19:49, Pavel Koshevoy wrote: > On Tue, Sep 29, 2020 at 12:14 PM Mark Thompson <sw@jkqxz.net> wrote: > >> On 29/09/2020 18:14, Pavel Koshevoy wrote: >>> On Tue, Sep 29, 2020 at 10:09 AM Mark Thompson <sw@jkqxz.net> wrote: >>> >> >> <snip> >> >>>> - Mark >>>> >>>> >>> It's pretty much this use case, except I'm not using ffmpeg cli but the >>> avfilter api to configure the filter chain, and I'm working with >>> AV_PIX_FMT_CUDA frames. >>> Perhaps I am mis-using the api, but my patch was sufficient for my needs: >>> >>> ``` >>> bool >>> VideoFilterChain::setup_filter_links(int num_threads) >>> { >>> graph_ = avfilter_graph_alloc(); >>> graph_->nb_threads = num_threads; >>> int err = avfilter_graph_parse2(graph_, filters_.c_str(), &in_, >>> &out_); >>> UL_FAIL_IF_AVERROR(err); >>> >>> if (hw_frames_.ref_) >>> { >>> AVHWFramesContext * hw_frames_ctx = >>> hw_frames_.get<AVHWFramesContext>(); >>> AVBufferRef * device_ref = hw_frames_ctx->device_ref; >>> >>> for (int i = 0; i < graph_->nb_filters; i++) >>> { >>> AVFilterContext * filter_ctx = graph_->filters[i]; >>> UL_ASSERT(!filter_ctx->hw_device_ctx); >>> filter_ctx->hw_device_ctx = av_buffer_ref(device_ref); >>> >>> bool found_hwdownload = strcmp(filter_ctx->filter->name, >>> "hwdownload") == 0; >>> if (found_hwdownload) >>> { >>> break; >>> } >>> >>> for (int j = 0; j < filter_ctx->nb_outputs; j++) >>> { >>> AVFilterLink * link = filter_ctx->outputs[j]; >>> UL_ASSERT(!link->hw_frames_ctx); >>> link->hw_frames_ctx = >> av_buffer_ref(hw_frames_.ref_); >> >> >> < >> http://git.videolan.org/?p=ffmpeg.git;a=blob;f=libavfilter/avfilter.h;h=99297ae798aa325ac37836a3a90d9a3f8e1e7a95;hb=HEAD#l497 >>> >> >> Don't write to internal fields. >> >> I'm not sure exactly what you're trying to do by writing the internal >> context fields in this section, but I suspect that if you just remove it >> entirely then the expected context propagation will happen and it will work. >> >> > Okay, if I fix my API mis-use and context propagation happens > automagically (idk where it would get the context in the 1st place, > yet)... If you are putting hardware frames into a filter graph then it comes from buffersrc. If you are using hwupload inside the filter graph then it gets made there from the device you provide as AVFilterContext.hw_device_ctx. > won't that still leave me with setparams that is not > FF_FILTER_FLAG_HWFRAME_AWARE and avfilter_config_links would still fail? Why would it fail? - Mark
On Thu, Oct 1, 2020 at 4:24 PM Mark Thompson <sw@jkqxz.net> wrote: > On 29/09/2020 19:49, Pavel Koshevoy wrote: > > On Tue, Sep 29, 2020 at 12:14 PM Mark Thompson <sw@jkqxz.net> wrote: > > > >> On 29/09/2020 18:14, Pavel Koshevoy wrote: > >>> On Tue, Sep 29, 2020 at 10:09 AM Mark Thompson <sw@jkqxz.net> wrote: > >>> > >> > >> <snip> > >> > >>>> - Mark > >>>> > >>>> > >>> It's pretty much this use case, except I'm not using ffmpeg cli but the > >>> avfilter api to configure the filter chain, and I'm working with > >>> AV_PIX_FMT_CUDA frames. > >>> Perhaps I am mis-using the api, but my patch was sufficient for my > needs: > >>> > >>> ``` > >>> bool > >>> VideoFilterChain::setup_filter_links(int num_threads) > >>> { > >>> graph_ = avfilter_graph_alloc(); > >>> graph_->nb_threads = num_threads; > >>> int err = avfilter_graph_parse2(graph_, filters_.c_str(), > &in_, > >>> &out_); > >>> UL_FAIL_IF_AVERROR(err); > >>> > >>> if (hw_frames_.ref_) > >>> { > >>> AVHWFramesContext * hw_frames_ctx = > >>> hw_frames_.get<AVHWFramesContext>(); > >>> AVBufferRef * device_ref = hw_frames_ctx->device_ref; > >>> > >>> for (int i = 0; i < graph_->nb_filters; i++) > >>> { > >>> AVFilterContext * filter_ctx = graph_->filters[i]; > >>> UL_ASSERT(!filter_ctx->hw_device_ctx); > >>> filter_ctx->hw_device_ctx = > av_buffer_ref(device_ref); > >>> > >>> bool found_hwdownload = > strcmp(filter_ctx->filter->name, > >>> "hwdownload") == 0; > >>> if (found_hwdownload) > >>> { > >>> break; > >>> } > >>> > >>> for (int j = 0; j < filter_ctx->nb_outputs; j++) > >>> { > >>> AVFilterLink * link = filter_ctx->outputs[j]; > >>> UL_ASSERT(!link->hw_frames_ctx); > >>> link->hw_frames_ctx = > >> av_buffer_ref(hw_frames_.ref_); > >> > >> > >> < > >> > http://git.videolan.org/?p=ffmpeg.git;a=blob;f=libavfilter/avfilter.h;h=99297ae798aa325ac37836a3a90d9a3f8e1e7a95;hb=HEAD#l497 > >>> > >> > >> Don't write to internal fields. > >> > >> I'm not sure exactly what you're trying to do by writing the internal > >> context fields in this section, but I suspect that if you just remove it > >> entirely then the expected context propagation will happen and it will > work. > >> > >> > > Okay, if I fix my API mis-use and context propagation happens > > automagically (idk where it would get the context in the 1st place, > > yet)... > > If you are putting hardware frames into a filter graph then it comes from > buffersrc. If you are using hwupload inside the filter graph then it gets > made there from the device you provide as AVFilterContext.hw_device_ctx. > > > won't that still leave me with setparams that is not > > FF_FILTER_FLAG_HWFRAME_AWARE and avfilter_config_links would still fail? > > Why would it fail? > > I guess I no longer know how to use avfilter api: I've been doing this: ``` graph_ = avfilter_graph_alloc(); if (nb_threads_ > 0) { graph_->nb_threads = nb_threads_; } int err = avfilter_graph_parse2(graph_, filters_.c_str(), &in_, &out_); ``` where filters_ is something like this: "buffer=width=1920:height=800:pix_fmt=cuda:frame_rate=60/1:time_base=1/90000:sar=1/1,hwdownload,format=pix_fmts=yuv420p,scale=w=128:h=47,setsar=sar=0.888889,buffersink" I don't see where to pass in the hwframes context. AVFilterGraph makes no mention of it, and avfilter_graph_parse2 has no parameter for it.
On Sun, Mar 14, 2021 at 1:59 PM Pavel Koshevoy <pkoshevoy@gmail.com> wrote: > > I guess I no longer know how to use avfilter api: > > I've been doing this: > ``` > graph_ = avfilter_graph_alloc(); > > if (nb_threads_ > 0) > { > graph_->nb_threads = nb_threads_; > } > > int err = avfilter_graph_parse2(graph_, filters_.c_str(), &in_, &out_); > ``` > > where filters_ is something like this: "buffer=width=1920:height=800:pix_fmt=cuda:frame_rate=60/1:time_base=1/90000:sar=1/1,hwdownload,format=pix_fmts=yuv420p,scale=w=128:h=47,setsar=sar=0.888889,buffersink" > I don't see where to pass in the hwframes context. AVFilterGraph makes no > mention of it, and avfilter_graph_parse2 has no parameter for it. > nvm, after looking through fftools/ffmpeg_hw.c and ffmpeg_filter.c I think I see it now -- I need to call av_buffersrc_parameters_set and pass hw_frames_ctx through AVBufferSrcParameters Pavel.
diff --git a/libavfilter/vf_setparams.c b/libavfilter/vf_setparams.c index 689097fac0..72a69e3fc2 100644 --- a/libavfilter/vf_setparams.c +++ b/libavfilter/vf_setparams.c @@ -169,6 +169,7 @@ AVFilter ff_vf_setparams = { .priv_class = &setparams_class, .inputs = inputs, .outputs = outputs, + .flags_internal = FF_FILTER_FLAG_HWFRAME_AWARE, }; #if CONFIG_SETRANGE_FILTER @@ -208,6 +209,7 @@ AVFilter ff_vf_setrange = { .priv_class = &setrange_class, .inputs = inputs, .outputs = outputs, + .flags_internal = FF_FILTER_FLAG_HWFRAME_AWARE, }; #endif /* CONFIG_SETRANGE_FILTER */ @@ -242,5 +244,6 @@ AVFilter ff_vf_setfield = { .priv_class = &setfield_class, .inputs = inputs, .outputs = outputs, + .flags_internal = FF_FILTER_FLAG_HWFRAME_AWARE, }; #endif /* CONFIG_SETFIELD_FILTER */