diff mbox series

[FFmpeg-devel] lavfi/minterpolate: Allow bigger resolutions if SIZE_MAX is big

Message ID CAB0OVGpn+QnpHKwfJ9n_huWep3o_iUzq7JuqHJKgcfgsek_DDQ@mail.gmail.com
State Superseded
Headers show
Series [FFmpeg-devel] lavfi/minterpolate: Allow bigger resolutions if SIZE_MAX is big | expand

Checks

Context Check Description
andriy/ffmpeg-patchwork success Make fate finished

Commit Message

Carl Eugen Hoyos March 28, 2020, 12:54 p.m. UTC
Hi!

Attached patch allows to work-around ticket #7140, tested on a system
with a lot of memory.

Please review, Carl Eugen

Comments

Carl Eugen Hoyos March 30, 2020, 10:46 p.m. UTC | #1
Am Sa., 28. März 2020 um 13:54 Uhr schrieb Carl Eugen Hoyos
<ceffmpeg@gmail.com>:

> Attached patch allows to work-around ticket #7140, tested on a system
> with a lot of memory.

Ping.

Carl Eugen
Anton Khirnov March 31, 2020, 9:18 a.m. UTC | #2
Quoting Carl Eugen Hoyos (2020-03-28 13:54:22)
> Hi!
> 
> Attached patch allows to work-around ticket #7140, tested on a system
> with a lot of memory.

This looks very ad-hoc. The right thing to do is fix arbitrary limits in
av_malloc_array(). That INT_MAX there looks wrong.
Carl Eugen Hoyos March 31, 2020, 9:56 a.m. UTC | #3
Am Di., 31. März 2020 um 11:18 Uhr schrieb Anton Khirnov <anton@khirnov.net>:
>
> Quoting Carl Eugen Hoyos (2020-03-28 13:54:22)
> > Hi!
> >
> > Attached patch allows to work-around ticket #7140, tested on a system
> > with a lot of memory.
>
> This looks very ad-hoc.

Is there another part of FFmpeg that rightfully allocates that much memory?

> The right thing to do is fix arbitrary limits in
> av_malloc_array().

> That INT_MAX there looks wrong.

I wonder if it is a good idea that demuxers and decoders cannot allocate
random amounts of memory...

Carl Eugen
Anton Khirnov March 31, 2020, 10:24 a.m. UTC | #4
Quoting Carl Eugen Hoyos (2020-03-31 11:56:44)
> Am Di., 31. März 2020 um 11:18 Uhr schrieb Anton Khirnov <anton@khirnov.net>:
> >
> > Quoting Carl Eugen Hoyos (2020-03-28 13:54:22)
> > > Hi!
> > >
> > > Attached patch allows to work-around ticket #7140, tested on a system
> > > with a lot of memory.
> >
> > This looks very ad-hoc.
> 
> Is there another part of FFmpeg that rightfully allocates that much memory?

Something decoding really big images? 2GB is not really that much these
days.

> 
> > The right thing to do is fix arbitrary limits in
> > av_malloc_array().
> 
> > That INT_MAX there looks wrong.
> 
> I wonder if it is a good idea that demuxers and decoders cannot allocate
> random amounts of memory...

I see no valid reason why this specific function should have this
specific limit, while other memory allocation functions have different
limits. Beyond that, I don't think we should have any arbitrary limits
on what allocations are reasonable and what are not. There are many
other --- more appropriate --- mechanisms for limiting memory usage.
Carl Eugen Hoyos March 31, 2020, 12:11 p.m. UTC | #5
> Am 31.03.2020 um 12:24 schrieb Anton Khirnov <anton@khirnov.net>:
> 
> Quoting Carl Eugen Hoyos (2020-03-31 11:56:44)
>>> Am Di., 31. März 2020 um 11:18 Uhr schrieb Anton Khirnov <anton@khirnov.net>:
>>> 
>>> Quoting Carl Eugen Hoyos (2020-03-28 13:54:22)
>>>> Hi!
>>>> 
>>>> Attached patch allows to work-around ticket #7140, tested on a system
>>>> with a lot of memory.
>>> 
>>> This looks very ad-hoc.
>> 
>> Is there another part of FFmpeg that rightfully allocates that much memory?
> 
> Something decoding really big images?

Are you sure?
A quick calculation indicated a max allocation of 1500MB to me, what did you think of?

> 2GB is not really that much these
> days.

Please describe an example.

>> 
>>> The right thing to do is fix arbitrary limits in
>>> av_malloc_array().
>> 
>>> That INT_MAX there looks wrong.
>> 
>> I wonder if it is a good idea that demuxers and decoders cannot allocate
>> random amounts of memory...
> 
> I see no valid reason why this specific function should have this
> specific limit, while other memory allocation functions have different
> limits.

I know of only one other function in FFmpeg that does this, but I believe it already listens to max_alloc.

> Beyond that, I don't think we should have any arbitrary limits
> on what allocations are reasonable and what are not. There are many
> other --- more appropriate --- mechanisms for limiting memory usage.

I tend to disagree, especially as this does not affect any normal usage.

Carl Eugen
diff mbox series

Patch

From c9f21979215660d78dfa3e308acf148094522a02 Mon Sep 17 00:00:00 2001
From: Carl Eugen Hoyos <ceffmpeg@gmail.com>
Date: Sat, 28 Mar 2020 13:46:05 +0100
Subject: [PATCH] lavfi/minterpolate: Allow larger allocations if SIZE_MAX is
 big.

Works around ticket #7140.
---
 libavfilter/vf_minterpolate.c | 8 +++++---
 1 file changed, 5 insertions(+), 3 deletions(-)

diff --git a/libavfilter/vf_minterpolate.c b/libavfilter/vf_minterpolate.c
index b0bb238ade..452166a2e3 100644
--- a/libavfilter/vf_minterpolate.c
+++ b/libavfilter/vf_minterpolate.c
@@ -361,9 +361,11 @@  static int config_input(AVFilterLink *inlink)
     }
 
     if (mi_ctx->mi_mode == MI_MODE_MCI) {
-        mi_ctx->pixel_mvs = av_mallocz_array(width * height, sizeof(PixelMVS));
-        mi_ctx->pixel_weights = av_mallocz_array(width * height, sizeof(PixelWeights));
-        mi_ctx->pixel_refs = av_mallocz_array(width * height, sizeof(PixelRefs));
+        if ((uint64_t)width * height < SIZE_MAX / 2 / FFMAX3(sizeof(PixelMVS), sizeof(PixelWeights), sizeof(PixelRefs))) {
+            mi_ctx->pixel_mvs = av_mallocz(width * height * sizeof(PixelMVS));
+            mi_ctx->pixel_weights = av_mallocz(width * height * sizeof(PixelWeights));
+            mi_ctx->pixel_refs = av_mallocz(width * height * sizeof(PixelRefs));
+        }
         if (!mi_ctx->pixel_mvs || !mi_ctx->pixel_weights || !mi_ctx->pixel_refs) {
             ret = AVERROR(ENOMEM);
             goto fail;
-- 
2.24.1