Message ID | 20220611140820.105906-1-leo.izen@gmail.com |
---|---|
State | New |
Headers | show |
Series | [FFmpeg-devel] avcodec/get_bits: declare VLC table args as const | expand |
Context | Check | Description |
---|---|---|
andriy/make_x86 | success | Make finished |
andriy/make_fate_x86 | success | Make fate finished |
andriy/make_armv7_RPi4 | success | Make finished |
andriy/make_fate_armv7_RPi4 | success | Make fate finished |
Leo Izen: > Declaring the VLC table as const allows a caller to call get_vlc2() > with a pre-generated static const table without generating warnings > for -Wdiscarded-qualifiers. > --- > libavcodec/get_bits.h | 4 ++-- > 1 file changed, 2 insertions(+), 2 deletions(-) > > diff --git a/libavcodec/get_bits.h b/libavcodec/get_bits.h > index d4e9276da1..49202b0211 100644 > --- a/libavcodec/get_bits.h > +++ b/libavcodec/get_bits.h > @@ -775,7 +775,7 @@ static inline const uint8_t *align_get_bits(GetBitContext *s) > > /* Return the LUT element for the given bitstream configuration. */ > static inline int set_idx(GetBitContext *s, int code, int *n, int *nb_bits, > - VLC_TYPE (*table)[2]) > + const VLC_TYPE (*table)[2]) > { > unsigned idx; > > @@ -795,7 +795,7 @@ static inline int set_idx(GetBitContext *s, int code, int *n, int *nb_bits, > * = (max_vlc_length + bits - 1) / bits > * @returns the code parsed or -1 if no vlc matches > */ > -static av_always_inline int get_vlc2(GetBitContext *s, VLC_TYPE (*table)[2], > +static av_always_inline int get_vlc2(GetBitContext *s, const VLC_TYPE (*table)[2], > int bits, int max_depth) > { > #if CACHED_BITSTREAM_READER There is unfortunately an issue here: C11 6.7.3/9 contains the following: "If the specification of an array type includes any type qualifiers, the element type is so-qualified, not the array type." Therefore adding const above means that the functions accept a pointer-to-array-of-two-const-VLC_TYPE, but ordinary callers call this with a pointer-to-array-of-two-VLC_TYPE; the automatic conversion from pointer-to-T to pointer-to-const-T does not help you here, because it would only give you a pointer-to-const-array-of-two-VLC_TYPE, but not a pointer-to-an-array-of-two-const-VLC_TYPE; at least that is the prevailing interpretation of the above part of the spec (I don't get why one does not just use 6.7.3/9 once more to conclude that pointer-to-const-array-of-two-VLC_TYPE is actually equivalent to pointer-to-array-of-const-VLC_TYPE). Older versions of GCC warned by default for such conversions (when using an ISO standard -- it is legal in GNU C standards); current versions still do so when compiling with -pedantic. Clang does not warn about this at all, not even with -pedantic. I see three ways to fix this: a) Add a get_vlc2c that accepts const. It will have the implementation of the current get_vlc2; get_vlc2 meanwhile would be turned into a wrapper for get_vlc2c, i.e. it would solely be used to cast to the expected pointer type. b) Add the necessary casts in the only user that wants to use a const table. c) Stop using VLC_TYPE[2] altogether; use a struct { VLC_TYPE symbol, bits; } (it feels like this struct should actually be called VLC_TYPE). Then adding const works fine as usual; it would IMO be more readable, too, because it would be automatically documented which of the entries is what. This is therefore my preferred option. Would you mind if I implemented this or do you want to do it? - Andreas
On 6/12/22 16:01, Andreas Rheinhardt wrote: > Leo Izen: >> Declaring the VLC table as const allows a caller to call get_vlc2() >> with a pre-generated static const table without generating warnings >> for -Wdiscarded-qualifiers. >> --- >> libavcodec/get_bits.h | 4 ++-- >> 1 file changed, 2 insertions(+), 2 deletions(-) >> >> diff --git a/libavcodec/get_bits.h b/libavcodec/get_bits.h >> index d4e9276da1..49202b0211 100644 >> --- a/libavcodec/get_bits.h >> +++ b/libavcodec/get_bits.h >> @@ -775,7 +775,7 @@ static inline const uint8_t *align_get_bits(GetBitContext *s) >> >> /* Return the LUT element for the given bitstream configuration. */ >> static inline int set_idx(GetBitContext *s, int code, int *n, int *nb_bits, >> - VLC_TYPE (*table)[2]) >> + const VLC_TYPE (*table)[2]) >> { >> unsigned idx; >> >> @@ -795,7 +795,7 @@ static inline int set_idx(GetBitContext *s, int code, int *n, int *nb_bits, >> * = (max_vlc_length + bits - 1) / bits >> * @returns the code parsed or -1 if no vlc matches >> */ >> -static av_always_inline int get_vlc2(GetBitContext *s, VLC_TYPE (*table)[2], >> +static av_always_inline int get_vlc2(GetBitContext *s, const VLC_TYPE (*table)[2], >> int bits, int max_depth) >> { >> #if CACHED_BITSTREAM_READER > > There is unfortunately an issue here: C11 6.7.3/9 contains the > following: "If the specification of an array type includes any type > qualifiers, the element type is so-qualified, not the array type." > > Therefore adding const above means that the functions accept a > pointer-to-array-of-two-const-VLC_TYPE, but ordinary callers call this with > a pointer-to-array-of-two-VLC_TYPE; the automatic conversion from > pointer-to-T to pointer-to-const-T does not help you here, because it > would only give you a pointer-to-const-array-of-two-VLC_TYPE, but not a > pointer-to-an-array-of-two-const-VLC_TYPE; at least that is the > prevailing interpretation of the above part of the spec (I don't get why > one does not just use 6.7.3/9 once more to conclude that > pointer-to-const-array-of-two-VLC_TYPE is actually equivalent to > pointer-to-array-of-const-VLC_TYPE). > Older versions of GCC warned by default for such conversions (when using > an ISO standard -- it is legal in GNU C standards); current versions > still do so when compiling with -pedantic. Clang does not warn about > this at all, not even with -pedantic. > > I see three ways to fix this: > a) Add a get_vlc2c that accepts const. It will have the implementation > of the current get_vlc2; get_vlc2 meanwhile would be turned into a > wrapper for get_vlc2c, i.e. it would solely be used to cast to the > expected pointer type. > b) Add the necessary casts in the only user that wants to use a const table. > c) Stop using VLC_TYPE[2] altogether; use a struct { VLC_TYPE symbol, > bits; } (it feels like this struct should actually be called VLC_TYPE). > Then adding const works fine as usual; it would IMO be more readable, > too, because it would be automatically documented which of the entries > is what. This is therefore my preferred option. Would you mind if I > implemented this or do you want to do it? > > - Andreas I would not mind if you implemented this. Thank you, I appreciate it. - Leo Izen (thebombzen)
diff --git a/libavcodec/get_bits.h b/libavcodec/get_bits.h index d4e9276da1..49202b0211 100644 --- a/libavcodec/get_bits.h +++ b/libavcodec/get_bits.h @@ -775,7 +775,7 @@ static inline const uint8_t *align_get_bits(GetBitContext *s) /* Return the LUT element for the given bitstream configuration. */ static inline int set_idx(GetBitContext *s, int code, int *n, int *nb_bits, - VLC_TYPE (*table)[2]) + const VLC_TYPE (*table)[2]) { unsigned idx; @@ -795,7 +795,7 @@ static inline int set_idx(GetBitContext *s, int code, int *n, int *nb_bits, * = (max_vlc_length + bits - 1) / bits * @returns the code parsed or -1 if no vlc matches */ -static av_always_inline int get_vlc2(GetBitContext *s, VLC_TYPE (*table)[2], +static av_always_inline int get_vlc2(GetBitContext *s, const VLC_TYPE (*table)[2], int bits, int max_depth) { #if CACHED_BITSTREAM_READER