summaryrefslogtreecommitdiff
path: root/drivers/staging/lirc
diff options
context:
space:
mode:
authorMichael Hennerich <michael.hennerich@analog.com>2011-02-24 16:34:53 +0100
committerGreg Kroah-Hartman <gregkh@suse.de>2011-02-28 14:40:23 -0800
commit7ccd4506fa49600a3c59cf64608b2c9e669b6c97 (patch)
tree5918b92ad54a8021a8adb830d1f391db90e1860d /drivers/staging/lirc
parent2bf99c70cee1d9347145ec0d96ba39764e2193bc (diff)
IIO: Documentation: iio_utils: Prevent buffer overflow
The first part of build_channel_array()identifies the number of enabled channels. Further down this count is used to allocate the ci_array. The next section parses the scan_elements directory again, and fills ci_array regardless if the channel is enabled or not. So if less than available channels are enabled ci_array memory is overflowed. This fix makes sure that we allocate enough memory. But the whole approach looks a bit cumbersome to me. Why not allocate memory for MAX_CHANNLES, less say 64 (I never seen a part with more than that channels). And skip the first part entirely. Signed-off-by: Michael Hennerich <michael.hennerich@analog.com> Acked-by: Jonathan Cameron <jic23@cam.ac.uk> Signed-off-by: Greg Kroah-Hartman <gregkh@suse.de>
Diffstat (limited to 'drivers/staging/lirc')
0 files changed, 0 insertions, 0 deletions