[jitsi-dev] Patch that mutes the video stream when holding a call


#1

Hi,

Here's a patch that mutes the video signal on hold. There are actually two
patch files since the code affects both jitsi and libjitsi projects.

Best regards,
Vladimir

mutes_video_data_source_on_hold.patch (4.88 KB)

mutes_video_media_stream_on_hold.patch (1.49 KB)


#2

Hello, Vladimir.

Thank you for the patches!

They need a bit more work before they become suitable for the nightly
channel, though. My major concerns are the memory corruptions that the
following code may cause.

+ long currentPixelPosition;
+ byte[] blackPixel = {0, 0, 0};
+ int bytesPerRGB24Pixel = 3;

···

+
+ AVFrame avFrame = (AVFrame) data;
+ AVFrameFormat avFrameFormat = (AVFrameFormat)
buffer.getFormat();
+ Dimension dim = avFrameFormat.getSize();
+
+ AVFrameFormat rgbAvFrameFormat =
+ new AVFrameFormat(
+ avFrameFormat.getSize(),
+ avFrameFormat.getFrameRate(),
+ FFmpeg.PIX_FMT_RGB24);
+
+ int size = dim.width * dim.height * bytesPerRGB24Pixel;
+ /* pad the buffer */
+ size += FFmpeg.FF_INPUT_BUFFER_PADDING_SIZE;
+
+ /* allocate native array */
+ ByteBuffer blankVideoFrameBytes = new ByteBuffer(size);
+ blankVideoFrameBytes.setLength(size);
+
+ currentPixelPosition = blankVideoFrameBytes.getPtr();
+ for (int i = 0; i < dim.width * dim.height; i++) {

I don't think we want to calculate dim.width * dim.height more than once.

+ FFmpeg.memcpy(
+ currentPixelPosition, blackPixel,
+ 0, bytesPerRGB24Pixel);
+
+ currentPixelPosition += bytesPerRGB24Pixel;
+ }
+
+ AVFrame rgbFrame = new AVFrame();

Since rgbFrame is local to the method, you should better invoke
rgbFrame.free() somewhere. While free() will automatically be called
when the AVFrame instance is finalized, this behavior is just a
safe-guard.

+ rgbFrame.avpicture_fill(blankVideoFrameBytes,
rgbAvFrameFormat);
+
+ int imageWidth = (int) dim.getWidth();
+ int imageHeight = (int) dim.getHeight();

We don't really need these two casts from double to int, we've already
used dim.width and dim.height earlier in the code.

+ long sws_context = 0;
+ byte[] dst = new byte[avFrame.getData().getCapacity()];
+ sws_context = FFmpeg.sws_getCachedContext(
+ sws_context, imageWidth, imageHeight,
+ FFmpeg.PIX_FMT_RGB24, imageWidth, imageHeight,

At this point I'd expect that we'd be oblivious about the pixfmt of
rgbFrame so I'd suggest using rgbFormat.getPixFmt() rather than
FFmpeg.PIX_FMT_RGB24.

+ avFrameFormat.getPixFmt(), FFmpeg.SWS_BICUBIC);
+
+ FFmpeg.sws_scale(
+ sws_context, rgbFrame.getPtr(), 0, imageHeight, dst,
+ avFrameFormat.getPixFmt(), imageWidth, imageHeight);
+
+ FFmpeg.memcpy(
+ blankVideoFrameBytes.getPtr(), dst, 0,
+ avFrame.getData().getLength());

The capacity of blackVideoFrameBytes is roughly width * height *
(bytes per 24-bit RGB pixel). If avFrame represents a 32-bit RGB frame
(e.g. libjitsi captures video in PIX_FMT_ARGB on OS X), then
avFrame.getData().getLength() will be at least width * height * (bytes
per 32-bit RGB pixel). Hence, there will be a buffer overflow.

+
+ avFrame.avpicture_fill(blankVideoFrameBytes, avFrameFormat);

The ByteBuffer blankVideoFrameBytes was first avpicture_fill-ed into
rgbFrame and then into avFrame. When rgbFrame is finalized, it will
free the native memory represented by blankVideoFrameBytes. If avFrame
attempts to utilize it afterwords, there will be a use after free.

Best regards,
Lyubomir


#3

Hello,

Here are the revised patches.

Best regards,
Vladimir

mutes_video_data_source_on_hold.patch (19 KB)

mutes_video_media_stream_on_hold.patch (1.49 KB)

···

On Tue, Oct 29, 2013 at 11:19 PM, Lyubomir Marinov < lyubomir.marinov@jitsi.org> wrote:

Hello, Vladimir.

Thank you for the patches!

They need a bit more work before they become suitable for the nightly
channel, though. My major concerns are the memory corruptions that the
following code may cause.

+ long currentPixelPosition;
+ byte[] blackPixel = {0, 0, 0};
+ int bytesPerRGB24Pixel = 3;
+
+ AVFrame avFrame = (AVFrame) data;
+ AVFrameFormat avFrameFormat = (AVFrameFormat)
buffer.getFormat();
+ Dimension dim = avFrameFormat.getSize();
+
+ AVFrameFormat rgbAvFrameFormat =
+ new AVFrameFormat(
+ avFrameFormat.getSize(),
+ avFrameFormat.getFrameRate(),
+ FFmpeg.PIX_FMT_RGB24);
+
+ int size = dim.width * dim.height * bytesPerRGB24Pixel;
+ /* pad the buffer */
+ size += FFmpeg.FF_INPUT_BUFFER_PADDING_SIZE;
+
+ /* allocate native array */
+ ByteBuffer blankVideoFrameBytes = new ByteBuffer(size);
+ blankVideoFrameBytes.setLength(size);
+
+ currentPixelPosition = blankVideoFrameBytes.getPtr();
+ for (int i = 0; i < dim.width * dim.height; i++) {

I don't think we want to calculate dim.width * dim.height more than once.

+ FFmpeg.memcpy(
+ currentPixelPosition, blackPixel,
+ 0, bytesPerRGB24Pixel);
+
+ currentPixelPosition += bytesPerRGB24Pixel;
+ }
+
+ AVFrame rgbFrame = new AVFrame();

Since rgbFrame is local to the method, you should better invoke
rgbFrame.free() somewhere. While free() will automatically be called
when the AVFrame instance is finalized, this behavior is just a
safe-guard.

+ rgbFrame.avpicture_fill(blankVideoFrameBytes,
rgbAvFrameFormat);
+
+ int imageWidth = (int) dim.getWidth();
+ int imageHeight = (int) dim.getHeight();

We don't really need these two casts from double to int, we've already
used dim.width and dim.height earlier in the code.

+ long sws_context = 0;
+ byte[] dst = new byte[avFrame.getData().getCapacity()];
+ sws_context = FFmpeg.sws_getCachedContext(
+ sws_context, imageWidth, imageHeight,
+ FFmpeg.PIX_FMT_RGB24, imageWidth,
imageHeight,

At this point I'd expect that we'd be oblivious about the pixfmt of
rgbFrame so I'd suggest using rgbFormat.getPixFmt() rather than
FFmpeg.PIX_FMT_RGB24.

+ avFrameFormat.getPixFmt(),
FFmpeg.SWS_BICUBIC);
+
+ FFmpeg.sws_scale(
+ sws_context, rgbFrame.getPtr(), 0, imageHeight,
dst,
+ avFrameFormat.getPixFmt(), imageWidth,
imageHeight);
+
+ FFmpeg.memcpy(
+ blankVideoFrameBytes.getPtr(), dst, 0,
+ avFrame.getData().getLength());

The capacity of blackVideoFrameBytes is roughly width * height *
(bytes per 24-bit RGB pixel). If avFrame represents a 32-bit RGB frame
(e.g. libjitsi captures video in PIX_FMT_ARGB on OS X), then
avFrame.getData().getLength() will be at least width * height * (bytes
per 32-bit RGB pixel). Hence, there will be a buffer overflow.

+
+ avFrame.avpicture_fill(blankVideoFrameBytes,
avFrameFormat);

The ByteBuffer blankVideoFrameBytes was first avpicture_fill-ed into
rgbFrame and then into avFrame. When rgbFrame is finalized, it will
free the native memory represented by blankVideoFrameBytes. If avFrame
attempts to utilize it afterwords, there will be a use after free.

Best regards,
Lyubomir

_______________________________________________
dev mailing list
dev@jitsi.org
Unsubscribe instructions and other list options:
http://lists.jitsi.org/mailman/listinfo/dev


#4

Hi,

Here's a patch that watermarks muted video frames. The patch
watermarks_muted_video_frames.patch needs to be applied after applying the
revised patches in my previous mail (i.e.
*mutes_video_data_source_on_hold.patch* and
*mutes_video_media_stream_on_hold.patch*). There is a big chance that we
switch to another approach to watermark a video frame so this patch might
turn out to be unnecessary.

Regards,
Vladimir

adds_onhold_watermark_image_resource.patch (14.4 KB)

watermarks_muted_video_frames.patch (13 KB)

···

On Fri, Nov 1, 2013 at 9:12 AM, Vladimir Marinov <vmarinov@bluejimp.com>wrote:

Hello,

Here are the revised patches.

Best regards,
Vladimir

On Tue, Oct 29, 2013 at 11:19 PM, Lyubomir Marinov < > lyubomir.marinov@jitsi.org> wrote:

Hello, Vladimir.

Thank you for the patches!

They need a bit more work before they become suitable for the nightly
channel, though. My major concerns are the memory corruptions that the
following code may cause.

+ long currentPixelPosition;
+ byte[] blackPixel = {0, 0, 0};
+ int bytesPerRGB24Pixel = 3;
+
+ AVFrame avFrame = (AVFrame) data;
+ AVFrameFormat avFrameFormat = (AVFrameFormat)
buffer.getFormat();
+ Dimension dim = avFrameFormat.getSize();
+
+ AVFrameFormat rgbAvFrameFormat =
+ new AVFrameFormat(
+ avFrameFormat.getSize(),
+ avFrameFormat.getFrameRate(),
+ FFmpeg.PIX_FMT_RGB24);
+
+ int size = dim.width * dim.height * bytesPerRGB24Pixel;
+ /* pad the buffer */
+ size += FFmpeg.FF_INPUT_BUFFER_PADDING_SIZE;
+
+ /* allocate native array */
+ ByteBuffer blankVideoFrameBytes = new ByteBuffer(size);
+ blankVideoFrameBytes.setLength(size);
+
+ currentPixelPosition = blankVideoFrameBytes.getPtr();
+ for (int i = 0; i < dim.width * dim.height; i++) {

I don't think we want to calculate dim.width * dim.height more than once.

+ FFmpeg.memcpy(
+ currentPixelPosition, blackPixel,
+ 0, bytesPerRGB24Pixel);
+
+ currentPixelPosition += bytesPerRGB24Pixel;
+ }
+
+ AVFrame rgbFrame = new AVFrame();

Since rgbFrame is local to the method, you should better invoke
rgbFrame.free() somewhere. While free() will automatically be called
when the AVFrame instance is finalized, this behavior is just a
safe-guard.

+ rgbFrame.avpicture_fill(blankVideoFrameBytes,
rgbAvFrameFormat);
+
+ int imageWidth = (int) dim.getWidth();
+ int imageHeight = (int) dim.getHeight();

We don't really need these two casts from double to int, we've already
used dim.width and dim.height earlier in the code.

+ long sws_context = 0;
+ byte[] dst = new byte[avFrame.getData().getCapacity()];
+ sws_context = FFmpeg.sws_getCachedContext(
+ sws_context, imageWidth, imageHeight,
+ FFmpeg.PIX_FMT_RGB24, imageWidth,
imageHeight,

At this point I'd expect that we'd be oblivious about the pixfmt of
rgbFrame so I'd suggest using rgbFormat.getPixFmt() rather than
FFmpeg.PIX_FMT_RGB24.

+ avFrameFormat.getPixFmt(),
FFmpeg.SWS_BICUBIC);
+
+ FFmpeg.sws_scale(
+ sws_context, rgbFrame.getPtr(), 0, imageHeight,
dst,
+ avFrameFormat.getPixFmt(), imageWidth,
imageHeight);
+
+ FFmpeg.memcpy(
+ blankVideoFrameBytes.getPtr(), dst, 0,
+ avFrame.getData().getLength());

The capacity of blackVideoFrameBytes is roughly width * height *
(bytes per 24-bit RGB pixel). If avFrame represents a 32-bit RGB frame
(e.g. libjitsi captures video in PIX_FMT_ARGB on OS X), then
avFrame.getData().getLength() will be at least width * height * (bytes
per 32-bit RGB pixel). Hence, there will be a buffer overflow.

+
+ avFrame.avpicture_fill(blankVideoFrameBytes,
avFrameFormat);

The ByteBuffer blankVideoFrameBytes was first avpicture_fill-ed into
rgbFrame and then into avFrame. When rgbFrame is finalized, it will
free the native memory represented by blankVideoFrameBytes. If avFrame
attempts to utilize it afterwords, there will be a use after free.

Best regards,
Lyubomir

_______________________________________________
dev mailing list
dev@jitsi.org
Unsubscribe instructions and other list options:
http://lists.jitsi.org/mailman/listinfo/dev