Hello, Vladimir.
Thank you for the patches!
They need a bit more work before they become suitable for the nightly
channel, though. My major concerns are the memory corruptions that the
following code may cause.
+ long currentPixelPosition;
+ byte blackPixel = {0, 0, 0};
+ int bytesPerRGB24Pixel = 3;
+
+ AVFrame avFrame = (AVFrame) data;
+ AVFrameFormat avFrameFormat = (AVFrameFormat)
buffer.getFormat();
+ Dimension dim = avFrameFormat.getSize();
+
+ AVFrameFormat rgbAvFrameFormat =
+ new AVFrameFormat(
+ avFrameFormat.getSize(),
+ avFrameFormat.getFrameRate(),
+ FFmpeg.PIX_FMT_RGB24);
+
+ int size = dim.width * dim.height * bytesPerRGB24Pixel;
+ /* pad the buffer */
+ size += FFmpeg.FF_INPUT_BUFFER_PADDING_SIZE;
+
+ /* allocate native array */
+ ByteBuffer blankVideoFrameBytes = new ByteBuffer(size);
+ blankVideoFrameBytes.setLength(size);
+
+ currentPixelPosition = blankVideoFrameBytes.getPtr();
+ for (int i = 0; i < dim.width * dim.height; i++) {
I don't think we want to calculate dim.width * dim.height more than once.
+ FFmpeg.memcpy(
+ currentPixelPosition, blackPixel,
+ 0, bytesPerRGB24Pixel);
+
+ currentPixelPosition += bytesPerRGB24Pixel;
+ }
+
+ AVFrame rgbFrame = new AVFrame();
Since rgbFrame is local to the method, you should better invoke
rgbFrame.free() somewhere. While free() will automatically be called
when the AVFrame instance is finalized, this behavior is just a
safe-guard.
+ rgbFrame.avpicture_fill(blankVideoFrameBytes,
rgbAvFrameFormat);
+
+ int imageWidth = (int) dim.getWidth();
+ int imageHeight = (int) dim.getHeight();
We don't really need these two casts from double to int, we've already
used dim.width and dim.height earlier in the code.
+ long sws_context = 0;
+ byte dst = new byte[avFrame.getData().getCapacity()];
+ sws_context = FFmpeg.sws_getCachedContext(
+ sws_context, imageWidth, imageHeight,
+ FFmpeg.PIX_FMT_RGB24, imageWidth,
imageHeight,
At this point I'd expect that we'd be oblivious about the pixfmt of
rgbFrame so I'd suggest using rgbFormat.getPixFmt() rather than
FFmpeg.PIX_FMT_RGB24.
+ avFrameFormat.getPixFmt(),
FFmpeg.SWS_BICUBIC);
+
+ FFmpeg.sws_scale(
+ sws_context, rgbFrame.getPtr(), 0, imageHeight,
dst,
+ avFrameFormat.getPixFmt(), imageWidth,
imageHeight);
+
+ FFmpeg.memcpy(
+ blankVideoFrameBytes.getPtr(), dst, 0,
+ avFrame.getData().getLength());
The capacity of blackVideoFrameBytes is roughly width * height *
(bytes per 24-bit RGB pixel). If avFrame represents a 32-bit RGB frame
(e.g. libjitsi captures video in PIX_FMT_ARGB on OS X), then
avFrame.getData().getLength() will be at least width * height * (bytes
per 32-bit RGB pixel). Hence, there will be a buffer overflow.
+
+ avFrame.avpicture_fill(blankVideoFrameBytes,
avFrameFormat);
The ByteBuffer blankVideoFrameBytes was first avpicture_fill-ed into
rgbFrame and then into avFrame. When rgbFrame is finalized, it will
free the native memory represented by blankVideoFrameBytes. If avFrame
attempts to utilize it afterwords, there will be a use after free.
Best regards,
Lyubomir
_______________________________________________
dev mailing list
dev@jitsi.org
Unsubscribe instructions and other list options:
http://lists.jitsi.org/mailman/listinfo/dev