[jitsi-dev] Recording RTP transmission to file using libjitsi


#1

Hi all,

I am having a hard time trying to figure out how to record an RTP
transmission received by the libjitsi's AVReceive2 sample class into a
file. My current approach was to implement a FileMediaDeviceSession which
extends the AudioMediaDeviceSession and to override the
playerControllerUpdate() method. In this method instead of setting player's
content descriptor to null, I set it to new ContentDescriptor(
FileTypeDescriptor.WAVE) and upon receipt of the RealizeCompleteEvent I
create a DataSink pointing to a file, open and start it. Here is the source
code for that class: http://pastebin.com/VQRNnxpG

Unfortunately the file that is created is empty after the transmission. I
am quite sure that the transmission itself is successful, because I can
hear playback of the transmitted file in case I do not override
the playerControllerUpdate method. I am wondering what am I doing wrong?
There is not much documentation on the details of recording a file using
JMF and all of the tutorials I found so far propose to do exactly what I am
trying to implement. I see that RecorderImpl class is based on using
AudioMixerMediaDevice, and not a simple AudioMediaDevice. Is that critical?
Unfortunately I was not able to figure out how to use the Recorder from the
AVReceive class. I would really appreciate any help!

Regards,
Alexander Fedulov


#2

Hey Alexander,

Hi all,

I am having a hard time trying to figure out how to record an RTP
transmission received by the libjitsi's AVReceive2 sample class into a
file. My current approach was to implement a FileMediaDeviceSession
which extends the AudioMediaDeviceSession and to override the
playerControllerUpdate() method. In this method instead of setting
player's content descriptor to null, I set it to
new ContentDescriptor(FileTypeDescriptor.WAVE) and upon receipt of the
RealizeCompleteEvent I create a DataSink pointing to a file, open and
start it. Here is the source code for that
class: http://pastebin.com/VQRNnxpG

Unfortunately the file that is created is empty after the transmission.
I am quite sure that the transmission itself is successful, because I
can hear playback of the transmitted file in case I do not override
the playerControllerUpdate method. I am wondering what am I doing wrong?
There is not much documentation on the details of recording a file using
JMF and all of the tutorials I found so far propose to do exactly what I
am trying to implement. I see that RecorderImpl class is based on using
AudioMixerMediaDevice, and not a simple AudioMediaDevice. Is that
critical? Unfortunately I was not able to figure out how to use the
Recorder from the AVReceive class. I would really appreciate any help!

Do you need to record the RTP stream itself, or just its content. If the
latter you should have a look at how we record calls in Jitsi. We
basically use the audio mixer and add the recorder as one of the devices
that are being mixed. It doesn't produce any data but it does get a
mixed version of everyone else's audio.

Hope this helps,
Emil

···

On 17.12.12, 14:04, ijustwanttoregister@googlemail.com wrote:
--
https://jitsi.org


#3

Hi Emil,

the goal is to emulate a voice stream, store recorded packets including all
of the network-imposed influences (packet losses, jitter) and use a utility
that allows to estimate something similar to PESQ score (voice transmission
quality indicator) by comparing the original WAV file that was stream to
those reconstructed from the RTP stream on the other side of the pipe.

Meanwhile I was able to actually able instantiate and use the recorder by
doing the following:

MediaDevice device =
mediaService.getDefaultDevice(MediaType.AUDIO,MediaUseCase.CALL);
MediaDevice mixer = mediaService.createMixer(device);
Recorder recorder = mediaService.createRecorder(mixer);
mediaStream = mediaService.createMediaStream(mixer);
... (further usage similar to AVReceive2 example)

I have two issues with the results that I get: 1) recorder actually mixes
the received RTP stream with the signal from the microphone 2) Recording
appears only if recorder.start(SoundFileUtils.mp3, "recording"); is used
and creates an empty file if the SoundFileUtils.wav format is chosen.

Issue number 2) is actually critical, because it ruins the whole point of
having the uncompressed stream being transmitted. Do you have an idea what
could cause that problem?

Regards,
Alexander Fedulov

···

2012/12/18 Emil Ivov <emcho@jitsi.org>

Hey Alexander,

On 17.12.12, 14:04, ijustwanttoregister@googlemail.com wrote:
> Hi all,
>
> I am having a hard time trying to figure out how to record an RTP
> transmission received by the libjitsi's AVReceive2 sample class into a
> file. My current approach was to implement a FileMediaDeviceSession
> which extends the AudioMediaDeviceSession and to override the
> playerControllerUpdate() method. In this method instead of setting
> player's content descriptor to null, I set it to
> new ContentDescriptor(FileTypeDescriptor.WAVE) and upon receipt of the
> RealizeCompleteEvent I create a DataSink pointing to a file, open and
> start it. Here is the source code for that
> class: http://pastebin.com/VQRNnxpG
>
> Unfortunately the file that is created is empty after the transmission.
> I am quite sure that the transmission itself is successful, because I
> can hear playback of the transmitted file in case I do not override
> the playerControllerUpdate method. I am wondering what am I doing wrong?
> There is not much documentation on the details of recording a file using
> JMF and all of the tutorials I found so far propose to do exactly what I
> am trying to implement. I see that RecorderImpl class is based on using
> AudioMixerMediaDevice, and not a simple AudioMediaDevice. Is that
> critical? Unfortunately I was not able to figure out how to use the
> Recorder from the AVReceive class. I would really appreciate any help!

Do you need to record the RTP stream itself, or just its content. If the
latter you should have a look at how we record calls in Jitsi. We
basically use the audio mixer and add the recorder as one of the devices
that are being mixed. It doesn't produce any data but it does get a
mixed version of everyone else's audio.

Hope this helps,
Emil
--
https://jitsi.org


#4

Hi all,
I still have those issues. Could anyone maybe give me a hint?

Regards,
Alexander Fedulov

···

2012/12/19 <ijustwanttoregister@googlemail.com>

Hi Emil,

the goal is to emulate a voice stream, store recorded packets including
all of the network-imposed influences (packet losses, jitter) and use a
utility that allows to estimate something similar to PESQ score (voice
transmission quality indicator) by comparing the original WAV file that was
stream to those reconstructed from the RTP stream on the other side of the
pipe.

Meanwhile I was able to actually able instantiate and use the recorder by
doing the following:

MediaDevice device =
mediaService.getDefaultDevice(MediaType.AUDIO,MediaUseCase.CALL);
MediaDevice mixer = mediaService.createMixer(device);
Recorder recorder = mediaService.createRecorder(mixer);
mediaStream = mediaService.createMediaStream(mixer);
... (further usage similar to AVReceive2 example)

I have two issues with the results that I get: 1) recorder actually mixes
the received RTP stream with the signal from the microphone 2) Recording
appears only if recorder.start(SoundFileUtils.mp3, "recording"); is used
and creates an empty file if the SoundFileUtils.wav format is chosen.

Issue number 2) is actually critical, because it ruins the whole point of
having the uncompressed stream being transmitted. Do you have an idea what
could cause that problem?

Regards,
Alexander Fedulov

2012/12/18 Emil Ivov <emcho@jitsi.org>

Hey Alexander,

On 17.12.12, 14:04, ijustwanttoregister@googlemail.com wrote:
> Hi all,
>
> I am having a hard time trying to figure out how to record an RTP
> transmission received by the libjitsi's AVReceive2 sample class into a
> file. My current approach was to implement a FileMediaDeviceSession
> which extends the AudioMediaDeviceSession and to override the
> playerControllerUpdate() method. In this method instead of setting
> player's content descriptor to null, I set it to
> new ContentDescriptor(FileTypeDescriptor.WAVE) and upon receipt of the
> RealizeCompleteEvent I create a DataSink pointing to a file, open and
> start it. Here is the source code for that
> class: http://pastebin.com/VQRNnxpG
>
> Unfortunately the file that is created is empty after the transmission.
> I am quite sure that the transmission itself is successful, because I
> can hear playback of the transmitted file in case I do not override
> the playerControllerUpdate method. I am wondering what am I doing wrong?
> There is not much documentation on the details of recording a file using
> JMF and all of the tutorials I found so far propose to do exactly what I
> am trying to implement. I see that RecorderImpl class is based on using
> AudioMixerMediaDevice, and not a simple AudioMediaDevice. Is that
> critical? Unfortunately I was not able to figure out how to use the
> Recorder from the AVReceive class. I would really appreciate any help!

Do you need to record the RTP stream itself, or just its content. If the
latter you should have a look at how we record calls in Jitsi. We
basically use the audio mixer and add the recorder as one of the devices
that are being mixed. It doesn't produce any data but it does get a
mixed version of everyone else's audio.

Hope this helps,
Emil
--
https://jitsi.org