[jitsi-dev] How to implement an RTP relay


#1

Hi,

We need to implement a service which receives audio data from RTP
(g.711/g.722), convert it to RAW audio and multicast it via RTP.

All the classes and the example seem to base on outputting data on the
local hardware devices. But the stream is not to be played on a sound
device. Nor is any GUI involved like Player/Processor do reference.

Could someone please outline how this could be realized? Which class(es)
should be extended or which interface must be implemented and (roughly)
how?

My idea was to extend the "AudioMediaDeviceImpl" and overwrite the
"createPlayer" method and return a custom "Processor". But where takes the
multicasting place then?

Thanks in advance,

Thomas


#2

It sounds to me like you may want to look at the source code of our
jitsi-videobridge project at
https://github.com/jitsi/jitsi-videobridge.

ยทยทยท

2013/7/16 Thomas Amsler <thomas.amsler@objectxp.com>:

We need to implement a service which receives audio data from RTP
(g.711/g.722), convert it to RAW audio and multicast it via RTP.


#3

Hi Lyubomir,

It sounds to me like you may want to look at the source code of our
jitsi-videobridge project at

https://github.com/jitsi/jitsi-videobridge.

The project description sounds promising. - However I'm still unsure
whether I get it right.
I understand that I need two kind of "Channel" classes, one for the
incoming RTP traffic and the other to multicast?
What about the "RTPLevelRelayType"? - Should both Channels be TRANSLATOR
and that is what results in forwarding the traffic from the incoming one to
the multicasting one?

Kind regards,

Thomas


#4

Hi,

I still have problems to understand how media is passed from one
MediaDevice to the other?

In the Video Bridge the "CLOCK_ONLY" attribute comment in the
"AudioSilenceCaptureDevice" says:
"it causes the AudioMixer to not push media unless at least one Channel is
receiving actual media."
What makes a Channel receive data from other channels?

Thank you in advance!

Thomas