[jitsi-dev] (no subject)


#1

Once I have configured and started my MediaStream instances, how do I add
or configure a ReceiveStreamListener or some other means of manipulating
the incoming stream data? I see these in my log, so I assume the data is
flowing from the client browser:

2014-12-01 13:40:22,434 [RTP Forwarder:
org.jitsi.impl.neomedia.MediaStreamImpl$2@43ae7355] INFO
net.sf.fmj.media.Log - Growing packet queue to 16

For more detail: I've got vp8 and opus data coming from a browser and I
want to modify and re-route it on my server; which is the recv-only side of
this configuration.

Lastly, what is the easiest and least resource intensive way to do what I
want here? I've looked at the recorder and renderer code in libjitsi, but
I'm not certain that creating "pseudo" rendering devices would be the most
efficient solution.

Regards,
Paul

···

--
http://gregoire.org/
https://github.com/Red5 <http://code.google.com/p/red5/>


#2

Is there a doc or some other resource I can look over to help me figure out
how to access the stream data and / or to know how the pipeline between the
endpoints is setup? Just looking through all the jitsi codebase is quite
tedious.

···

On Tue Dec 02 2014 at 10:20:38 AM Mondain <mondain@gmail.com> wrote:

Once I have configured and started my MediaStream instances, how do I add
or configure a ReceiveStreamListener or some other means of manipulating
the incoming stream data? I see these in my log, so I assume the data is
flowing from the client browser:

2014-12-01 13:40:22,434 [RTP Forwarder:
org.jitsi.impl.neomedia.MediaStreamImpl$2@43ae7355] INFO
net.sf.fmj.media.Log - Growing packet queue to 16

For more detail: I've got vp8 and opus data coming from a browser and I
want to modify and re-route it on my server; which is the recv-only side of
this configuration.

Lastly, what is the easiest and least resource intensive way to do what I
want here? I've looked at the recorder and renderer code in libjitsi, but
I'm not certain that creating "pseudo" rendering devices would be the most
efficient solution.

Regards,
Paul

--
http://gregoire.org/
https://github.com/Red5 <http://code.google.com/p/red5/>
_______________________________________________
dev mailing list
dev@jitsi.org
Unsubscribe instructions and other list options:
http://lists.jitsi.org/mailman/listinfo/dev


#3

I know this is quite vague, but you could try reading the Jitsi Videobridge code. It's probably less confusing than Jitsi itself.

You can start with RtpChannel, which represents a connection to an endpoint for either audio or video. It holds a libjitsi MediaStream.

Regards,
Boris

···

On 03/12/14 17:41, Mondain wrote:

Is there a doc or some other resource I can look over to help me figure
out how to access the stream data and / or to know how the pipeline
between the endpoints is setup? Just looking through all the jitsi
codebase is quite tedious.


#4

Thanks Boris, I will have a look at the Videobridge sources.

···

On Wed Dec 03 2014 at 11:00:09 AM Boris Grozev <boris@jitsi.org> wrote:

On 03/12/14 17:41, Mondain wrote:
> Is there a doc or some other resource I can look over to help me figure
> out how to access the stream data and / or to know how the pipeline
> between the endpoints is setup? Just looking through all the jitsi
> codebase is quite tedious.

I know this is quite vague, but you could try reading the Jitsi
Videobridge code. It's probably less confusing than Jitsi itself.

You can start with RtpChannel, which represents a connection to an
endpoint for either audio or video. It holds a libjitsi MediaStream.

Regards,
Boris

_______________________________________________
dev mailing list
dev@jitsi.org
Unsubscribe instructions and other list options:
http://lists.jitsi.org/mailman/listinfo/dev