[jitsi-users] [libjitsi] Proceeding from examples


#1

Hello all,

I apologize in advance for the rather exploratory question. I am a
developer, but I see myself as a user of your library, so I am posting
here.

Background: I am starting a research project on palliative care support
systems. One of the aspects of the system is communication among
family, doctors and patients, and the availability of Jitsi and
libjitsi make it a great candidate to try video calls from our Java
platform.

I successfully ran the AVTransmit2 and AVReceive2 examples, but now I
don't really know how to proceed. I am wondering whether I should
implement my own GUI, or if I should reuse the JITSI components in my
code.

Is there a simple way to render the video in my SWT components, or can
I reuse the Jitsi client UI?

How would I go from receiving the RTP stream (as in the examples) to
rendering the video in a (any) GUI? Do you have any
suggestions/pointers/hints? Thanks in advance!


#2

The libjitsi library renders video in AWT Component. You will likely want to read the javadocs in the source code of the org.jitsi.service.neomedia.VideoMediaStream class with initial focus on the add/removeVideoListener, getLocalVisualComponent and getVisualComponents methods. Jitsi is the complete reference on the subject of employing these.

We have not explored ways to integrate libjitsi's AWT rendering into an SWT-based user interface.

ยทยทยท

On 27.09.2012, at 12:23, <semuelle@uos.de> wrote:

Is there a simple way to render the video in my SWT components, or can
I reuse the Jitsi client UI?

How would I go from receiving the RTP stream (as in the examples) to
rendering the video in a (any) GUI? Do you have any
suggestions/pointers/hints? Thanks in advance!