[jitsi-users] Can't seem to use sendEndpointMessage when using websockets for XMPP


#1

Hi,

I'm having a trying time debugging a fairly straight forward app I've built
using lib-jitsi-meet in an attempt to build something different from
straight-up conferencing.

It's setup is similar to lib-jitsi-meet/doc/example/example.js, and I'm
using websockets by specifying bosh: "wss:// ... /xmpp-websocket", have
prosody configured to use websockets, and nginx to pass it through. All of
that seems to work fine, and a multi-party conference is coming up fine.
The reason I want to use websockets is to have better visibility in the
chrome debugger, where I can see the XMPP messages in Chrome's WS network
debugger tab.

So everything is going well until I try to use sendEndpointMessage when it
throws a "Channel support is disabled" exception. The app works fine with
straight bosh, however it does not seem to work when using websockets. As
in the example.js, I am passing in the option "openBridgeChannel: true",
and I have changed that to "openBridgeChannel: 'websocket'".

I'm not altogether clear what "openBridgeChannel" is connecting to though,
so I don't know where to look -- is it establishing a channel with Prosody,
or jitsi-videobridge to be able to pass messages between participants?
Should I expect it to work when using websockets for the XMPP?

My ultimate goal is to be able to have one participant be able to send
data/event messages to another participant via XMPP, and ideally via
websockets so that I can see what is going on in the Chrome debugger along
with the rest of the XMPP messages. Maybe I'm going about it all wrong? Is
there perhaps a better way to be passing these messages?

Cheers,
Simon


#2

Hi Simon,

Hi,

I'm having a trying time debugging a fairly straight forward app I've built using lib-jitsi-meet in an attempt to build something different from straight-up conferencing.

It's setup is similar to lib-jitsi-meet/doc/example/example.js, and I'm using websockets by specifying bosh: "wss:// ... /xmpp-websocket", have prosody configured to use websockets, and nginx to pass it through. All of that seems to work fine, and a multi-party conference is coming up fine. The reason I want to use websockets is to have better visibility in the chrome debugger, where I can see the XMPP messages in Chrome's WS network debugger tab.

So everything is going well until I try to use sendEndpointMessage when it throws a "Channel support is disabled" exception. The app works fine with straight bosh, however it does not seem to work when using websockets. As in the example.js, I am passing in the option "openBridgeChannel: true", and I have changed that to "openBridgeChannel: 'websocket'".

I'm not altogether clear what "openBridgeChannel" is connecting to though, so I don't know where to look -- is it establishing a channel with Prosody, or jitsi-videobridge to be able to pass messages between participants? Should I expect it to work when using websockets for the XMPP?

You'll need to add configuration to the bridge for this to work. Note that sendEndpointMessage bypasses XMPP and sends a message through jitsi-videobridge. It does this either over a WebRTC DataChannel, or through a web-socket, but in both cases the remote endpoint is jitsi-videobridge and XMPP is not involved.

My ultimate goal is to be able to have one participant be able to send data/event messages to another participant via XMPP, and ideally via websockets so that I can see what is going on in the Chrome debugger along with the rest of the XMPP messages. Maybe I'm going about it all wrong? Is there perhaps a better way to be passing these messages?

You can just send XMPP messages (but I can't lookup the API in jitsi-meet right now), which will use whatever transport XMPP uses.

Regards,
Boris

···

On 22/01/2018 10:48, Simon Ditner wrote:


#3

Hi Boris,

Thanks! That clears up where to expect the openBridgeChannel related
messaging to be.

Do you happen to know where I can find the config options for enabling
websockets for the videobridge? The documentation for the videobridge
mentions that there is the /colibri-ws endpoint, but not whether it's
enabled or configured, nor how I inform lib-jitsi-meet that it should use a
specific websockets endpoint like /colibri-ws in place of the DataChannel.

I'll take a look at the direct XMPP messaging too. I imagine I can figure
that out from the strophe API docs...

Cheers,
Simon

···

On Mon, Jan 22, 2018 at 12:44 PM, Boris Grozev <boris@jitsi.org> wrote:

Hi Simon,

On 22/01/2018 10:48, Simon Ditner wrote:

Hi,

I'm having a trying time debugging a fairly straight forward app I've
built using lib-jitsi-meet in an attempt to build something different from
straight-up conferencing.

It's setup is similar to lib-jitsi-meet/doc/example/example.js, and I'm
using websockets by specifying bosh: "wss:// ... /xmpp-websocket", have
prosody configured to use websockets, and nginx to pass it through. All of
that seems to work fine, and a multi-party conference is coming up fine.
The reason I want to use websockets is to have better visibility in the
chrome debugger, where I can see the XMPP messages in Chrome's WS network
debugger tab.

So everything is going well until I try to use sendEndpointMessage when
it throws a "Channel support is disabled" exception. The app works fine
with straight bosh, however it does not seem to work when using websockets.
As in the example.js, I am passing in the option "openBridgeChannel: true",
and I have changed that to "openBridgeChannel: 'websocket'".

I'm not altogether clear what "openBridgeChannel" is connecting to
though, so I don't know where to look -- is it establishing a channel with
Prosody, or jitsi-videobridge to be able to pass messages between
participants? Should I expect it to work when using websockets for the XMPP?

You'll need to add configuration to the bridge for this to work. Note that
sendEndpointMessage bypasses XMPP and sends a message through
jitsi-videobridge. It does this either over a WebRTC DataChannel, or
through a web-socket, but in both cases the remote endpoint is
jitsi-videobridge and XMPP is not involved.

My ultimate goal is to be able to have one participant be able to send
data/event messages to another participant via XMPP, and ideally via
websockets so that I can see what is going on in the Chrome debugger along
with the rest of the XMPP messages. Maybe I'm going about it all wrong? Is
there perhaps a better way to be passing these messages?

You can just send XMPP messages (but I can't lookup the API in jitsi-meet
right now), which will use whatever transport XMPP uses.

Regards,
Boris


#4

Hmm, do you happen to know what that jingle fragment looks like so that I
could verify that my setup is configured correctly? I read over the jvb
logs for a single participant connecting, and to the untrained eye, none of
the jingle fragments stood out as one negotiating a path for something
other than audio or video related.

You mentioned earlier that configuration to the jitsiv-videobridge
component was necessary to enable the Websocket support for it, is that not
the case?

Thanks,
Simon

···

On Mon, Jan 22, 2018 at 4:17 PM, Boris Grozev <boris@jitsi.org> wrote:

On 22/01/2018 13:39, Simon Ditner wrote:

Hi Boris,

Thanks! That clears up where to expect the openBridgeChannel related
messaging to be.

Do you happen to know where I can find the config options for enabling
websockets for the videobridge? The documentation for the videobridge
mentions that there is the /colibri-ws endpoint, but not whether it's
enabled or configured, nor how I inform lib-jitsi-meet that it should use a
specific websockets endpoint like /colibri-ws in place of the DataChannel.

This information is signaled to lib-jitsi-meet in Jingle, no need to
configure anything (other than the openBridgeChannel option)

Boris


#5

Hmm, do you happen to know what that jingle fragment looks like so that I could verify that my setup is configured correctly? I read over the jvb logs for a single participant connecting, and to the untrained eye, none of the jingle fragments stood out as one negotiating a path for something other than audio or video related.

It's a "web-socket" element inside "transport".

You mentioned earlier that configuration to the jitsiv-videobridge component was necessary to enable the Websocket support for it, is that not the case?

Correct. I thought this was documented somewhere, but it doesn't seem to be. First you need to enable the public HTTP interface as described in [0] so clients can access the colibri HTTP endpoint (if you're proxing make sure web-socket support is enabled).

Second, configure the URL which is advertised with the properties described in the code[1].

Regards,
Boris

[0] https://github.com/jitsi/jitsi-videobridge/blob/master/doc/rest.md
[1] https://github.com/jitsi/jitsi-videobridge/blob/master/src/main/java/org/jitsi/videobridge/rest/ColibriWebSocketService.java#L32

···

On 23/01/2018 09:16, Simon Ditner wrote:


#6

Wonderful! Thank you Boris. Now I can see all the XMPP messages and the
"channel" messages sent by sendEndpointMessage. I'll be able to debug the
messaging much easier now.

The JVB jingle log messages were indeed missing the 'web-socket' lines in
the transport. Adding a COLIBRI_WS_DOMAIN value to
/etc/jitsi/videobridge/sip-communicator.properties did the trick.

For anyone else who is interested in setting up websockets, I'll tidy up my
setup notes and post them soon. Meanwhile, my properties now looks like so:

org.jitsi.videobridge.AUTHORIZED_SOURCE_REGEXP=focus@auth.videotest01.lan/.*
org.jitsi.videobridge.rest.jetty.host=::
org.jitsi.videobridge.rest.jetty.port=4443
org.jitsi.videobridge.rest.jetty.ProxyServlet.hostHeader=videotest01.lan
*org.jitsi.videobridge.rest.COLIBRI_WS_DOMAIN=videotest01.lan:4443*
org.jitsi.videobridge.rest.jetty.ProxyServlet.pathSpec=/http-bind
org.jitsi.videobridge.rest.jetty.ProxyServlet.proxyTo=
http://localhost:5280/http-bind
org.jitsi.videobridge.rest.jetty.ResourceHandler.resourceBase=/usr/share/jitsi-meet
org.jitsi.videobridge.rest.jetty.ResourceHandler.alias./config.js=/etc/jitsi/meet/videotest01.lan-config.js
org.jitsi.videobridge.rest.jetty.ResourceHandler.alias./interface_config.js=/usr/share/jitsi-meet/interface_config.js
org.jitsi.videobridge.rest.jetty.ResourceHandler.alias./logging_config.js=/usr/share/jitsi-meet/logging_config.js
org.jitsi.videobridge.rest.jetty.RewriteHandler.regex=^/([a-zA-Z0-9]+)$
org.jitsi.videobridge.rest.jetty.RewriteHandler.replacement=/
org.jitsi.videobridge.rest.jetty.SSIResourceHandler.paths=/
org.jitsi.videobridge.rest.jetty.tls.port=4443
org.jitsi.videobridge.TCP_HARVESTER_PORT=4443
org.jitsi.videobridge.rest.jetty.sslContextFactory.keyStorePath=/etc/jitsi/videobridge/videotest01.lan.jks
org.jitsi.videobridge.rest.jetty.sslContextFactory.keyStorePassword=changeit

···

--
Simon P. Ditner <spditner@gmail.com>
Office: +1 (416) 848-7573 / Mobile: +1 (647) 899-1293

On Tue, Jan 23, 2018 at 10:36 AM, Boris Grozev <boris@jitsi.org> wrote:

On 23/01/2018 09:16, Simon Ditner wrote:

Hmm, do you happen to know what that jingle fragment looks like so that I
could verify that my setup is configured correctly? I read over the jvb
logs for a single participant connecting, and to the untrained eye, none of
the jingle fragments stood out as one negotiating a path for something
other than audio or video related.

It's a "web-socket" element inside "transport".

You mentioned earlier that configuration to the jitsiv-videobridge
component was necessary to enable the Websocket support for it, is that not
the case?

Correct. I thought this was documented somewhere, but it doesn't seem to
be. First you need to enable the public HTTP interface as described in [0]
so clients can access the colibri HTTP endpoint (if you're proxing make
sure web-socket support is enabled).

Second, configure the URL which is advertised with the properties
described in the code[1].

Regards,
Boris

[0] https://github.com/jitsi/jitsi-videobridge/blob/master/doc/rest.md
[1] https://github.com/jitsi/jitsi-videobridge/blob/master/src/
main/java/org/jitsi/videobridge/rest/ColibriWebSocketService.java#L32