Load balancing setup not working with JMS and JVB on different hosts in AWS

We have set up separate hosts for JMS (web, prosody, jicofo) and JVB so that we can add autoscaling. JMS is on private subnet & JVB is on public subnet (URI → ALB → Backend Docker Containers). The following ports are opened between the services (UDP 10000 is open from JVB).

JVB → PROSODY on TCP/5222
nc -v xmpp-abc.test.com 5222 Ncat: Version 7.50 ( Ncat - Netcat for the 21st Century ) Ncat: Connected to xx.xx.xx.xx:5222.

PROSODY(JMS) → JVB pn TCP/9090
nc -v abc-videobridge.test.com 9090 Ncat: Version 7.50 ( Ncat - Netcat for the 21st Century ) Ncat: Connected to xx.xx.xx.xx:9090.

When the meeting starts, we see this message:

Console errors:

Js Console Error

BridgeChannel.js:87 WebSocket connection to ‘wss://abc-videobridge.test.com/colibri-ws/jvb-i-0ee68ed52fd2d3761/eea2d1a7c24a07d9/11f8a9ae?pwd=70sou0suijs8g6cmnt4n5hkijg’ failed:
_initWebSocket @ BridgeChannel.js:87
Logger.js:154 2023-02-03T05:03:33.249Z [modules/RTC/BridgeChannel.js] <WebSocket.e.onclose>: Channel closed: 1006
r @ Logger.js:154
Logger.js:154 2023-02-03T05:03:34.037Z [modules/RTC/JitsiLocalTrack.js] Mute LocalTrack[1,audio]: true
BridgeChannel.js:87 WebSocket connection to ‘wss://abc-videobridge.test.com/colibri-ws/jvb-i-0ee68ed52fd2d3761/eea2d1a7c24a07d9/11f8a9ae?pwd=70sou0suijs8g6cmnt4n5hkijg’ failed:
_initWebSocket @ BridgeChannel.js:87
Logger.js:154 2023-02-03T05:03:35.242Z [modules/RTC/BridgeChannel.js] <WebSocket.e.onclose>: Channel closed: 1006

The colibri config in meet.conf looks like this where abc-videobridge.test.com is the alb’s endpoint for jvb:

Colibri

colibri (JVB) websockets

location ~ ^/colibri-ws/([a-zA-Z0-9-._]+)/(.*) {
tcp_nodelay on;

proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";

proxy_pass http://abc-videobridge.test.com/colibri-ws/$1/$2$is_args$args;

}

The env variables are

Env Variables



What are we doing wrong with this set up? What needs to change? Any help would be highly appreciated.

As shown in the JS console logs, the frontend is trying to connect to wss://abc-videobridge.test.com — note wss which is WebSocket over https.

You mentioned abc-videobridge.test.com is the ALB’s endpoint for JVB. Is that ALB configured with an HTTPS listener on port 443?

Your nginx config is probably irrelevant since based on the JS console logs it appears you have configured JVB to advertise the ALB directly as the Colibri websocket host, so Colibri traffic won’t hit your nginx (which is fine — better, even).

Yes, we have a listener on port 443.

And that listener is forwarding to the port that JVB is listening on? Does JVB log anything when the frontend attempts to connect?

When jvb starts, I see “JVB 2023-02-03 18:34:14.593 INFO: [1] ReadOnlyConfigurationService.reloadConfiguration#56: Error loading config file: java.io.FileNotFoundException: /config/sip-communicator.properties (No such file or directory)”

There is also a harvester related exception: JVB 2023-02-03 19:15:52.891 INFO: [55] org.ice4j.ice.harvest.CandidateHarvesterSetTask.run: disabling harvester due to exception: hostname can’t be null

Jvb logs
jvb_logs.txt (46.4 KB)

Nothing seems relevant in the JVB log (though the harvester exception seems likely to be a separate configuration problem — do you have working audio & video in a 3+ person meeting?).

It’s likely the Colibri connection is not reaching the JVB. Check the ALB configuration to be sure that the HTTPS:443 listener is forwarding to the JVB and using the correct target port. You can also have a look at the Network tab in your browser’s developer tools to see the Colibri WebSocket connection and possibly find more detail about the connection failure.

Yes, audio and video are working when 3 or more users are there. But screen share is blurry. Occasionally, video is not enabled for one user out of 4. It shows video enabled for the user but other users can’t see the video.

I see 404 error in the network tab.
networkError

Trying to curl from the jvb container using https also gives error

In Jvb logs, we see “ICE connected” and then “ICE state changed from Completed to Terminated.” What does this mean? Will this impair video quality?

JVB Logs

JVB 2023-02-08 20:45:02.135 INFO: [79] [confId=aad576303fd96574 conf_name=939716358116758662334010@muc.abc.test.com meeting_id=5e1b545d epId=ad944dd8 stats_id=Name-M56] Endpoint$setupIceTransport$2.connected#375: ICE connected
JVB 2023-02-08 20:45:02.135 INFO: [86] [confId=aad576303fd96574 conf_name=939716358116758662334010@muc.abc.test.com meeting_id=5e1b545d epId=ad944dd8 stats_id=Name-M56] DtlsTransport.startDtlsHandshake#102: Starting DTLS handshake, role=org.jitsi.nlj.dtls.DtlsServer@655b8f35
JVB 2023-02-08 20:45:02.135 INFO: [79] [confId=aad576303fd96574 conf_name=939716358116758662334010@muc.abc.test.com meeting_id=5e1b545d epId=ad944dd8 stats_id=Name-M56 local_ufrag=fd1hf1gopcchum ufrag=fd1hf1gopcchum] Agent.logCandTypes#2009: Harvester used for selected pair for stream-ad944dd8.RTP: srflx
JVB 2023-02-08 20:45:02.135 INFO: [86] [confId=aad576303fd96574 conf_name=939716358116758662334010@muc.abc.test.com meeting_id=5e1b545d epId=ad944dd8 stats_id=Name-M56] TlsServerImpl.notifyClientVersion#199: Negotiated DTLS version DTLS 1.2
JVB 2023-02-08 20:45:02.227 INFO: [86] [confId=aad576303fd96574 conf_name=939716358116758662334010@muc.abc.test.com meeting_id=5e1b545d epId=ad944dd8 stats_id=Name-M56] Endpoint$setupDtlsTransport$3.handshakeComplete#419: DTLS handshake complete
JVB 2023-02-08 20:45:04.080 WARNING: [72] [confId=aad576303fd96574 conf_name=939716358116758662334010@muc.abc.test.com meeting_id=5e1b545d epId=ad944dd8 stats_id=Name-M56 ssrc=3373424207] RetransmissionRequester$StreamPacketRequester.packetReceived#122: 3373424207 large jump in sequence numbers detected (highest received was 3470, current is 3748, jump of 277) , not requesting retransmissions
JVB 2023-02-08 20:45:04.082 WARNING: [21] [confId=f40288753446a89b conf_name=939721116757203322872@muc.abc.test.com meeting_id=f0bbd569 epId=78df188d stats_id=Ramon-ng2 ssrc=3027668323] RetransmissionRequester$StreamPacketRequester.doWork#177: 3027668323 sending NACK for 111 missing packets
JVB 2023-02-08 20:45:04.229 WARNING: [21] [confId=f40288753446a89b conf_name=939721116757203322872@muc.abc.test.com meeting_id=f0bbd569 epId=78df188d stats_id=Ramon-ng2 ssrc=3027668323] RetransmissionRequester$StreamPacketRequester.doWork#177: 3027668323 sending NACK for 111 missing packets
JVB 2023-02-08 20:45:04.358 WARNING: [70] [confId=aad576303fd96574 conf_name=939716358116758662334010@muc.abc.test.com meeting_id=5e1b545d epId=ad944dd8 stats_id=Name-M56 ssrc=570300856] RetransmissionRequester$StreamPacketRequester.packetReceived#122: 570300856 large jump in sequence numbers detected (highest received was 3749, current is 9214, jump of 5464) , not requesting retransmissions
JVB 2023-02-08 20:45:04.378 WARNING: [21] [confId=f40288753446a89b conf_name=939721116757203322872@muc.abc.test.com meeting_id=f0bbd569 epId=78df188d stats_id=Ramon-ng2 ssrc=3027668323] RetransmissionRequester$StreamPacketRequester.doWork#177: 3027668323 sending NACK for 111 missing packets
JVB 2023-02-08 20:45:04.528 WARNING: [21] [confId=f40288753446a89b conf_name=939721116757203322872@muc.abc.test.com meeting_id=f0bbd569 epId=78df188d stats_id=Ramon-ng2 ssrc=3027668323] RetransmissionRequester$StreamPacketRequester.doWork#177: 3027668323 sending NACK for 111 missing packets
JVB 2023-02-08 20:45:04.552 WARNING: [21] [confId=f40288753446a89b conf_name=939721116757203322872@muc.abc.test.com meeting_id=f0bbd569 epId=78df188d stats_id=Ramon-ng2 ssrc=3027668323] RetransmissionRequester$StreamPacketRequester.doWork#177: 3027668323 sending NACK for 135 missing packets
JVB 2023-02-08 20:45:04.835 INFO: [74] [confId=f40288753446a89b conf_name=939721116757203322872@muc.abc.test.com meeting_id=f0bbd569 epId=483815f9 stats_id=Abe-m70] BandwidthAllocator.allocate#271: Sources suspended due to insufficient bandwidth (bwe=30000 bps): [78df188d-v0]
JVB 2023-02-08 20:45:05.135 INFO: [56] [confId=aad576303fd96574 conf_name=939716358116758662334010@muc.abc.test.com meeting_id=5e1b545d epId=ad944dd8 stats_id=Name-M56 local_ufrag=fd1hf1gopcchum ufrag=fd1hf1gopcchum] Agent.setState#946: ICE state changed from Completed to Terminated.
JVB 2023-02-08 20:45:05.135 INFO: [56] [confId=aad576303fd96574 conf_name=939716358116758662334010@muc.abc.test.com meeting_id=5e1b545d epId=ad944dd8 stats_id=Name-M56 local_ufrag=fd1hf1gopcchum] IceTransport.iceStateChanged#342: ICE state changed old=Completed new=Terminated
JVB 2023-02-08 20:45:07.505 SEVERE: [21] [confId=aad576303fd96574 conf_name=939716358116758662334010@muc.abc.test.com meeting_id=5e1b545d epId=0df221d4 stats_id=Suzanne-bG6] Endpoint.scheduleEndpointMessageTransportTimeout$lambda-29#687: EndpointMessageTransport still not connected.
JVB 2023-02-08 20:45:07.647 SEVERE: [21] [confId=aad576303fd96574 conf_name=939716358116758662334010@muc.abc.test.com meeting_id=5e1b545d epId=be3ca821 stats_id=Julio-yO9] Endpoint.scheduleEndpointMessageTransportTimeout$lambda-29#687: EndpointMessageTransport still not connected.

JS Console error

2023-02-08T20:45:37.118Z [modules/RTC/JitsiRemoteTrack.js] <Hd._onTrackMute>: “onmute” event(1675889137114): RemoteTrack[userID: ad944dd8, type: video, ssrc: 1259213061, p2p: false, sourceName: ad944dd8-v0, status: {readyState: live, muted: true, enabled: true}]
Logger.js:154 2023-02-08T20:45:37.118Z [modules/connectivity/TrackStreamingStatus.ts] <Ld.onTrackRtcMuted>: Detector track RTC muted: ad944dd8-v0 1675889137118
Logger.js:154 2023-02-08T20:45:37.957Z [modules/RTC/JitsiRemoteTrack.js] <Hd._onTrackUnmute>: “onunmute” event(1675889137957): RemoteTrack[userID: ad944dd8, type: video, ssrc: 1259213061, p2p: false, sourceName: ad944dd8-v0, status: {readyState: live, muted: false, enabled: true}]
Logger.js:154 2023-02-08T20:45:37.957Z [modules/connectivity/TrackStreamingStatus.ts] <Ld.onTrackRtcUnmuted>: Detector track RTC unmuted: ad944dd8-v0 1675889137957
Logger.js:154 2023-02-08T20:45:37.958Z [modules/connectivity/TrackStreamingStatus.ts] <Ld.figureOutStreamingStatus>: Figure out conn status for ad944dd8-v0, is video muted: true video track frozen: false p2p mode: false is in forwarded sources: true currentStatus => newStatus: active => active
Logger.js:154 2023-02-08T20:45:43.023Z [modules/RTC/JitsiRemoteTrack.js] <Hd._onTrackMute>: “onmute” event(1675889143020): RemoteTrack[userID: ad944dd8, type: video, ssrc: 1259213061, p2p: false, sourceName: ad944dd8-v0, status: {readyState: live, muted: true, enabled: true}]
Logger.js:154 2023-02-08T20:45:43.024Z [modules/connectivity/TrackStreamingStatus.ts] <Ld.onTrackRtcMuted>: Detector track RTC muted: ad944dd8-v0 1675889143023
Logger.js:154 2023-02-08T20:45:43.865Z [modules/RTC/JitsiRemoteTrack.js] <Hd._onTrackUnmute>: “onunmute” event(1675889143865): RemoteTrack[userID: ad944dd8, type: video, ssrc: 1259213061, p2p: false, sourceName: ad944dd8-v0, status: {readyState: live, muted: false, enabled: true}]
Logger.js:154 2023-02-08T20:45:43.865Z [modules/connectivity/TrackStreamingStatus.ts] <Ld.onTrackRtcUnmuted>: Detector track RTC unmuted: ad944dd8-v0 1675889143865
Logger.js:154 2023-02-08T20:45:43.865Z [modules/connectivity/TrackStreamingStatus.ts] <Ld.figureOutStreamingStatus>: Figure out conn status for ad944dd8-v0, is video muted: true video track frozen: false p2p mode: false is in forwarded sources: true currentStatus => newStatus: active => active
Logger.js:154 2023-02-08T20:45:48.083Z [modules/RTC/JitsiRemoteTrack.js] <Hd._onTrackMute>: “onmute” event(1675889148083): RemoteTrack[userID: ad944dd8, type: video, ssrc: 1259213061, p2p: false, sourceName: ad944dd8-v0, status: {readyState: live, muted: true, enabled: true}]
Logger.js:154 2023-02-08T20:45:48.084Z [modules/connectivity/TrackStreamingStatus.ts] <Ld.onTrackRtcMuted>: Detector track RTC muted: ad944dd8-v0 1675889148083
Logger.js:154 2023-02-08T20:45:48.926Z [modules/RTC/JitsiRemoteTrack.js] <Hd._onTrackUnmute>: “onunmute” event(1675889148926): RemoteTrack[userID: ad944dd8, type: video, ssrc: 1259213061, p2p: false, sourceName: ad944dd8-v0, status: {readyState: live, muted: false, enabled: true}]
Logger.js:154 2023-02-08T20:45:48.926Z [modules/connectivity/TrackStreamingStatus.ts] <Ld.onTrackRtcUnmuted>: Detector track RTC unmuted: ad944dd8-v0 1675889148926
Logger.js:154 2023-02-08T20:45:48.927Z [modules/connectivity/TrackStreamingStatus.ts] <Ld.figureOutStreamingStatus>: Figure out conn status for ad944dd8-v0, is video muted: true video track frozen: false p2p mode: false is in forwarded sources: true currentStatus => newStatus: active => active
BridgeChannel.js:87 WebSocket connection to ‘wss://abc-videobridge.test.com/colibri-ws/i-0d77cd1da56ed8f42/aad576303fd96574/0df221d4?pwd=4bpihjrbn8k3ahv88vqgq7hue’ failed:
_initWebSocket @ BridgeChannel.js:87
Logger.js:154 2023-02-08T20:45:51.706Z [modules/RTC/BridgeChannel.js] <e.onclose>: Channel closed by server
Logger.js:154 2023-02-08T20:45:51.706Z [modules/RTC/BridgeChannel.js] <e.onclose>: Channel closed: 1006
r @ Logger.js:154
e.onclose @ BridgeChannel.js:388
Logger.js:154 2023-02-08T20:45:53.145Z [modules/RTC/JitsiRemoteTrack.js] <Hd._onTrackMute>: “onmute” event(1675889153145): RemoteTrack[userID: ad944dd8, type: video, ssrc: 1259213061, p2p: false, sourceName: ad944dd8-v0, status: {readyState: live, muted: true, enabled: true}]
Logger.js:154 2023-02-08T20:45:53.145Z [modules/connectivity/TrackStreamingStatus.ts] <Ld.onTrackRtcMuted>: Detector track RTC muted: ad944dd8-v0 1675889153145
Logger.js:154 2023-02-08T20:45:53.988Z [modules/RTC/JitsiRemoteTrack.js] <Hd._onTrackUnmute>: “onunmute” event(1675889153988): RemoteTrack[userID: ad944dd8, type: video, ssrc: 1259213061, p2p: false, sourceName: ad944dd8-v0, status: {readyState: live, muted: false, enabled: true}]
Logger.js:154

We have now removed the ALB infront of JVB. We have the dns record pointed to public ip of the instance instead. We get 405 when trying to access “http://abc-videobridge.test.com:9090/colibri-ws/jvb-i-0bb6196274464afbb

As per this doc How to Install Jitsi Meet with Multi-Server Configuration | facsiaginsa.com 405 indicates everything is setup correct.

JVB error: JVB 2023-02-09 16:06:37.202 SEVERE: [21] [confId=a088cbf405c17ca4 conf_name= 939721116759571433510@muc.abc.test.com meeting_id=ca1fcbed epId=4c0483f6 stats_id=Ramon-ng2] Endpoint.scheduleEndpointMessageTransportTimeout$lambda-29#687: EndpointMessageTransport still not connected.

Issues we are still seeing:

  1. Seeing the message that “video quality is impaired” (bridge channel closed)
  2. Screen share is not working, seeing a black screen or is blurry

Working: Audio/Video is working for more than 2 participants.
Any pointers on how to fix these would help?