Audio mixing in jitsi-meet

In jitsi-meet, how audio streams from other participants are mixed? Is the mixing done in webrtc native code or in chrome?

Best regards,

/Kaiduan

Hi,

From my understanding, Jitsi-videobridge is forwarding all stream to client and let it mixing all those streams itself… That’s why you (in this case it could also be only I) have a lot of issue with client when your server is definitely fine.

Jitsi meet is mostly a JS code which allow you to interact with all jitsi component, it is installed on client side and configured according to your server configuration.

Again do not hesitate to correct me if I’m wrong… and I’m not, I hope this help :slight_smile:

I’m a newbie also interested in audio in Jitsi. A nice thread to clarify my understanding.

So, do you mean that every client always hears the mixed audio from all clients? Or, does Jitsi Meet (or Jicofo?) set the focus to the single (currently-speaking) client, and mute the other audio streams, at each client side?

I ask this question because I thought (imagined) that focusing & muting the others is done at the server side, and the client side just receives the final (single) audio stream, to save the bandwidth. Was I wrong?

Thanks for help!
Cheeeeeers

Hi,

From my understanding, focus is calculated on client side based on audio analysis (done on client side as well). When someone is speaking other continue to speak (continue to send audio stream).

When moderator mute someone then it ask server to tell client to stop sending audio, that’s my understanding of how it works.

There is a lot of parameter to try and set to configure your jitsi session but thats my understanding based on what I can see from monitoring and couple investagtion but again all my thought are based on guess…

Hope someone could help on this.

PS: editing my previous post:

… and IF I’m not, I hope this help

Thanks for your comment. I’d like to start studying myself, so could you please give me some pointer(s) where I should begin?
A link to the appropriate portion of the doc or source code, maybe? A suggestion like “Look into this and this source subdirs” would be a great help. I’d appreciate it.
Thanks!

Hi, is there any progress on this topic?

Ant media is webRTC media server, which is using merge of audio and video at client side.
This is simple html5 canvas work for video and stream merging for audio,.
Please check Ant media sample application “conference.”