Currently, in Jigasi, a participant’s speech will be stored in a local buffer
until the buffer length is full. Meanwhile, if the participant mutes his/her audio and the local buffer
is not full, jigasi
won’t call the GCP until the participant unmutes his/her audio. How can jigasi
listen to a mute event and release the buffer to GCP?
What is the size of this buffer, this is few milliseconds of speech, isn’t it?
We haven’t seen any issues with that.
Thanks, @damencho for the reply. Right now, it is 500ms which is high for some cases jigasi/Participant.java at master · jitsi/jigasi · GitHub