Gstreamer IO

I am hoping to generate audio/video stream with gstreamer and send the stream to a jitsi meet like a user/browser would. so a gstreamer sink.

and also I would like to connect to the server and ingest the stream as a gstreamer source.

This is to add some automation around using jitsi for conference presentations.
Currently a collection of humans work together and figure it out, but we would like to automate whatever we can, and replace things like [browser, vlc, vnc, human hits Play button at the right time] with a script that runs at the right time and doesn’t require the amount of config and apt to app connections and permissions that are needed for jitsi in browser to read from what vlc has rendered to it’s window.

and: I want a count down HH:MM:SS clock so we all know when the next presentation starts, using a clock we are all in sync with. currently a person in the meeting is in charge of talking this information, and sometimes they don’t for all the reasons humans don’t do what they should.

We also want to send a single composited stream to a CDN with a standard html 5 player embeded in a web page, or people can use whatever media player they want.

searching around, I found:
“I finally make simple gstreamer webrtcbin video/audio connecting to jicofo/jvb works.”
But 2 years later: “Can you share the code”
[jitsi-dev] rtcp-mux without bundle

I am also interested working in having a gstreamer sink that could be configured to connect to a Jitsi Meeting. I really don’t get why this feature seems to get ignored, several people have expressed interest in such a use case, but the developers seem to respond with either ignorance or by suggesting subpar options like screen-sharing or non-portable work-arounds using loopback devices.

Anyway, in short, I would be interested in having a go at doing this. I think the approach should be to start with a simple sink that does nothing, and then to study the code of lib-jitsi-meet (written in javascript) to figure out how to make the sink talk to a Jitsi server.

Anyway, if I can find another developer to collaborate with, I would be willing to throw a couple days at this to see if it is possible to get something basic working. @CarlFK do you have any experience with Gstreamer/coding skills?

I don’t think anyone on the team has much experience with gstreamer, but if you ask specific questions we can try and guide you around what else you’d need to do.

Here is what a gst dev says:

Such a thing doesn’t exist at the moment. It would look something like the example for Janus interop if it did (https://gitlab.freedesktop.org/gstreamer/gst-examples/-/tree/master/webrtc/janus).

AIUI (without checking at all), the main reason noone has done it is that the Janus javascript protocol is well documented and easy to implement, and Jitsi’s is not documented outside their code and would effectively have to be reverse engineered.

The work to be done is primarily javascript-based to set up a call. The GStreamer Webrtc components should already do the necessary things otherwise (hopefully).

So my interest in this has just dropped a bit, but I’ll be happy to test things if someone else does the real work.

Tip: how to share any video on a Jitsi session
It’s possible to do the same using Gstreamer too

A customized Jibri can do that… It can start a gstreamer process instead of ffmpeg using a fake ffmpeg.
Tip: how to customize ffmpeg without changing Jibri’s code

Or a custom stream server can be used as a gstreamer source. This solves the CDN issue too

Tip: how to share any video on a Jitsi session says:
“open chromium , join a Jitsi room, select the virtual camera and the virtual microphone”

I do not want to run a browser.

"Create the file /usr/local/bin/ffmpeg"
That is pretty hacky. I do not want to replace, augment, manage etc such things. the ffmpeg command should run the ffmpeg binary, not gstreamer.

Hi Carl, if you want to rebuild all the work done by the JavaScript part of jitsi-meet it will be really really complexe.
You need to manage and maintain :

  • 2 differents websockets connections, one for XMPP-muc and one for the colibri channel to the videobridge.
  • implement a xmpp muc protocol to connect your client to a jitsi-meet room and get media description
  • transcode the xmpp media description to werbtc sdp and your local sdp to xmpp media description to initiate the gstreamer part
  • implement a colibri stack for the client to bridge channel and link it to your gstreamer part.

So having all this work done by the browser and the jitsi-meet JavaScript is more simple.
Regards.