Running Finalize Script For Recording and Broadcasing

I am taking @emrah script for doing broadcasting and recording at the same time:

#!/bin/bash

ARGS=$@
RDIR="/tmp/recordings"

if [[ -n "$(echo $ARGS | grep 'rtmp://')" ]]; then
    STREAM=$(echo $ARGS | egrep -o '[^/]*$')
    SUID=$(echo "$STREAM-$(date '+%s')" | md5sum | awk '{print $1}')
    SUBDIR="$RDIR/$SUID"
    mkdir -p $SUBDIR
    FILE="$SUBDIR/$SUID.mp4"

    ARGS="$ARGS -acodec aac -strict -2 -ar 44100"
    ARGS="$ARGS -c:v libx264 -preset veryfast -profile:v main -level 3.1"                                                              
    ARGS="$ARGS -pix_fmt yuv420p -r 30 -crf 25 -g 60 -tune zerolatency"                                                                
    ARGS="$ARGS -f mp4 $FILE"
fi

exec /usr/bin/ffmpeg $ARGS

But the doesn’t call the finalize.sh after it does executing. How can I produce the required meta and call the /srv/finalize.sh after the live stream is done?

Where did you put this code?

My setup is the following:

/usr/local/bin/ffmpeg

#!/bin/bash
echo ffmpeg in $0 #Comment this line after making sure, that running ffmpeg, points to this script.
ARGS=$@
ARGS=$(echo $ARGS | sed 's/-tune zerolatency/-tune zerolatency -vf scale=854x480/') #Scale Video to 854x480

RDIR="/tmp/recordings"

if [[ -n "$(echo $ARGS | grep 'rtmp://')" ]]; then
    STREAM=$(echo $ARGS | egrep -o '[^/]*$')
    SUID=$(echo "$STREAM-$(date '+%s')" | md5sum | awk '{print $1}')
    SUBDIR="$RDIR/$SUID"
    mkdir -p $SUBDIR
    FILE="$SUBDIR/$SUID.mp4"

    ARGS="$ARGS -acodec aac -strict -2 -ar 44100"
    ARGS="$ARGS -c:v libx264 -preset veryfast -profile:v main -level 3.1"                                                              
    ARGS="$ARGS -pix_fmt yuv420p -r 30 -crf 25 -g 60 -tune zerolatency"                                                                
    ARGS="$ARGS -f mp4 $FILE"
fi

exec /usr/bin/ffmpeg $ARGS

The recording setup in my jibri is the following:

/etc/jitsi/jibri/jibri.conf

  recording {
    recordings-directory = "/srv/recordings"
    # TODO: make this an optional param and remove the default
    finalize-script = "/srv/finalize_recording.sh"
  }
  streaming {
    // A list of regex patterns for allowed RTMP URLs.  The RTMP URL used
    // when starting a stream must match at least one of the patterns in
    // this list.
    rtmp-allow-list = [
      // By default, all services are allowed
      ".*"
    ]
  }
  ffmpeg {
    resolution = "1920x1080"
    // The audio source that will be used to capture audio on Linux
    audio-source = "alsa"
    // The audio device that will be used to capture audio on Linux
    audio-device = "plug:bsnoop"
  }
  chrome {
    // The flags which will be passed to chromium when launching
    flags = [
      "--use-fake-ui-for-media-stream",
      "--start-maximized",
      "--kiosk",
      "--enabled",
      "--disable-infobars",
      "--autoplay-policy=no-user-gesture-required"
      "--ignore-certificate-errors"
      "--user-agent='Jibri Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.96 Safari/537.36'"
    ]
  }
  stats {
    enable-stats-d = true
  }
  webhook {
    // A list of subscribers interested in receiving webhook events
    subscribers = []
  }

The finalize script has nothing related with this code (fake ffmpeg).
Check your jibri logs.

Do you click recording or streaming on UI?
The finalize script run after recording.

@emrah I am using the streaming features as such:

let stream_url = 'rtmp://some-url';

            let record = room.startRecording({
                broadcastId: some_id_for_broadcast,
                mode: JitsiRecordingConstants.mode.STREAM,
                streamId: stream_url
            });

AFAIK the finalize script doesn’t run after streaming.

So it sounds like I should change the mode to JitsiRecordingConstants.mode.FILE and pass in the RTMP url as the streamId as well.

Probably you will need a more complex solution. It’s not a good idea to pass RTMP destination publicly.

Your solution is good, its just a missing a few things. I’m trying to figure out how to retrieve the broadcast or pass in meta. It looks like the Streaming Service class has must flexibility.

/jibri/src/main/kotlin/org/jitsi/jibri/service/impl/StreamingJibriService.kt

/*
 * Copyright @ 2018 Atlassian Pty Ltd
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 *
 */

package org.jitsi.jibri.service.impl

import org.jitsi.xmpp.extensions.jibri.JibriIq
import org.jitsi.jibri.capture.ffmpeg.FfmpegCapturer
import org.jitsi.jibri.config.Config
import org.jitsi.jibri.config.XmppCredentials
import org.jitsi.jibri.error.JibriError
import org.jitsi.jibri.selenium.CallParams
import org.jitsi.jibri.selenium.JibriSelenium
import org.jitsi.jibri.selenium.RECORDING_URL_OPTIONS
import org.jitsi.jibri.service.ErrorSettingPresenceFields
import org.jitsi.jibri.service.JibriService
import org.jitsi.jibri.sink.Sink
import org.jitsi.jibri.sink.impl.StreamSink
import org.jitsi.jibri.status.ComponentState
import org.jitsi.jibri.status.ErrorScope
import org.jitsi.jibri.util.whenever
import org.jitsi.metaconfig.config
import java.util.regex.Pattern

const val YOUTUBE_URL = "rtmp://a.rtmp.youtube.com/live2"
private const val STREAMING_MAX_BITRATE = 2976

/**
 * Parameters needed for starting a [StreamingJibriService]
 */
data class StreamingParams(
    /**
     * Which call we'll join
     */
    val callParams: CallParams,
    /**
     * The ID of this session
     */
    val sessionId: String,
    /**
     * The login information needed to appear invisible in
     * the call
     */
    val callLoginParams: XmppCredentials,
    /**
     * The RTMP URL we'll stream to
     */
    val rtmpUrl: String,
    /**
     * The URL at which the stream can be viewed
     */
    val viewingUrl: String? = null
)

/**
 * [StreamingJibriService] is the [JibriService] responsible for joining a
 * web call, capturing its audio and video, and streaming that audio and video
 * to a url
 */
class StreamingJibriService(
    private val streamingParams: StreamingParams
) : StatefulJibriService("Streaming") {
    init {
        logger.addContext("session_id", streamingParams.sessionId)
    }
    private val capturer = FfmpegCapturer(logger)
    private val sink: Sink
    private val jibriSelenium = JibriSelenium(logger)

    private val rtmpAllowList: List<Pattern> by config {
        "jibri.streaming.rtmp-allow-list".from(Config.configSource)
            .convertFrom<List<String>> { it.map(Pattern::compile) }
    }

    init {
        sink = StreamSink(
            url = streamingParams.rtmpUrl,
            streamingMaxBitrate = STREAMING_MAX_BITRATE,
            streamingBufSize = 2 * STREAMING_MAX_BITRATE
        )

        registerSubComponent(JibriSelenium.COMPONENT_ID, jibriSelenium)
        registerSubComponent(FfmpegCapturer.COMPONENT_ID, capturer)
    }

    override fun start() {
        if (rtmpAllowList.none { it.matcher(streamingParams.rtmpUrl).matches() }) {
            logger.error("RTMP url ${streamingParams.rtmpUrl} is not allowed")
            publishStatus(
                ComponentState.Error(
                    JibriError(
                        ErrorScope.SESSION,
                        "RTMP URL ${streamingParams.rtmpUrl} is not allowed"
                    )
                )
            )
            return
        }
        jibriSelenium.joinCall(
            streamingParams.callParams.callUrlInfo.copy(urlParams = RECORDING_URL_OPTIONS),
            streamingParams.callLoginParams
        )

        whenever(jibriSelenium).transitionsTo(ComponentState.Running) {
            logger.info("Selenium joined the call, starting capturer")
            try {
                jibriSelenium.addToPresence("session_id", streamingParams.sessionId)
                jibriSelenium.addToPresence("mode", JibriIq.RecordingMode.STREAM.toString())
                streamingParams.viewingUrl?.let { viewingUrl ->
                    if (!jibriSelenium.addToPresence("live-stream-view-url", viewingUrl)) {
                        logger.error("Error adding live stream url to presence")
                    }
                }
                jibriSelenium.sendPresence()
                capturer.start(sink)
            } catch (t: Throwable) {
                logger.error("Error while setting fields in presence", t)
                publishStatus(ComponentState.Error(ErrorSettingPresenceFields))
            }
        }
    }

    override fun stop() {
        logger.info("Stopping capturer")
        capturer.stop()
        logger.info("Stopped capturer")
        logger.info("Quitting selenium")
        jibriSelenium.leaveCallAndQuitBrowser()
        logger.info("Quit selenium")
    }
}

So I’ve finally created a CRAZY workaround for this, phew. Starts off like this:

NGINX
Nginx can do RTMP streaming!..kinda. We are going to use Nginx to route a stream onto itself, and record the stream. On the same server as your jibri, install nginx and add the following to the bottom:

rtmp {
	server {
		listen 1935;
		chunk_size 4096;

		application live {
			live on;
			#Set this to "record off" if you don't want to save a copy of your broadcasts
			record all;
			# The directory in which the recordings will be stored.
			record_path /var/www/html/recordings;
			record_unique on;
			record_suffix -%d-%b-%y-%T.flv;
			on_record_done http://127.0.0.1:3000/recorded;

			# Turn on HLS
			exec /usr/bin/ffmpeg -i rtmp://127.0.0.1:1935/live/$name -crf 30 -preset ultrafast -acodec aac -strict experimental -ar 44100 -ac 2 -b:a 64k -vcodec libx264 -x264-params keyint=60:no-scenecut=1 -r 30 -b:v 500k -s 960x540 -f flv rtmp://127.0.0.1/show/$name;
		}

		application show {
			live on;
			# Turn on HLS
			hls on;
			hls_path /mnt/hls/;
			hls_fragment 3;
			hls_playlist_length 60;
			# disable consuming the stream from nginx as rtmp
			deny play all;
		}
	}
}

Local FFMPEG

We are going to customize the local FFMPEG by saying "if there is a broadcast, then we are going to add an additional RTMP stream and route it to the server. In /usr/local/bin/ffmpeg, have the following

#!/bin/bash
echo ffmpeg in $0 #Comment this line after making sure, that running ffmpeg, points to this script.

ARGS=$@
ARGS=$(echo $ARGS | sed 's/-tune zerolatency/-tune zerolatency -vf scale=854x480/') #Scale Video to 854x480

if [[ -n "$(echo $ARGS | grep ' rtmp://')" ]]; then
    DST=$(echo $ARGS | rev | awk '{print $1}' | rev)
    STREAM=$(echo $DST | rev | cut -d '/' -f1 | rev)

    ARGS="$ARGS -f flv rtmp://127.0.0.1:1935/live/$STREAM"
fi

echo $ARGS >> /tmp/ffmpeg.log

exec /usr/bin/ffmpeg $ARGS

Express

You must have above Node version 14 for this. We are going to write an an express server that is called by Nginx with the recording has stopped.

const express = require('express');
const cors = require('cors');
const bodyParser = require("body-parser");

const shell = require('shelljs');
const FormData = require('form-data');
const axios = require('axios').default;
const fs = require('fs/promises');

const app = express();
const port = 3000;

//Here we are configuring express to use body-parser as middle-ware.
app.use(cors())
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());

app.use(function (req, res, next) {
    res.header("Access-Control-Allow-Origin", "*")
    res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept")
    next()
});


app.post('/recorded', async (req, res) => {

    try {
        //Stream Name - remove "-broadcast"
        let name = req.body.name;

        //Make sure to remove prefix for several stream type
        event_id = name.replace('-remove-unwanted-stuff', '');

        let file_location = req.body.path;

        //An Auth Token To Your Server
        let token = 'xxxxxxx';

        //Access Config
        const config = {
            headers: { Authorization: `Bearer ${token}` }
        };

        //Create form for the file
        const form = new FormData();
        const file = await fs.readFile(file_location);
        form.append('file', file, event_id);

        axios.post(`https://route-to/upload/content/${event_id}`, form, config).then(function (response) {
            //console.log(response);
        })
            .catch(function (error) {
                console.error(error);
            });

        shell.exec('./recorded.sh')
        res.status(200);
        res.send().end();
    } catch (e) {
        console.log(e);
    }
});


app.listen(port, () => {
    console.log(`Example app listening on port ${port}`)
});

record.sh

And finally in our record.sh, we are just going to keep Jibri clean and restart it.

## upload script
#!/bin/sh

sudo service jibri restart
exit 0

You can trigger the shell script directly using exec_record_done in nginx.
BTW you don’t need to restart jibri after each streaming.