Jibri FFMPEG - Recording Failed

Here is the output.

root@node98601-env-2068478:/home# ps aux
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
root 1 0.0 0.1 103940 10236 ? Ss 16:57 0:00 init -z
root 83 0.0 0.1 29600 8544 ? Ss 16:57 0:00 /lib/systemd/systemd-journald
root 115 0.0 0.0 19536 4560 ? Ss 16:57 0:00 /lib/systemd/systemd-udevd
root 132 0.0 0.0 225812 3548 ? Ssl 16:57 0:00 /usr/sbin/rsyslogd -n -iNONE
message+ 140 0.0 0.0 8684 3692 ? Ss 16:57 0:00 /usr/bin/dbus-daemon --system --address=systemd: --nofork --nopidfile --systemd-activation --syslog-on
root 144 0.0 0.0 7252 2472 ? Ss 16:57 0:00 /usr/sbin/cron -f
root 145 0.0 0.1 19380 7260 ? Ss 16:57 0:00 /lib/systemd/systemd-logind
systemd+ 146 0.0 0.1 20932 6684 ? Ss 16:57 0:00 /lib/systemd/systemd-networkd
root 170 0.0 0.0 18356 2640 ? Ss 16:57 0:00 /usr/sbin/saslauthd -a pam -c -m /var/run/saslauthd -n 2
root 171 0.0 0.0 18356 916 ? S 16:57 0:00 /usr/sbin/saslauthd -a pam -c -m /var/run/saslauthd -n 2
jibri 282 0.0 0.7 1048816 44352 ? Ssl 16:57 0:00 /usr/lib/xorg/Xorg -nocursor -noreset +extension RANDR +extension RENDER -logfile /var/log/jitsi/jibri
root 284 0.0 0.1 15840 6560 ? Ss 16:57 0:00 /usr/sbin/sshd -D
jibri 309 0.0 0.0 5556 1632 ? Ss 16:57 0:00 /usr/bin/icewm-session
jibri 311 0.0 0.1 35636 7208 ? SNs 16:57 0:00 /usr/bin/icewmbg
jibri 313 0.0 0.1 42840 11828 ? Ss 16:57 0:00 /usr/bin/icewm --notify
root 359 0.0 0.0 4160 1972 ? Ss+ 16:57 0:00 /sbin/agetty -o -p – \u --noclear --keep-baud console 115200,38400,9600 linux
root 360 0.0 0.0 2408 1676 ? Ss+ 16:57 0:00 /sbin/agetty -o -p – \u --noclear tty2 linux
root 496 0.0 0.0 9596 2200 ? Ss 16:57 0:00 /usr/sbin/xinetd -pidfile /run/xinetd.pid -stayalive -inetd_compat -inetd_ipv6
root 759 0.0 0.1 16596 7800 ? Rs 16:58 0:00 sshd: root@pts/0
root 762 0.0 0.1 21144 9120 ? Ss 16:58 0:00 /lib/systemd/systemd --user
root 763 0.0 0.0 104904 2416 ? S 16:58 0:00 (sd-pam)
root 777 0.0 0.0 3988 3304 pts/0 Ss 16:58 0:00 -bash
jibri 1450 0.7 2.5 6357604 159864 ? Ssl 17:02 0:05 java -Djava.util.logging.config.file=/etc/jitsi/jibri/logging.properties -Dconfig.file=/etc/jitsi/jibr
root 3081 0.0 0.0 7628 2720 pts/0 R+ 17:15 0:00 ps aux
root@node98601-env-2068478:/home#

@Freddie I did created new file but I think I messed up with jicofo secret and jvb secret and getting the SASL auth error.

I’ll fix that and shall update you on results derived from new .asoundrc in couple of hours.

Thank you!

Ddid you check this

@emrah It’s running on Jelastic on native container cloud. Jibri is deployed as a standalone container to connect with Jitsi on another machine.

Jibri as a user can’t find the soundcard, but root user is detecting the device.

root on container or root on host?

@emrah root for container and host are same in this case. On Jelastic PaaS, it’s same since it’s native containers cloud.

@Freddie Good day!

I recreated the .asoundrc and tried again. Here is the output.

2021-02-23 04:05:55.928 INFO: [85] ffmpeg.call() [x11grab @ 0x564b8dfe8580] Stream #0: not enough frames to estimate rate; consider increasing probesize
2021-02-23 04:05:55.928 INFO: [85] ffmpeg.call() Input #0, x11grab, from ‘:0.0+0,0’:
2021-02-23 04:05:55.928 INFO: [85] ffmpeg.call() Duration: N/A, start: 1614053155.882163, bitrate: N/A
2021-02-23 04:05:55.928 INFO: [85] ffmpeg.call() Stream #0:0: Video: rawvideo (BGR[0] / 0x524742), bgr0, 1280x720, 30 fps, 1000k tbr, 1000k tbn, 1000k tbc
2021-02-23 04:05:55.978 INFO: [85] ffmpeg.call() ALSA lib pcm_direct.c:1824:(_snd_pcm_direct_get_slave_ipc_offset) Invalid value for card
2021-02-23 04:05:55.978 INFO: [85] ffmpeg.call() [alsa @ 0x564b8dff2c80] cannot open audio device plug:bsnoop (No such device)
2021-02-23 04:05:55.978 INFO: [85] ffmpeg.call() plug:bsnoop: Input/output error

I don’t know what Jelastic PaaS is and how it works. On LXC container I set the following config for container on the host to allow it to access sound devices

lxc.cgroup.devices.allow = c 116:* rwm
lxc.mount.entry = /dev/snd dev/snd none bind,optional,create=dir

@Freddie Any suggestions ?

I’m not by even a long stretch a docker expert, so your situation throws me for a loop. From what I understand, when you check as root, you find the sound module, but when you check as Jibri, you don’t. Jibri needs to be able to access the sound module for your recording to work. In my mind, I’m thinking your base infrastructure has the sound module installed, but your docker pod doesn’t. If so, you need to make sure you’re creating the module in the pod and not on your base infrastructure.

@Freddie Ok, but if I deploy the “full Jitsi stack conatiners” (Jitsi+prosody+jvb+jibri) it works on the same hots. Jibri works without any problem.

But if trying a standalone Jibri, this whole problem comes.

That suggests to me that the jibri pod has an outside dependency somehow. My thinking is that, with docker, everything you need to run the app/service should be packaged inside the pod itself.

Check the home directory within your Jibri docker pod. If the .asoundrc module is not there, create it. Make sure Jibri can access it. That should likely solve your problem.

@Freddie You’re right. .asoundrc is the key here. As root user I can access the devices, but as a Jibri user, can’t.

Somehow not been able to figure out why?

@Prashanth @Freddie @damencho Thank you for great guidance and help. I was able to sort that issue by making changes in rc.local. It’s working perfectly fine.

I’ve a follow-up question - Can I just deploy 2nd/3rd +++ jibri on separate machines pointing to same host where earlier 1st jibri is pointed ? In that way, will I be able to use all of them ???

Your inputs, suggestions needed to make it scalable.

Thanks

Yes, you can do that and have a jibri pool for recordings.

@Prashanth I thought so, where can I check the available pool of Jibri in brewery? At which path?

Because I tried 2 Jibri machines, but only 1 recorder worked.

When tried starting another recording I got error - All recorders are busy.

The second jibri the instance needs a different nickname in jibri.conf. you may either check jibri or jicofo logs.

In jibri log, you should see…"…Joined MUC: jibribrewery@internal.auth.yourdomain"

@Prashanth I did changed the nickname. Both Jibris were on and configured, but only one recording could be processed. In both Jibri logs I could see Joined MUC: jibribrewery@internal.auth.

I think the Jibribrewery need to be checked for available Jibri count there.

Any thoughts where Jibribrewry count can be checked?

Stop the first jibri(working one) and start recording. If the recording starts on second jibri, you have that correctly registered. Else, there’s some configuration that still needs to be changed

@Prashanth That I already did. The second one works fine too.