Docker+Jibri: cannot start recording baseUrl=<no value>

Hello there,

i’m facing an issue when i try to record with Jibri, here you have the docker compose logs:

jicofo_1   | INFO: Chat room event PresenceUpdated member=ChatMember[administrativeauditsinhibitclose@muc.meet.jitsi/628d55dc, jid: 628d55dc-d886-432e-8607-0d75fc91326e@meet.jitsi/jBsryp57]@1632617437
web_1      | 93.148.161.221 - - [01/Jul/2021:08:40:09 +0000] "GET /pwa-worker.js HTTP/2.0" 200 1499 "https://test-video.ionio.app/pwa-worker.js" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.114 Safari/537.36"
jicofo_1   | Jul 01, 2021 8:40:13 AM org.jitsi.utils.logging2.LoggerImpl log
jicofo_1   | INFO: Starting session with Jibri jibribrewery@internal-muc.meet.jitsi/jibri-instanse-230318422
jicofo_1   | Jul 01, 2021 8:40:13 AM org.jitsi.utils.logging2.LoggerImpl log
jicofo_1   | INFO: Starting Jibri jibribrewery@internal-muc.meet.jitsi/jibri-instanse-230318422 for stream ID: null in room: administrativeauditsinhibitclose@muc.meet.jitsi
jibri_1    | 2021-07-01 08:40:13.159 FINE: [112] org.jitsi.xmpp.mucclient.MucClient.log() Received an IQ with type set: IQ Stanza (jibri http://jitsi.org/protocol/jibri) [to=jibri@auth.meet.jitsi/1HkWXSz9,from=jibribrewery@internal-muc.meet.jitsi/focus,id=amlicmlAYXV0aC5tZWV0LmppdHNpLzFIa1dYU3o5AHB4VTllLTEwNjMAfTCgC9u7i1Yav6vesKJXxQ==,type=set,]
jibri_1    | 2021-07-01 08:40:13.160 INFO: [112] org.jitsi.jibri.api.xmpp.XmppApi.log() Received JibriIq <iq to='jibri@auth.meet.jitsi/1HkWXSz9' from='jibribrewery@internal-muc.meet.jitsi/focus' id='amlicmlAYXV0aC5tZWV0LmppdHNpLzFIa1dYU3o5AHB4VTllLTEwNjMAfTCgC9u7i1Yav6vesKJXxQ==' type='set'><jibri xmlns='http://jitsi.org/protocol/jibri' action='start' recording_mode='file' room='administrativeauditsinhibitclose@muc.meet.jitsi' session_id='ucstsyhfufcyjiqa' app_data='{"file_recording_metadata":{"share":true}}'/></iq> from environment [MucClient id=xmpp.meet.jitsi hostname=xmpp.meet.jitsi]
jibri_1    | 2021-07-01 08:40:13.161 INFO: [112] org.jitsi.jibri.api.xmpp.XmppApi.log() Received start request, starting service
jibri_1    | 2021-07-01 08:40:13.167 INFO: [112] org.jitsi.jibri.api.xmpp.XmppApi.log() Parsed call url info: CallUrlInfo(baseUrl=<no value>, callName=administrativeauditsinhibitclose, urlParams=[])
jibri_1    | 2021-07-01 08:40:13.167 INFO: [112] org.jitsi.jibri.JibriManager.log() Starting a file recording with params: FileRecordingRequestParams(callParams=CallParams(callUrlInfo=CallUrlInfo(baseUrl=<no value>, callName=administrativeauditsinhibitclose, urlParams=[]), email='', passcode=null, callStatsUsernameOverride=, displayName=), sessionId=ucstsyhfufcyjiqa, callLoginParams=XmppCredentials(domain=recorder.meet.jitsi, port=null, username=recorder, password=afd0f795f67d491aeaa01780ae48317e))
jibri_1    | 2021-07-01 08:40:13.168 FINE: [112] org.jitsi.jibri.capture.ffmpeg.FfmpegCapturer.log() Detected os as OS: LINUX
jibri_1    | 2021-07-01 08:40:13.169 FINE: [112] org.jitsi.jibri.config.log() ConfigSourceSupplier: Trying to retrieve key 'jibri.chrome.flags' from source 'config' as type kotlin.collections.List<kotlin.String>
jibri_1    | 2021-07-01 08:40:13.171 FINE: [112] org.jitsi.jibri.config.log() ConfigSourceSupplier: Successfully retrieved key 'jibri.chrome.flags' from source 'config' as type kotlin.collections.List<kotlin.String>
jibri_1    | Starting ChromeDriver 90.0.4430.24 (4c6d850f087da467d926e8eddb76550aed655991-refs/branch-heads/4430@{#429}) on port 1645
jibri_1    | Only local connections are allowed.
jibri_1    | Please see https://chromedriver.chromium.org/security-considerations for suggestions on keeping ChromeDriver safe.
jibri_1    | ChromeDriver was started successfully.
jibri_1    | 2021-07-01 08:40:13.725 INFO: [112] org.openqa.selenium.remote.ProtocolHandshake.createSession() Detected dialect: OSS
jibri_1    | 2021-07-01 08:40:13.732 FINE: [112] org.jitsi.jibri.config.log() FallbackSupplier: checking for value via suppliers:
jibri_1    |   LambdaSupplier: 'JibriConfig::recordingDirectory'
jibri_1    |   ConfigSourceSupplier: key: 'jibri.recording.recordings-directory', type: 'kotlin.String', source: 'config'
jibri_1    | 2021-07-01 08:40:13.732 FINE: [112] org.jitsi.jibri.config.log() LambdaSupplier: Trying to retrieve value via JibriConfig::recordingDirectory
jibri_1    | 2021-07-01 08:40:13.734 FINE: [112] org.jitsi.jibri.config.log() LambdaSupplier: 'JibriConfig::recordingDirectory': found value
jibri_1    | 2021-07-01 08:40:13.734 FINE: [112] org.jitsi.jibri.config.log() FallbackSupplier: value found via LambdaSupplier: 'JibriConfig::recordingDirectory'
jibri_1    | 2021-07-01 08:40:13.735 FINE: [112] org.jitsi.jibri.config.log() FallbackSupplier: checking for value via suppliers:
jibri_1    |   LambdaSupplier: 'JibriConfig::finalizeRecordingScriptPath'
jibri_1    |   ConfigSourceSupplier: key: 'jibri.recording.finalize-script', type: 'kotlin.String', source: 'config'
jibri_1    | 2021-07-01 08:40:13.735 FINE: [112] org.jitsi.jibri.config.log() LambdaSupplier: Trying to retrieve value via JibriConfig::finalizeRecordingScriptPath
jibri_1    | 2021-07-01 08:40:13.736 FINE: [112] org.jitsi.jibri.config.log() LambdaSupplier: 'JibriConfig::finalizeRecordingScriptPath': found value
jibri_1    | 2021-07-01 08:40:13.736 FINE: [112] org.jitsi.jibri.config.log() FallbackSupplier: value found via LambdaSupplier: 'JibriConfig::finalizeRecordingScriptPath'
jibri_1    | 2021-07-01 08:40:13.737 INFO: [112] org.jitsi.jibri.service.impl.FileRecordingJibriService.log() Writing recording to /config/recordings/ucstsyhfufcyjiqa, finalize script path /config/finalize.sh
jibri_1    | 2021-07-01 08:40:13.738 FINE: [112] org.jitsi.jibri.statsd.JibriStatsDClient.log() Incrementing statsd counter: start:recording
jibri_1    | 2021-07-01 08:40:13.740 INFO: [112] org.jitsi.jibri.status.JibriStatusManager.log() Busy status has changed: IDLE -> BUSY
jibri_1    | 2021-07-01 08:40:13.741 FINE: [112] org.jitsi.jibri.webhooks.v1.WebhookClient.log() Updating 0 subscribers of status
jibri_1    | 2021-07-01 08:40:13.741 INFO: [112] org.jitsi.jibri.api.xmpp.XmppApi.log() Jibri reports its status is now JibriStatus(busyStatus=BUSY, health=OverallHealth(healthStatus=HEALTHY, details={})), publishing presence to connections
jibri_1    | 2021-07-01 08:40:13.744 FINE: [112] org.jitsi.xmpp.mucclient.MucClientManager.log() Setting a presence extension: org.jitsi.xmpp.extensions.jibri.JibriStatusPacketExt@6851a5ac
jibri_1    | 2021-07-01 08:40:13.745 FINE: [112] org.jitsi.xmpp.mucclient.MucClientManager.log() Replacing presence extension: org.jitsi.xmpp.extensions.jibri.JibriStatusPacketExt@18dae049
jibri_1    | 2021-07-01 08:40:13.748 FINE: [37] org.jitsi.xmpp.extensions.DefaultPacketExtensionProvider.parse() Could not add a provider for element busy-status from namespace http://jitsi.org/protocol/jibri
jibri_1    | 2021-07-01 08:40:13.748 FINE: [37] org.jitsi.xmpp.extensions.DefaultPacketExtensionProvider.parse() Could not add a provider for element health-status from namespace http://jitsi.org/protocol/health
jibri_1    | 2021-07-01 08:40:13.755 INFO: [112] org.jitsi.jibri.api.xmpp.XmppApi.log() Sending 'pending' response to start IQ
jibri_1    | 2021-07-01 08:40:13.757 INFO: [124] org.jitsi.jibri.selenium.pageobjects.HomePage.log() Visiting url <no value>
jibri_1    | 2021-07-01 08:40:13.772 SEVERE: [124] org.jitsi.jibri.selenium.JibriSelenium.log() An error occurred while joining the call
jibri_1    | org.openqa.selenium.InvalidArgumentException: invalid argument
jibri_1    |   (Session info: chrome=90.0.4430.212)
jibri_1    |   (Driver info: chromedriver=90.0.4430.24 (4c6d850f087da467d926e8eddb76550aed655991-refs/branch-heads/4430@{#429}),platform=Linux 4.4.0-210-generic x86_64) (WARNING: The server did not provide any stacktrace information)
jibri_1    | Command duration or timeout: 0 milliseconds
jibri_1    | Build info: version: 'unknown', revision: 'unknown', time: 'unknown'
jibri_1    | System info: host: '29a0fdbc575c', ip: '192.168.176.6', os.name: 'Linux', os.arch: 'amd64', os.version: '4.4.0-210-generic', java.version: '1.8.0_292'
jibri_1    | Driver info: org.openqa.selenium.chrome.ChromeDriver
jibri_1    | Capabilities {acceptInsecureCerts: false, acceptSslCerts: false, applicationCacheEnabled: false, browserConnectionEnabled: false, browserName: chrome, chrome: {chromedriverVersion: 90.0.4430.24 (4c6d850f087da..., userDataDir: /tmp/.com.google.Chrome.FWBzYW}, cssSelectorsEnabled: true, databaseEnabled: false, goog:chromeOptions: {debuggerAddress: localhost:45278}, handlesAlerts: true, hasTouchScreen: false, javascriptEnabled: true, locationContextEnabled: true, mobileEmulationEnabled: false, nativeEvents: true, networkConnectionEnabled: false, pageLoadStrategy: normal, platform: LINUX, platformName: LINUX, proxy: Proxy(), rotatable: false, setWindowRect: true, strictFileInteractability: false, takesHeapSnapshot: true, takesScreenshot: true, timeouts: {implicit: 0, pageLoad: 300000, script: 30000}, unexpectedAlertBehaviour: ignore, unhandledPromptBehavior: ignore, version: 90.0.4430.212, webStorageEnabled: true, webauthn:extension:largeBlob: true, webauthn:virtualAuthenticators: true}
jibri_1    | Session ID: 1a4dee4f4fed0a02797a539925903419
jibri_1    | 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
jibri_1    | 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
jibri_1    | 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
jibri_1    | 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
jibri_1    | 	at org.openqa.selenium.remote.ErrorHandler.createThrowable(ErrorHandler.java:214)
jibri_1    | 	at org.openqa.selenium.remote.ErrorHandler.throwIfResponseFailed(ErrorHandler.java:166)
jibri_1    | 	at org.openqa.selenium.remote.http.JsonHttpResponseCodec.reconstructValue(JsonHttpResponseCodec.java:40)
jibri_1    | 	at org.openqa.selenium.remote.http.AbstractHttpResponseCodec.decode(AbstractHttpResponseCodec.java:80)
jibri_1    | 	at org.openqa.selenium.remote.http.AbstractHttpResponseCodec.decode(AbstractHttpResponseCodec.java:44)
jibri_1    | 	at org.openqa.selenium.remote.HttpCommandExecutor.execute(HttpCommandExecutor.java:158)
jibri_1    | 	at org.openqa.selenium.remote.service.DriverCommandExecutor.execute(DriverCommandExecutor.java:83)
jibri_1    | 	at org.openqa.selenium.remote.RemoteWebDriver.execute(RemoteWebDriver.java:543)
jibri_1    | 	at org.openqa.selenium.remote.RemoteWebDriver.get(RemoteWebDriver.java:271)
jibri_1    | 	at org.jitsi.jibri.selenium.pageobjects.AbstractPageObject.visit(AbstractPageObject.kt:35)
jibri_1    | 	at org.jitsi.jibri.selenium.JibriSelenium$joinCall$1.run(JibriSelenium.kt:278)
jibri_1    | 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
jibri_1    | 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
jibri_1    | 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
jibri_1    | 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
jibri_1    | 	at java.lang.Thread.run(Thread.java:748)
jibri_1    | 2021-07-01 08:40:13.773 INFO: [124] org.jitsi.jibri.selenium.JibriSelenium.log() Transitioning from state Starting up to Error: FailedToJoinCall SESSION Failed to join the call
jibri_1    | 2021-07-01 08:40:13.773 INFO: [124] org.jitsi.jibri.service.impl.FileRecordingJibriService.log() File recording service transitioning from state Starting up to Error: FailedToJoinCall SESSION Failed to join the call
jibri_1    | 2021-07-01 08:40:13.775 INFO: [124] org.jitsi.jibri.api.xmpp.XmppApi.log() Current service had an error Error: FailedToJoinCall SESSION Failed to join the call, sending error iq <iq to='jibribrewery@internal-muc.meet.jitsi/focus' id='wZYHh-154' type='set'><jibri xmlns='http://jitsi.org/protocol/jibri' status='off' failure_reason='error' should_retry='true'/></iq>
jibri_1    | 2021-07-01 08:40:13.777 FINE: [124] org.jitsi.jibri.statsd.JibriStatsDClient.log() Incrementing statsd counter: stop:recording
jibri_1    | 2021-07-01 08:40:13.778 INFO: [124] org.jitsi.jibri.JibriManager.log() Stopping the current service
jibri_1    | 2021-07-01 08:40:13.778 INFO: [124] org.jitsi.jibri.service.impl.FileRecordingJibriService.log() Stopping capturer
jibri_1    | 2021-07-01 08:40:13.778 INFO: [124] org.jitsi.jibri.util.JibriSubprocess.log() Stopping ffmpeg process
jibri_1    | 2021-07-01 08:40:13.779 INFO: [124] org.jitsi.jibri.util.JibriSubprocess.log() ffmpeg exited with value null
jibri_1    | 2021-07-01 08:40:13.779 INFO: [124] org.jitsi.jibri.service.impl.FileRecordingJibriService.log() Quitting selenium
jicofo_1   | Jul 01, 2021 8:40:13 AM org.jitsi.utils.logging2.LoggerImpl log
jicofo_1   | INFO: Updating status from JIBRI: <iq to='focus@auth.meet.jitsi/focus' from='jibribrewery@internal-muc.meet.jitsi/jibri-instanse-230318422' id='pxU9e-1063' type='result'><jibri xmlns='http://jitsi.org/protocol/jibri' status='pending'/></iq> for administrativeauditsinhibitclose@muc.meet.jitsi
jicofo_1   | Jul 01, 2021 8:40:13 AM org.jitsi.utils.logging2.LoggerImpl log
jicofo_1   | INFO: Got Jibri status update: Jibri jibribrewery@internal-muc.meet.jitsi/jibri-instanse-230318422 has status pending and failure reason null, current Jibri jid is jibribrewery@internal-muc.meet.jitsi/jibri-instanse-230318422
jicofo_1   | Jul 01, 2021 8:40:13 AM org.jitsi.utils.logging2.LoggerImpl log
jicofo_1   | INFO: Started Jibri session
jibri_1    | 2021-07-01 08:40:13.793 INFO: [124] org.jitsi.jibri.service.impl.FileRecordingJibriService.log() Participants in this recording: []
jicofo_1   | Jul 01, 2021 8:40:13 AM org.jitsi.utils.logging2.LoggerImpl log
jicofo_1   | INFO: Updating status from JIBRI: <iq to='focus@auth.meet.jitsi/focus' from='jibribrewery@internal-muc.meet.jitsi/jibri-instanse-230318422' id='Zm9jdXNAYXV0aC5tZWV0LmppdHNpL2ZvY3VzAHdaWUhoLTE1NAAB4O9ZPrdC0KJW6EKN3ttk' type='set'><jibri xmlns='http://jitsi.org/protocol/jibri' status='off' failure_reason='error' should_retry='true'/></iq> for administrativeauditsinhibitclose@muc.meet.jitsi
jicofo_1   | Jul 01, 2021 8:40:13 AM org.jitsi.utils.logging2.LoggerImpl log
jicofo_1   | INFO: Got Jibri status update: Jibri jibribrewery@internal-muc.meet.jitsi/jibri-instanse-230318422 has status off and failure reason error, current Jibri jid is jibribrewery@internal-muc.meet.jitsi/jibri-instanse-230318422
jicofo_1   | Jul 01, 2021 8:40:13 AM org.jitsi.utils.logging2.LoggerImpl log
jicofo_1   | INFO: Jibri failed, trying to fall back to another Jibri
jicofo_1   | Jul 01, 2021 8:40:13 AM org.jitsi.utils.logging2.LoggerImpl log
jicofo_1   | SEVERE: Unable to find an available Jibri, can't start
jicofo_1   | Jul 01, 2021 8:40:13 AM org.jitsi.utils.logging2.LoggerImpl log
jicofo_1   | WARNING: Failed to fall back to another Jibri, this session has now failed: org.jitsi.jicofo.jibri.JibriSession$StartException$AllBusy: All jibri instances are busy
jicofo_1   | org.jitsi.jicofo.jibri.JibriSession$StartException$AllBusy: All jibri instances are busy
jicofo_1   | 	at org.jitsi.jicofo.jibri.JibriSession.startInternal(JibriSession.java:308)
jicofo_1   | 	at org.jitsi.jicofo.jibri.JibriSession.start(JibriSession.java:284)
jicofo_1   | 	at org.jitsi.jicofo.jibri.JibriSession.retryRequestWithAnotherJibri(JibriSession.java:586)
jicofo_1   | 	at org.jitsi.jicofo.jibri.JibriSession.handleJibriStatusUpdate(JibriSession.java:654)
jicofo_1   | 	at org.jitsi.jicofo.jibri.JibriSession.processJibriIqFromJibri(JibriSession.java:441)
jicofo_1   | 	at org.jitsi.jicofo.jibri.JibriSession.processJibriIqRequestFromJibri(JibriSession.java:425)
jicofo_1   | 	at org.jitsi.jicofo.jibri.BaseJibri.doHandleIQRequest(BaseJibri.kt:154)
jicofo_1   | 	at org.jitsi.jicofo.jibri.BaseJibri.access$doHandleIQRequest(BaseJibri.kt:42)
jicofo_1   | 	at org.jitsi.jicofo.jibri.BaseJibri$incomingIqQueue$1.handlePacket(BaseJibri.kt:53)
jicofo_1   | 	at org.jitsi.jicofo.jibri.BaseJibri$incomingIqQueue$1.handlePacket(BaseJibri.kt:42)
jicofo_1   | 	at org.jitsi.utils.queue.PacketQueue$HandlerAdapter.handleItem(PacketQueue.java:380)
jicofo_1   | 	at org.jitsi.utils.queue.AsyncQueueHandler$1.run(AsyncQueueHandler.java:133)
jicofo_1   | 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
jicofo_1   | 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
jicofo_1   | 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
jicofo_1   | 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
jicofo_1   | 	at java.lang.Thread.run(Thread.java:748)
jicofo_1   |
jicofo_1   | Jul 01, 2021 8:40:13 AM org.jitsi.utils.logging2.LoggerImpl log
jicofo_1   | INFO: Got jibri status off and failure error
jicofo_1   | Jul 01, 2021 8:40:13 AM org.jitsi.utils.logging2.LoggerImpl log
jicofo_1   | INFO: Publishing new jibri-recording-status: <jibri-recording-status xmlns='http://jitsi.org/protocol/jibri' status='off' failure_reason='error' session_id='ucstsyhfufcyjiqa' recording_mode='file'/> in: administrativeauditsinhibitclose@muc.meet.jitsi
jicofo_1   | Jul 01, 2021 8:40:13 AM org.jitsi.utils.logging2.LoggerImpl log
jicofo_1   | INFO: Cleaning up current JibriSession
jibri_1    | 2021-07-01 08:40:13.808 INFO: [124] org.jitsi.jibri.selenium.JibriSelenium.log() Leaving call and quitting browser
jibri_1    | 2021-07-01 08:40:13.809 INFO: [124] org.jitsi.jibri.selenium.JibriSelenium.log() Recurring call status checks cancelled
jibri_1    | 2021-07-01 08:40:13.817 INFO: [124] org.jitsi.jibri.selenium.JibriSelenium.log() Got 0 log entries for type browser
jibri_1    | 2021-07-01 08:40:13.826 INFO: [124] org.jitsi.jibri.selenium.JibriSelenium.log() Got 81 log entries for type driver
jibri_1    | 2021-07-01 08:40:13.843 INFO: [124] org.jitsi.jibri.selenium.JibriSelenium.log() Got 0 log entries for type client
jibri_1    | 2021-07-01 08:40:13.844 INFO: [124] org.jitsi.jibri.selenium.JibriSelenium.log() Leaving web call
jibri_1    | 2021-07-01 08:40:13.860 INFO: [124] org.jitsi.jibri.selenium.JibriSelenium.log() Quitting chrome driver
jibri_1    | 2021-07-01 08:40:13.929 INFO: [124] org.jitsi.jibri.selenium.JibriSelenium.log() Chrome driver quit
jibri_1    | 2021-07-01 08:40:13.930 INFO: [124] org.jitsi.jibri.service.impl.FileRecordingJibriService.log() Finalizing the recording
jibri_1    | 2021-07-01 08:40:13.931 SEVERE: [124] org.jitsi.jibri.service.impl.FileRecordingJibriService.log() Failed to run finalize script
jibri_1    | java.io.IOException: Cannot run program "/config/finalize.sh": error=2, No such file or directory
jibri_1    | 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
jibri_1    | 	at org.jitsi.jibri.util.ProcessWrapper.start(ProcessWrapper.kt:88)
jibri_1    | 	at org.jitsi.jibri.service.impl.FileRecordingJibriService.finalize(FileRecordingJibriService.kt:220)
jibri_1    | 	at org.jitsi.jibri.service.impl.FileRecordingJibriService.stop(FileRecordingJibriService.kt:205)
jibri_1    | 	at org.jitsi.jibri.JibriManager.stopService(JibriManager.kt:263)
jibri_1    | 	at org.jitsi.jibri.JibriManager$startService$1.invoke(JibriManager.kt:211)
jibri_1    | 	at org.jitsi.jibri.JibriManager$startService$1.invoke(JibriManager.kt:85)
jibri_1    | 	at org.jitsi.jibri.util.StatusPublisher$addStatusHandler$1.invoke(StatusPublisher.kt:37)
jibri_1    | 	at org.jitsi.jibri.util.StatusPublisher$addStatusHandler$1.invoke(StatusPublisher.kt:29)
jibri_1    | 	at org.jitsi.jibri.util.StatusPublisher$publishStatus$1.invoke(StatusPublisher.kt:53)
jibri_1    | 	at org.jitsi.jibri.util.StatusPublisher$publishStatus$1.invoke(StatusPublisher.kt:29)
jibri_1    | 	at kotlin.collections.CollectionsKt__MutableCollectionsKt.filterInPlace$CollectionsKt__MutableCollectionsKt(MutableCollections.kt:285)
jibri_1    | 	at kotlin.collections.CollectionsKt__MutableCollectionsKt.retainAll(MutableCollections.kt:276)
jibri_1    | 	at org.jitsi.jibri.util.StatusPublisher.publishStatus(StatusPublisher.kt:53)
jibri_1    | 	at org.jitsi.jibri.service.impl.StatefulJibriService.onServiceStateChange(StatefulJibriService.kt:40)
jibri_1    | 	at org.jitsi.jibri.service.impl.StatefulJibriService.access$onServiceStateChange(StatefulJibriService.kt:26)
jibri_1    | 	at org.jitsi.jibri.service.impl.StatefulJibriService$1.invoke(StatefulJibriService.kt:35)
jibri_1    | 	at org.jitsi.jibri.service.impl.StatefulJibriService$1.invoke(StatefulJibriService.kt:26)
jibri_1    | 	at org.jitsi.jibri.util.NotifyingStateMachine.notify(NotifyingStateMachine.kt:26)
jibri_1    | 	at org.jitsi.jibri.service.JibriServiceStateMachine.access$notify(JibriServiceStateMachine.kt:46)
jibri_1    | 	at org.jitsi.jibri.service.JibriServiceStateMachine$stateMachine$1$5.invoke(JibriServiceStateMachine.kt:100)
jibri_1    | 	at org.jitsi.jibri.service.JibriServiceStateMachine$stateMachine$1$5.invoke(JibriServiceStateMachine.kt:46)
jibri_1    | 	at com.tinder.StateMachine.notifyOnTransition(StateMachine.kt:65)
jibri_1    | 	at com.tinder.StateMachine.transition(StateMachine.kt:23)
jibri_1    | 	at org.jitsi.jibri.service.JibriServiceStateMachine.transition(JibriServiceStateMachine.kt:112)
jibri_1    | 	at org.jitsi.jibri.service.impl.StatefulJibriService$registerSubComponent$1.invoke(StatefulJibriService.kt:46)
jibri_1    | 	at org.jitsi.jibri.service.impl.StatefulJibriService$registerSubComponent$1.invoke(StatefulJibriService.kt:26)
jibri_1    | 	at org.jitsi.jibri.util.StatusPublisher$addStatusHandler$1.invoke(StatusPublisher.kt:37)
jibri_1    | 	at org.jitsi.jibri.util.StatusPublisher$addStatusHandler$1.invoke(StatusPublisher.kt:29)
jibri_1    | 	at org.jitsi.jibri.util.StatusPublisher$publishStatus$1.invoke(StatusPublisher.kt:53)
jibri_1    | 	at org.jitsi.jibri.util.StatusPublisher$publishStatus$1.invoke(StatusPublisher.kt:29)
jibri_1    | 	at kotlin.collections.CollectionsKt__MutableCollectionsKt.filterInPlace$CollectionsKt__MutableCollectionsKt(MutableCollections.kt:285)
jibri_1    | 	at kotlin.collections.CollectionsKt__MutableCollectionsKt.retainAll(MutableCollections.kt:276)
jibri_1    | 	at org.jitsi.jibri.util.StatusPublisher.publishStatus(StatusPublisher.kt:53)
jibri_1    | 	at org.jitsi.jibri.selenium.JibriSelenium.onSeleniumStateChange(JibriSelenium.kt:208)
jibri_1    | 	at org.jitsi.jibri.selenium.JibriSelenium.access$onSeleniumStateChange(JibriSelenium.kt:158)
jibri_1    | 	at org.jitsi.jibri.selenium.JibriSelenium$1.invoke(JibriSelenium.kt:193)
jibri_1    | 	at org.jitsi.jibri.selenium.JibriSelenium$1.invoke(JibriSelenium.kt:158)
jibri_1    | 	at org.jitsi.jibri.util.NotifyingStateMachine.notify(NotifyingStateMachine.kt:26)
jibri_1    | 	at org.jitsi.jibri.selenium.SeleniumStateMachine.access$notify(SeleniumStateMachine.kt:33)
jibri_1    | 	at org.jitsi.jibri.selenium.SeleniumStateMachine$stateMachine$1$5.invoke(SeleniumStateMachine.kt:78)
jibri_1    | 	at org.jitsi.jibri.selenium.SeleniumStateMachine$stateMachine$1$5.invoke(SeleniumStateMachine.kt:33)
jibri_1    | 	at com.tinder.StateMachine.notifyOnTransition(StateMachine.kt:65)
jibri_1    | 	at com.tinder.StateMachine.transition(StateMachine.kt:23)
jibri_1    | 	at org.jitsi.jibri.selenium.SeleniumStateMachine.transition(SeleniumStateMachine.kt:83)
jibri_1    | 	at org.jitsi.jibri.selenium.JibriSelenium$joinCall$1.run(JibriSelenium.kt:311)
jibri_1    | 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
jibri_1    | 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
jibri_1    | 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
jibri_1    | 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
jibri_1    | 	at java.lang.Thread.run(Thread.java:748)
jibri_1    | Caused by: java.io.IOException: error=2, No such file or directory
jibri_1    | 	at java.lang.UNIXProcess.forkAndExec(Native Method)
jibri_1    | 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:247)
jibri_1    | 	at java.lang.ProcessImpl.start(ProcessImpl.java:134)
jibri_1    | 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
jibri_1    | 	... 50 more
jibri_1    | 2021-07-01 08:40:13.932 INFO: [124] org.jitsi.jibri.status.JibriStatusManager.log() Busy status has changed: BUSY -> IDLE
jibri_1    | 2021-07-01 08:40:13.932 FINE: [124] org.jitsi.jibri.webhooks.v1.WebhookClient.log() Updating 0 subscribers of status
jibri_1    | 2021-07-01 08:40:13.933 INFO: [124] org.jitsi.jibri.api.xmpp.XmppApi.log() Jibri reports its status is now JibriStatus(busyStatus=IDLE, health=OverallHealth(healthStatus=HEALTHY, details={})), publishing presence to connections
jibri_1    | 2021-07-01 08:40:13.933 FINE: [124] org.jitsi.xmpp.mucclient.MucClientManager.log() Setting a presence extension: org.jitsi.xmpp.extensions.jibri.JibriStatusPacketExt@4aa51919
jibri_1    | 2021-07-01 08:40:13.934 FINE: [124] org.jitsi.xmpp.mucclient.MucClientManager.log() Replacing presence extension: org.jitsi.xmpp.extensions.jibri.JibriStatusPacketExt@6851a5ac
jibri_1    | 2021-07-01 08:40:13.938 FINE: [37] org.jitsi.xmpp.extensions.DefaultPacketExtensionProvider.parse() Could not add a provider for element busy-status from namespace http://jitsi.org/protocol/jibri
jibri_1    | 2021-07-01 08:40:13.938 FINE: [37] org.jitsi.xmpp.extensions.DefaultPacketExtensionProvider.parse() Could not add a provider for element health-status from namespace http://jitsi.org/protocol/health

I would guess that:
jibri_1 | 2021-07-01 08:40:13.167 INFO: [112] org.jitsi.jibri.api.xmpp.XmppApi.log() Parsed call url info: CallUrlInfo(baseUrl=<no value>, callName=administrativeauditsinhibitclose, urlParams=[])

causes

org.jitsi.jibri.selenium.pageobjects.HomePage.log() Visiting url <no value>
org.openqa.selenium.InvalidArgumentException: invalid argument

But i’m not sure b/c i think this should happen in the internal docker network :thinking:

Here you have my .env file:

# shellcheck disable=SC203f

# Security

# Set these to strong passwords to avoid intruders from impersonating a service account
# The service(s) won't start unless these are specified
# Running ./gen-passwords.sh will update .env with strong passwords
# You may skip the Jigasi and Jibri passwords if you are not using those
# DO NOT reuse passwords
#

# XMPP component password for Jicofo
JICOFO_COMPONENT_SECRET=***

# XMPP password for Jicofo client connections
JICOFO_AUTH_PASSWORD=***

# XMPP password for JVB client connections
JVB_AUTH_PASSWORD=***

# XMPP password for Jigasi MUC client connections
JIGASI_XMPP_PASSWORD=***

# XMPP recorder password for Jibri client connections
JIBRI_RECORDER_PASSWORD=***

# XMPP password for Jibri client connections
JIBRI_XMPP_PASSWORD=***


#
# Basic configuration options
#

# Directory where all configuration will be stored
CONFIG=~/.jitsi-meet-cfg

# Exposed HTTP port
HTTP_PORT=80

# Exposed HTTPS port
HTTPS_PORT=443

# System time zone
TZ=UTC

# Public URL for the web service (required)
PUBLIC_URL=https://test-video.my.domain

# IP address of the Docker host
# See the "Running behind NAT or on a LAN environment" section in the Handbook:
# https://jitsi.github.io/handbook/docs/devops-guide/devops-guide-docker#running-behind-nat-or-on-a-lan-environment
#DOCKER_HOST_ADDRESS=172.31.36.192

# Control whether the lobby feature should be enabled or not
#ENABLE_LOBBY=1

# Show a prejoin page before entering a conference
#ENABLE_PREJOIN_PAGE=0

# Enable the welcome page
#ENABLE_WELCOME_PAGE=1

# Enable the close page
#ENABLE_CLOSE_PAGE=0

# Disable measuring of audio levels
#DISABLE_AUDIO_LEVELS=0

# Enable noisy mic detection
#ENABLE_NOISY_MIC_DETECTION=1

#
# Let's Encrypt configuration
#

# Enable Let's Encrypt certificate generation
ENABLE_LETSENCRYPT=1

# Domain for which to generate the certificate
LETSENCRYPT_DOMAIN=test-video.my.domain

# E-Mail for receiving important account notifications (mandatory)
LETSENCRYPT_EMAIL=smt@gmail.com

# Use the staging server (for avoiding rate limits while testing)
LETSENCRYPT_USE_STAGING=0


#
# Etherpad integration (for document sharing)
#

# Set etherpad-lite URL in docker local network (uncomment to enable)
ETHERPAD_URL_BASE=https://wbo.ophir.dev/boards/

# Set etherpad-lite public URL (uncomment to enable)
ETHERPAD_PUBLIC_URL=https://wbo.ophir.dev/boards/

# Name your etherpad instance!
#ETHERPAD_TITLE=Video Chat

# The default text of a pad
#ETHERPAD_DEFAULT_PAD_TEXT=Welcome to Web Chat!\n\n

# Name of the skin for etherpad
#ETHERPAD_SKIN_NAME=colibris

# Skin variants for etherpad
#ETHERPAD_SKIN_VARIANTS=super-light-toolbar super-light-editor light-background full-width-editor


#
# Basic Jigasi configuration options (needed for SIP gateway support)
#

# SIP URI for incoming / outgoing calls
#JIGASI_SIP_URI=test@sip2sip.info

# Password for the specified SIP account as a clear text
#JIGASI_SIP_PASSWORD=passw0rd

# SIP server (use the SIP account domain if in doubt)
#JIGASI_SIP_SERVER=sip2sip.info

# SIP server port
#JIGASI_SIP_PORT=5060

# SIP server transport
#JIGASI_SIP_TRANSPORT=UDP

#
# Authentication configuration (see handbook for details)
#

# Enable authentication
ENABLE_AUTH=0

# Enable guest access
#ENABLE_GUESTS=0

# Select authentication type: internal, jwt or ldap
AUTH_TYPE=internal

# JWT authentication
#

# Application identifier
#JWT_APP_ID=my_jitsi_app_id

# Application secret known only to your token
#JWT_APP_SECRET=my_jitsi_app_secret

# (Optional) Set asap_accepted_issuers as a comma separated list
#JWT_ACCEPTED_ISSUERS=my_web_client,my_app_client

# (Optional) Set asap_accepted_audiences as a comma separated list
#JWT_ACCEPTED_AUDIENCES=my_server1,my_server2


# LDAP authentication (for more information see the Cyrus SASL saslauthd.conf man page)
#

# LDAP url for connection
#LDAP_URL=ldaps://ldap.domain.com/

# LDAP base DN. Can be empty
#LDAP_BASE=DC=example,DC=domain,DC=com

# LDAP user DN. Do not specify this parameter for the anonymous bind
#LDAP_BINDDN=CN=binduser,OU=users,DC=example,DC=domain,DC=com

# LDAP user password. Do not specify this parameter for the anonymous bind
#LDAP_BINDPW=LdapUserPassw0rd

# LDAP filter. Tokens example:
# %1-9 - if the input key is user@mail.domain.com, then %1 is com, %2 is domain and %3 is mail
# %s - %s is replaced by the complete service string
# %r - %r is replaced by the complete realm string
#LDAP_FILTER=(sAMAccountName=%u)

# LDAP authentication method
#LDAP_AUTH_METHOD=bind

# LDAP version
#LDAP_VERSION=3

# LDAP TLS using
#LDAP_USE_TLS=1

# List of SSL/TLS ciphers to allow
#LDAP_TLS_CIPHERS=SECURE256:SECURE128:!AES-128-CBC:!ARCFOUR-128:!CAMELLIA-128-CBC:!3DES-CBC:!CAMELLIA-128-CBC

# Require and verify server certificate
#LDAP_TLS_CHECK_PEER=1

# Path to CA cert file. Used when server certificate verify is enabled
#LDAP_TLS_CACERT_FILE=/etc/ssl/certs/ca-certificates.crt

# Path to CA certs directory. Used when server certificate verify is enabled
#LDAP_TLS_CACERT_DIR=/etc/ssl/certs

# Wether to use starttls, implies LDAPv3 and requires ldap:// instead of ldaps://
# LDAP_START_TLS=1


#
# Advanced configuration options (you generally don't need to change these)
#

# Internal XMPP domain
XMPP_DOMAIN=meet.jitsi

# Internal XMPP server
XMPP_SERVER=xmpp.meet.jitsi

# Internal XMPP server URL
XMPP_BOSH_URL_BASE=http://xmpp.meet.jitsi:5280

# Internal XMPP domain for authenticated services
XMPP_AUTH_DOMAIN=auth.meet.jitsi

# XMPP domain for the MUC
XMPP_MUC_DOMAIN=muc.meet.jitsi

# XMPP domain for the internal MUC used for jibri, jigasi and jvb pools
XMPP_INTERNAL_MUC_DOMAIN=internal-muc.meet.jitsi

# XMPP domain for unauthenticated users
XMPP_GUEST_DOMAIN=guest.meet.jitsi

# Comma separated list of domains for cross domain policy or "true" to allow all
# The PUBLIC_URL is always allowed
#XMPP_CROSS_DOMAIN=true

# Custom Prosody modules for XMPP_DOMAIN (comma separated)
XMPP_MODULES=

# Custom Prosody modules for MUC component (comma separated)
XMPP_MUC_MODULES=

# Custom Prosody modules for internal MUC component (comma separated)
XMPP_INTERNAL_MUC_MODULES=

# MUC for the JVB pool
JVB_BREWERY_MUC=jvbbrewery

# XMPP user for JVB client connections
JVB_AUTH_USER=jvb

# STUN servers used to discover the server's public IP
JVB_STUN_SERVERS=meet-jit-si-turnrelay.jitsi.net:443

# Media port for the Jitsi Videobridge
JVB_PORT=10000

# TCP Fallback for Jitsi Videobridge for when UDP isn't available
JVB_TCP_HARVESTER_DISABLED=true
JVB_TCP_PORT=4443
JVB_TCP_MAPPED_PORT=4443

# A comma separated list of APIs to enable when the JVB is started [default: none]
# See https://github.com/jitsi/jitsi-videobridge/blob/master/doc/rest.md for more information
#JVB_ENABLE_APIS=rest,colibri

# XMPP user for Jicofo client connections.
# NOTE: this option doesn't currently work due to a bug
JICOFO_AUTH_USER=focus

# Base URL of Jicofo's reservation REST API
#JICOFO_RESERVATION_REST_BASE_URL=http://reservation.example.com

# Enable Jicofo's health check REST API (http://<jicofo_base_url>:8888/about/health)
#JICOFO_ENABLE_HEALTH_CHECKS=true

# XMPP user for Jigasi MUC client connections
JIGASI_XMPP_USER=jigasi

# MUC name for the Jigasi pool
JIGASI_BREWERY_MUC=jigasibrewery

# Minimum port for media used by Jigasi
JIGASI_PORT_MIN=20000

# Maximum port for media used by Jigasi
JIGASI_PORT_MAX=20050

# Enable SDES srtp
#JIGASI_ENABLE_SDES_SRTP=1

# Keepalive method
#JIGASI_SIP_KEEP_ALIVE_METHOD=OPTIONS

# Health-check extension
#JIGASI_HEALTH_CHECK_SIP_URI=keepalive

# Health-check interval
#JIGASI_HEALTH_CHECK_INTERVAL=300000
#
# Enable Jigasi transcription
#ENABLE_TRANSCRIPTIONS=1

# Jigasi will record audio when transcriber is on [default: false]
#JIGASI_TRANSCRIBER_RECORD_AUDIO=true

# Jigasi will send transcribed text to the chat when transcriber is on [default: false]
#JIGASI_TRANSCRIBER_SEND_TXT=true

# Jigasi will post an url to the chat with transcription file [default: false]
#JIGASI_TRANSCRIBER_ADVERTISE_URL=true

# Credentials for connect to Cloud Google API from Jigasi
# Please read https://cloud.google.com/text-to-speech/docs/quickstart-protocol
# section "Before you begin" paragraph 1 to 5
# Copy the values from the json to the related env vars
#GC_PROJECT_ID=
#GC_PRIVATE_KEY_ID=
#GC_PRIVATE_KEY=
#GC_CLIENT_EMAIL=
#GC_CLIENT_ID=
#GC_CLIENT_CERT_URL=

# Enable recording
ENABLE_RECORDING=1

# XMPP domain for the jibri recorder
XMPP_RECORDER_DOMAIN=recorder.meet.jitsi

# XMPP recorder user for Jibri client connections
JIBRI_RECORDER_USER=recorder

# Directory for recordings inside Jibri container
JIBRI_RECORDING_DIR=/config/recordings

# The finalizing script. Will run after recording is complete
JIBRI_FINALIZE_RECORDING_SCRIPT_PATH=/config/finalize.sh

# XMPP user for Jibri client connections
JIBRI_XMPP_USER=jibri

# MUC name for the Jibri pool
JIBRI_BREWERY_MUC=jibribrewery

# MUC connection timeout
JIBRI_PENDING_TIMEOUT=90

# When jibri gets a request to start a service for a room, the room
# jid will look like: roomName@optional.prefixes.subdomain.xmpp_domain
# We'll build the url for the call by transforming that into:
# https://xmpp_domain/subdomain/roomName
# So if there are any prefixes in the jid (like jitsi meet, which
# has its participants join a muc at conference.xmpp_domain) then
# list that prefix here so it can be stripped out to generate
# the call url correctly
JIBRI_STRIP_DOMAIN_JID=muc

# Directory for logs inside Jibri container
JIBRI_LOGS_DIR=/config/logs

# Disable HTTPS: handle TLS connections outside of this setup
#DISABLE_HTTPS=1

# Redirect HTTP traffic to HTTPS
# Necessary for Let's Encrypt, relies on standard HTTPS port (443)
#ENABLE_HTTP_REDIRECT=1

# Send a `strict-transport-security` header to force browsers to use
# a secure and trusted connection. Recommended for production use.
# Defaults to 1 (send the header).
# ENABLE_HSTS=1

# Enable IPv6
# Provides means to disable IPv6 in environments that don't support it (get with the times, people!)
#ENABLE_IPV6=1

# Container restart policy
# Defaults to unless-stopped
RESTART_POLICY=unless-stopped

# Authenticate using external service or just focus external auth window if there is one already.
# TOKEN_AUTH_URL=https://auth.meet.example.com/{room}

and my jibri config.json:

    "recording_directory":"{{ .Env.JIBRI_RECORDING_DIR }}",
    // The path to the script which will be run on completed recordings
{{ if .Env.JIBRI_FINALIZE_RECORDING_SCRIPT_PATH -}}
    "finalize_recording_script_path": "{{ .Env.JIBRI_FINALIZE_RECORDING_SCRIPT_PATH }}",
{{ end -}}
    "xmpp_environments": [
        {
            // A friendly name for this environment which can be used
            //  for logging, stats, etc.
            "name": "prod environment",
            // The hosts of the XMPP servers to connect to as part of
            //  this environment
            "xmpp_server_hosts": [
                "{{ .Env.XMPP_SERVER }}"
            ],
            "xmpp_domain": "{{ .Env.XMPP_DOMAIN }}",
            // Jibri will login to the xmpp server as a privileged user
            "control_login": {
                "domain": "{{ .Env.XMPP_AUTH_DOMAIN }}",
                // The credentials for logging in
                "username": "{{ .Env.JIBRI_XMPP_USER }}",
                "password": "{{ .Env.JIBRI_XMPP_PASSWORD }}"
            },
            // Using the control_login information above, Jibri will join
            //  a control muc as a means of announcing its availability
            //  to provide services for a given environment
            "control_muc": {
                "domain": "{{ .Env.XMPP_INTERNAL_MUC_DOMAIN }}",
                "room_name": "{{ .Env.JIBRI_BREWERY_MUC }}",
                // MUST be unic for every instanse
                "nickname": "jibri-instanse-{{ .Env.JIBRI_INSTANCE_ID }}"
            },
            // All participants in a call join a muc so they can exchange
            //  information.  Jibri can be instructed to join a special muc
            //  with credentials to give it special abilities (e.g. not being
            //  displayed to other users like a normal participant)
            "call_login": {
                "domain": "{{ .Env.XMPP_RECORDER_DOMAIN }}",
                "username": "{{ .Env.JIBRI_RECORDER_USER }}",
                "password": "{{ .Env.JIBRI_RECORDER_PASSWORD }}"
            },
            // When jibri gets a request to start a service for a room, the room
            //  jid will look like:
            //  roomName@optional.prefixes.subdomain.xmpp_domain
            // We'll build the url for the call by transforming that into:
            //  https://xmpp_domain/subdomain/roomName
            // So if there are any prefixes in the jid (like jitsi meet, which
            //  has its participants join a muc at conference.xmpp_domain) then
            //  list that prefix here so it can be stripped out to generate
            //  the call url correctly
            "room_jid_domain_string_to_strip_from_start": "{{ .Env.JIBRI_STRIP_DOMAIN_JID }}.",
            // The amount of time, in minutes, a service is allowed to continue.
            //  Once a service has been running for this long, it will be
            //  stopped (cleanly).  A value of 0 means an indefinite amount
            //  of time is allowed
            "usage_timeout": "0"
        }
    ]
}

I’m on AWS, moved the kernel to Linux 4.4.0-210-generic to make snd-aloop work correctly.

Can someone give me an helping hand? thank you!!!

Hi @1modica. I’m also facing the exact same issue any luck on resolving this yet ?

Hi @Dhawal_Patel, sadly no luck yet :frowning:

@1modica I can’t see baseUrl in your config.json. Are you using the latest jibri.yml file?
If not then add PUBLIC_URL= https://your_domain.com env variable in jibri.yml file.

default config.json: docker-jitsi-meet/config.json at cf904618107a4ead9a8477458ab612ab277ae4b5 · jitsi/docker-jitsi-meet · GitHub

1 Like

I’m using the docker image of jibri.
The jibri.yml file looks like this

version: '3'

services:
    jibri1:
        image: jitsi/jibri
        restart: ${RESTART_POLICY}
        volumes:
            - ${CONFIG}/jibri1:/config:Z
            - /dev/shm:/dev/shm
            - /root/jibri-docker/config/.asoundrc1:/home/jibri/.asoundrc
            - /root/jibri-docker/recordings:/config/recordings
        cap_add:
            - SYS_ADMIN
            - NET_BIND_SERVICE
        devices:
            - /dev/snd:/dev/snd
        environment:
            - XMPP_AUTH_DOMAIN
            - XMPP_INTERNAL_MUC_DOMAIN
            - XMPP_RECORDER_DOMAIN
            - XMPP_SERVER
            - XMPP_DOMAIN
            - JIBRI_XMPP_USER
            - JIBRI_XMPP_PASSWORD
            - JIBRI_BREWERY_MUC
            - JIBRI_RECORDER_USER
            - JIBRI_RECORDER_PASSWORD
            - JIBRI_RECORDING_DIR
            - JIBRI_FINALIZE_RECORDING_SCRIPT_PATH
            - JIBRI_STRIP_DOMAIN_JID
            - JIBRI_LOGS_DIR
            - DISPLAY=:0
            - TZ
        depends_on:
            - jicofo
    jibri2:
        image: jitsi/jibri
        restart: ${RESTART_POLICY}
        volumes:
            - ${CONFIG}/jibri2:/config:Z
            - /dev/shm:/dev/shm
            - /root/jibri-docker/config/.asoundrc2:/home/jibri/.asoundrc
            - /root/jibri-docker/recordings:/config/recordings
        cap_add:
            - SYS_ADMIN
            - NET_BIND_SERVICE
        devices:
            - /dev/snd:/dev/snd
        environment:
            - XMPP_AUTH_DOMAIN
            - XMPP_INTERNAL_MUC_DOMAIN
            - XMPP_RECORDER_DOMAIN
            - XMPP_SERVER
            - XMPP_DOMAIN
            - JIBRI_XMPP_USER
            - JIBRI_XMPP_PASSWORD
            - JIBRI_BREWERY_MUC
            - JIBRI_RECORDER_USER
            - JIBRI_RECORDER_PASSWORD
            - JIBRI_RECORDING_DIR
            - JIBRI_FINALIZE_RECORDING_SCRIPT_PATH
            - JIBRI_STRIP_DOMAIN_JID
            - JIBRI_LOGS_DIR
            - DISPLAY=:0
            - TZ
    jibri3:
        image: jitsi/jibri
        restart: ${RESTART_POLICY}
        volumes:
            - ${CONFIG}/jibri3:/config:Z
            - /dev/shm:/dev/shm
            - /root/jibri-docker/config/.asoundrc3:/home/jibri/.asoundrc
            - /root/jibri-docker/recordings:/config/recordings
        cap_add:
            - SYS_ADMIN
            - NET_BIND_SERVICE
        devices:
            - /dev/snd:/dev/snd
        environment:
            - XMPP_AUTH_DOMAIN
            - XMPP_INTERNAL_MUC_DOMAIN
            - XMPP_RECORDER_DOMAIN
            - XMPP_SERVER
            - XMPP_DOMAIN
            - JIBRI_XMPP_USER
            - JIBRI_XMPP_PASSWORD
            - JIBRI_BREWERY_MUC
            - JIBRI_RECORDER_USER
            - JIBRI_RECORDER_PASSWORD
            - JIBRI_RECORDING_DIR
            - JIBRI_FINALIZE_RECORDING_SCRIPT_PATH
            - JIBRI_STRIP_DOMAIN_JID
            - JIBRI_LOGS_DIR
            - DISPLAY=:0
            - TZ
    jibri4:
        image: jitsi/jibri
        restart: ${RESTART_POLICY}
        volumes:
            - ${CONFIG}/jibri4:/config:Z
            - /dev/shm:/dev/shm
            - /root/jibri-docker/config/.asoundrc4:/home/jibri/.asoundrc
            - /root/jibri-docker/recordings:/config/recordings
        cap_add:
            - SYS_ADMIN
            - NET_BIND_SERVICE
        devices:
            - /dev/snd:/dev/snd
        environment:
            - XMPP_AUTH_DOMAIN
            - XMPP_INTERNAL_MUC_DOMAIN
            - XMPP_RECORDER_DOMAIN
            - XMPP_SERVER
            - XMPP_DOMAIN
            - JIBRI_XMPP_USER
            - JIBRI_XMPP_PASSWORD
            - JIBRI_BREWERY_MUC
            - JIBRI_RECORDER_USER
            - JIBRI_RECORDER_PASSWORD
            - JIBRI_RECORDING_DIR
            - JIBRI_FINALIZE_RECORDING_SCRIPT_PATH
            - JIBRI_STRIP_DOMAIN_JID
            - JIBRI_LOGS_DIR
            - DISPLAY=:0
            - TZ
    jibri5:
        image: jitsi/jibri
        restart: ${RESTART_POLICY}
        volumes:
            - ${CONFIG}/jibri5:/config:Z
            - /dev/shm:/dev/shm
            - /root/jibri-docker/config/.asoundrc5:/home/jibri/.asoundrc
            - /root/jibri-docker/recordings:/config/recordings
        cap_add:
            - SYS_ADMIN
            - NET_BIND_SERVICE
        devices:
            - /dev/snd:/dev/snd
        environment:
            - XMPP_AUTH_DOMAIN
            - XMPP_INTERNAL_MUC_DOMAIN
            - XMPP_RECORDER_DOMAIN
            - XMPP_SERVER
            - XMPP_DOMAIN
            - JIBRI_XMPP_USER
            - JIBRI_XMPP_PASSWORD
            - JIBRI_BREWERY_MUC
            - JIBRI_RECORDER_USER
            - JIBRI_RECORDER_PASSWORD
            - JIBRI_RECORDING_DIR
            - JIBRI_FINALIZE_RECORDING_SCRIPT_PATH
            - JIBRI_STRIP_DOMAIN_JID
            - JIBRI_LOGS_DIR
            - DISPLAY=:0
            - TZ
        depends_on:
            - jicofo
    jibri6:
        image: jitsi/jibri
        restart: ${RESTART_POLICY}
        volumes:
            - ${CONFIG}/jibri6:/config:Z
            - /dev/shm:/dev/shm
            - /root/jibri-docker/config/.asoundrc6:/home/jibri/.asoundrc
            - /root/jibri-docker/recordings:/config/recordings
        cap_add:
            - SYS_ADMIN
            - NET_BIND_SERVICE
        devices:
            - /dev/snd:/dev/snd
        environment:
            - XMPP_AUTH_DOMAIN
            - XMPP_INTERNAL_MUC_DOMAIN
            - XMPP_RECORDER_DOMAIN
            - XMPP_SERVER
            - XMPP_DOMAIN
            - JIBRI_XMPP_USER
            - JIBRI_XMPP_PASSWORD
            - JIBRI_BREWERY_MUC
            - JIBRI_RECORDER_USER
            - JIBRI_RECORDER_PASSWORD
            - JIBRI_RECORDING_DIR
            - JIBRI_FINALIZE_RECORDING_SCRIPT_PATH
            - JIBRI_STRIP_DOMAIN_JID
            - JIBRI_LOGS_DIR
            - DISPLAY=:0
            - TZ

And the .env file looks like this

# JIBRI CONFIG
# Internal XMPP domain for authenticated services
PUBLIC_URL="https://meet.mydomain.one"
XMPP_AUTH_DOMAIN=auth.meet.mydomain.one

# XMPP domain for the internal MUC used for jibri, jigasi and jvb pools
XMPP_INTERNAL_MUC_DOMAIN=internal.auth.meet.mydomain.one

# XMPP domain for the jibri recorder
XMPP_RECORDER_DOMAIN=recorder.meet.mydomain.one

# Internal XMPP server
XMPP_SERVER=meet.mydomain.one

# Internal XMPP domain
XMPP_DOMAIN=meet.mydomain.one

# XMPP user for Jibri client connections
JIBRI_XMPP_USER=jibri

# XMPP password for Jibri client connections
JIBRI_XMPP_PASSWORD=Porsche911

# MUC name for the Jibri pool
JIBRI_BREWERY_MUC=jibribrewery

# XMPP recorder user for Jibri client connections
JIBRI_RECORDER_USER=recorder

# XMPP recorder password for Jibri client connections
JIBRI_RECORDER_PASSWORD=Porsche911

# Directory for recordings inside Jibri container
JIBRI_RECORDING_DIR=/home/ubuntu/config/recordings

# The finalizing script. Will run after recording is complete
JIBRI_FINALIZE_RECORDING_SCRIPT_PATH=/home/ubuntu/config/finalize.sh

 
# When jibri gets a request to start a service for a room, the room
# jid will look like: roomName@optional.prefixes.subdomain.xmpp_domain
# We'll build the url for the call by transforming that into:
# https://xmpp_domain/subdomain/roomName
# So if there are any prefixes in the jid (like jitsi meet, which
# has its participants join a muc at conference.xmpp_domain) then
# list that prefix here so it can be stripped out to generate
# the call url correctly

JIBRI_STRIP_DOMAIN_JID=conference

# Directory for logs inside Jibri container
JIBRI_LOGS_DIR=/config/logs

#PUBLIC_URL="https://meet.mydomain.one"

DISPLAY=:0=

Update your jibri.yaml file.
see here: docker-jitsi-meet/jibri.yml at cf904618107a4ead9a8477458ab612ab277ae4b5 · jitsi/docker-jitsi-meet · GitHub

I updated the jibri.yml file and now running into these logs for jibri docker images

2021-07-29 15:18:32.397 INFO: [1] org.jitsi.jibri.Main.log() Jibri run with args [--config, /etc/jitsi/jibri/config.json]
2021-07-29 15:18:32.471 INFO: [1] org.jitsi.jibri.Main.log() Checking legacy config file /etc/jitsi/jibri/config.json
Exception in thread "main" com.fasterxml.jackson.databind.JsonMappingException: Unexpected character ('h' (code 104)): was expecting comma to separate Object entries
 at [Source: (File); line: 17, column: 27]
 at [Source: (File); line: 17, column: 24] (through reference chain: org.jitsi.jibri.config.JibriConfig["xmpp_environments"]->java.util.ArrayList[0])
        at com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:391)
        at com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:363)
        at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:302)
        at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:245)
        at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:27)
        at com.fasterxml.jackson.databind.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:529)
        at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeWithErrorWrapping(BeanDeserializer.java:528)
        at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:417)
        at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1280)
        at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:326)
        at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:159)
        at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4001)
        at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2909)
        at org.jitsi.jibri.config.JibriConfigKt.loadConfigFromFile(JibriConfig.kt:189)
        at org.jitsi.jibri.MainKt.setupLegacyConfig(Main.kt:217)
        at org.jitsi.jibri.MainKt.handleCommandLineArgs(Main.kt:202)
        at org.jitsi.jibri.MainKt.main(Main.kt:53)
Caused by: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('h' (code 104)): was expecting comma to separate Object entries
 at [Source: (File); line: 17, column: 27]
        at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1804)
        at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:663)
        at com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:561)
        at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.nextToken(UTF8StreamJsonParser.java:707)
        at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:401)
        at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1280)
        at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:326)
        at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:159)
        at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:286)
        ... 14 more

I checked /etc/jitsi/jibri/ directory for config.json file and did not find the file. Found jibri.conf file instead with only jibri { } inside it.

@metadata updating the jibri.yaml file as you suggested solved my problem! Thank you very very much!

1 Like

As I am running docker on my local system, I am not able to understand what should be configured in my public_url field and how exactly can i expose my docker

Please can someone guide me, i am new to this concept