[jitsi-dev] [libjitsi] MediaStream.getRemoteDataAddress() returns null after MediaStream.start() (#18)


#1

First posted to the jitsi-dev list: http://lists.jitsi.org/pipermail/dev/2014-September/022078.html

What would cause this to be null after a successful start?

Here are the steps leading up to the method call:

   - get MediaService instance
   - get MediaFormatFactory via MediaService instance
   - get Audio and Video device instances
   - create MediaFormat for audio and video using MediaFormatFactory
   - create SrtpControl via MediaService
   - create MediaStream for audio and video using MediaService with Device
   and SrtpControl parameters
   - add dynamic payload type to MediaStream instances
   - set direction to SENDONLY on MediaStream instances
   - set format on MediaStream instances
   - create a DatagramSocket for both rtp and rtcp
   - create InetSocketAddress for both ports on my public IP
   - create a StreamConnector for both audio and video
   - set StreamConnector on each MediaStream instance using DatagramSocket
   instances
   - create a single MediaStreamTarget using the InetSocketAddress
   - set MediaStreamTarget on both MediaStream instances
   - set names on MediaStream instances ("audio" / "video")
   - start SrtpControl with MediaType.AUDIO
   - start the audio MediaStream instance
   - start the video MediaStream instance

All of these actions complete without an exception and the following is
output into my log:

2014-09-17 16:46:53,176 [pool-9-thread-2] TRACE MyStream - MediaFormats
audio: rtpmap:-1 opus/48000/2
video: rtpmap:-1 VP8/90000
2014-09-17 16:46:53,512 [pool-9-thread-2] TRACE MyStream - Using ports -
rtp/rtcp: [5000, 5001]
2014-09-17 16:46:53,610 [pool-9-thread-2] TRACE MyStream - Using IP:
xx.xx.xx.xx
2014-09-17 16:46:53,625 [pool-9-thread-2] TRACE MyStream - Addresses - rtp:
/xx.xx.xx.xx:5000 rtcp: /xx.xx.xx.xx:5001
2014-09-17 16:46:53,626 [pool-9-thread-2] TRACE MyStream - Stream target:
MediaStreamTarget with dataAddress /xx.xx.xx.xx:5000 and controlAddress
/xx.xx.xx.xx:5001
2014-09-17 16:46:53,639 [pool-9-thread-2] TRACE MyStream - Starting audio
stream
2014-09-17 16:46:54,241 [pool-9-thread-2] INFO
o.j.impl.neomedia.MediaStreamImpl - audio codec/freq: opus/48000 Hz
2014-09-17 16:46:54,242 [pool-9-thread-2] INFO
o.j.impl.neomedia.MediaStreamImpl - audio remote IP/port: xx.xx.xx.xx/5000
2014-09-17 16:46:54,242 [pool-9-thread-2] TRACE MyStream - Starting video
stream
2014-09-17 16:46:56,786 [RTP Forwarder:
org.jitsi.impl.neomedia.MediaStreamImpl$2 at 23d6dee] ERROR
net.sf.fmj.media.Log - No format has been registered for RTP payload type
111
2014-09-17 16:46:56,796 [pool-9-thread-2] INFO
o.j.i.n.d.VideoMediaDeviceSession - video send resolution: 320x240
2014-09-17 16:46:56,797 [pool-9-thread-2] INFO
o.j.i.n.d.VideoMediaDeviceSession - video send FPS: default(no restriction)
2014-09-17 16:46:56,945 [pool-9-thread-2] INFO
o.j.impl.neomedia.MediaStreamImpl - video codec/freq: VP8/90000 Hz
2014-09-17 16:46:56,945 [pool-9-thread-2] INFO
o.j.impl.neomedia.MediaStreamImpl - video remote IP/port: xx.xx.xx.xx/5000
2014-09-17 16:46:56,986 [Loop thread:
net.sf.fmj.media.parser.RawPullBufferParser$FrameTrack at 686ab109] INFO
o.j.i.n.codec.video.vp8.VPXEncoder - Setting new width/height: 320/240
2014-09-17 16:46:57,015 [RTP Forwarder:
org.jitsi.impl.neomedia.MediaStreamImpl$2 at 2c5993e3] ERROR
net.sf.fmj.media.Log - No format has been registered for RTP payload type
100

After all this completes, I call
InetSocketAddress audioAddr = audioMediaStream.getRemoteDataAddress();
and it always returns null; same for my video MediaStream.

If a sample class is required, I can create one.

···

---
Reply to this email directly or view it on GitHub:
https://github.com/jitsi/libjitsi/issues/18


#2

Heres a sample class for testing this issue:

<pre>

import java.io.IOException;
import java.net.DatagramSocket;
import java.net.InetAddress;
import java.net.InetSocketAddress;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Map;
import java.util.Set;

import javax.media.format.VideoFormat;

import org.ice4j.Transport;
import org.ice4j.TransportAddress;
import org.ice4j.ice.Agent;
import org.ice4j.ice.CandidatePair;
import org.ice4j.ice.Component;
import org.ice4j.ice.IceMediaStream;
import org.ice4j.ice.IceProcessingState;
import org.ice4j.ice.harvest.StunCandidateHarvester;
import org.ice4j.ice.harvest.TurnCandidateHarvester;
import org.jitsi.impl.neomedia.format.MediaFormatImpl;
import org.jitsi.impl.neomedia.format.VideoMediaFormatImpl;
import org.jitsi.service.libjitsi.LibJitsi;
import org.jitsi.service.neomedia.DefaultStreamConnector;
import org.jitsi.service.neomedia.DtlsControl;
import org.jitsi.service.neomedia.MediaDirection;
import org.jitsi.service.neomedia.MediaService;
import org.jitsi.service.neomedia.MediaStream;
import org.jitsi.service.neomedia.MediaStreamTarget;
import org.jitsi.service.neomedia.MediaType;
import org.jitsi.service.neomedia.SrtpControl;
import org.jitsi.service.neomedia.SrtpControlType;
import org.jitsi.service.neomedia.StreamConnector;
import org.jitsi.service.neomedia.codec.Constants;
import org.jitsi.service.neomedia.device.MediaDevice;
import org.jitsi.service.neomedia.format.MediaFormat;
import org.jitsi.service.neomedia.format.MediaFormatFactory;
import org.red5.logging.Red5LoggerFactory;
import org.red5.net.websocket.util.IdGenerator;
import org.slf4j.Logger;

public class MyBundledStream {

  private Logger log = Red5LoggerFactory.getLogger(MyBundledStream.class, "server");
    
  private SrtpControl srtpControl;
  
  private String sourceStreamName;
  
  private MediaStream audioMediaStream;
  
  private MediaStream videoMediaStream;
  
  private boolean useDTLS;
  
  private boolean useICE;
  
  private TransportAddress[] stunAddresses;

  private TransportAddress[] turnAddresses;

  // ex. "sha-256"
  private String remoteHashFunction;
  
  // ex. "5C:C6:19:38:4D:54:57:71:16:3F:67:A6:C8:21:CC:29:88:85:22:86:53:E5:7B:3F:3D:A4:5C:E5:BC:29:D8:B5"
  private String remoteFingerprint;
  
  /**
   * Initialize the stream and all the network endpoints.
   */
  public void init() {
    log.trace("init");
    try {
      // get the media service
      MediaService mediaService = LibJitsi.getMediaService();
      // create our a/v MediaDevice instances
      MediaDevice audioDevice = null; // TODO: Set your audio device here
      MediaDevice videoDevice = null; // TODO: Set your video device here
      // get the format factory
            MediaFormatFactory factory = mediaService.getFormatFactory();
      // create the MediaFormat for each stream
      MediaFormat audioFormat = factory.createMediaFormat(Constants.OPUS_RTP);
      MediaFormat videoFormat = factory.createMediaFormat(Constants.VP8_RTP);
      /*
      Create 2 MediaStreams, one with the video device, the other with the audio device.
      Give a null reference for the StreamConnector because its set later; I can't set the SrtpControl later.
      The SrtpControl will be the same for the 2 streams because they will share it, so it needs to be created before
      */
      srtpControl = mediaService.createSrtpControl(SrtpControlType.DTLS_SRTP);
      // create the media streams
      audioMediaStream = mediaService.createMediaStream(null, audioDevice, srtpControl);
      audioMediaStream.addDynamicRTPPayloadType((byte) 111, audioFormat);
      videoMediaStream = mediaService.createMediaStream(null, videoDevice, srtpControl);
      videoMediaStream.addDynamicRTPPayloadType((byte) 100, videoFormat);
      // even streams set to SENDONLY can receive data, but in that case, playback is not possible
      audioMediaStream.setDirection(MediaDirection.SENDONLY);
      videoMediaStream.setDirection(MediaDirection.SENDONLY);
      // set the format of the rtp stream
      audioMediaStream.setFormat(audioFormat);
      videoMediaStream.setFormat(videoFormat);
      // DTLS setup
      if (useDTLS) {
          // starting from here, I set up the DTLS for the stream, if DTLS does not interest you, you can skip this part
          DtlsControl dtlsControl = (DtlsControl) srtpControl;
          // see rfc 4145 to understand setup
          dtlsControl.setSetup(DtlsControl.Setup.PASSIVE);
          // normally, you need to set the fingerprint and the hash function of the remote target (and also send yours to the remote target).
          // libjitsi doesn't absolutely need this, so you can skip it if you want but that's way cleaner to do it.
    
          // assume here that I already receive the remote fingerprint / hash function
          Map<String, String> dtlsMap = new HashMap<String, String>();
          // sendFingerprint take care of sending your fingerprint and hash function to the remote target (this is a "fake" function)
          sendFingerprint(dtlsControl.getLocalFingerprint(), dtlsControl.getLocalFingerprintHashFunction());
          // the remote target also use just one connection for RTP, so you get one fingerprint for both audio and video..
          dtlsMap.put(remoteHashFunction, remoteFingerprint);
          // the DtlsControl need a Map of hash functions and their corresponding fingerprints that have been presented by the remote endpoint via the signaling path
          dtlsControl.setRemoteFingerprints(dtlsMap);
          // end of the DTLS set up
        }
      DatagramSocket rtpSocket;
      DatagramSocket rtcpSocket;
      // ICE setup
      if (useICE) {
        // I create an Agent containing only one IceMediaStream that will handle the audio/video stream
        Set<String> nameSet = new HashSet<String>();
        nameSet.add(sourceStreamName);
        Agent agent = generateIceMediaStream(nameSet, stunAddresses, turnAddresses);
        // YOU NEED TO SEND YOUR ICE CREDENTIALS BEFORE STARTING ICE
        // But how you do it depends on what signaling protocol you use.
        // But you need to do it here, before the next instruction.

        // Start the ICE process
        agent.startConnectivityEstablishment();
        // Running the ICE process doesn't block the tread, so you can do whatever you want until it's terminated,
        // but you mustn't use the sockets the Agent created, not before ICE terminates.
        // Here I decide to wait sleep until ICE terminates.
        while (IceProcessingState.TERMINATED != agent.getState()) {
          System.out.println("Connectivity establishment in process");
          try {
            Thread.sleep(1500);
          } catch (Exception e) {
            log.warn("Exception in ice process", e);
          }
        }
        // END OF ICE SETUP
        IceMediaStream iceMediaStream = agent.getStream("audio_video");
        // get candidates
        CandidatePair rtpPair = iceMediaStream.getComponent(Component.RTP).getSelectedPair();
        CandidatePair rtcpPair = iceMediaStream.getComponent(Component.RTCP).getSelectedPair();
        log.trace("Using IP: {}", rtpPair.getRemoteCandidate().getTransportAddress());
        log.trace("Using ports - rtp: {} rtcp: {}", rtpPair.getLocalCandidate().getDatagramSocket().getPort(), rtcpPair.getLocalCandidate().getDatagramSocket().getPort());
        // get the sockets
        rtpSocket = rtpPair.getLocalCandidate().getDatagramSocket();
        rtcpSocket = rtcpPair.getLocalCandidate().getDatagramSocket();
        // use the same DatagramSocket for both streams
        StreamConnector videoConnector = new DefaultStreamConnector(rtpSocket, rtcpSocket);
        StreamConnector audioConnector = new DefaultStreamConnector(rtpSocket, rtcpSocket);
        // shouldn't need 2 StreamConnector, you may just need to create one, with rtpSocket and rtcpSocket, and then you set the same StreamConnector to both streams
        videoMediaStream.setConnector(videoConnector);
        audioMediaStream.setConnector(audioConnector);
        // set mediastream target
        MediaStreamTarget mediaStreamTarget = new MediaStreamTarget(rtpPair.getRemoteCandidate().getTransportAddress(), rtcpPair.getRemoteCandidate().getTransportAddress());
        // use the same MediaStreamTarget for each stream with the same endpoint to bundle the stream
        // it's possible that you just only need to create one MediaStreamTarget that you set for both streams
        videoMediaStream.setTarget(mediaStreamTarget);
        audioMediaStream.setTarget(mediaStreamTarget);
      } else {
        // get a port pair
        int[] ports = new int[] { 5000, 5001 };
        log.trace("Using ports - rtp/rtcp: {}", ports);
        // setup the sockets
            rtpSocket = new DatagramSocket(ports[0]);
        rtcpSocket = new DatagramSocket(ports[1]);
        // get public ip
        String publicAddress = "192.168.1.1"; // TODO: Set your public address here
        log.trace("Using IP: {}", publicAddress);
        InetSocketAddress rtpaddr = new InetSocketAddress(InetAddress.getByName(publicAddress), ports[0]);
        InetSocketAddress rtcpaddr = new InetSocketAddress(InetAddress.getByName(publicAddress), ports[1]);
        // use the same DatagramSocket for both streams
        StreamConnector videoConnector = new DefaultStreamConnector(rtpSocket, rtcpSocket);
        StreamConnector audioConnector = new DefaultStreamConnector(rtpSocket, rtcpSocket);
        // shouldn't need 2 StreamConnector, you may just need to create one, with rtpSocket and rtcpSocket, and then you set the same StreamConnector to both streams
        videoMediaStream.setConnector(videoConnector);
        audioMediaStream.setConnector(audioConnector);
        log.trace("Addresses - rtp: {} rtcp: {}", rtpaddr, rtcpaddr);
        // set mediastream target
        MediaStreamTarget mediaStreamTarget = new MediaStreamTarget(rtpaddr, rtcpaddr);
        log.trace("Stream target: {}", mediaStreamTarget);
        // use the same MediaStreamTarget for each stream with the same endpoint to bundle the stream
        videoMediaStream.setTarget(mediaStreamTarget);
        audioMediaStream.setTarget(mediaStreamTarget);
      }
      // set the "name"/type of the stream
      audioMediaStream.setName(MediaType.AUDIO.toString());
      videoMediaStream.setName(MediaType.VIDEO.toString());
      //When you start the SrtpControl, you need to give a MediaType, but in this prototype example, the SrtpControl handle multiple MediaType.
      //It doesn't seem like libjitsi uses the MediaType in the SrtpControl, we will just set as audio
      srtpControl.start(MediaType.AUDIO);
      // start the stream, the target should quickly receive the first rtp packets
      log.trace("Starting audio stream");
      audioMediaStream.start();
      log.trace("Starting video stream");
      videoMediaStream.start();
    } catch (Throwable e) {
      log.warn("Exception in stream init", e);
    }
    log.trace("init - exit");
  }

        public void testForNullRemoteAddress() {
                InetSocketAddress audioAddr = audioMediaStream.getRemoteDataAddress();
                if (audioAddr == null) {
                        log.warn("Remote address is null, test failed");
                }
        }

  private Agent generateIceMediaStream(Set<String> mediaNameSet, TransportAddress[] stunAddresses, TransportAddress[] turnAddresses) throws IOException {
    Agent agent = new Agent();
    agent.setControlling(false);
    IceMediaStream stream = null;
    if (stunAddresses != null) {
      for (TransportAddress stunAddress : stunAddresses) {
        agent.addCandidateHarvester(new StunCandidateHarvester(stunAddress));
      }
    }
    if (turnAddresses != null) {
      for (TransportAddress turnAddress : turnAddresses) {
        agent.addCandidateHarvester(new TurnCandidateHarvester(turnAddress));
      }
    }
    for (String name : mediaNameSet) {
      stream = agent.createMediaStream(name);
      int[] ports = PortManager.getRtcPortPair();
      agent.createComponent(stream, Transport.UDP, ports[0], ports[0], ports[0] + 50);
      agent.createComponent(stream, Transport.UDP, ports[1], ports[1], ports[0] + 50);
    }
    return agent;
  }

}
</pre>

···

---
Reply to this email directly or view it on GitHub:
https://github.com/jitsi/libjitsi/issues/18#issuecomment-56044974


#3

To use the class, set your audio device, video device, and public IP; unless you choose to use ICE. Also if the logger is a problem, just replace it with something else.

···

---
Reply to this email directly or view it on GitHub:
https://github.com/jitsi/libjitsi/issues/18#issuecomment-56045187


#4

It would be awesome to get some feedback on this; I used a workaround to get the IP and Port information using Target. In the example code above, would the binding to the outgoing port be disabled by not setting useDTLS to true and also by using DtlsControl.Setup.PASSIVE instead of DtlsControl.Setup.ACTIVE or DtlsControl.Setup.ACTPASS?

···

---
Reply to this email directly or view it on GitHub:
https://github.com/jitsi/libjitsi/issues/18#issuecomment-56804597


#5

Closed #18.

···

---
Reply to this email directly or view it on GitHub:
https://github.com/jitsi/libjitsi/issues/18#event-361453599


#6

We have had a look at the issue and didn't find anything suspicious. Please reopen if you still experience the issue or seek for help on the mailing list.

···

---
Reply to this email directly or view it on GitHub:
https://github.com/jitsi/libjitsi/issues/18#issuecomment-123419459