程序问答   发布时间:2022-06-02  发布网站:大佬教程  code.js-code.com
大佬教程收集整理的这篇文章主要介绍了WebRTC Java服务器故障大佬教程大佬觉得挺不错的,现在分享给大家,也给大家做个参考。

如何解决WebRTC Java服务器故障?

开发过程中遇到WebRTC Java服务器故障的问题如何解决?下面主要结合日常开发的经验,给出你关于WebRTC Java服务器故障的解决方法建议,希望对你解决WebRTC Java服务器故障有所启发或帮助;

啊。没关系。很抱歉这个愚蠢的问题。

缺少的部分是浏览器和Java服务器之间的ICE候选者交换。现在,我添加了通过WebSocket进行ICE协商的代码,一切正常!

解决方法

我认为我非常接近让Java服务器应用程序通过WebRTC与浏览器页面对话,但是我不能完全使其正常工作。我感觉自己缺少一些小东西,因此希望这里有人可以提出建议。

我仔细研究了WebRTC示例-
Java单元测试(org.webrtc.PeerConnectionTest)和示例Android应用(trunk/talk/examples/android)。根据所学知识,我编写了一个Java应用程序,该应用程序使用WebSockets进行信号传输并尝试将视频流发送到Chrome。

问题是,即使我所有的代码(Javascript和Java)都按照我期望的顺序执行,并击中了所有正确的日志记录语句,浏览器中也没有视频。本地libjingle代码在控制台日志中有一些可疑的输出,但是我不确定该怎么做。我在日志中以“

如果您查看下面的日志,您会发现与浏览器的WebSocket通信正常运行,并且能够将SDP报价成功发送到浏览器并从浏览器收到SDP答复。在PeerConnections上设置本地和远程描述似乎也可以正常工作。HTML5视频元素按原样将源设置为BLOB
url。那么,我可能会缺少什么呢?即使我的客户端和服务器现在在同一台计算机上,我也需要对ICE候选者做任何事情吗?

任何建议将不胜感激!

SDP消息(来自Chrome的Javascript控制台)

1.134: Java Offer: 
v=0
o=- 5893945934600346864 2 IN IP4 127.0.0.1
s=-
t=0 0
a=group:BUNDLE video
a=msid-semantic: WMS JavaMediaStream
m=video 1 RTP/SAVPF 100 116 117
c=IN IP4 0.0.0.0
a=rtcp:1 IN IP4 0.0.0.0
a=ice-ufrag:dJxTlMlXy7uASrDU
a=ice-pwd:r8BRkXVnc4dqCABUDhuRjpP7
a=ice-options:google-ice
a=mid:video
a=extmap:2 urn:ietf:params:rtp-hdrext:toffset
a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=sendrecv
a=rtcp-mux
a=crypto:0 AES_CM_128_HMAC_SHA1_80 inline:yq6wOHhk/QfsWuh+1oOEqfB4GjKZzz8XfQnGCDP3
a=rtpmap:100 VP8/90000
a=rtcp-fb:100 ccm fir
a=rtcp-fb:100 nack
a=rtcp-fb:100 goog-remb
a=rtpmap:116 red/90000
a=rtpmap:117 ulpfec/90000
a=ssrc:3720473526 cname:nul6R21KmwAms3Ge
a=ssrc:3720473526 msid:JavaMediaStream JavaMediaStream_v0
a=ssrc:3720473526 mslabel:JavaMediaStream
a=ssrc:3720473526 label:JavaMediaStream_v0


1.149: Received remote stream


1.150: Browsers Answer: 
v=0
o=- 4261396844048664099 2 IN IP4 127.0.0.1
s=-
t=0 0
a=group:BUNDLE video
a=msid-semantic: WMS
m=video 1 RTP/SAVPF 100 116 117
c=IN IP4 0.0.0.0
a=rtcp:1 IN IP4 0.0.0.0
a=ice-ufrag:quzQNsX+ZlUWUQqV
a=ice-pwd:y5A0+7sM8P88AatBLd1fdd5G
a=mid:video
a=extmap:2 urn:ietf:params:rtp-hdrext:toffset
a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=recvonly
a=rtcp-mux
a=crypto:0 AES_CM_128_HMAC_SHA1_80 inline:WClNA69OfpjdJy3Bv4ujejk/IYnn4DW8kjrB18xP
a=rtpmap:100 VP8/90000
a=rtcp-fb:100 ccm fir
a=rtcp-fb:100 nack
a=rtcp-fb:100 goog-remb
a=rtpmap:116 red/90000
a=rtpmap:117 ulpfec/90000

对我来说这没关系。Java的报价包括我的视频流。

本机代码记录(libjingle)

(标有“ >>”的可疑行)

Camera '/dev/video0' started with format YUY2 640x480x30,elapsed time 59 ms
Ignored line: c=IN IP4 0.0.0.0
NACK enabled for chAnnel 0
NACK enabled for chAnnel 0
Created chAnnel for video
Jingle:ChAnnel[video|1|__]: NULL DTLS identity supplied. Not doing DTLS
Jingle:ChAnnel[video|2|__]: NULL DTLS identity supplied. Not doing DTLS
Session:5893945934600346864 Old state:STATE_INIT New state:STATE_SENTinITIATE Type:urn:xmpp:jingle:apps:rtp:1 Transport:http://www.google.com/transport/p2p
SetTing local video description
AddSendStream {id:JavaMediaStream_v0;ssrcs:[3720473526];ssrc_groups:;cname:nul6R21KmwAms3Ge;sync_label:JavaMediaStream}
Add send ssrc: 3720473526
>> Warning(webrtcvideoENGIne.cc:2704): SetReceiverBufferingMode(0,0) failed,err=12606
Changing video state,recv=0 send=0
Transport: video,allocating candidates
Transport: video,allocating candidates
Jingle:Net[eth0:192.168.0.0/24]: AlLOCATIOn Phase=Udp
Jingle:Port[:1:0::Net[eth0:192.168.0.0/24]]: Port created
Adding allocated port for video
Jingle:Port[video:1:0::Net[eth0:192.168.0.0/24]]: Added port to allocator
Jingle:Net[tun0:192.168.128.6/32]: AlLOCATIOn Phase=Udp
Jingle:Port[:1:0::Net[tun0:192.168.128.6/32]]: Port created
Adding allocated port for video
Jingle:Port[video:1:0::Net[tun0:192.168.128.6/32]]: Added port to allocator
Ignored line: c=IN IP4 0.0.0.0
Warning(webrtcvideoENGIne.cc:2309): GetStats: sender information not ready.
Jingle:ChAnnel[video|1|__]: Other side didn't support DTLs.
Jingle:ChAnnel[video|2|__]: Other side didn't support DTLs.
Enabling BUNDLE,bundling onto transport: video
ChAnnel enabled
>> Changing video state,recv=1 send=0
Session:5893945934600346864 Old state:STATE_SENTinITIATE New state:STATE_RECEIVEDACCEPT Type:urn:xmpp:jingle:apps:rtp:1 Transport:http://www.google.com/transport/p2p
SetTing remote video description
Hybrid NACK/FEC enabled for chAnnel 0
Hybrid NACK/FEC enabled for chAnnel 0
SetSendCodecs() : SELEcted video codec VP8/1280x720x30fps@2000kbps (min=50kbps,start=300kbps)
Video max quantization: 56
VP8 number of temporal layers: 1
VP8 options : picture loss inDication = 0,feedBACk mode = 0,complexity = normal,resilience = off,denoising = 0,error concealment = 0,automatic resize = 0,frame dropping = 1,key framE interval = 3000
WARNING: no real random source present!
SRTP activated with negotiated parameters: send cipher_suite AES_CM_128_HMAC_SHA1_80 recv cipher_suite AES_CM_128_HMAC_SHA1_80
Changing video state,recv=1 send=0
Session:5893945934600346864 Old state:STATE_RECEIVEDACCEPT New state:STATE_INPROGRESS Type:urn:xmpp:jingle:apps:rtp:1 Transport:http://www.google.com/transport/p2p
Jingle:Net[eth0:192.168.0.0/24]: AlLOCATIOn Phase=Relay
Jingle:Net[tun0:192.168.128.6/32]: AlLOCATIOn Phase=Relay
Jingle:Net[eth0:192.168.0.0/24]: AlLOCATIOn Phase=Tcp
Jingle:Port[:1:0:local:Net[eth0:192.168.0.0/24]]: Port created
Adding allocated port for video
Jingle:Port[video:1:0:local:Net[eth0:192.168.0.0/24]]: Added port to allocator
Jingle:Net[tun0:192.168.128.6/32]: AlLOCATIOn Phase=Tcp
Jingle:Port[:1:0:local:Net[tun0:192.168.128.6/32]]: Port created
Adding allocated port for video
Jingle:Port[video:1:0:local:Net[tun0:192.168.128.6/32]]: Added port to allocator
Jingle:Net[eth0:192.168.0.0/24]: AlLOCATIOn Phase=SslTcp
Jingle:Net[tun0:192.168.128.6/32]: AlLOCATIOn Phase=SslTcp
All candidates gathered for video:1:0
Transport: video,component 1 alLOCATIOn complete
Transport: video alLOCATIOn complete
Candidate gathering is complete.
Capture delay changed to 120 ms
Captured frame size 640x480. Expected format YUY2 640x480x30
Capture size changed : SELEcted video codec VP8/640x480x30fps@2000kbps (min=50kbps,automatic resize = 1,key framE interval = 3000
VAdapt Frame: 0 / 300 Changes: 0 Input: 640x480 Scale: 1 Output: 640x480 Changed: false
>> Jingle:Port[video:1:0::Net[eth0:192.168.0.0/24]]: Port deleted
>> Jingle:Port[video:1:0::Net[eth0:192.168.0.0/24]]: Removed port from allocator (3 remaining)
Removed port from p2p socket: 3 remaining
Jingle:Port[video:1:0::Net[tun0:192.168.128.6/32]]: Port deleted
Jingle:Port[video:1:0::Net[tun0:192.168.128.6/32]]: Removed port from allocator (2 remaining)
Removed port from p2p socket: 2 remaining
>> Jingle:Port[video:1:0:local:Net[eth0:192.168.0.0/24]]: Port deleted
>> Jingle:Port[video:1:0:local:Net[eth0:192.168.0.0/24]]: Removed port from allocator (1 remaining)
Removed port from p2p socket: 1 remaining
Jingle:Port[video:1:0:local:Net[tun0:192.168.128.6/32]]: Port deleted
Jingle:Port[video:1:0:local:Net[tun0:192.168.128.6/32]]: Removed port from allocator (0 remaining)
Removed port from p2p socket: 0 remaining

的HTML

<html lang="en">
    <head>
        <title>Web Socket Signalling</title>
        <link rel="stylesheet" href="css/socket.css">
        <script src="js/socket.js"></script>
    </head>
    <body>
        <h2>Repsonse from Server</h2>
        <textarea id="responseText"></textarea>

        <h2>Video</h2>
        <video id="remoteVideo" autoplay></video>
    </body>
</html>

Java脚本

(function() {
  var remotePeerConnection;
  var sdpConsTraints = {
    'mandatory' : {
      'OfferToReceiveAudio' : false,'OfferToReceiveVideo' : true
    }
  };


  var Sock = function() {
    var socket;
    if (!window.WebSocket) {
      window.WebSocket = window.MozWebSocket;
    }


    if (window.WebSocket) {
      socket = new WebSocket("ws://localhost:8080/websocket");
      socket.onopen = onopen;
      socket.onmessage = onmessage;
      socket.onclose = onclose;
    } else {
      alert("Your browser does not support Web Socket.");
    }


    function onopen(event) {
      getTextAreaElement().value = "Web Socket opened!";
    }


    function onmessage(event) {
      appendTextArea(event.data);


      sdpOffer = new RTCSessionDescription(JSON.parse(event.data));


      remotePeerConnection = new webkitRTCPeerConnection(null);
      remotePeerConnection.onaddstream = gotRemoteStream;


      trace("Java Offer: \n" + sdpOffer.sdp);
      remotePeerConnection.setRemoteDescription(sdpOffer);
      remotePeerConnection.createAnswer(gotRemoteDescription,onCreateSessionDescriptionError,sdpConsTraints);


    }
    function onCreateSessionDescriptionError(error) {
      console.log('Failed to create session description: '
          + error.toString());
    }

    function gotRemoteDescription(answer) {
      remotePeerConnection.setLocalDescription(answer);
      trace("Browser's Answer: \n" + answer.sdp);


      socket.send(JSON.Stringify(answer));
    }


    function gotRemoteStream(event) {
      var remoteVideo = document.getElementById("remoteVideo");
      remoteVideo.src = URl.createObjectURL(event.stream);
      trace("Received remote stream");
    }


    function onclose(event) {
      appendTextArea("Web Socket closed");
    }


    function appendTextArea(newData) {
      var el = getTextAreaElement();
      el.value = el.value + '\n' + newData;
    }


    function getTextAreaElement() {
      return document.getElementById('responseText');
    }


    function trace(text) {
      console.log((perfoRMANce.now() / 1000).toFixed(3) + ": " + text);
    }


  }
  window.addEventListener('load',function() {
    new Sock();
  },falsE);
})();

Java服务器

public class PeerConnectionManager {

   /**
    * Called when the WebSocket handshake is completed
    */
   public void createOffer() {

      peerConnection = factory.createPeerConnection(
            new ArrayList<PeerConnection.IceServer>(),new MediaConsTraints(),new PeerConnectionObserverImpl());


      // Get the video source
      videosource = factory.createVideosource(VideoCapturer.create(""),new MediaConsTraints());


      // Create a MediaStream with one video track
      MediaStream lMS = factory.createLocalMediaStream("JavaMediaStream");
      VideoTrack videoTrack = factory.createVideoTrack("JavaMediaStream_v0",videosourcE);
      videoTrack.addRenderer(new VideoRenderer(new VideoRendererObserverImpl()));
      lMs.addTrack(videoTrack);
      peerConnection.addStream(lMS,new MediaConsTraints());

      // We don't want to receive anything
      MediaConsTraints sdpConsTraints = new MediaConsTraints();
      sdpConsTraints.mandatory.add(new MediaConsTraints.KeyValuePair(
            "OfferToReceiveAudio","false"));
      sdpConsTraints.mandatory.add(new MediaConsTraints.KeyValuePair(
            "OfferToReceiveVideo","false"));


      // Get the Offer SDP
      SdpObserverImpl sdpOfferObserver = new SdpObserverImpl();
      peerConnection.createOffer(sdpOfferObserver,sdpConsTraints);
      SessionDescription offerSdp = sdpOfferObserver.getSdp();

      // Set local SDP,don't care for any callBACks
      peerConnection.setLocalDescription(new SdpObserverImpl(),offerSdp);


      // serialize Offer and send to the Browser via a WebSocket
      JSONObject offerSdpJson = new JSONObject();
      offerSdpJson.put("sdp",offerSdp.description);
      offerSdpJson.put("type",offerSdp.type.canonicalForm());
      webSocketContext.chAnnel().writeAndFlush(
            new TextWebSocketFrame(offerSdpJson.toString()));


   }


   /**
    * Called when an SDP Answer arrives via the WebSocket
    */
   public void setRemoteDescription(SessionDescription answer) {
      peerConnection.setRemoteDescription( new SdpObserverImpl(),answer);

   }
}
@H_197_65@

大佬总结

以上是大佬教程为你收集整理的WebRTC Java服务器故障全部内容,希望文章能够帮你解决WebRTC Java服务器故障所遇到的程序开发问题。

如果觉得大佬教程网站内容还不错,欢迎将大佬教程推荐给程序员好友。

本图文内容来源于网友网络收集整理提供,作为学习参考使用,版权属于原作者。
如您有任何意见或建议可联系处理。小编QQ:384754419,请注明来意。