当前位置: 首页>>代码示例>>Java>>正文


Java PeerConnectionFactory.createVideoTrack方法代码示例

本文整理汇总了Java中org.webrtc.PeerConnectionFactory.createVideoTrack方法的典型用法代码示例。如果您正苦于以下问题:Java PeerConnectionFactory.createVideoTrack方法的具体用法?Java PeerConnectionFactory.createVideoTrack怎么用?Java PeerConnectionFactory.createVideoTrack使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在org.webrtc.PeerConnectionFactory的用法示例。


在下文中一共展示了PeerConnectionFactory.createVideoTrack方法的6个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: PeerConnectionWrapper

import org.webrtc.PeerConnectionFactory; //导入方法依赖的package包/类
public PeerConnectionWrapper(@NonNull Context context,
                             @NonNull PeerConnectionFactory factory,
                             @NonNull PeerConnection.Observer observer,
                             @NonNull VideoRenderer.Callbacks localRenderer,
                             @NonNull List<PeerConnection.IceServer> turnServers,
                             boolean hideIp)
{
  List<PeerConnection.IceServer> iceServers = new LinkedList<>();
  iceServers.add(STUN_SERVER);
  iceServers.addAll(turnServers);

  MediaConstraints                constraints      = new MediaConstraints();
  MediaConstraints                audioConstraints = new MediaConstraints();
  PeerConnection.RTCConfiguration configuration    = new PeerConnection.RTCConfiguration(iceServers);

  configuration.bundlePolicy  = PeerConnection.BundlePolicy.MAXBUNDLE;
  configuration.rtcpMuxPolicy = PeerConnection.RtcpMuxPolicy.REQUIRE;

  if (hideIp) {
    configuration.iceTransportsType = PeerConnection.IceTransportsType.RELAY;
  }

  constraints.optional.add(new MediaConstraints.KeyValuePair("DtlsSrtpKeyAgreement", "true"));
  audioConstraints.optional.add(new MediaConstraints.KeyValuePair("DtlsSrtpKeyAgreement", "true"));

  this.peerConnection = factory.createPeerConnection(configuration, constraints, observer);
  this.videoCapturer  = createVideoCapturer(context);

  MediaStream mediaStream = factory.createLocalMediaStream("ARDAMS");
  this.audioSource = factory.createAudioSource(audioConstraints);
  this.audioTrack  = factory.createAudioTrack("ARDAMSa0", audioSource);
  this.audioTrack.setEnabled(false);
  mediaStream.addTrack(audioTrack);

  if (videoCapturer != null) {
    this.videoSource = factory.createVideoSource(videoCapturer);
    this.videoTrack = factory.createVideoTrack("ARDAMSv0", videoSource);

    this.videoTrack.addRenderer(new VideoRenderer(localRenderer));
    this.videoTrack.setEnabled(false);
    mediaStream.addTrack(videoTrack);
  } else {
    this.videoSource = null;
    this.videoTrack  = null;
  }

  this.peerConnection.addStream(mediaStream);
}
 
开发者ID:XecureIT,项目名称:PeSanKita-android,代码行数:49,代码来源:PeerConnectionWrapper.java

示例2: start

import org.webrtc.PeerConnectionFactory; //导入方法依赖的package包/类
public void start() {
    start.setEnabled(false);
    call.setEnabled(true);
    //Initialize PeerConnectionFactory globals.
    //Params are context, initAudio,initVideo and videoCodecHwAcceleration
    PeerConnectionFactory.initializeAndroidGlobals(this, true, true, true);

    //Create a new PeerConnectionFactory instance.
    PeerConnectionFactory.Options options = new PeerConnectionFactory.Options();
    peerConnectionFactory = new PeerConnectionFactory(options);


    //Now create a VideoCapturer instance. Callback methods are there if you want to do something! Duh!
    VideoCapturer videoCapturerAndroid = getVideoCapturer(new CustomCameraEventsHandler());

    //Create MediaConstraints - Will be useful for specifying video and audio constraints.
    audioConstraints = new MediaConstraints();
    videoConstraints = new MediaConstraints();

    //Create a VideoSource instance
    videoSource = peerConnectionFactory.createVideoSource(videoCapturerAndroid);
    localVideoTrack = peerConnectionFactory.createVideoTrack("100", videoSource);

    //create an AudioSource instance
    audioSource = peerConnectionFactory.createAudioSource(audioConstraints);
    localAudioTrack = peerConnectionFactory.createAudioTrack("101", audioSource);
    localVideoView.setVisibility(View.VISIBLE);

    //create a videoRenderer based on SurfaceViewRenderer instance
    localRenderer = new VideoRenderer(localVideoView);
    // And finally, with our VideoRenderer ready, we
    // can add our renderer to the VideoTrack.
    localVideoTrack.addRenderer(localRenderer);

}
 
开发者ID:vivek1794,项目名称:webrtc-android-codelab,代码行数:36,代码来源:MainActivity.java

示例3: onCreate

import org.webrtc.PeerConnectionFactory; //导入方法依赖的package包/类
@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.activity_main);

    //Initialize PeerConnectionFactory globals.
    //Params are context, initAudio,initVideo and videoCodecHwAcceleration
    PeerConnectionFactory.initializeAndroidGlobals(this, true, true, true);

    //Create a new PeerConnectionFactory instance.
    PeerConnectionFactory.Options options = new PeerConnectionFactory.Options();
    PeerConnectionFactory peerConnectionFactory = new PeerConnectionFactory(options);


    //Now create a VideoCapturer instance. Callback methods are there if you want to do something! Duh!
    VideoCapturer videoCapturerAndroid = createVideoCapturer();
    //Create MediaConstraints - Will be useful for specifying video and audio constraints. More on this later!
    MediaConstraints constraints = new MediaConstraints();

    //Create a VideoSource instance
    VideoSource videoSource = peerConnectionFactory.createVideoSource(videoCapturerAndroid);
    VideoTrack localVideoTrack = peerConnectionFactory.createVideoTrack("100", videoSource);

    //create an AudioSource instance
    AudioSource audioSource = peerConnectionFactory.createAudioSource(constraints);
    AudioTrack localAudioTrack = peerConnectionFactory.createAudioTrack("101", audioSource);

    //we will start capturing the video from the camera
    //width,height and fps
    videoCapturerAndroid.startCapture(1000, 1000, 30);

    //create surface renderer, init it and add the renderer to the track
    SurfaceViewRenderer videoView = (SurfaceViewRenderer) findViewById(R.id.surface_rendeer);
    videoView.setMirror(true);

    EglBase rootEglBase = EglBase.create();
    videoView.init(rootEglBase.getEglBaseContext(), null);

    localVideoTrack.addRenderer(new VideoRenderer(videoView));


}
 
开发者ID:vivek1794,项目名称:webrtc-android-codelab,代码行数:43,代码来源:MainActivity.java

示例4: onCreate

import org.webrtc.PeerConnectionFactory; //导入方法依赖的package包/类
@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    setContentView(R.layout.activity_main);

    AudioManager audioManager = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
    audioManager.setMode(AudioManager.MODE_IN_COMMUNICATION);
    audioManager.setSpeakerphoneOn(true);

    PeerConnectionFactory.initializeAndroidGlobals(
            this,  // Context
            true,  // Audio Enabled
            true,  // Video Enabled
            true,  // Hardware Acceleration Enabled
            null); // Render EGL Context

    peerConnectionFactory = new PeerConnectionFactory();

    VideoCapturerAndroid vc = VideoCapturerAndroid.create(VideoCapturerAndroid.getNameOfFrontFacingDevice(), null);

    localVideoSource = peerConnectionFactory.createVideoSource(vc, new MediaConstraints());
    VideoTrack localVideoTrack = peerConnectionFactory.createVideoTrack(VIDEO_TRACK_ID, localVideoSource);
    localVideoTrack.setEnabled(true);

    AudioSource audioSource = peerConnectionFactory.createAudioSource(new MediaConstraints());
    AudioTrack localAudioTrack = peerConnectionFactory.createAudioTrack(AUDIO_TRACK_ID, audioSource);
    localAudioTrack.setEnabled(true);

    localMediaStream = peerConnectionFactory.createLocalMediaStream(LOCAL_STREAM_ID);
    localMediaStream.addTrack(localVideoTrack);
    localMediaStream.addTrack(localAudioTrack);

    GLSurfaceView videoView = (GLSurfaceView) findViewById(R.id.glview_call);

    VideoRendererGui.setView(videoView, null);
    try {
        otherPeerRenderer = VideoRendererGui.createGui(0, 0, 100, 100, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
        VideoRenderer renderer = VideoRendererGui.createGui(50, 50, 50, 50, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
        localVideoTrack.addRenderer(renderer);
    } catch (Exception e) {
        e.printStackTrace();
    }
}
 
开发者ID:Nitrillo,项目名称:krankygeek,代码行数:45,代码来源:MainActivity.java

示例5: onIceServers

import org.webrtc.PeerConnectionFactory; //导入方法依赖的package包/类
@Override
public void onIceServers(List<PeerConnection.IceServer> iceServers) {
  factory = new PeerConnectionFactory();

  MediaConstraints pcConstraints = appRtcClient.pcConstraints();
  pcConstraints.optional.add(
      new MediaConstraints.KeyValuePair("RtpDataChannels", "true"));
  pc = factory.createPeerConnection(iceServers, pcConstraints, pcObserver);

  createDataChannelToRegressionTestBug2302(pc);  // See method comment.

  // Uncomment to get ALL WebRTC tracing and SENSITIVE libjingle logging.
  // NOTE: this _must_ happen while |factory| is alive!
  // Logging.enableTracing(
  //     "logcat:",
  //     EnumSet.of(Logging.TraceLevel.TRACE_ALL),
  //     Logging.Severity.LS_SENSITIVE);

  {
    final PeerConnection finalPC = pc;
    final Runnable repeatedStatsLogger = new Runnable() {
        public void run() {
          synchronized (quit[0]) {
            if (quit[0]) {
              return;
            }
            final Runnable runnableThis = this;
            if (hudView.getVisibility() == View.INVISIBLE) {
              vsv.postDelayed(runnableThis, 1000);
              return;
            }
            boolean success = finalPC.getStats(new StatsObserver() {
                public void onComplete(final StatsReport[] reports) {
                  runOnUiThread(new Runnable() {
                      public void run() {
                        updateHUD(reports);
                      }
                    });
                  for (StatsReport report : reports) {
                    Log.d(TAG, "Stats: " + report.toString());
                  }
                  vsv.postDelayed(runnableThis, 1000);
                }
              }, null);
            if (!success) {
              throw new RuntimeException("getStats() return false!");
            }
          }
        }
      };
    vsv.postDelayed(repeatedStatsLogger, 1000);
  }

  {
    logAndToast("Creating local video source...");
    MediaStream lMS = factory.createLocalMediaStream("ARDAMS");
    if (appRtcClient.videoConstraints() != null) {
      VideoCapturer capturer = getVideoCapturer();
      videoSource = factory.createVideoSource(
          capturer, appRtcClient.videoConstraints());
      VideoTrack videoTrack =
          factory.createVideoTrack("ARDAMSv0", videoSource);
      videoTrack.addRenderer(new VideoRenderer(localRender));
      lMS.addTrack(videoTrack);
    }
    if (appRtcClient.audioConstraints() != null) {
      lMS.addTrack(factory.createAudioTrack(
          "ARDAMSa0",
          factory.createAudioSource(appRtcClient.audioConstraints())));
    }
    pc.addStream(lMS, new MediaConstraints());
  }
  logAndToast("Waiting for ICE candidates...");
}
 
开发者ID:gaku,项目名称:WebRTCDemo,代码行数:75,代码来源:AppRTCDemoActivity.java

示例6: onIceServers

import org.webrtc.PeerConnectionFactory; //导入方法依赖的package包/类
@Override
public void onIceServers(List<PeerConnection.IceServer> iceServers) {
  factory = new PeerConnectionFactory();
  pc = factory.createPeerConnection(
      iceServers, appRtcClient.pcConstraints(), pcObserver);

  // Uncomment to get ALL WebRTC tracing and SENSITIVE libjingle logging.
  // NOTE: this _must_ happen while |factory| is alive!
  // Logging.enableTracing(
  //     "logcat:",
  //     EnumSet.of(Logging.TraceLevel.TRACE_ALL),
  //     Logging.Severity.LS_SENSITIVE);

  {
    final PeerConnection finalPC = pc;
    final Runnable repeatedStatsLogger = new Runnable() {
        public void run() {
          synchronized (quit[0]) {
            if (quit[0]) {
              return;
            }
            final Runnable runnableThis = this;
            boolean success = finalPC.getStats(new StatsObserver() {
                public void onComplete(StatsReport[] reports) {
                  for (StatsReport report : reports) {
                    Log.d(TAG, "Stats: " + report.toString());
                  }
                  vsv.postDelayed(runnableThis, 10000);
                }
              }, null);
            if (!success) {
              throw new RuntimeException("getStats() return false!");
            }
          }
        }
      };
    vsv.postDelayed(repeatedStatsLogger, 10000);
  }

  {
    logAndToast("Creating local video source...");
    MediaStream lMS = factory.createLocalMediaStream("ARDAMS");
    if (appRtcClient.videoConstraints() != null) {
      VideoCapturer capturer = getVideoCapturer();
      videoSource = factory.createVideoSource(
          capturer, appRtcClient.videoConstraints());
      VideoTrack videoTrack =
          factory.createVideoTrack("ARDAMSv0", videoSource);
      videoTrack.addRenderer(new VideoRenderer(new VideoCallbacks(
          vsv, VideoStreamsView.Endpoint.LOCAL)));
      lMS.addTrack(videoTrack);
    }
    lMS.addTrack(factory.createAudioTrack("ARDAMSa0"));
    pc.addStream(lMS, new MediaConstraints());
  }
  logAndToast("Waiting for ICE candidates...");
}
 
开发者ID:kenneththorman,项目名称:appspotdemo-mono,代码行数:58,代码来源:AppRTCDemoActivity.java


注:本文中的org.webrtc.PeerConnectionFactory.createVideoTrack方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。