当前位置: 首页>>代码示例>>Java>>正文


Java MediaConstraints类代码示例

本文整理汇总了Java中org.webrtc.MediaConstraints的典型用法代码示例。如果您正苦于以下问题:Java MediaConstraints类的具体用法?Java MediaConstraints怎么用?Java MediaConstraints使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


MediaConstraints类属于org.webrtc包,在下文中一共展示了MediaConstraints类的15个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: WebRTC

import org.webrtc.MediaConstraints; //导入依赖的package包/类
WebRTC(WebRTCTask task, MainActivity activity) {
	this.task = task;
	this.activity = activity;

	// Initialize Android globals
	// See https://bugs.chromium.org/p/webrtc/issues/detail?id=3416
	PeerConnectionFactory.initializeAndroidGlobals(activity, false);

	// Set ICE servers
	List<PeerConnection.IceServer> iceServers = new ArrayList<>();
	iceServers.add(new org.webrtc.PeerConnection.IceServer("stun:" + Config.STUN_SERVER));
	if (Config.TURN_SERVER != null) {
		iceServers.add(new org.webrtc.PeerConnection.IceServer("turn:" + Config.TURN_SERVER,
				Config.TURN_USER, Config.TURN_PASS));
	}

	// Create peer connection
	final PeerConnectionFactory.Options options = new PeerConnectionFactory.Options();
	this.factory = new PeerConnectionFactory(options);
	this.constraints = new MediaConstraints();
	this.pc = this.factory.createPeerConnection(iceServers, constraints, new PeerConnectionObserver());

	// Add task message event handler
	this.task.setMessageHandler(new TaskMessageHandler());
}
 
开发者ID:saltyrtc,项目名称:saltyrtc-demo,代码行数:26,代码来源:WebRTC.java

示例2: WebRtcClient

import org.webrtc.MediaConstraints; //导入依赖的package包/类
public WebRtcClient(RtcListener listener, String host, PeerConnectionClient.PeerConnectionParameters params) {
    mListener = listener;
    pcParams = params;
    PeerConnectionFactory.initializeAndroidGlobals(listener, true, true,
            params.videoCodecHwAcceleration);
    factory = new PeerConnectionFactory();
    MessageHandler messageHandler = new MessageHandler();

    try {
        client = IO.socket(host);
    } catch (URISyntaxException e) {
        e.printStackTrace();
    }
    client.on("id", messageHandler.onId);
    client.on("message", messageHandler.onMessage);
    client.connect();

    iceServers.add(new PeerConnection.IceServer("stun:23.21.150.121"));
    iceServers.add(new PeerConnection.IceServer("stun:stun.l.google.com:19302"));

    pcConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveAudio", "true"));
    pcConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveVideo", "true"));
    pcConstraints.optional.add(new MediaConstraints.KeyValuePair("DtlsSrtpKeyAgreement", "true"));
}
 
开发者ID:ardnezar,项目名称:webrtc-android,代码行数:25,代码来源:WebRtcClient.java

示例3: setCamera

import org.webrtc.MediaConstraints; //导入依赖的package包/类
private void setCamera(){
    localMS = factory.createLocalMediaStream("ARDAMS");
    if(pcParams.videoCallEnabled){
        MediaConstraints videoConstraints = new MediaConstraints();
        videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxHeight", Integer.toString(pcParams.videoHeight)));
        videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxWidth", Integer.toString(pcParams.videoWidth)));
        videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxFrameRate", Integer.toString(pcParams.videoFps)));
        videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("minFrameRate", Integer.toString(pcParams.videoFps)));

        videoSource = factory.createVideoSource(getVideoCapturer(), videoConstraints);
        localMS.addTrack(factory.createVideoTrack("ARDAMSv0", videoSource));
    }

    AudioSource audioSource = factory.createAudioSource(new MediaConstraints());
    localMS.addTrack(factory.createAudioTrack("ARDAMSa0", audioSource));

    mListener.onLocalStream(localMS);
}
 
开发者ID:ardnezar,项目名称:webrtc-android,代码行数:19,代码来源:WebRtcClient.java

示例4: addLocalStreams

import org.webrtc.MediaConstraints; //导入依赖的package包/类
private void addLocalStreams(Context context) {
    AudioManager audioManager = ((AudioManager) context.getSystemService(Context.AUDIO_SERVICE));
    // TODO(fischman): figure out how to do this Right(tm) and remove the suppression.
    @SuppressWarnings("deprecation")
    boolean isWiredHeadsetOn = audioManager.isWiredHeadsetOn();
    audioManager.setMode(isWiredHeadsetOn ? AudioManager.MODE_IN_CALL : AudioManager.MODE_IN_COMMUNICATION);
    audioManager.setSpeakerphoneOn(!isWiredHeadsetOn);

    localStream = peerConnectionFactory.createLocalMediaStream("ARDAMS");

    if (!audioOnly) {
        VideoCapturer capturer = getVideoCapturer();
        MediaConstraints videoConstraints = new MediaConstraints();
        videoSource = peerConnectionFactory.createVideoSource(capturer, videoConstraints);
        VideoTrack videoTrack = peerConnectionFactory.createVideoTrack("ARDAMSv0", videoSource);
        videoTrack.addRenderer(new VideoRenderer(localRender));
        localStream.addTrack(videoTrack);
    }

    localStream.addTrack(peerConnectionFactory.createAudioTrack("ARDAMSa0", peerConnectionFactory.createAudioSource(new MediaConstraints())));

    peerConnection.addStream(localStream);
}
 
开发者ID:respoke,项目名称:respoke-sdk-android,代码行数:24,代码来源:RespokeCall.java

示例5: createPeerConnection

import org.webrtc.MediaConstraints; //导入依赖的package包/类
private boolean createPeerConnection(Context context) {
	boolean success = false;

	if (PeerConnectionFactory.initializeAndroidGlobals(context)) {
		PeerConnectionFactory factory = new PeerConnectionFactory();
		List<IceServer> iceServers = new ArrayList<IceServer>();
		iceServers.add(new IceServer("stun:stun.l.google.com:19302"));
		// For TURN servers the format would be:
		// new IceServer("turn:url", user, password)

		MediaConstraints mediaConstraints = new MediaConstraints();
		mediaConstraints.optional.add(new MediaConstraints.KeyValuePair("DtlsSrtpKeyAgreement", "false"));
		mediaConstraints.optional.add(new MediaConstraints.KeyValuePair("RtpDataChannels", "true"));
		peerConnection = factory.createPeerConnection(iceServers, mediaConstraints, this);

		localStream = factory.createLocalMediaStream("WEBRTC_WORKSHOP_NS");
		localStream.addTrack(factory.createAudioTrack("WEBRTC_WORKSHOP_NSa1",
				factory.createAudioSource(new MediaConstraints())));
		peerConnection.addStream(localStream, new MediaConstraints());
		success = true;
	}

	return success;
}
 
开发者ID:Serchinastico,项目名称:webrtc-workshop,代码行数:25,代码来源:PeerConnectionWrapper.java

示例6: createAudioTrack

import org.webrtc.MediaConstraints; //导入依赖的package包/类
/**
 * Create the local audio stack
 */
private void createAudioTrack() {
    Log.d(LOG_TAG, "createAudioTrack");

    MediaConstraints audioConstraints = new MediaConstraints();

    // add all existing audio filters to avoid having echos
    audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair("googEchoCancellation", "true"));
    audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair("googEchoCancellation2", "true"));
    audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair("googDAEchoCancellation", "true"));

    audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair("googTypingNoiseDetection", "true"));

    audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair("googAutoGainControl", "true"));
    audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair("googAutoGainControl2", "true"));

    audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair("googNoiseSuppression", "true"));
    audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair("googNoiseSuppression2", "true"));

    audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair("googAudioMirroring", "false"));
    audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair("googHighpassFilter", "true"));

    mAudioSource = mPeerConnectionFactory.createAudioSource(audioConstraints);
    mLocalAudioTrack = mPeerConnectionFactory.createAudioTrack(AUDIO_TRACK_ID, mAudioSource);
}
 
开发者ID:matrix-org,项目名称:matrix-android-sdk,代码行数:28,代码来源:MXWebRtcCall.java

示例7: PeerConnectionWrapper

import org.webrtc.MediaConstraints; //导入依赖的package包/类
public PeerConnectionWrapper(@NonNull Context context,
                             @NonNull PeerConnectionFactory factory,
                             @NonNull PeerConnection.Observer observer,
                             @NonNull VideoRenderer.Callbacks localRenderer,
                             @NonNull List<PeerConnection.IceServer> turnServers,
                             boolean hideIp)
{
  List<PeerConnection.IceServer> iceServers = new LinkedList<>();
  iceServers.add(STUN_SERVER);
  iceServers.addAll(turnServers);

  MediaConstraints                constraints      = new MediaConstraints();
  MediaConstraints                audioConstraints = new MediaConstraints();
  PeerConnection.RTCConfiguration configuration    = new PeerConnection.RTCConfiguration(iceServers);

  configuration.bundlePolicy  = PeerConnection.BundlePolicy.MAXBUNDLE;
  configuration.rtcpMuxPolicy = PeerConnection.RtcpMuxPolicy.REQUIRE;

  if (hideIp) {
    configuration.iceTransportsType = PeerConnection.IceTransportsType.RELAY;
  }

  constraints.optional.add(new MediaConstraints.KeyValuePair("DtlsSrtpKeyAgreement", "true"));
  audioConstraints.optional.add(new MediaConstraints.KeyValuePair("DtlsSrtpKeyAgreement", "true"));

  this.peerConnection = factory.createPeerConnection(configuration, constraints, observer);
  this.videoCapturer  = createVideoCapturer(context);

  MediaStream mediaStream = factory.createLocalMediaStream("ARDAMS");
  this.audioSource = factory.createAudioSource(audioConstraints);
  this.audioTrack  = factory.createAudioTrack("ARDAMSa0", audioSource);
  this.audioTrack.setEnabled(false);
  mediaStream.addTrack(audioTrack);

  if (videoCapturer != null) {
    this.videoSource = factory.createVideoSource(videoCapturer);
    this.videoTrack = factory.createVideoTrack("ARDAMSv0", videoSource);

    this.videoTrack.addRenderer(new VideoRenderer(localRenderer));
    this.videoTrack.setEnabled(false);
    mediaStream.addTrack(videoTrack);
  } else {
    this.videoSource = null;
    this.videoTrack  = null;
  }

  this.peerConnection.addStream(mediaStream);
}
 
开发者ID:XecureIT,项目名称:PeSanKita-android,代码行数:49,代码来源:PeerConnectionWrapper.java

示例8: PnSignalingParams

import org.webrtc.MediaConstraints; //导入依赖的package包/类
public PnSignalingParams(
        List<PeerConnection.IceServer> iceServers,
        MediaConstraints pcConstraints,
        MediaConstraints videoConstraints,
        MediaConstraints audioConstraints) {
    this.iceServers       = (iceServers==null)       ? defaultIceServers()       : iceServers;
    this.pcConstraints    = (pcConstraints==null)    ? defaultPcConstraints()    : pcConstraints;
    this.videoConstraints = (videoConstraints==null) ? defaultVideoConstraints() : videoConstraints;
    this.audioConstraints = (audioConstraints==null) ? defaultAudioConstraints() : audioConstraints;
}
 
开发者ID:newbie007fx,项目名称:newwebrtc,代码行数:11,代码来源:PnSignalingParams.java

示例9: defaultInstance

import org.webrtc.MediaConstraints; //导入依赖的package包/类
/**
 * The default parameters for media constraints. Might have to tweak in future.
 * @return default parameters
 */
public static PnSignalingParams defaultInstance() {
    MediaConstraints pcConstraints    = PnSignalingParams.defaultPcConstraints();
    MediaConstraints videoConstraints = PnSignalingParams.defaultVideoConstraints();
    MediaConstraints audioConstraints = PnSignalingParams.defaultAudioConstraints();
    List<PeerConnection.IceServer> iceServers = PnSignalingParams.defaultIceServers();
    return new PnSignalingParams(iceServers, pcConstraints, videoConstraints, audioConstraints);
}
 
开发者ID:newbie007fx,项目名称:newwebrtc,代码行数:12,代码来源:PnSignalingParams.java

示例10: defaultPcConstraints

import org.webrtc.MediaConstraints; //导入依赖的package包/类
private static MediaConstraints defaultPcConstraints(){
    MediaConstraints pcConstraints = new MediaConstraints();
    pcConstraints.optional.add(new MediaConstraints.KeyValuePair("DtlsSrtpKeyAgreement", "true"));
    pcConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveAudio", "true"));
    pcConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveVideo", "true"));
    return pcConstraints;
}
 
开发者ID:newbie007fx,项目名称:newwebrtc,代码行数:8,代码来源:PnSignalingParams.java

示例11: defaultVideoConstraints

import org.webrtc.MediaConstraints; //导入依赖的package包/类
private static MediaConstraints defaultVideoConstraints(){
    MediaConstraints videoConstraints = new MediaConstraints();
    videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxWidth","1280"));
    videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxHeight","720"));
    videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("minWidth", "640"));
    videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("minHeight","480"));
    return videoConstraints;
}
 
开发者ID:newbie007fx,项目名称:newwebrtc,代码行数:9,代码来源:PnSignalingParams.java

示例12: start

import org.webrtc.MediaConstraints; //导入依赖的package包/类
public void start() {
    start.setEnabled(false);
    call.setEnabled(true);
    //Initialize PeerConnectionFactory globals.
    //Params are context, initAudio,initVideo and videoCodecHwAcceleration
    PeerConnectionFactory.initializeAndroidGlobals(this, true, true, true);

    //Create a new PeerConnectionFactory instance.
    PeerConnectionFactory.Options options = new PeerConnectionFactory.Options();
    peerConnectionFactory = new PeerConnectionFactory(options);


    //Now create a VideoCapturer instance. Callback methods are there if you want to do something! Duh!
    VideoCapturer videoCapturerAndroid = getVideoCapturer(new CustomCameraEventsHandler());

    //Create MediaConstraints - Will be useful for specifying video and audio constraints.
    audioConstraints = new MediaConstraints();
    videoConstraints = new MediaConstraints();

    //Create a VideoSource instance
    videoSource = peerConnectionFactory.createVideoSource(videoCapturerAndroid);
    localVideoTrack = peerConnectionFactory.createVideoTrack("100", videoSource);

    //create an AudioSource instance
    audioSource = peerConnectionFactory.createAudioSource(audioConstraints);
    localAudioTrack = peerConnectionFactory.createAudioTrack("101", audioSource);
    localVideoView.setVisibility(View.VISIBLE);

    //create a videoRenderer based on SurfaceViewRenderer instance
    localRenderer = new VideoRenderer(localVideoView);
    // And finally, with our VideoRenderer ready, we
    // can add our renderer to the VideoTrack.
    localVideoTrack.addRenderer(localRenderer);

}
 
开发者ID:vivek1794,项目名称:webrtc-android-codelab,代码行数:36,代码来源:MainActivity.java

示例13: WebRtcClient

import org.webrtc.MediaConstraints; //导入依赖的package包/类
public WebRtcClient(RtcListener listener, String host, PeerConnectionParameters params) {
        mListener = listener;
        pcParams = params;
        PeerConnectionFactory.initializeAndroidGlobals(listener, true, true,
                params.videoCodecHwAcceleration);
        factory = new PeerConnectionFactory();
        MessageHandler messageHandler = new MessageHandler();
        Log.d(TAG, "WebRtcClient..host:" + host);
        try {
            Manager man = new Manager(new URI(host));
//            client = IO.socket(host);
            client = man.socket("/hello");

        } catch (URISyntaxException e) {
            e.printStackTrace();
            Log.d(TAG, "WebRtcClient..exception");
        }
        client.on("id", messageHandler.onId);
        client.on("message", messageHandler.onMessage);
        client.connect();

//        iceServers.add(new PeerConnection.IceServer("stun:23.21.150.121"));
        iceServers.add(new PeerConnection.IceServer("stun:stun.l.google.com:19302"));

        pcConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveAudio", "true"));
        pcConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveVideo", "true"));
        pcConstraints.optional.add(new MediaConstraints.KeyValuePair("DtlsSrtpKeyAgreement", "true"));
    }
 
开发者ID:ardnezar,项目名称:webrtc-android,代码行数:29,代码来源:WebRtcClient.java

示例14: onAnswerButtonClicked

import org.webrtc.MediaConstraints; //导入依赖的package包/类
public void onAnswerButtonClicked(final View view) {
  progressState.changeValue(ProgressState.NEGOTIATING);
  showHangButton();

  final PeerConnectionFactory factory = ((Application) getApplication()).getWebRtcFactory();
  final MediaStream stream = factory.createLocalMediaStream(UUID.randomUUID().toString());
  stream.addTrack(factory.createAudioTrack(
      UUID.randomUUID().toString(),
      factory.createAudioSource(CONSTRAINTS)
  ));
  peerConnection.addStream(stream);
  peerConnection.createAnswer(sdpObserver, new MediaConstraints());
}
 
开发者ID:seamlik,项目名称:viska-android,代码行数:14,代码来源:CallingActivity.java

示例15: createPeerConnection

import org.webrtc.MediaConstraints; //导入依赖的package包/类
/**
 * Creates a PeerConnection.
 * @param config configuration of PeerConnection
 */
private void createPeerConnection(final List<PeerConnection.IceServer> config) {
    MediaConstraints mc = new MediaConstraints();
    try {
        mPeerConnection = mFactory.createPeerConnection(config, mc, mObserver);
    } catch (Exception e) {
        if (BuildConfig.DEBUG) {
            Log.e(TAG, "@@@ Failed to create PeerConnection.", e);
        }
        throw new RuntimeException(e);
    }
}
 
开发者ID:DeviceConnect,项目名称:DeviceConnect-Android,代码行数:16,代码来源:MediaConnection.java


注:本文中的org.webrtc.MediaConstraints类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。