当前位置: 首页>>代码示例>>Java>>正文


Java AudioTrack.setEnabled方法代码示例

本文整理汇总了Java中org.webrtc.AudioTrack.setEnabled方法的典型用法代码示例。如果您正苦于以下问题:Java AudioTrack.setEnabled方法的具体用法?Java AudioTrack.setEnabled怎么用?Java AudioTrack.setEnabled使用的例子?那么, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在org.webrtc.AudioTrack的用法示例。


在下文中一共展示了AudioTrack.setEnabled方法的5个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: onAddStream

import org.webrtc.AudioTrack; //导入方法依赖的package包/类
@Override
public void onAddStream(MediaStream stream) {
  Log.w(TAG, "onAddStream:" + stream);

  for (AudioTrack audioTrack : stream.audioTracks) {
    audioTrack.setEnabled(true);
  }

  if (stream.videoTracks != null && stream.videoTracks.size() == 1) {
    VideoTrack videoTrack = stream.videoTracks.getFirst();
    videoTrack.setEnabled(true);
    videoTrack.addRenderer(new VideoRenderer(remoteRenderer));
  }

}
 
开发者ID:XecureIT,项目名称:PeSanKita-android,代码行数:16,代码来源:WebRtcCallService.java

示例2: onAddStream

import org.webrtc.AudioTrack; //导入方法依赖的package包/类
@Override
public void onAddStream(MediaStream stream) {
  Log.w(TAG, "onAddStream:" + stream);

  for (AudioTrack audioTrack : stream.audioTracks) {
    audioTrack.setEnabled(true);
  }

  if (stream.videoTracks != null && stream.videoTracks.size() == 1) {
    VideoTrack videoTrack = stream.videoTracks.getFirst();
    videoTrack.setEnabled(true);
    videoTrack.addRenderer(new VideoRenderer(remoteRenderer));
  }
}
 
开发者ID:CableIM,项目名称:Cable-Android,代码行数:15,代码来源:WebRtcCallService.java

示例3: setAudioEnabled

import org.webrtc.AudioTrack; //导入方法依赖的package包/类
@Override
public void setAudioEnabled(boolean enabled) {
	if (mState != State.kConnected || lMS == null) {
		return;
	}

	for (AudioTrack audioTrack : lMS.audioTracks) {
		audioTrack.setEnabled(enabled);
	}
}
 
开发者ID:K-GmbH,项目名称:licodeAndroidClient,代码行数:11,代码来源:LicodeConnector.java

示例4: onCreate

import org.webrtc.AudioTrack; //导入方法依赖的package包/类
@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    setContentView(R.layout.activity_main);

    AudioManager audioManager = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
    audioManager.setMode(AudioManager.MODE_IN_COMMUNICATION);
    audioManager.setSpeakerphoneOn(true);

    PeerConnectionFactory.initializeAndroidGlobals(
            this,  // Context
            true,  // Audio Enabled
            true,  // Video Enabled
            true,  // Hardware Acceleration Enabled
            null); // Render EGL Context

    peerConnectionFactory = new PeerConnectionFactory();

    VideoCapturerAndroid vc = VideoCapturerAndroid.create(VideoCapturerAndroid.getNameOfFrontFacingDevice(), null);

    localVideoSource = peerConnectionFactory.createVideoSource(vc, new MediaConstraints());
    VideoTrack localVideoTrack = peerConnectionFactory.createVideoTrack(VIDEO_TRACK_ID, localVideoSource);
    localVideoTrack.setEnabled(true);

    AudioSource audioSource = peerConnectionFactory.createAudioSource(new MediaConstraints());
    AudioTrack localAudioTrack = peerConnectionFactory.createAudioTrack(AUDIO_TRACK_ID, audioSource);
    localAudioTrack.setEnabled(true);

    localMediaStream = peerConnectionFactory.createLocalMediaStream(LOCAL_STREAM_ID);
    localMediaStream.addTrack(localVideoTrack);
    localMediaStream.addTrack(localAudioTrack);

    GLSurfaceView videoView = (GLSurfaceView) findViewById(R.id.glview_call);

    VideoRendererGui.setView(videoView, null);
    try {
        otherPeerRenderer = VideoRendererGui.createGui(0, 0, 100, 100, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
        VideoRenderer renderer = VideoRendererGui.createGui(50, 50, 50, 50, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
        localVideoTrack.addRenderer(renderer);
    } catch (Exception e) {
        e.printStackTrace();
    }
}
 
开发者ID:Nitrillo,项目名称:krankygeek,代码行数:45,代码来源:MainActivity.java

示例5: doPublish

import org.webrtc.AudioTrack; //导入方法依赖的package包/类
/** begin streaming to server - MUST run on VcThread */
void doPublish(VideoStreamsView view) {
	if (mVideoCapturer != null) {
		return;
	}

	MediaConstraints videoConstraints = new MediaConstraints();
	videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair(
			"maxWidth", "320"));
	videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair(
			"maxHeight", "240"));
	videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair(
			"maxFrameRate", "10"));
	MediaConstraints audioConstraints = new MediaConstraints();
	audioConstraints.optional.add(new MediaConstraints.KeyValuePair(
			"googEchoCancellation2", "true"));
	audioConstraints.optional.add(new MediaConstraints.KeyValuePair(
			"googNoiseSuppression", "true"));
	lMS = sFactory.createLocalMediaStream("ARDAMS");

	if (videoConstraints != null) {
		mVideoCapturer = getVideoCapturer();
		mVideoSource = sFactory.createVideoSource(mVideoCapturer,
				videoConstraints);
		VideoTrack videoTrack = sFactory.createVideoTrack("ARDAMSv0",
				mVideoSource);
		lMS.addTrack(videoTrack);
	}
	if (audioConstraints != null) {
		AudioTrack audioTrack = sFactory.createAudioTrack("ARDAMSa0",
				sFactory.createAudioSource(audioConstraints));
		lMS.addTrack(audioTrack);
		audioTrack.setEnabled(false);
	}

	StreamDescription stream = new StreamDescription("", false, true, true,
			false, null, mNick);
	MediaConstraints pcConstraints = makePcConstraints();
	MyPcObserver pcObs = new MyPcObserver(new LicodeSdpObserver(stream,
			true), stream);

	PeerConnection pc = sFactory.createPeerConnection(mIceServers,
			pcConstraints, pcObs);
	pc.addStream(lMS, new MediaConstraints());

	stream.setMedia(lMS);
	if (view != null) {
		stream.attachRenderer(new VideoCallbacks(view,
				VideoStreamsView.LOCAL_STREAM_ID));
	}
	stream.initLocal(pc, pcObs.getSdpObserver());
}
 
开发者ID:K-GmbH,项目名称:licodeAndroidClient,代码行数:53,代码来源:LicodeConnector.java


注:本文中的org.webrtc.AudioTrack.setEnabled方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。