当前位置: 首页>>代码示例>>Java>>正文


Java VideoCapturerAndroid.create方法代码示例

本文整理汇总了Java中org.webrtc.VideoCapturerAndroid.create方法的典型用法代码示例。如果您正苦于以下问题:Java VideoCapturerAndroid.create方法的具体用法?Java VideoCapturerAndroid.create怎么用?Java VideoCapturerAndroid.create使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在org.webrtc.VideoCapturerAndroid的用法示例。


在下文中一共展示了VideoCapturerAndroid.create方法的4个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: getVideoCapturer

import org.webrtc.VideoCapturerAndroid; //导入方法依赖的package包/类
private VideoCapturer getVideoCapturer(CameraVideoCapturer.CameraEventsHandler eventsHandler) {
    String[] cameraFacing = {"front", "back"};
    int[] cameraIndex = {0, 1};
    int[] cameraOrientation = {0, 90, 180, 270};
    for (String facing : cameraFacing) {
        for (int index : cameraIndex) {
            for (int orientation : cameraOrientation) {
                String name = "Camera " + index + ", Facing " + facing +
                        ", Orientation " + orientation;
                VideoCapturer capturer = VideoCapturerAndroid.create(name, eventsHandler);
                if (capturer != null) {
                    Log.d("Using camera: ", name);
                    return capturer;
                }
            }
        }
    }
    throw new RuntimeException("Failed to open capture");
}
 
开发者ID:vivek1794,项目名称:webrtc-android-codelab,代码行数:20,代码来源:MainActivity.java

示例2: getVideoCapturer

import org.webrtc.VideoCapturerAndroid; //导入方法依赖的package包/类
private VideoCapturer getVideoCapturer() {
    String frontCameraDeviceName = CameraEnumerationAndroid.getDeviceName(0);
    return VideoCapturerAndroid.create(frontCameraDeviceName);
}
 
开发者ID:ardnezar,项目名称:webrtc-android,代码行数:5,代码来源:WebRtcClient.java

示例3: onCreate

import org.webrtc.VideoCapturerAndroid; //导入方法依赖的package包/类
@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    setContentView(R.layout.activity_main);

    AudioManager audioManager = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
    audioManager.setMode(AudioManager.MODE_IN_COMMUNICATION);
    audioManager.setSpeakerphoneOn(true);

    PeerConnectionFactory.initializeAndroidGlobals(
            this,  // Context
            true,  // Audio Enabled
            true,  // Video Enabled
            true,  // Hardware Acceleration Enabled
            null); // Render EGL Context

    peerConnectionFactory = new PeerConnectionFactory();

    VideoCapturerAndroid vc = VideoCapturerAndroid.create(VideoCapturerAndroid.getNameOfFrontFacingDevice(), null);

    localVideoSource = peerConnectionFactory.createVideoSource(vc, new MediaConstraints());
    VideoTrack localVideoTrack = peerConnectionFactory.createVideoTrack(VIDEO_TRACK_ID, localVideoSource);
    localVideoTrack.setEnabled(true);

    AudioSource audioSource = peerConnectionFactory.createAudioSource(new MediaConstraints());
    AudioTrack localAudioTrack = peerConnectionFactory.createAudioTrack(AUDIO_TRACK_ID, audioSource);
    localAudioTrack.setEnabled(true);

    localMediaStream = peerConnectionFactory.createLocalMediaStream(LOCAL_STREAM_ID);
    localMediaStream.addTrack(localVideoTrack);
    localMediaStream.addTrack(localAudioTrack);

    GLSurfaceView videoView = (GLSurfaceView) findViewById(R.id.glview_call);

    VideoRendererGui.setView(videoView, null);
    try {
        otherPeerRenderer = VideoRendererGui.createGui(0, 0, 100, 100, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
        VideoRenderer renderer = VideoRendererGui.createGui(50, 50, 50, 50, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
        localVideoTrack.addRenderer(renderer);
    } catch (Exception e) {
        e.printStackTrace();
    }
}
 
开发者ID:Nitrillo,项目名称:krankygeek,代码行数:45,代码来源:MainActivity.java

示例4: createLocalMediaStream

import org.webrtc.VideoCapturerAndroid; //导入方法依赖的package包/类
void createLocalMediaStream(Object renderEGLContext,final VideoRenderer.Callbacks localRender) {
    if (factory == null) {
        Log.e(TAG, "Peerconnection factory is not created");
        return;
    }
    this.localRender = localRender;
    if (videoCallEnabled) {
        factory.setVideoHwAccelerationOptions(renderEGLContext, renderEGLContext);
    }

    // Set default WebRTC tracing and INFO libjingle logging.
    // NOTE: this _must_ happen while |factory| is alive!
    Logging.enableTracing("logcat:", EnumSet.of(Logging.TraceLevel.TRACE_DEFAULT), Logging.Severity.LS_INFO);

    localMediaStream = factory.createLocalMediaStream("ARDAMS");

    // If video call is enabled and the device has camera(s)
    if (videoCallEnabled && numberOfCameras > 0) {
        String cameraDeviceName; // = CameraEnumerationAndroid.getDeviceName(0);
        String frontCameraDeviceName = CameraEnumerationAndroid.getNameOfFrontFacingDevice();
        String backCameraDeviceName = CameraEnumerationAndroid.getNameOfBackFacingDevice();

        // If current camera is set to front and the device has one
        if (currentCameraPosition==NBMCameraPosition.FRONT && frontCameraDeviceName!=null) {
            cameraDeviceName = frontCameraDeviceName;
        }
        // If current camera is set to back and the device has one
        else if (currentCameraPosition==NBMCameraPosition.BACK && backCameraDeviceName!=null) {
            cameraDeviceName = backCameraDeviceName;
        }
        // If current camera is set to any then we pick the first camera of the device, which
        // should be a back-facing camera according to libjingle API
        else {
            cameraDeviceName = CameraEnumerationAndroid.getDeviceName(0);
            currentCameraPosition = NBMCameraPosition.BACK;
        }

        Log.d(TAG, "Opening camera: " + cameraDeviceName);
        videoCapturer = VideoCapturerAndroid.create(cameraDeviceName, null);
        if (videoCapturer == null) {
            Log.d(TAG, "Error while opening camera");
            return;
        }
        localMediaStream.addTrack(createCapturerVideoTrack(videoCapturer));
    }

    // Create audio track
    localMediaStream.addTrack(factory.createAudioTrack(AUDIO_TRACK_ID, factory.createAudioSource(audioConstraints)));

    Log.d(TAG, "Local media stream created.");
}
 
开发者ID:nubomedia-vtt,项目名称:webrtcpeer-android,代码行数:52,代码来源:MediaResourceManager.java


注:本文中的org.webrtc.VideoCapturerAndroid.create方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。