當前位置: 首頁>>代碼示例>>Java>>正文


Java CaptureResult.get方法代碼示例

本文整理匯總了Java中android.hardware.camera2.CaptureResult.get方法的典型用法代碼示例。如果您正苦於以下問題:Java CaptureResult.get方法的具體用法?Java CaptureResult.get怎麽用?Java CaptureResult.get使用的例子?那麽, 這裏精選的方法代碼示例或許可以為您提供幫助。您也可以進一步了解該方法所在android.hardware.camera2.CaptureResult的用法示例。


在下文中一共展示了CaptureResult.get方法的8個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。

示例1: detectFaces

import android.hardware.camera2.CaptureResult; //導入方法依賴的package包/類
private void detectFaces(CaptureResult captureResult) {
    Integer mode = captureResult.get(CaptureResult.STATISTICS_FACE_DETECT_MODE);

    if (isViewAvailable() && mode != null) {
        android.hardware.camera2.params.Face[] faces = captureResult.get(CaptureResult.STATISTICS_FACES);
        if (faces != null) {
            Log.i(TAG, "faces : " + faces.length + " , mode : " + mode);
            for (android.hardware.camera2.params.Face face : faces) {
                Rect faceBounds = face.getBounds();
                // Once processed, the result is sent back to the View
                presenterView.onFaceDetected(mapCameraFaceToCanvas(faceBounds, face.getLeftEyePosition(),
                        face.getRightEyePosition()));
            }
        }
    }
}
 
開發者ID:raulh82vlc,項目名稱:Image-Detection-Samples,代碼行數:17,代碼來源:FDCamera2Presenter.java

示例2: process

import android.hardware.camera2.CaptureResult; //導入方法依賴的package包/類
private void process(CaptureResult captureResult) {
    switch (mCaptureState) {
        case STATE_PREVIEW:
            // Do nothing
            break;
        case STATE_WAIT_LOCK:
            mCaptureState = STATE_PREVIEW;
            Integer afState = captureResult.get(CaptureResult.CONTROL_AF_STATE);
            if(afState == CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED ||
                    afState == CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED) {
                Toast.makeText(getApplicationContext(), "AF Locked!", Toast.LENGTH_SHORT).show();
                startStillCaptureRequest();
            }
            break;
    }
}
 
開發者ID:mobapptuts,項目名稱:android_camera2_api_video_app,代碼行數:17,代碼來源:Camera2VideoImageActivity.java

示例3: checkControlAfState

import android.hardware.camera2.CaptureResult; //導入方法依賴的package包/類
/**
 * Complain if CONTROL_AF_STATE is not present in result.
 * Could indicate bug in API implementation.
 */
public static boolean checkControlAfState(CaptureResult result) {
    boolean missing = result.get(CaptureResult.CONTROL_AF_STATE) == null;
    if (missing) {
        // throw new IllegalStateException("CaptureResult missing CONTROL_AF_STATE.");
        Log.e(TAG, "\n!!!! TotalCaptureResult missing CONTROL_AF_STATE. !!!!\n ");
    }
    return !missing;
}
 
開發者ID:jameliu,項目名稱:Camera2,代碼行數:13,代碼來源:AutoFocusHelper.java

示例4: checkLensState

import android.hardware.camera2.CaptureResult; //導入方法依賴的package包/類
/**
 * Complain if LENS_STATE is not present in result.
 * Could indicate bug in API implementation.
 */
public static boolean checkLensState(CaptureResult result) {
    boolean missing = result.get(CaptureResult.LENS_STATE) == null;
    if (missing) {
        // throw new IllegalStateException("CaptureResult missing LENS_STATE.");
        Log.e(TAG, "\n!!!! TotalCaptureResult missing LENS_STATE. !!!!\n ");
    }
    return !missing;
}
 
開發者ID:jameliu,項目名稱:Camera2,代碼行數:13,代碼來源:AutoFocusHelper.java

示例5: onCaptureProgressed

import android.hardware.camera2.CaptureResult; //導入方法依賴的package包/類
@Override
public void onCaptureProgressed(CameraCaptureSession session, CaptureRequest request,
        final CaptureResult partialResult) {
    long frameNumber = partialResult.getFrameNumber();

    // Update mMetadata for whichever keys are present, if this frame is
    // supplying newer values.
    for (final Key<?> key : partialResult.getKeys()) {
        Pair<Long, Object> oldEntry = mMetadata.get(key);
        final Object oldValue = (oldEntry != null) ? oldEntry.second : null;

        boolean newerValueAlreadyExists = oldEntry != null
                && frameNumber < oldEntry.first;
        if (newerValueAlreadyExists) {
            continue;
        }

        final Object newValue = partialResult.get(key);
        mMetadata.put(key, new Pair<Long, Object>(frameNumber, newValue));

        // If the value has changed, call the appropriate listeners, if
        // any exist.
        if (oldValue == newValue || !mMetadataChangeListeners.containsKey(key)) {
            continue;
        }

        for (final MetadataChangeListener listener :
                mMetadataChangeListeners.get(key)) {
            Log.v(TAG, "Dispatching to metadata change listener for key: "
                    + key.toString());
            mListenerHandler.post(new Runnable() {
                    @Override
                public void run() {
                    listener.onImageMetadataChange(key, oldValue, newValue,
                            partialResult);
                }
            });
        }
    }
}
 
開發者ID:jameliu,項目名稱:Camera2,代碼行數:41,代碼來源:ImageCaptureManager.java

示例6: autofocusStateChangeDispatcher

import android.hardware.camera2.CaptureResult; //導入方法依賴的package包/類
/**
 * This method takes appropriate action if camera2 AF state changes.
 * <ol>
 * <li>Reports changes in camera2 AF state to OneCamera.FocusStateListener.</li>
 * <li>Take picture after AF scan if mTakePictureWhenLensIsStopped true.</li>
 * </ol>
 */
private void autofocusStateChangeDispatcher(CaptureResult result) {
    if (result.getFrameNumber() < mLastControlAfStateFrameNumber ||
            result.get(CaptureResult.CONTROL_AF_STATE) == null) {
        return;
    }
    mLastControlAfStateFrameNumber = result.getFrameNumber();

    // Convert to OneCamera mode and state.
    AutoFocusState resultAFState = AutoFocusHelper.
            stateFromCamera2State(result.get(CaptureResult.CONTROL_AF_STATE));

    // TODO: Consider using LENS_STATE.
    boolean lensIsStopped = resultAFState == AutoFocusState.ACTIVE_FOCUSED ||
            resultAFState == AutoFocusState.ACTIVE_UNFOCUSED ||
            resultAFState == AutoFocusState.PASSIVE_FOCUSED ||
            resultAFState == AutoFocusState.PASSIVE_UNFOCUSED;

    if (mTakePictureWhenLensIsStopped && lensIsStopped) {
        // Take the shot.
        mCameraHandler.post(mTakePictureRunnable);
        mTakePictureWhenLensIsStopped = false;
    }

    // Report state change when AF state has changed.
    if (resultAFState != mLastResultAFState && mFocusStateListener != null) {
        mFocusStateListener.onFocusStatusUpdate(resultAFState, result.getFrameNumber());
    }
    mLastResultAFState = resultAFState;
}
 
開發者ID:jameliu,項目名稱:Camera2,代碼行數:37,代碼來源:OneCameraImpl.java

示例7: process

import android.hardware.camera2.CaptureResult; //導入方法依賴的package包/類
private void process(CaptureResult result) {
            int afState = result.get(CaptureResult.CONTROL_AF_STATE);
            if (CaptureResult.CONTROL_AF_STATE_PASSIVE_FOCUSED == afState) {
                areWeFocused = true;
                getActivity().runOnUiThread(new Runnable() {
                    @Override
                    public void run() {
                        button.setBackgroundColor(getActivity().getResources().getColor(R.color.blue));
                        button.setText("Focused");
                    }
                });
            } else {
                areWeFocused = false;
                getActivity().runOnUiThread(new Runnable() {
                    @Override
                    public void run() {
                        button.setBackgroundColor(getActivity().getResources().getColor(R.color.colorAccent));
                        button.setText("Not focused");
                    }
                });
            }

            if (shouldCapture) {
                if (areWeFocused) {
                    shouldCapture = false;
                    captureStillPicture();
                }
            }


//            switch (mState) {
//                case STATE_PREVIEW: {
//                    Log.d(TAG, "STATE_PREVIEW");
//                    // We have nothing to do when the camera preview is working normally.
//                    break;
//                }
//                case STATE_WAITING_LOCK: {
//                    Log.d(TAG, "STATE_WAITING_LOCK");
//                    Integer afState = result.get(CaptureResult.CONTROL_AF_STATE);
//                    if (afState == null) {
//                        captureStillPicture();
//                    } else if (CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED == afState ||
//                            CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED == afState ||
//                            CaptureResult.CONTROL_AF_STATE_INACTIVE == afState /*add this*/) {
//                        Log.d(TAG, "STATE_WAITING_LOCK222");
//                        // CONTROL_AE_STATE can be null on some devices
//                        Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
//                        if (aeState == null || aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED) {
//                            mState = STATE_PICTURE_TAKEN;
//                            captureStillPicture();
//                        } else {
//                            runPrecaptureSequence();
//                        }
//                    }
//                    break;
//                }
//                case STATE_WAITING_PRECAPTURE: {
//                    Log.d(TAG, "STATE_WAITING_PRECAPTURE");
//                    // CONTROL_AE_STATE can be null on some devices
//                    Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
//                    if (aeState == null ||
//                            aeState == CaptureResult.CONTROL_AE_STATE_PRECAPTURE ||
//                            aeState == CaptureRequest.CONTROL_AE_STATE_FLASH_REQUIRED) {
//                        mState = STATE_WAITING_NON_PRECAPTURE;
//                    }
//                    break;
//                }
//                case STATE_WAITING_NON_PRECAPTURE: {
//                    Log.d(TAG, "STATE_WAITING_NON_PRECAPTURE");
//                    // CONTROL_AE_STATE can be null on some devices
//                    Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
//                    if (aeState == null || aeState != CaptureResult.CONTROL_AE_STATE_PRECAPTURE) {
//                        mState = STATE_PICTURE_TAKEN;
//                        captureStillPicture();
//                    }
//                    break;
//                }
//            }

        }
 
開發者ID:vulovicv23,項目名稱:opencv-documentscanner-android,代碼行數:81,代碼來源:Camera2BasicFragment.java

示例8: process

import android.hardware.camera2.CaptureResult; //導入方法依賴的package包/類
private void process(CaptureResult result) {
    synchronized (mCameraStateLock) {
        switch (mState) {
            case STATE_PREVIEW: {
                // We have nothing to do when the camera preview is running normally.
                break;
            }
            case STATE_WAITING_FOR_3A_CONVERGENCE: {
                boolean readyToCapture = true;
                if (!mNoAFRun) {
                    Integer afState = result.get(CaptureResult.CONTROL_AF_STATE);
                    if (afState == null) {
                        break;
                    }

                    // If auto-focus has reached locked state, we are ready to capture
                    readyToCapture =
                            (afState == CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED ||
                                    afState == CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED);
                }

                // If we are running on an non-legacy device, we should also wait until
                // auto-exposure and auto-white-balance have converged as well before
                // taking a picture.
                if (!CameraDeviceCapability.isLegacyLocked(mCharacteristics)) {
                    Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
                    Integer awbState = result.get(CaptureResult.CONTROL_AWB_STATE);
                    if (aeState == null || awbState == null) {
                        break;
                    }

                    readyToCapture = readyToCapture &&
                            aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED &&
                            awbState == CaptureResult.CONTROL_AWB_STATE_CONVERGED;
                }

                // If we haven't finished the pre-capture sequence but have hit our maximum
                // wait timeout, too bad! Begin capture anyway.
                if (!readyToCapture && hitTimeoutLocked()) {
                    Log.w(TAG, "Timed out waiting for pre-capture sequence to complete.");
                    readyToCapture = true;
                }

                if (readyToCapture && mPendingUserCaptures > 0) {
                    // Capture once for each user tap of the "Picture" button.
                    while (mPendingUserCaptures > 0) {
                        captureStillPictureLocked();
                        mPendingUserCaptures--;
                    }
                    // After this, the camera will go back to the normal state of preview.
                    mState = STATE_PREVIEW;
                }
            }
        }
    }
}
 
開發者ID:OkayCamera,項目名稱:OkayCamera-Android,代碼行數:57,代碼來源:Camera2RawFragment.java


注:本文中的android.hardware.camera2.CaptureResult.get方法示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。