android - 将 Camera2 API 与 ImageReader 结合使用

标签 android android-camera

我正在尝试使用 Galaxy S4 上的 Camera2 API 捕获图像数据。 ImageReader 被用作表面提供者。使用的图像格式已在 ImageFormat.YV12 和 ImageFormat.YUV_420_888 上进行了尝试,并产生了相同的结果。

设置看起来不错,我使用 ImageReader 从 ImageReader 获取图像。图像有 3 个平面。缓冲区是预期的大小,Y 平面的 Width*Height 和其他两个平面的 (Width*Height)/4。

问题是我无法通过两种方式正确获取数据。第一个问题是Y平面数据是镜像的。这是可以处理的,虽然这很奇怪,所以我很好奇这是否是预期的。

更糟糕的是,其他两架飞机似乎根本无法正确传送数据。例如,图像大小为 640x480,导致 U 和 V 缓冲区大小为 76800 字节,只有缓冲区的前 320 个字节是非零值。这个数字会有所不同,并且似乎不遵循不同图像尺寸之间的固定比例,但在每种尺寸的图像之间似乎是一致的。

我想知道我在使用这个 API 时是否遗漏了什么。代码如下。

public class OnboardCamera {
  private final String TAG = "OnboardCamera";

  int mWidth = 1280;
  int mHeight = 720;
  int mYSize = mWidth*mHeight;
  int mUVSize = mYSize/4;
  int mFrameSize = mYSize+(mUVSize*2); 

  //handler for the camera
  private HandlerThread mCameraHandlerThread;
  private Handler mCameraHandler;

  //the size of the ImageReader determines the output from the camera.
  private ImageReader mImageReader = ImageReader.newInstance(mWidth, mHeight, ImageFormat.YV12, 30);

  private Surface mCameraRecieverSurface = mImageReader.getSurface();
  {
      mImageReader.setOnImageAvailableListener(mImageAvailListener, mCameraHandler);
  }

  private byte[] tempYbuffer = new byte[mYSize];
  private byte[] tempUbuffer = new byte[mUVSize];
  private byte[] tempVbuffer = new byte[mUVSize];

  ImageReader.OnImageAvailableListener mImageAvailListener = new ImageReader.OnImageAvailableListener() {
      @Override
      public void onImageAvailable(ImageReader reader) {
          //when a buffer is available from the camera
          //get the image
          Image image = reader.acquireNextImage();
          Image.Plane[] planes = image.getPlanes();

          //copy it into a byte[]
          byte[] outFrame = new byte[mFrameSize];
          int outFrameNextIndex = 0;


          ByteBuffer sourceBuffer = planes[0].getBuffer();
          sourceBuffer.get(tempYbuffer, 0, tempYbuffer.length);

          ByteBuffer vByteBuf = planes[1].getBuffer();
          vByteBuf.get(tempVbuffer);

          ByteBuffer yByteBuf = planes[2].getBuffer();
          yByteBuf.get(tempUbuffer);

          //free the Image
          image.close();
      }
  };


  OnboardCamera() {
      mCameraHandlerThread = new HandlerThread("mCameraHandlerThread");
      mCameraHandlerThread.start();
      mCameraHandler = new Handler(mCameraHandlerThread.getLooper());

  }




  @Override
  public boolean startProducing() {
      CameraManager cm = (CameraManager) Ten8Application.getAppContext().getSystemService(Context.CAMERA_SERVICE);
      try {
          String[] cameraList = cm.getCameraIdList();
          for (String cd: cameraList) {
              //get camera characteristics
              CameraCharacteristics mCameraCharacteristics = cm.getCameraCharacteristics(cd);

              //check if the camera is in the back - if not, continue to next
              if (mCameraCharacteristics.get(CameraCharacteristics.LENS_FACING) != CameraCharacteristics.LENS_FACING_BACK) {
                  continue;
              }

              //get StreamConfigurationMap - supported image formats
              StreamConfigurationMap scm = mCameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);

              android.util.Size[] sizes =  scm.getOutputSizes(ImageFormat.YV12);

              cm.openCamera(cd, mDeviceStateCallback, mCameraHandler);
          }

      } catch (CameraAccessException e) {
          e.printStackTrace();
          Log.e(TAG, "CameraAccessException detected", e);
      }
      return false;
  }

  private final CameraDevice.StateCallback mDeviceStateCallback = new CameraDevice.StateCallback() {
      @Override
      public void onOpened(CameraDevice camera) {
          //make list of surfaces to give to camera
          List<Surface> surfaceList = new ArrayList<>();
          surfaceList.add(mCameraRecieverSurface);

          try {
              camera.createCaptureSession(surfaceList, mCaptureSessionStateCallback, mCameraHandler); 
          } catch (CameraAccessException e) {
              Log.e(TAG, "createCaptureSession threw CameraAccessException.", e);
          }
      }

      @Override
      public void onDisconnected(CameraDevice camera) {

      }

      @Override
      public void onError(CameraDevice camera, int error) {

      }
  };

  private final CameraCaptureSession.StateCallback mCaptureSessionStateCallback = new CameraCaptureSession.StateCallback() {
      @Override
      public void onConfigured(CameraCaptureSession session) {
          try {
              CaptureRequest.Builder requestBuilder = session.getDevice().createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
              requestBuilder.addTarget(mCameraRecieverSurface);
              //set to null - image data will be produced but will not receive metadata
              session.setRepeatingRequest(requestBuilder.build(), null, mCameraHandler); 

          } catch (CameraAccessException e) {
              Log.e(TAG, "createCaptureSession threw CameraAccessException.", e);
          }


      }

      @Override
      public void onConfigureFailed(CameraCaptureSession session) {

      }
  };
}

最佳答案

我有同样的问题,我认为问题出在 Android API 21 中。我升级到 API 23 并且相同的代码工作正常。还在 API 22 上进行了测试,它也有效。

关于android - 将 Camera2 API 与 ImageReader 结合使用,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/31350451/

相关文章:

android - 图像尺寸太小 Azure Face API Android

Android - WrapText in Edittext 的问题

android - 通过捏而不是 Activity 旋转 View 时 getX() 和 getY() 的错误值

java - .set Preview Display(holder) 创建空指针异常

android - Eclipse - keystore 被篡改,或密码不正确。但是昨天成功了

android - Google 健身自定义数据类型

android - 错误 :(32, 13) 无法解析 : com. afollestad.material-dialogs :core:0. 8.1.0

java - 相机 "Bitmap imageBitmap = (Bitmap) extras.get("数据");"给出 Nullpointer 错误

android - 如何在Android中修复相机预览(surfaceview)的正确纵横比?

android - 如何在android中使用相机 Intent 获取图像名称?