java - 将 CameraBridgeViewBase 作为服务运行以使用 OpenCV 在后台进行运动检测

标签 java android opencv service camera

所以我正在尝试为最终将在我们办公室周围行驶的机器人编写部分代码。

我们使用的平板电脑可以进行面部识别,同时还可以投影动画面孔来迎接客户。

此时我有一个简单的应用程序,它可以在前台运行时进行一些基本的运动检测。

现在我开始遇到问题,因为我必须将应用程序更改为服务,以便它可以在后台持续运行,而“动画面部应用程序”可以在前台运行。

我遇到的最大也是最后一个障碍是:

因为 Activity 是一项服务,所以我似乎找不到在服务部分实例化 CameraBridgeViewBase 的方法:

mOpenCvCameraView = (JavaCameraView) findViewById(R.id.show_camera_activity_java_surface_view);

这将引发错误,因为 findViewById 不是服务中的可调用函数。

我尝试了很多方法,使 CameraBridgeViewBase 可序列化、可解析并将其作为附加到 Intent 的对象传递给服务,但没有成功。

这是来自 Main_activity_show_camera_service.java:

public int onStartCommand(Intent intent, int flags,int startId) {
    mOpenCvCameraView.enableView();
    mOpenCvCameraView = (JavaCameraView) findViewById(R.id.show_camera_activity_java_surface_view);
    mOpenCvCameraView.setMaxFrameSize(400,300);
    mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);

    if (!OpenCVLoader.initDebug()) {
        Log.d(TAG, "Internal OpenCV library not found. Using OpenCV Manager for initialization");
        OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_3_0_0, this, mLoaderCallback);
    } else {
        Log.d(TAG, "OpenCV library found inside package. Using it!");
        mLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
    }
    return START_STICKY;
}

这会在 findViewById 命令上崩溃。

这只是启动服务按钮上的功能:

    public void startService(View view) {
    Intent i = new Intent(this,MainActivity_show_camera_service.class);
    startService(i);
}

我知道还有很多其他库和方法可以获得相同的结果,我肯定会研究这些,但我只是希望能够使用 OpenCV。

最佳答案

只是为了帮助 future 可能的人: 这是我最终得到的适用于我的服务代码。它使用了新的 camera2 API。

package com.example.stemmeriky.testapplication;
//imports 

public class MainActivity_show_camera_service extends Service  {
private List<MatOfPoint> contours;
private int mWidth = 480;
private int mHeight = 320;
private Mat mFGMask;
private BackgroundSubtractorMOG2 fgbg;
private Handler mCameraHandler;
private static int counter = 0;
private ImageReader mImageReader = ImageReader.newInstance(mWidth, mHeight, ImageFormat.YUV_420_888, 2);
public int threshold = 650;
private Surface mCameraRecieverSurface = mImageReader.getSurface();


private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
    @Override
    public void onManagerConnected(int status) {
        switch (status) {
            case LoaderCallbackInterface.SUCCESS: {
                fgbg = Video.createBackgroundSubtractorMOG2();
                contours = new ArrayList<>();
            }
            break;
            default: {
                super.onManagerConnected(status);
            }
            break;
        }
    }
};

public void convertImageToMat(Mat inputFrame) throws FileNotFoundException {
    contours.clear();
    fgbg.apply(inputFrame, mFGMask, 0.1);
    Imgproc.findContours(mFGMask, contours, new Mat(), Imgproc.RETR_EXTERNAL, Imgproc.CHAIN_APPROX_NONE);

    if (contours.size() > threshold) {
        System.out.println("movement detected");//           
savePicture(inputFrame);
    }
}


public static Mat imageToMat(Image image) {
    ByteBuffer buffer;
    int rowStride;
    int pixelStride;
    int width = image.getWidth();
    int height = image.getHeight();
    int offset = 0;

    Image.Plane[] planes = image.getPlanes();
    byte[] data = new byte[image.getWidth() * image.getHeight() * ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8];
    byte[] rowData = new byte[planes[0].getRowStride()];

    for (int i = 0; i < planes.length; i++) {
        buffer = planes[i].getBuffer();
        rowStride = planes[i].getRowStride();
        pixelStride = planes[i].getPixelStride();
        int w = (i == 0) ? width : width / 2;
        int h = (i == 0) ? height : height / 2;
        for (int row = 0; row < h; row++) {
            int bytesPerPixel = ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8;
            if (pixelStride == bytesPerPixel) {
                int length = w * bytesPerPixel;
                buffer.get(data, offset, length);

                // Advance buffer the remainder of the row stride, unless on the last row.
                // Otherwise, this will throw an IllegalArgumentException because the buffer
                // doesn't include the last padding.
                if (h - row != 1) {
                    buffer.position(buffer.position() + rowStride - length);
                }
                offset += length;
            } else {

                // On the last row only read the width of the image minus the pixel stride
                // plus one. Otherwise, this will throw a BufferUnderflowException because the
                // buffer doesn't include the last padding.
                if (h - row == 1) {
                    buffer.get(rowData, 0, width - pixelStride + 1);
                } else {
                    buffer.get(rowData, 0, rowStride);
                }

                for (int col = 0; col < w; col++) {
                    data[offset++] = rowData[col * pixelStride];
                }
            }
        }
    }

    // Finally, create the Mat.
    Mat mat = new Mat(height + height / 2, width, CvType.CV_8UC1);
    mat.put(0, 0, data);

    return mat;
}

public void savePicture(Mat image) throws FileNotFoundException {
    File storage = Environment.getExternalStorageDirectory();
    File RobotPhotoDirectory = new File(storage + "/Pictures/RobotPhotoDirectory/");
    if (!RobotPhotoDirectory.exists()) {
        RobotPhotoDirectory.mkdir();
    }
    String dir_path = RobotPhotoDirectory + "/";
    Imgcodecs.imwrite(dir_path + File.separator + "testfile" + counter + ".jpg", image);
    counter++;
}

ImageReader.OnImageAvailableListener mImageAvailListener = new ImageReader.OnImageAvailableListener() {
    @Override
    public void onImageAvailable(ImageReader reader) {
        Image image = reader.acquireLatestImage();
        try {
            convertImageToMat(imageToMat(image));
        } catch (FileNotFoundException e) {
            e.printStackTrace();
        }
        image.close();
    }
};

{
    mImageReader.setOnImageAvailableListener(mImageAvailListener, mCameraHandler);
}

MainActivity_show_camera_service() {
    HandlerThread mCameraHandlerThread = new HandlerThread("mCameraHandlerThread");
    mCameraHandlerThread.start();
    mCameraHandler = new Handler(mCameraHandlerThread.getLooper());
}

@SuppressLint("MissingPermission")
public void startProducing() {
    CameraManager cm = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
    try {
        String[] cameraList = cm.getCameraIdList();
        for (String cd : cameraList) {
            CameraCharacteristics mCameraCharacteristics = cm.getCameraCharacteristics(cd);
            if (mCameraCharacteristics.get(CameraCharacteristics.LENS_FACING) != CameraCharacteristics.LENS_FACING_FRONT) {
                continue;
            }
            cm.openCamera(cd, mDeviceStateCallback, mCameraHandler);
        }
    } catch (CameraAccessException e) {
        e.printStackTrace();
    }
}


private final CameraDevice.StateCallback mDeviceStateCallback = new CameraDevice.StateCallback() {
    @Override
    public void onOpened(@NonNull CameraDevice camera) {
        List<Surface> surfaceList = new ArrayList<>();
        surfaceList.add(mCameraRecieverSurface);

        try {
            camera.createCaptureSession(surfaceList, mCaptureSessionStateCallback, mCameraHandler);
        } catch (CameraAccessException ignored) {
        }
    }
    @Override
    public void onDisconnected(@NonNull CameraDevice camera) {
    }
    @Override
    public void onError(@NonNull CameraDevice camera, int error) {

    }
};


public void onCreate() {
    System.out.println("onCreate");
    super.onCreate();
    if (!OpenCVLoader.initDebug()) {
        OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_3_0_0, this, mLoaderCallback);
    } else {
        mLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
    }
    mFGMask = new Mat();
    startProducing();
}


@Nullable
@Override
public IBinder onBind(Intent intent) {
    return null;
}

private final CameraCaptureSession.StateCallback mCaptureSessionStateCallback = new CameraCaptureSession.StateCallback() {
    @Override
    public void onConfigured(@NonNull CameraCaptureSession session) {
        try {
            CaptureRequest.Builder requestBuilder = session.getDevice().createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
            requestBuilder.addTarget(mCameraRecieverSurface);
            //set to null - image data will be produced but will not receive metadata
            session.setRepeatingRequest(requestBuilder.build(), null, mCameraHandler);

        } catch (CameraAccessException e) {
        }
    }

    @Override
    public void onConfigureFailed(@NonNull CameraCaptureSession session) {
    }
};

关于java - 将 CameraBridgeViewBase 作为服务运行以使用 OpenCV 在后台进行运动检测,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48743733/

相关文章:

java - Stream.map.filter的效率

android - 在 Android 中显示之前获取 Admob 插页式视频广告的长度

java - 在 ListView 中编辑文本?

java - 窗口调整大小事件?

java - Android 游戏开发 - 我应该使用像 AndEngine 这样的框架吗?

java - 从 Web 检索并解析 XML

java - 在android上解码加密信息

opencv - 向图像 OpenCV 添加变暗滤镜

java - 将屏幕截图加载到 Mat

opencv - 从哪里获取 haar 训练的背景样本图像?