java - 像网络摄像头一样使用安卓手机

标签 java android sockets video-streaming

我正在尝试开发保存视频的应用程序,并且该视频可以在电脑上观看。我在手机上使用android,在pc上使用java。 pc 内容在服务器端,传输是通过套接字进行的。 我的问题似乎是我可以录制视频,但 pc 端应用程序无法重现发送的视频。` 我向您展示了我用于设置 MediaRecorder 的代码:

public void prepareVideoRecorder(Camera mCamera, ParcelFileDescriptor pfd,
        SurfaceHolder mHolder) {
    if (mCamera == null) {
        mCamera = safeCameraOpen(mCamera);
    }
    if (mMediaRecorder == null) {
        mMediaRecorder = new MediaRecorder();

        mCamera.stopPreview();
        // Step 1: unlock and set camera to MediaRecorder;
        mCamera.unlock();
        mMediaRecorder.setCamera(mCamera);
    }

    // Step 2: Set sources:
    mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
    mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
    //mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);


    // Step 3:Set a CamcorderProfile (APi level 8 or higher)
    mMediaRecorder.setProfile(CamcorderProfile
            .get(CamcorderProfile.QUALITY_HIGH));

    // Step 4: Set output file
    mMediaRecorder.setOutputFile(pfd.getFileDescriptor());
    // Step 5: Set the preview output
    mMediaRecorder.setPreviewDisplay(mHolder.getSurface());
    try {
        mMediaRecorder.prepare();
    } catch (IllegalStateException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    } catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }

}

看来是对的。然后用 xuggler 开发应该播放视频的 pc 端,应用程序停止在:

if (container.open(inputstream, null) < 0) {
            throw new IllegalArgumentException("could not open inpustream");
        }

这是下一个 java 类的一部分:

public class imagePnl extends JPanel {

URL medialocator = null;
BufferedImage image;
private Player player;
private DataSource ds = null;
private String mobileLocation = "socket://localhost:1234";
// private ByteArrayDataSource byteDs = null;
private InputStream inputStream = null;
IContainerFormat format;

public imagePnl() {
}

public void setVideo(InputStream inputstream) {
    // Let's make sure that we can actually convert video pixel formats.
    if (!IVideoResampler
            .isSupported(IVideoResampler.Feature.FEATURE_COLORSPACECONVERSION)) {
        throw new RuntimeException("you must install the GPL version"
                + " of Xuggler (with IVideoResampler support) for "
                + "this demo to work");
    }

    IContainer container = IContainer.make();

    if (container.open(inputstream, null) < 0) {
        throw new IllegalArgumentException("could not open inpustream");
    }
    // query how many streams the call to open found
    int numStreams = container.getNumStreams();
    // and iterate through the streams to find the first video stream
    int videoStreamId = -1;
    IStreamCoder videoCoder = null;
    for (int i = 0; i < numStreams; i++) {
        // Find the stream object
        IStream stream = container.getStream(i);
        // Get the pre-configured decoder that can decode this stream;
        IStreamCoder coder = stream.getStreamCoder();

        if (coder.getCodecType() == ICodec.Type.CODEC_TYPE_VIDEO) {
            videoStreamId = i;
            videoCoder = coder;
            break;
        }
    }
    if (videoStreamId == -1) {
        throw new RuntimeException("could not find video stream");
    }
    /*
     * Now we have found the video stream in this file. Let's open up our
     * decoder so it can do work.
     */
    if (videoCoder.open() < 0) {
        throw new RuntimeException(
                "could not open video decoder for container");
    }
    IVideoResampler resampler = null;
    if (videoCoder.getPixelType() != IPixelFormat.Type.BGR24) {
        // if this stream is not in BGR24, we're going to need to
        // convert it. The VideoResampler does that for us.
        resampler = IVideoResampler.make(videoCoder.getWidth(),
                videoCoder.getHeight(), IPixelFormat.Type.BGR24,
                videoCoder.getWidth(), videoCoder.getHeight(),
                videoCoder.getPixelType());
        if (resampler == null) {
            throw new RuntimeException(
                    "could not create color space resampler.");
        }
    }
    /*
     * Now, we start walking through the container looking at each packet.
     */
    IPacket packet = IPacket.make();
    long firstTimestampInStream = Global.NO_PTS;
    long systemClockStartTime = 0;
    while (container.readNextPacket(packet) >= 0) {
        /*
         * Now we have a packet, let's see if it belongs to our video stream
         */
        if (packet.getStreamIndex() == videoStreamId) {
            /*
             * We allocate a new picture to get the data out of Xuggler
             */
            IVideoPicture picture = IVideoPicture.make(
                    videoCoder.getPixelType(), videoCoder.getWidth(),
                    videoCoder.getHeight());

            try {
                int offset = 0;
                while (offset < packet.getSize()) {
                    System.out
                            .println("VideoManager.decode(): decode one image");
                    /*
                     * Now, we decode the video, checking for any errors.
                     */
                    int bytesDecoded = videoCoder.decodeVideo(picture,
                            packet, offset);
                    if (bytesDecoded < 0) {
                        throw new RuntimeException(
                                "got error decoding video");
                    }
                    offset += bytesDecoded;

                    /*
                     * Some decoders will consume data in a packet, but will
                     * not be able to construct a full video picture yet.
                     * Therefore you should always check if you got a
                     * complete picture from the decoder
                     */
                    if (picture.isComplete()) {
                        System.out
                                .println("VideoManager.decode(): image complete");
                        IVideoPicture newPic = picture;
                        /*
                         * If the resampler is not null, that means we
                         * didn't get the video in BGR24 format and need to
                         * convert it into BGR24 format.
                         */
                        if (resampler != null) {
                            // we must resample
                            newPic = IVideoPicture
                                    .make(resampler.getOutputPixelFormat(),
                                            picture.getWidth(),
                                            picture.getHeight());
                            if (resampler.resample(newPic, picture) < 0) {
                                throw new RuntimeException(
                                        "could not resample video");
                            }
                        }
                        if (newPic.getPixelType() != IPixelFormat.Type.BGR24) {
                            throw new RuntimeException(
                                    "could not decode video as BGR 24 bit data");
                        }

                        /**
                         * We could just display the images as quickly as we
                         * decode them, but it turns out we can decode a lot
                         * faster than you think.
                         * 
                         * So instead, the following code does a poor-man's
                         * version of trying to match up the frame-rate
                         * requested for each IVideoPicture with the system
                         * clock time on your computer.
                         * 
                         * Remember that all Xuggler IAudioSamples and
                         * IVideoPicture objects always give timestamps in
                         * Microseconds, relative to the first decoded item.
                         * If instead you used the packet timestamps, they
                         * can be in different units depending on your
                         * IContainer, and IStream and things can get hairy
                         * quickly.
                         */
                        if (firstTimestampInStream == Global.NO_PTS) {
                            // This is our first time through
                            firstTimestampInStream = picture.getTimeStamp();
                            // get the starting clock time so we can hold up
                            // frames until the right time.
                            systemClockStartTime = System
                                    .currentTimeMillis();
                        } else {
                            long systemClockCurrentTime = System
                                    .currentTimeMillis();
                            long millisecondsClockTimeSinceStartofVideo = systemClockCurrentTime
                                    - systemClockStartTime;

                            // compute how long for this frame since the
                            // first frame in the stream.
                            // remember that IVideoPicture and IAudioSamples
                            // timestamps are always in MICROSECONDS,
                            // so we divide by 1000 to get milliseconds.
                            long millisecondsStreamTimeSinceStartOfVideo = (picture
                                    .getTimeStamp() - firstTimestampInStream) / 1000;
                            final long millisecondsTolerance = 50; // and we
                                                                    // give
                                                                    // ourselfs
                                                                    // 50 ms
                                                                    // of
                                                                    // tolerance
                            final long millisecondsToSleep = (millisecondsStreamTimeSinceStartOfVideo - (millisecondsClockTimeSinceStartofVideo + millisecondsTolerance));
                            if (millisecondsToSleep > 0) {
                                try {
                                    Thread.sleep(millisecondsToSleep);
                                } catch (InterruptedException e) {
                                    // we might get this when the user
                                    // closes the dialog box, so just return
                                    // from the method.
                                    return;
                                }
                            }
                        }

                        // And finally, convert the BGR24 to an Java
                        // buffered image
                        BufferedImage javaImage = Utils
                                .videoPictureToImage(newPic);

                        // and display it on the Java Swing window
                        setImage(javaImage);
                        // if (listener != null) {
                        // listener.imageUpdated(javaImage);
                        // }
                    }
                } // end of while
            } catch (Exception exc) {
                exc.printStackTrace();
            }
        } else {
            /*
             * This packet isn't part of our video stream, so we just
             * silently drop it.
             */
            do {
            } while (false);
        }

    }
    /*
     * Technically since we're exiting anyway, these will be cleaned up by
     * the garbage collector... but because we're nice people and want to be
     * invited places for Christmas, we're going to show how to clean up.
     */
    if (videoCoder != null) {
        videoCoder.close();
        videoCoder = null;
    }
    if (container != null) {
        container.close();
        container = null;
    }

    // byteDs = new ByteArrayDataSource(bytes, "video/3gp");
    // ToolFactory.makere byteDs
    // .getOutputStream();
    // Manager.createPlayer(byteD);
    // Player mediaPlayer = Manager.createRealizedPlayer(new
    // MediaLocator(mobileLocation));
    // Component video = mediaPlayer.getVisualComponent();
    // Component control = mediaPlayer.getControlPanelComponent();
    // if (video != null) {
    // add(video, BorderLayout.CENTER);
    // }
    // add(control, BorderLayout.SOUTH);
    // mediaPlayer.start();
    // } catch (IOException | NoPlayerException | CannotRealizeException ex)
    // {
    // Logger.getLogger(imagePnl.class.getName()).log(Level.SEVERE, null,
    // ex);
    // }
    paint(getGraphics());
}

public void setImage(BufferedImage image) {
    this.image = (BufferedImage) image;

    paint(getGraphics());
}

@Override
public void paintComponent(Graphics g) {
    // super.paintComponent(g);
    // Graphics2D g2d = (Graphics2D) g;
    //
    // g2d.drawImage(image, 0, 0, null);
    // explicitly specify width (w) and height (h)
    g.drawImage(image, 10, 10, this.getWidth(), this.getHeight(), this);

}

} 在此行停止应用程序时,不会显示任何错误,但应用程序也不会在 pc 端显示视频。

我希望你能帮助我。该项目用于学习目的。 提前致谢, 法兰

最佳答案

如果你想从安卓流式传输视频,你应该使用像RTSP这样的流媒体协议(protocol)。或 RTP . 使用 TCP 套接字将不起作用,因为 header 信息在通过套接字接收的所有数据包中都不可用。 请看Spydroid .

关于java - 像网络摄像头一样使用安卓手机,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/15361679/

相关文章:

java - 如何在java中根据条件对值进行分组

android - 使用 NumberPicker 作为 DialogPreference。为什么我的设置不保存?

android - ListView onItemLongClickListener 不会阻止链接点击

java - 如何使用 HTML 标签将以字节为单位的图像显示到 JSP 页面?

Java导入语句

c - Windows - 同时等待事件和套接字

c++ - 模拟数据包发送和接收

c++ - 通过 TLS 将 HTTP2 帧发送到服务器

JavaFX + Spring Boot + JPA-NullPointerException

android - 有什么方法可以提高带有背景图像的 xamarin.android ui 性能