Tuesday, February 6, 2018

Integrated video streaming Initial steps

Integrated video streaming Initial steps


Currently, the rover uses the excellent IPWebcam program for Android to serve video frames. The client just makes HTTP requests for an image in a tight loop. At low resolution (320x240) Im getting 5-10 fps over my ancient WiFi. This works, and was easy to implement, but now that Ive gotten the underlying architecture sorted out, and the rover driving well, I really want to integrate the video server. I expect this to be a challenge, but part of the point of the project is to learn about Android development. So here goes.

Yesterday, using examples on the web, I got a function working that opens the camera, takes a photo, and saves it to the SD card. The main challenge was doing this from a service, with no GUI - most examples are centered around giving the user a preview to aim with.

It turns out that the use of a preview surface is not optional - I tried using the camera API without one, and though it took pictures and returned byte arrays of varying sizes, the images were all black. Once I found a post describing how to set up a preview surface, it started working as expected.

My phone is set up to emit an audible shutter sound when you take a picture, and you cant turn it off. This is presumably for privacy reasons. I figured that since IP Webcam isnt emitting shutter sounds multiple times a second, it must be capturing the preview stream and converting it into images (which doesnt result in a shutter sound).

It appears that the previews are coming off the camera in YUV format, and each time a frame is available, it fires a callback function that you can define. It should just be a matter of converting the YUV image to a JPEG and then shoving it over the network.

For higher resolution, Id need to investigate H.264 streaming, which I suspect is not trivial, so for now I am going to focus on the simpler approach.

Camera code: very alpha. This might come in handy for later to snap a high res picture of whatever the rover is looking at. I think I can use the camera object to turn on the flash LED to use as a headlight, too! :-) This code works from a service.

This code was heavily based on examples found on these sites and some others on StackOverflow:

http://p2p.wrox.com/book-professional-android-application-development-isbn-978-0-470-34471-2/72528-article-using-android-camera.html

http://handycodeworks.com/?p=19


private void takePicture()
{

Camera cam = Camera.open();
Camera.Parameters parameters = cam.getParameters();

parameters.set("jpeg-quality", 70);
parameters.setPictureFormat(PixelFormat.JPEG);
parameters.setPictureSize(320, 200);

cam.setParameters(parameters);

//So you cant take a picture without mapping its preview to a surface. If you do, you get all black images.
SurfaceView view = new SurfaceView(this);
try {
cam.setPreviewDisplay(view.getHolder());
cam.startPreview();

} catch (IOException e) {
}

//give the startPreview time to complete, or you get a black image.
try {
Thread.sleep(1000);
} catch (InterruptedException e) {}

//this is what gets fired on the jpeg data when a picture gets taken below
PictureCallback mPicture = new PictureCallback() {
@Override
public void onPictureTaken(byte[] data, Camera cam) {
Long d = new Date().getTime();

File pictureFile = new File("/mnt/sdcard-ext/DCIM/Camera/" + d.toString() + ".jpg");

if (pictureFile == null) {
Log.d(DEBUG_TAG, "Something bad happened while writing image file.");
return;
}
try {
Log.d(DEBUG_TAG, "Byte array: " + data.length + " bytes");
FileOutputStream fos = new FileOutputStream(pictureFile);
fos.write(data);
fos.close();
} catch (FileNotFoundException e) {

} catch (IOException e) {
}
}
};

cam.takePicture(null, null, mPicture);

try {
Thread.sleep(2000);
} catch (InterruptedException e) {}

cam.release();


}


visit link download