From 143aa0b891b2f8bf8ffabf8c53966c3cf97e9eb0 Mon Sep 17 00:00:00 2001 From: Igor Murashkin Date: Fri, 17 Jan 2014 15:02:34 -0800 Subject: [PATCH] camera3: Specify frame durations and minimum frame durations in metadata Change-Id: Ic52c54c3c26e3547a1064fc5afc8ebbac5b392ad --- .../hardware/camera2/CameraCharacteristics.java | 28 ++++--- .../android/hardware/camera2/CaptureRequest.java | 90 +++++++++++++++++++++- .../android/hardware/camera2/CaptureResult.java | 90 +++++++++++++++++++++- 3 files changed, 190 insertions(+), 18 deletions(-) diff --git a/core/java/android/hardware/camera2/CameraCharacteristics.java b/core/java/android/hardware/camera2/CameraCharacteristics.java index 293b90a25ff1..8be50499c58b 100644 --- a/core/java/android/hardware/camera2/CameraCharacteristics.java +++ b/core/java/android/hardware/camera2/CameraCharacteristics.java @@ -384,13 +384,15 @@ public final class CameraCharacteristics extends CameraMetadata { /** *

The minimum frame duration that is supported - * for each resolution in availableJpegSizes. Should - * correspond to the frame duration when only that JPEG - * stream is active and captured in a burst, with all - * processing set to FAST

+ * for each resolution in {@link CameraCharacteristics#SCALER_AVAILABLE_JPEG_SIZES android.scaler.availableJpegSizes}.

+ *

This corresponds to the minimum steady-state frame duration when only + * that JPEG stream is active and captured in a burst, with all + * processing (typically in android.*.mode) set to FAST.

*

When multiple streams are configured, the minimum * frame duration will be >= max(individual stream min * durations)

+ * + * @see CameraCharacteristics#SCALER_AVAILABLE_JPEG_SIZES */ public static final Key SCALER_AVAILABLE_JPEG_MIN_DURATIONS = new Key("android.scaler.availableJpegMinDurations", long[].class); @@ -415,14 +417,16 @@ public final class CameraCharacteristics extends CameraMetadata { new Key("android.scaler.availableMaxDigitalZoom", float.class); /** - *

The minimum frame duration that is supported - * for each resolution in availableProcessedSizes. Should - * correspond to the frame duration when only that processed - * stream is active, with all processing set to - * FAST

- *

When multiple streams are configured, the minimum - * frame duration will be >= max(individual stream min - * durations)

+ *

For each available processed output size (defined in + * {@link CameraCharacteristics#SCALER_AVAILABLE_PROCESSED_SIZES android.scaler.availableProcessedSizes}), this property lists the + * minimum supportable frame duration for that size.

+ *

This should correspond to the frame duration when only that processed + * stream is active, with all processing (typically in android.*.mode) + * set to FAST.

+ *

When multiple streams are configured, the minimum frame duration will + * be >= max(individual stream min durations).

+ * + * @see CameraCharacteristics#SCALER_AVAILABLE_PROCESSED_SIZES */ public static final Key SCALER_AVAILABLE_PROCESSED_MIN_DURATIONS = new Key("android.scaler.availableProcessedMinDurations", long[].class); diff --git a/core/java/android/hardware/camera2/CaptureRequest.java b/core/java/android/hardware/camera2/CaptureRequest.java index 731c0bdb34d1..4a3a7509194d 100644 --- a/core/java/android/hardware/camera2/CaptureRequest.java +++ b/core/java/android/hardware/camera2/CaptureRequest.java @@ -986,9 +986,93 @@ public final class CaptureRequest extends CameraMetadata implements Parcelable { /** *

Duration from start of frame exposure to - * start of next frame exposure

- *

Exposure time has priority, so duration is set to - * max(duration, exposure time + overhead)

+ * start of next frame exposure.

+ *

The maximum frame rate that can be supported by a camera subsystem is + * a function of many factors:

+ *
    + *
  • Requested resolutions of output image streams
  • + *
  • Availability of binning / skipping modes on the imager
  • + *
  • The bandwidth of the imager interface
  • + *
  • The bandwidth of the various ISP processing blocks
  • + *
+ *

Since these factors can vary greatly between different ISPs and + * sensors, the camera abstraction tries to represent the bandwidth + * restrictions with as simple a model as possible.

+ *

The model presented has the following characteristics:

+ *
    + *
  • The image sensor is always configured to output the smallest + * resolution possible given the application's requested output stream + * sizes. The smallest resolution is defined as being at least as large + * as the largest requested output stream size; the camera pipeline must + * never digitally upsample sensor data when the crop region covers the + * whole sensor. In general, this means that if only small output stream + * resolutions are configured, the sensor can provide a higher frame + * rate.
  • + *
  • Since any request may use any or all the currently configured + * output streams, the sensor and ISP must be configured to support + * scaling a single capture to all the streams at the same time. This + * means the camera pipeline must be ready to produce the largest + * requested output size without any delay. Therefore, the overall + * frame rate of a given configured stream set is governed only by the + * largest requested stream resolution.
  • + *
  • Using more than one output stream in a request does not affect the + * frame duration.
  • + *
  • JPEG streams act like processed YUV streams in requests for which + * they are not included; in requests in which they are directly + * referenced, they act as JPEG streams. This is because supporting a + * JPEG stream requires the underlying YUV data to always be ready for + * use by a JPEG encoder, but the encoder will only be used (and impact + * frame duration) on requests that actually reference a JPEG stream.
  • + *
  • The JPEG processor can run concurrently to the rest of the camera + * pipeline, but cannot process more than 1 capture at a time.
  • + *
+ *

The necessary information for the application, given the model above, + * is provided via the android.scaler.available*MinDurations fields. + * These are used to determine the maximum frame rate / minimum frame + * duration that is possible for a given stream configuration.

+ *

Specifically, the application can use the following rules to + * determine the minimum frame duration it can request from the HAL + * device:

+ *
    + *
  1. Given the application's currently configured set of output + * streams, S, divide them into three sets: streams in a JPEG format + * SJ, streams in a raw sensor format SR, and the rest ('processed') + * SP.
  2. + *
  3. For each subset of streams, find the largest resolution (by pixel + * count) in the subset. This gives (at most) three resolutions RJ, + * RR, and RP.
  4. + *
  5. If RJ is greater than RP, set RP equal to RJ. If there is + * no exact match for RP == RJ (in particular there isn't an available + * processed resolution at the same size as RJ), then set RP equal + * to the smallest processed resolution that is larger than RJ. If + * there are no processed resolutions larger than RJ, then set RJ to + * the processed resolution closest to RJ.
  6. + *
  7. If RP is greater than RR, set RR equal to RP. If there is + * no exact match for RR == RP (in particular there isn't an available + * raw resolution at the same size as RP), then set RR equal to + * or to the smallest raw resolution that is larger than RP. If + * there are no raw resolutions larger than RP, then set RR to + * the raw resolution closest to RP.
  8. + *
  9. Look up the matching minimum frame durations in the property lists + * {@link CameraCharacteristics#SCALER_AVAILABLE_JPEG_MIN_DURATIONS android.scaler.availableJpegMinDurations}, + * android.scaler.availableRawMinDurations, and + * {@link CameraCharacteristics#SCALER_AVAILABLE_PROCESSED_MIN_DURATIONS android.scaler.availableProcessedMinDurations}. This gives three + * minimum frame durations FJ, FR, and FP.
  10. + *
  11. If a stream of requests do not use a JPEG stream, then the minimum + * supported frame duration for each request is max(FR, FP).
  12. + *
  13. If a stream of requests all use the JPEG stream, then the minimum + * supported frame duration for each request is max(FR, FP, FJ).
  14. + *
  15. If a mix of JPEG-using and non-JPEG-using requests is submitted by + * the application, then the HAL will have to delay JPEG-using requests + * whenever the JPEG encoder is still busy processing an older capture. + * This will happen whenever a JPEG-using request starts capture less + * than FJ ns after a previous JPEG-using request. The minimum + * supported frame duration will vary between the values calculated in + * #6 and #7.
  16. + *
+ * + * @see CameraCharacteristics#SCALER_AVAILABLE_JPEG_MIN_DURATIONS + * @see CameraCharacteristics#SCALER_AVAILABLE_PROCESSED_MIN_DURATIONS */ public static final Key SENSOR_FRAME_DURATION = new Key("android.sensor.frameDuration", long.class); diff --git a/core/java/android/hardware/camera2/CaptureResult.java b/core/java/android/hardware/camera2/CaptureResult.java index e54fb5128fa5..96183908ca7a 100644 --- a/core/java/android/hardware/camera2/CaptureResult.java +++ b/core/java/android/hardware/camera2/CaptureResult.java @@ -1239,9 +1239,93 @@ public final class CaptureResult extends CameraMetadata { /** *

Duration from start of frame exposure to - * start of next frame exposure

- *

Exposure time has priority, so duration is set to - * max(duration, exposure time + overhead)

+ * start of next frame exposure.

+ *

The maximum frame rate that can be supported by a camera subsystem is + * a function of many factors:

+ *
    + *
  • Requested resolutions of output image streams
  • + *
  • Availability of binning / skipping modes on the imager
  • + *
  • The bandwidth of the imager interface
  • + *
  • The bandwidth of the various ISP processing blocks
  • + *
+ *

Since these factors can vary greatly between different ISPs and + * sensors, the camera abstraction tries to represent the bandwidth + * restrictions with as simple a model as possible.

+ *

The model presented has the following characteristics:

+ *
    + *
  • The image sensor is always configured to output the smallest + * resolution possible given the application's requested output stream + * sizes. The smallest resolution is defined as being at least as large + * as the largest requested output stream size; the camera pipeline must + * never digitally upsample sensor data when the crop region covers the + * whole sensor. In general, this means that if only small output stream + * resolutions are configured, the sensor can provide a higher frame + * rate.
  • + *
  • Since any request may use any or all the currently configured + * output streams, the sensor and ISP must be configured to support + * scaling a single capture to all the streams at the same time. This + * means the camera pipeline must be ready to produce the largest + * requested output size without any delay. Therefore, the overall + * frame rate of a given configured stream set is governed only by the + * largest requested stream resolution.
  • + *
  • Using more than one output stream in a request does not affect the + * frame duration.
  • + *
  • JPEG streams act like processed YUV streams in requests for which + * they are not included; in requests in which they are directly + * referenced, they act as JPEG streams. This is because supporting a + * JPEG stream requires the underlying YUV data to always be ready for + * use by a JPEG encoder, but the encoder will only be used (and impact + * frame duration) on requests that actually reference a JPEG stream.
  • + *
  • The JPEG processor can run concurrently to the rest of the camera + * pipeline, but cannot process more than 1 capture at a time.
  • + *
+ *

The necessary information for the application, given the model above, + * is provided via the android.scaler.available*MinDurations fields. + * These are used to determine the maximum frame rate / minimum frame + * duration that is possible for a given stream configuration.

+ *

Specifically, the application can use the following rules to + * determine the minimum frame duration it can request from the HAL + * device:

+ *
    + *
  1. Given the application's currently configured set of output + * streams, S, divide them into three sets: streams in a JPEG format + * SJ, streams in a raw sensor format SR, and the rest ('processed') + * SP.
  2. + *
  3. For each subset of streams, find the largest resolution (by pixel + * count) in the subset. This gives (at most) three resolutions RJ, + * RR, and RP.
  4. + *
  5. If RJ is greater than RP, set RP equal to RJ. If there is + * no exact match for RP == RJ (in particular there isn't an available + * processed resolution at the same size as RJ), then set RP equal + * to the smallest processed resolution that is larger than RJ. If + * there are no processed resolutions larger than RJ, then set RJ to + * the processed resolution closest to RJ.
  6. + *
  7. If RP is greater than RR, set RR equal to RP. If there is + * no exact match for RR == RP (in particular there isn't an available + * raw resolution at the same size as RP), then set RR equal to + * or to the smallest raw resolution that is larger than RP. If + * there are no raw resolutions larger than RP, then set RR to + * the raw resolution closest to RP.
  8. + *
  9. Look up the matching minimum frame durations in the property lists + * {@link CameraCharacteristics#SCALER_AVAILABLE_JPEG_MIN_DURATIONS android.scaler.availableJpegMinDurations}, + * android.scaler.availableRawMinDurations, and + * {@link CameraCharacteristics#SCALER_AVAILABLE_PROCESSED_MIN_DURATIONS android.scaler.availableProcessedMinDurations}. This gives three + * minimum frame durations FJ, FR, and FP.
  10. + *
  11. If a stream of requests do not use a JPEG stream, then the minimum + * supported frame duration for each request is max(FR, FP).
  12. + *
  13. If a stream of requests all use the JPEG stream, then the minimum + * supported frame duration for each request is max(FR, FP, FJ).
  14. + *
  15. If a mix of JPEG-using and non-JPEG-using requests is submitted by + * the application, then the HAL will have to delay JPEG-using requests + * whenever the JPEG encoder is still busy processing an older capture. + * This will happen whenever a JPEG-using request starts capture less + * than FJ ns after a previous JPEG-using request. The minimum + * supported frame duration will vary between the values calculated in + * #6 and #7.
  16. + *
+ * + * @see CameraCharacteristics#SCALER_AVAILABLE_JPEG_MIN_DURATIONS + * @see CameraCharacteristics#SCALER_AVAILABLE_PROCESSED_MIN_DURATIONS */ public static final Key SENSOR_FRAME_DURATION = new Key("android.sensor.frameDuration", long.class); -- 2.11.0