X-Git-Url: http://git.osdn.net/view?a=blobdiff_plain;ds=sidebyside;f=camera%2Fdocs%2Fmetadata_definitions.xml;h=11304114524b0b180b51f6b45e3e6ba0a4ec49d3;hb=6916b4fbf7c2941ad28204eef9557421a578e6ec;hp=3a0e6d24b288afa8bc8e4f5b12c8f9ddcda1daf3;hpb=58e1caea721568ce6f4b8949ce4bac996565e274;p=android-x86%2Fsystem-media.git diff --git a/camera/docs/metadata_definitions.xml b/camera/docs/metadata_definitions.xml index 3a0e6d24..11304114 100644 --- a/camera/docs/metadata_definitions.xml +++ b/camera/docs/metadata_definitions.xml @@ -39,6 +39,9 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata Entry is required for the YUV or PRIVATE reprocessing capability. + + Entry is required for logical multi-camera capability. + Entry is under-specified and is not required for now. This is for book-keeping purpose, do not implement or use it, it may be revised for future. @@ -627,6 +630,22 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata sequence. + ON_EXTERNAL_FLASH + + An external flash has been turned on. + + It informs the camera device that an external flash has been turned on, and that + metering (and continuous focus if active) should be quickly recaculated to account + for the external flash. Otherwise, this mode acts like ON. + + When the external flash is turned off, AE mode should be changed to one of the + other available AE modes. + + If the camera device supports AE external flash mode, aeState must be + FLASH_REQUIRED after the camera device finishes AE scan and it's too dark without + flash. + + The desired mode for the camera device's auto-exposure routine. @@ -1349,6 +1368,14 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata For example, the application may wish to manually control android.sensor.exposureTime, android.sensor.sensitivity, etc. + MOTION_TRACKING + This request is for a motion tracking use case, where + the application will use camera and inertial sensor data to + locate and track objects in the world. + + The camera device auto-exposure routine will limit the exposure time + of the camera to no more than 20 milliseconds, to minimize motion blur. + Information to the camera device 3A (auto-exposure, auto-focus, auto-white balance) routines about the purpose @@ -1357,10 +1384,13 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata
This control (except for MANUAL) is only effective if `android.control.mode != OFF` and any 3A routine is active. - ZERO_SHUTTER_LAG will be supported if android.request.availableCapabilities - contains PRIVATE_REPROCESSING or YUV_REPROCESSING. MANUAL will be supported if - android.request.availableCapabilities contains MANUAL_SENSOR. Other intent values are - always supported. + All intents are supported by all devices, except that: + * ZERO_SHUTTER_LAG will be supported if android.request.availableCapabilities contains + PRIVATE_REPROCESSING or YUV_REPROCESSING. + * MANUAL will be supported if android.request.availableCapabilities contains + MANUAL_SENSOR. + * MOTION_TRACKING will be supported if android.request.availableCapabilities contains + MOTION_TRACKING.
@@ -2312,7 +2342,7 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata :------------:|:----------------:|:---------:|:-----------------------: INACTIVE | | INACTIVE | Camera device auto exposure algorithm is disabled - When android.control.aeMode is AE_MODE_ON_*: + When android.control.aeMode is AE_MODE_ON*: State | Transition Cause | New State | Notes :-------------:|:--------------------------------------------:|:--------------:|:-----------------: @@ -2335,11 +2365,15 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata Any state (excluding LOCKED) | android.control.aePrecaptureTrigger is START | PRECAPTURE | Start AE precapture metering sequence Any state (excluding LOCKED) | android.control.aePrecaptureTrigger is CANCEL| INACTIVE | Currently active precapture metering sequence is canceled + If the camera device supports AE external flash mode (ON_EXTERNAL_FLASH is included in + android.control.aeAvailableModes), aeState must be FLASH_REQUIRED after the camera device + finishes AE scan and it's too dark without flash. + For the above table, the camera device may skip reporting any state changes that happen without application intervention (i.e. mode switch, trigger, locking). Any state that can be skipped in that manner is called a transient state. - For example, for above AE modes (AE_MODE_ON_*), in addition to the state transitions + For example, for above AE modes (AE_MODE_ON*), in addition to the state transitions listed in above table, it is also legal for the camera device to skip one or more transient states between two results. See below table for examples: @@ -2863,7 +2897,7 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata - + NOT_DETECTED Scene change is not detected within the AF region(s). @@ -2878,9 +2912,6 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata result. Otherwise the value will be NOT_DETECTED. The threshold for detection is similar to what would trigger a new passive focus scan to begin in CONTINUOUS autofocus modes. - afSceneChange may be DETECTED only if afMode is AF_MODE_CONTINUOUS_VIDEO or - AF_MODE_CONTINUOUS_PICTURE. In other AF modes, afSceneChange must be NOT_DETECTED. - This key will be available if the camera device advertises this key via {@link android.hardware.camera2.CameraCharacteristics#getAvailableCaptureResultKeys|ACAMERA_REQUEST_AVAILABLE_RESULT_KEYS}. @@ -3841,36 +3872,33 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata Meters
The position of the camera device's lens optical center, - as a three-dimensional vector `(x,y,z)`, relative to the - optical center of the largest camera device facing in the - same direction as this camera, in the {@link - android.hardware.SensorEvent Android sensor coordinate - axes}. Note that only the axis definitions are shared with - the sensor coordinate system, but not the origin. - - If this device is the largest or only camera device with a - given facing, then this position will be `(0, 0, 0)`; a - camera device with a lens optical center located 3 cm from - the main sensor along the +X axis (to the right from the - user's perspective) will report `(0.03, 0, 0)`. - - To transform a pixel coordinates between two cameras - facing the same direction, first the source camera - android.lens.radialDistortion must be corrected for. Then - the source camera android.lens.intrinsicCalibration needs - to be applied, followed by the android.lens.poseRotation - of the source camera, the translation of the source camera - relative to the destination camera, the - android.lens.poseRotation of the destination camera, and - finally the inverse of android.lens.intrinsicCalibration - of the destination camera. This obtains a - radial-distortion-free coordinate in the destination - camera pixel coordinates. - - To compare this against a real image from the destination - camera, the destination camera image then needs to be - corrected for radial distortion before comparison or - sampling. + as a three-dimensional vector `(x,y,z)`. + + Prior to Android P, or when android.lens.poseReference is PRIMARY_CAMERA, this position + is relative to the optical center of the largest camera device facing in the same + direction as this camera, in the {@link android.hardware.SensorEvent Android sensor + coordinate axes}. Note that only the axis definitions are shared with the sensor + coordinate system, but not the origin. + + If this device is the largest or only camera device with a given facing, then this + position will be `(0, 0, 0)`; a camera device with a lens optical center located 3 cm + from the main sensor along the +X axis (to the right from the user's perspective) will + report `(0.03, 0, 0)`. + + To transform a pixel coordinates between two cameras facing the same direction, first + the source camera android.lens.radialDistortion must be corrected for. Then the source + camera android.lens.intrinsicCalibration needs to be applied, followed by the + android.lens.poseRotation of the source camera, the translation of the source camera + relative to the destination camera, the android.lens.poseRotation of the destination + camera, and finally the inverse of android.lens.intrinsicCalibration of the destination + camera. This obtains a radial-distortion-free coordinate in the destination camera pixel + coordinates. + + To compare this against a real image from the destination camera, the destination camera + image then needs to be corrected for radial distortion before comparison or sampling. + + When android.lens.poseReference is GYROSCOPE, then this position is relative to + the center of the primary gyroscope on the device.
@@ -4078,6 +4106,31 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata
+ + + + PRIMARY_CAMERA + The value of android.lens.poseTranslation is relative to the optical center of + the largest camera device facing the same direction as this camera. + + This is the default value for API levels before Android P. + + + GYROSCOPE + The value of android.lens.poseTranslation is relative to the position of the + primary gyroscope of this Android device. + + + + + The origin for android.lens.poseTranslation. + +
+ Different calibration methods and use cases can produce better or worse results + depending on the selected coordinate origin. +
+
+
@@ -4945,6 +4998,7 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata - android.lens.intrinsicCalibration - android.lens.radialDistortion * The android.depth.depthIsExclusive entry is listed by this device. + * As of Android P, the android.lens.poseReference entry is listed by this device. * A LIMITED camera with only the DEPTH_OUTPUT capability does not have to support normal YUV_420_888, JPEG, and PRIV-format outputs. It only has to support the DEPTH16 format. @@ -5041,6 +5095,59 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata during high speed streaming. + MOTION_TRACKING + + The camera device supports the MOTION_TRACKING value for + android.control.captureIntent, which limits maximum exposure time to 20 ms. + + This limits the motion blur of capture images, resulting in better image tracking + results for use cases such as image stabilization or augmented reality. + + + LOGICAL_MULTI_CAMERA + + The camera device is a logical camera backed by two or more physical cameras that are + also exposed to the application. + + This capability requires the camera device to support the following: + + * This camera device must list the following static metadata entries in {@link + android.hardware.camera2.CameraCharacteristics}: + - android.logicalMultiCamera.physicalIds + - android.logicalMultiCamera.sensorSyncType + * The underlying physical cameras' static metadata must list the following entries, + so that the application can correlate pixels from the physical streams: + - android.lens.poseReference + - android.lens.poseRotation + - android.lens.poseTranslation + - android.lens.intrinsicCalibration + - android.lens.radialDistortion + * The SENSOR_INFO_TIMESTAMP_SOURCE of the logical device and physical devices must be + the same. + * The logical camera device must be LIMITED or higher device. + + Both the logical camera device and its underlying physical devices support the + mandatory stream combinations required for their device levels. + + Additionally, for each guaranteed stream combination, the logical camera supports: + + * For each guaranteed stream combination, the logical camera supports replacing one + logical {@link android.graphics.ImageFormat#YUV_420_888|AIMAGE_FORMAT_YUV_420_888 YUV_420_888} + or raw stream with two physical streams of the same size and format, each from a + separate physical camera, given that the size and format are supported by both + physical cameras. + * If the logical camera doesn't advertise RAW capability, but the underlying physical + cameras do, the logical camera will support guaranteed stream combinations for RAW + capability, except that the RAW streams will be physical streams, each from a separate + physical camera. This is usually the case when the physical cameras have different + sensor sizes. + + Using physical streams in place of a logical stream of the same size and format will + not slow down the frame rate of the capture, as long as the minimum frame duration + of the physical and logical streams are the same. + + + List of capabilities that this camera device advertises as fully supporting. @@ -5106,6 +5213,10 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata enum notes. The entry android.depth.maxDepthSamples must be available if the DEPTH_POINT_CLOUD format is supported (HAL pixel format BLOB, dataspace DEPTH). + + For a camera device with LOGICAL_MULTI_CAMERA capability, it should operate in the + same way as a physical camera device based on its hardware level and capabilities. + It's recommended that its feature set is superset of that of individual physical cameras. + + n + + A subset of the available request keys that the camera device + can pass as part of the capture session initialization. + +
This is a subset of android.request.availableRequestKeys which + contains a list of keys that are difficult to apply per-frame and + can result in unexpected delays when modified during the capture session + lifetime. Typical examples include parameters that require a + time-consuming hardware re-configuration or internal camera pipeline + change. For performance reasons we advise clients to pass their initial + values as part of + {@link SessionConfiguration#setSessionParameters|ACameraDevice_createCaptureSessionWithSessionParameters}. + Once the camera capture session is enabled it is also recommended to avoid + changing them from their initial values set in + {@link SessionConfiguration#setSessionParameters|ACameraDevice_createCaptureSessionWithSessionParameters}. + Control over session parameters can still be exerted in capture requests + but clients should be aware and expect delays during their application. + An example usage scenario could look like this: + + * The camera client starts by quering the session parameter key list via + {@link android.hardware.camera2.CameraCharacteristics#getAvailableSessionKeys|ACameraManager_getCameraCharacteristics}. + * Before triggering the capture session create sequence, a capture request + must be built via + {@link CameraDevice#createCaptureRequest|ACameraDevice_createCaptureRequest} + using an appropriate template matching the particular use case. + * The client should go over the list of session parameters and check + whether some of the keys listed matches with the parameters that + they intend to modify as part of the first capture request. + * If there is no such match, the capture request can be passed + unmodified to + {@link SessionConfiguration#setSessionParameters|ACameraDevice_createCaptureSessionWithSessionParameters}. + * If matches do exist, the client should update the respective values + and pass the request to + {@link SessionConfiguration#setSessionParameters|ACameraDevice_createCaptureSessionWithSessionParameters}. + * After the capture session initialization completes the session parameter + key list can continue to serve as reference when posting or updating + further requests. As mentioned above further changes to session + parameters should ideally be avoided, if updates are necessary + however clients could expect a delay/glitch during the + parameter switch. + +
+ + Vendor tags can be listed here. Vendor tag metadata should also + use the extensions C api (refer to + android.hardware.camera.device.V3_4.StreamConfiguration.sessionParams for more details). + + Setting/getting vendor tags will be checked against the metadata + vendor extensions API and not against this field. + + The HAL must not consume any request tags in the session parameters that + are not listed either here or in the vendor tag list. + + The public camera2 API will always make the vendor tags visible + via + {@link android.hardware.camera2.CameraCharacteristics#getAvailableSessionKeys}. + +
+ + + n + + A subset of the available request keys that can be overriden for + physical devices backing a logical multi-camera. +
+ This is a subset of android.request.availableRequestKeys which contains a list + of keys that can be overriden using {@link CaptureRequest.Builder#setPhysicalCameraKey}. + The respective value of such request key can be obtained by calling + {@link CaptureRequest.Builder#getPhysicalCameraKey}. Capture requests that contain + individual physical device requests must be built via + {@link android.hardware.camera2.CameraDevice#createCaptureRequest(int, Set)}. +
+ + Vendor tags can be listed here. Vendor tag metadata should also + use the extensions C api (refer to + android.hardware.camera.device.V3_4.CaptureRequest.physicalCameraSettings for more + details). + + Setting/getting vendor tags will be checked against the metadata + vendor extensions API and not against this field. + + The HAL must not consume any request tags in the session parameters that + are not listed either here or in the vendor tag list. + + There should be no overlap between this set of keys and the available session keys + {@link android.hardware.camera2.CameraCharacteristics#getAvailableSessionKeys} along + with any other controls that can have impact on the dual-camera sync. + + The public camera2 API will always make the vendor tags visible + via + {@link android.hardware.camera2.CameraCharacteristics#getAvailablePhysicalCameraRequestKeys}. + +
@@ -7647,6 +7856,21 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata LEGACY mode devices will always only support OFF. + + + n + + + List of OIS data output modes for android.statistics.oisDataMode that + are supported by this camera device. + + Any value listed in android.statistics.oisDataMode +
+ If no OIS data output is available for this camera device, this key will + contain only OFF. +
+
@@ -8077,6 +8301,76 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata + + + + OFF + Do not include OIS data in the capture result. + ON + Include OIS data in the capture result. + + A control for selecting whether OIS position information is included in output + result metadata. + android.Statistics.info.availableOisDataModes +
When set to ON, + android.statistics.oisTimestamps, android.statistics.oisShiftPixelX, + and android.statistics.oisShiftPixelY provide OIS data in the output result metadata. +
+
+
+ + + + + + n + + + An array of timestamps of OIS samples, in nanoseconds. + + nanoseconds +
+ The array contains the timestamps of OIS samples. The timestamps are in the same + timebase as and comparable to android.sensor.timestamp. +
+
+ + + n + + + An array of shifts of OIS samples, in x direction. + + Pixels in active array. +
+ The array contains the amount of shifts in x direction, in pixels, based on OIS samples. + A positive value is a shift from left to right in active array coordinate system. For + example, if the optical center is (1000, 500) in active array coordinates, an shift of + (3, 0) puts the new optical center at (1003, 500). + + The number of shifts must match the number of timestamps in + android.statistics.oisTimestamps. +
+
+ + + n + + + An array of shifts of OIS samples, in y direction. + + Pixels in active array. +
+ The array contains the amount of shifts in y direction, in pixels, based on OIS samples. + A positive value is a shift from top to bottom in active array coordinate system. For + example, if the optical center is (1000, 500) in active array coordinates, an shift of + (0, 5) puts the new optical center at (1000, 505). + + The number of shifts must match the number of timestamps in + android.statistics.oisTimestamps. +
+
+
@@ -8580,6 +8874,27 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata `RAW`) + + EXTERNAL + + This camera device is backed by an external camera connected to this Android device. + + The device has capability identical to a LIMITED level device, with the following + exceptions: + + * The device may not report lens/sensor related information such as + - android.lens.focalLength + - android.lens.info.hyperfocalDistance + - android.sensor.info.physicalSize + - android.sensor.info.whiteLevel + - android.sensor.blackLevelPattern + - android.sensor.info.colorFilterArrangement + - android.sensor.rollingShutterSkew + * The device will report 0 for android.sensor.orientation + * The device has less guarantee on stable framerate, as the framerate partly depends + on the external camera being used. + + Generally classifies the overall set of the camera device functionality. @@ -8647,6 +8962,26 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata Camera HAL3+ must not implement LEGACY mode. It is there for backwards compatibility in the `android.hardware.camera2` user-facing API only on legacy HALv1 devices, and is implemented by the camera framework code. + + EXTERNAL level devices have lower peformance bar in CTS since the peformance might depend + on the external camera being used and is not fully controlled by the device manufacturer. + The ITS test suite is exempted for the same reason. + + + + + A short string for manufacturer version information about the camera device, such as + ISP hardware, sensors, etc. + +
+ This can be used in {@link android.media.ExifInterface#TAG_IMAGE_DESCRIPTION TAG_IMAGE_DESCRIPTION} + in jpeg EXIF. This key may be absent if no version information is available on the + device. +
+ + The string must consist of only alphanumeric characters, punctuation, and + whitespace, i.e. it must match regular expression "[\p{Alnum}\p{Punct}\p{Space}]*". + It must not exceed 256 characters.
@@ -9122,5 +9457,61 @@ xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata
+
+ + + + n + + String containing the ids of the underlying physical cameras. + + UTF-8 null-terminated string +
+ For a logical camera, this is concatenation of all underlying physical camera ids. + The null terminator for physical camera id must be preserved so that the whole string + can be tokenized using '\0' to generate list of physical camera ids. + + For example, if the physical camera ids of the logical camera are "2" and "3", the + value of this tag will be ['2', '\0', '3', '\0']. + + The number of physical camera ids must be no less than 2. +
+ +
+ + + APPROXIMATE + + A software mechanism is used to synchronize between the physical cameras. As a result, + the timestamp of an image from a physical stream is only an approximation of the + image sensor start-of-exposure time. + + + CALIBRATED + + The camera device supports frame timestamp synchronization at the hardware level, + and the timestamp of a physical stream image accurately reflects its + start-of-exposure time. + + + + The accuracy of frame timestamp synchronization between physical cameras +
+ The accuracy of the frame timestamp synchronization determines the physical cameras' + ability to start exposure at the same time. If the sensorSyncType is CALIBRATED, + the physical camera sensors usually run in master-slave mode so that their shutter + time is synchronized. For APPROXIMATE sensorSyncType, the camera sensors usually run in + master-master mode, and there could be offset between their start of exposure. + + In both cases, all images generated for a particular capture request still carry the same + timestamps, so that they can be used to look up the matching frame number and + onCaptureStarted callback. +
+ +
+
+