><a href="#static_android.scaler.availableStallDurations">android.scaler.availableStallDurations</a></li>
<li
><a href="#static_android.scaler.streamConfigurationMap">android.scaler.streamConfigurationMap</a></li>
+ <li
+ ><a href="#static_android.scaler.croppingType">android.scaler.croppingType</a></li>
</ul>
</li>
<li>
should be nonnegative.<wbr/></p>
<p>If all regions have 0 weight,<wbr/> then no specific metering area
needs to be used by the camera device.<wbr/> If the metering region is
-outside the current <a href="#controls_android.scaler.cropRegion">android.<wbr/>scaler.<wbr/>crop<wbr/>Region</a>,<wbr/> the camera device
-will ignore the sections outside the region and output the
-used sections in the frame metadata.<wbr/></p>
+outside the used <a href="#controls_android.scaler.cropRegion">android.<wbr/>scaler.<wbr/>crop<wbr/>Region</a> returned in capture result metadata,<wbr/>
+the camera device will ignore the sections outside the region and output the
+used sections in the result metadata.<wbr/></p>
</td>
</tr>
should be nonnegative.<wbr/></p>
<p>If all regions have 0 weight,<wbr/> then no specific focus area
needs to be used by the camera device.<wbr/> If the focusing region is
-outside the current <a href="#controls_android.scaler.cropRegion">android.<wbr/>scaler.<wbr/>crop<wbr/>Region</a>,<wbr/> the camera device
-will ignore the sections outside the region and output the
-used sections in the frame metadata.<wbr/></p>
+outside the the used <a href="#controls_android.scaler.cropRegion">android.<wbr/>scaler.<wbr/>crop<wbr/>Region</a> returned in capture
+result metadata,<wbr/> the camera device will ignore the sections outside
+the region and output the used sections in the result metadata.<wbr/></p>
</td>
</tr>
should be nonnegative.<wbr/></p>
<p>If all regions have 0 weight,<wbr/> then no specific auto-white balance (AWB) area
needs to be used by the camera device.<wbr/> If the AWB region is
-outside the current <a href="#controls_android.scaler.cropRegion">android.<wbr/>scaler.<wbr/>crop<wbr/>Region</a>,<wbr/> the camera device
-will ignore the sections outside the region and output the
-used sections in the frame metadata.<wbr/></p>
+outside the the used <a href="#controls_android.scaler.cropRegion">android.<wbr/>scaler.<wbr/>crop<wbr/>Region</a> returned in capture result metadata,<wbr/>
+the camera device will ignore the sections outside the region and output the
+used sections in the result metadata.<wbr/></p>
</td>
</tr>
should be nonnegative.<wbr/></p>
<p>If all regions have 0 weight,<wbr/> then no specific metering area
needs to be used by the camera device.<wbr/> If the metering region is
-outside the current <a href="#controls_android.scaler.cropRegion">android.<wbr/>scaler.<wbr/>crop<wbr/>Region</a>,<wbr/> the camera device
-will ignore the sections outside the region and output the
-used sections in the frame metadata.<wbr/></p>
+outside the used <a href="#controls_android.scaler.cropRegion">android.<wbr/>scaler.<wbr/>crop<wbr/>Region</a> returned in capture result metadata,<wbr/>
+the camera device will ignore the sections outside the region and output the
+used sections in the result metadata.<wbr/></p>
</td>
</tr>
should be nonnegative.<wbr/></p>
<p>If all regions have 0 weight,<wbr/> then no specific focus area
needs to be used by the camera device.<wbr/> If the focusing region is
-outside the current <a href="#controls_android.scaler.cropRegion">android.<wbr/>scaler.<wbr/>crop<wbr/>Region</a>,<wbr/> the camera device
-will ignore the sections outside the region and output the
-used sections in the frame metadata.<wbr/></p>
+outside the the used <a href="#controls_android.scaler.cropRegion">android.<wbr/>scaler.<wbr/>crop<wbr/>Region</a> returned in capture
+result metadata,<wbr/> the camera device will ignore the sections outside
+the region and output the used sections in the result metadata.<wbr/></p>
</td>
</tr>
should be nonnegative.<wbr/></p>
<p>If all regions have 0 weight,<wbr/> then no specific auto-white balance (AWB) area
needs to be used by the camera device.<wbr/> If the AWB region is
-outside the current <a href="#controls_android.scaler.cropRegion">android.<wbr/>scaler.<wbr/>crop<wbr/>Region</a>,<wbr/> the camera device
-will ignore the sections outside the region and output the
-used sections in the frame metadata.<wbr/></p>
+outside the the used <a href="#controls_android.scaler.cropRegion">android.<wbr/>scaler.<wbr/>crop<wbr/>Region</a> returned in capture result metadata,<wbr/>
+the camera device will ignore the sections outside the region and output the
+used sections in the result metadata.<wbr/></p>
</td>
</tr>
</tr>
<tr class="entry_cont">
<td class="entry_details" colspan="5">
- <p>Any additional per-stream cropping must be done to
-maximize the final pixel area of the stream.<wbr/></p>
+ <p>The crop region is applied after the RAW to other color space (e.<wbr/>g.<wbr/> YUV)
+conversion.<wbr/> Since raw streams (e.<wbr/>g.<wbr/> RAW16) don't have the conversion stage,<wbr/>
+it is not croppable.<wbr/> The crop region will be ignored by raw streams.<wbr/></p>
+<p>For non-raw streams,<wbr/> any additional per-stream cropping will
+be done to maximize the final pixel area of the stream.<wbr/></p>
<p>For example,<wbr/> if the crop region is set to a 4:3 aspect
ratio,<wbr/> then 4:3 streams should use the exact crop
region.<wbr/> 16:9 streams should further crop vertically
<tr class="entry_spacer"><td class="entry_spacer" colspan="6"></td></tr>
<!-- end of entry -->
+
+ <tr class="entry" id="static_android.scaler.croppingType">
+ <td class="entry_name
+ " rowspan="3">
+ android.<wbr/>scaler.<wbr/>cropping<wbr/>Type
+ </td>
+ <td class="entry_type">
+ <span class="entry_type_name entry_type_name_enum">byte</span>
+
+ <span class="entry_type_visibility"> [public]</span>
+
+
+
+
+ <ul class="entry_type_enum">
+ <li>
+ <span class="entry_type_enum_name">CENTER_ONLY</span>
+ <span class="entry_type_enum_notes"><p>The camera device will only support centered crop regions.<wbr/></p></span>
+ </li>
+ <li>
+ <span class="entry_type_enum_name">FREEFORM</span>
+ <span class="entry_type_enum_notes"><p>The camera device will support arbitrarily chosen crop regions.<wbr/></p></span>
+ </li>
+ </ul>
+
+ </td> <!-- entry_type -->
+
+ <td class="entry_description">
+ <p>The crop type that this camera device supports.<wbr/></p>
+ </td>
+
+ <td class="entry_units">
+ </td>
+
+ <td class="entry_range">
+ </td>
+
+ <td class="entry_tags">
+ </td>
+
+ </tr>
+ <tr class="entries_header">
+ <th class="th_details" colspan="5">Details</th>
+ </tr>
+ <tr class="entry_cont">
+ <td class="entry_details" colspan="5">
+ <p>When passing a non-centered crop region (<a href="#controls_android.scaler.cropRegion">android.<wbr/>scaler.<wbr/>crop<wbr/>Region</a>) to a camera
+device that only supports CENTER_<wbr/>ONLY cropping,<wbr/> the camera device will move the
+crop region to the center of the sensor active array (<a href="#static_android.sensor.info.activeArraySize">android.<wbr/>sensor.<wbr/>info.<wbr/>active<wbr/>Array<wbr/>Size</a>)
+and keep the crop region width and height unchanged.<wbr/> The camera device will return the
+final used crop region in metadata result <a href="#controls_android.scaler.cropRegion">android.<wbr/>scaler.<wbr/>crop<wbr/>Region</a>.<wbr/></p>
+<p>Camera devices that support FREEFORM cropping will support any crop region that
+is inside of the active array.<wbr/> The camera device will apply the same crop region and
+return the final used crop region in capture result metadata <a href="#controls_android.scaler.cropRegion">android.<wbr/>scaler.<wbr/>crop<wbr/>Region</a>.<wbr/></p>
+<p>FULL capability devices (<a href="#static_android.info.supportedHardwareLevel">android.<wbr/>info.<wbr/>supported<wbr/>Hardware<wbr/>Level</a> <code>==</code> FULL) will support
+FREEFORM cropping.<wbr/></p>
+ </td>
+ </tr>
+
+
+ <tr class="entry_spacer"><td class="entry_spacer" colspan="6"></td></tr>
+ <!-- end of entry -->
+
<!-- end of kind -->
</tr>
<tr class="entry_cont">
<td class="entry_details" colspan="5">
- <p>Any additional per-stream cropping must be done to
-maximize the final pixel area of the stream.<wbr/></p>
+ <p>The crop region is applied after the RAW to other color space (e.<wbr/>g.<wbr/> YUV)
+conversion.<wbr/> Since raw streams (e.<wbr/>g.<wbr/> RAW16) don't have the conversion stage,<wbr/>
+it is not croppable.<wbr/> The crop region will be ignored by raw streams.<wbr/></p>
+<p>For non-raw streams,<wbr/> any additional per-stream cropping will
+be done to maximize the final pixel area of the stream.<wbr/></p>
<p>For example,<wbr/> if the crop region is set to a 4:3 aspect
ratio,<wbr/> then 4:3 streams should use the exact crop
region.<wbr/> 16:9 streams should further crop vertically
camera device.<wbr/> Applications can request lens shading map data by setting
<a href="#controls_android.statistics.lensShadingMapMode">android.<wbr/>statistics.<wbr/>lens<wbr/>Shading<wbr/>Map<wbr/>Mode</a> to ON,<wbr/> and then the camera device will provide
lens shading map data in <a href="#dynamic_android.statistics.lensShadingMap">android.<wbr/>statistics.<wbr/>lens<wbr/>Shading<wbr/>Map</a>,<wbr/> with size specified
-by <a href="#static_android.lens.info.shadingMapSize">android.<wbr/>lens.<wbr/>info.<wbr/>shading<wbr/>Map<wbr/>Size</a>.<wbr/></p>
+by <a href="#static_android.lens.info.shadingMapSize">android.<wbr/>lens.<wbr/>info.<wbr/>shading<wbr/>Map<wbr/>Size</a>; the returned shading map data will be the one
+applied by the camera device for this capture request.<wbr/></p>
+<p>The shading map data may depend on the AE and AWB statistics,<wbr/> therefore the reliability
+of the map data may be affected by the AE and AWB algorithms.<wbr/> When AE and AWB are in
+AUTO modes(<a href="#controls_android.control.aeMode">android.<wbr/>control.<wbr/>ae<wbr/>Mode</a> <code>!=</code> OFF and <a href="#controls_android.control.awbMode">android.<wbr/>control.<wbr/>awb<wbr/>Mode</a> <code>!=</code> OFF),<wbr/>
+to get best results,<wbr/> it is recommended that the applications wait for the AE and AWB to
+be converged before using the returned shading map data.<wbr/></p>
</td>
</tr>
camera device.<wbr/> Applications can request lens shading map data by setting
<a href="#controls_android.statistics.lensShadingMapMode">android.<wbr/>statistics.<wbr/>lens<wbr/>Shading<wbr/>Map<wbr/>Mode</a> to ON,<wbr/> and then the camera device will provide
lens shading map data in <a href="#dynamic_android.statistics.lensShadingMap">android.<wbr/>statistics.<wbr/>lens<wbr/>Shading<wbr/>Map</a>,<wbr/> with size specified
-by <a href="#static_android.lens.info.shadingMapSize">android.<wbr/>lens.<wbr/>info.<wbr/>shading<wbr/>Map<wbr/>Size</a>.<wbr/></p>
+by <a href="#static_android.lens.info.shadingMapSize">android.<wbr/>lens.<wbr/>info.<wbr/>shading<wbr/>Map<wbr/>Size</a>; the returned shading map data will be the one
+applied by the camera device for this capture request.<wbr/></p>
+<p>The shading map data may depend on the AE and AWB statistics,<wbr/> therefore the reliability
+of the map data may be affected by the AE and AWB algorithms.<wbr/> When AE and AWB are in
+AUTO modes(<a href="#controls_android.control.aeMode">android.<wbr/>control.<wbr/>ae<wbr/>Mode</a> <code>!=</code> OFF and <a href="#controls_android.control.awbMode">android.<wbr/>control.<wbr/>awb<wbr/>Mode</a> <code>!=</code> OFF),<wbr/>
+to get best results,<wbr/> it is recommended that the applications wait for the AE and AWB to
+be converged before using the returned shading map data.<wbr/></p>
</td>
</tr>
<tr class="entry" id="dynamic_android.statistics.lensShadingMap">
<td class="entry_name
- " rowspan="3">
+ " rowspan="5">
android.<wbr/>statistics.<wbr/>lens<wbr/>Shading<wbr/>Map
</td>
<td class="entry_type">
</td>
</tr>
+ <tr class="entries_header">
+ <th class="th_details" colspan="5">HAL Implementation Details</th>
+ </tr>
+ <tr class="entry_cont">
+ <td class="entry_details" colspan="5">
+ <p>The lens shading map calculation may depend on exposure and white balance statistics.<wbr/>
+When AE and AWB are in AUTO modes
+(<a href="#controls_android.control.aeMode">android.<wbr/>control.<wbr/>ae<wbr/>Mode</a> <code>!=</code> OFF and <a href="#controls_android.control.awbMode">android.<wbr/>control.<wbr/>awb<wbr/>Mode</a> <code>!=</code> OFF),<wbr/> the HAL
+may have all the information it need to generate most accurate lens shading map.<wbr/> When
+AE or AWB are in manual mode
+(<a href="#controls_android.control.aeMode">android.<wbr/>control.<wbr/>ae<wbr/>Mode</a> <code>==</code> OFF or <a href="#controls_android.control.awbMode">android.<wbr/>control.<wbr/>awb<wbr/>Mode</a> <code>==</code> OFF),<wbr/> the shading map
+may be adversely impacted by manual exposure or white balance parameters.<wbr/> To avoid
+generating unreliable shading map data,<wbr/> the HAL may choose to lock the shading map with
+the latest known good map generated when the AE and AWB are in AUTO modes.<wbr/></p>
+ </td>
+ </tr>
<tr class="entry_spacer"><td class="entry_spacer" colspan="6"></td></tr>
<!-- end of entry -->
<a href="#controls_android.tonemap.curveGreen">android.<wbr/>tonemap.<wbr/>curve<wbr/>Green</a>,<wbr/> and <a href="#controls_android.tonemap.curveBlue">android.<wbr/>tonemap.<wbr/>curve<wbr/>Blue</a>.<wbr/>
These values are always available,<wbr/> and as close as possible to the
actually used nonlinear/<wbr/>nonglobal transforms.<wbr/></p>
-<p>If a request is sent with TRANSFORM_<wbr/>MATRIX with the camera device's
+<p>If a request is sent with CONTRAST_<wbr/>CURVE with the camera device's
provided curve in FAST or HIGH_<wbr/>QUALITY,<wbr/> the image's tonemap will be
roughly the same.<wbr/></p>
</td>
<a href="#controls_android.tonemap.curveGreen">android.<wbr/>tonemap.<wbr/>curve<wbr/>Green</a>,<wbr/> and <a href="#controls_android.tonemap.curveBlue">android.<wbr/>tonemap.<wbr/>curve<wbr/>Blue</a>.<wbr/>
These values are always available,<wbr/> and as close as possible to the
actually used nonlinear/<wbr/>nonglobal transforms.<wbr/></p>
-<p>If a request is sent with TRANSFORM_<wbr/>MATRIX with the camera device's
+<p>If a request is sent with CONTRAST_<wbr/>CURVE with the camera device's
provided curve in FAST or HIGH_<wbr/>QUALITY,<wbr/> the image's tonemap will be
roughly the same.<wbr/></p>
</td>
If all regions have 0 weight, then no specific metering area
needs to be used by the camera device. If the metering region is
- outside the current android.scaler.cropRegion, the camera device
- will ignore the sections outside the region and output the
- used sections in the frame metadata.</details>
+ outside the used android.scaler.cropRegion returned in capture result metadata,
+ the camera device will ignore the sections outside the region and output the
+ used sections in the result metadata.</details>
<tag id="BC" />
</entry>
<entry name="aeTargetFpsRange" type="int32" visibility="public"
If all regions have 0 weight, then no specific focus area
needs to be used by the camera device. If the focusing region is
- outside the current android.scaler.cropRegion, the camera device
- will ignore the sections outside the region and output the
- used sections in the frame metadata.</details>
+ outside the the used android.scaler.cropRegion returned in capture
+ result metadata, the camera device will ignore the sections outside
+ the region and output the used sections in the result metadata.</details>
<tag id="BC" />
</entry>
<entry name="afTrigger" type="byte" visibility="public" enum="true">
If all regions have 0 weight, then no specific auto-white balance (AWB) area
needs to be used by the camera device. If the AWB region is
- outside the current android.scaler.cropRegion, the camera device
- will ignore the sections outside the region and output the
- used sections in the frame metadata.
+ outside the the used android.scaler.cropRegion returned in capture result metadata,
+ the camera device will ignore the sections outside the region and output the
+ used sections in the result metadata.
</details>
<tag id="BC" />
</entry>
in pixels; (0,0) is top-left corner of
android.sensor.activeArraySize</units>
<details>
- Any additional per-stream cropping must be done to
- maximize the final pixel area of the stream.
+ The crop region is applied after the RAW to other color space (e.g. YUV)
+ conversion. Since raw streams (e.g. RAW16) don't have the conversion stage,
+ it is not croppable. The crop region will be ignored by raw streams.
+
+ For non-raw streams, any additional per-stream cropping will
+ be done to maximize the final pixel area of the stream.
For example, if the crop region is set to a 4:3 aspect
ratio, then 4:3 streams should use the exact crop
* available[Processed,Raw,Jpeg]Sizes
</hal_details>
</entry>
+ <entry name="croppingType" type="byte" visibility="public" enum="true">
+ <enum>
+ <value>CENTER_ONLY
+ <notes>
+ The camera device will only support centered crop regions.
+ </notes>
+ </value>
+ <value>FREEFORM
+ <notes>
+ The camera device will support arbitrarily chosen crop regions.
+ </notes>
+ </value>
+ </enum>
+ <description>The crop type that this camera device supports.</description>
+ <details>
+ When passing a non-centered crop region (android.scaler.cropRegion) to a camera
+ device that only supports CENTER_ONLY cropping, the camera device will move the
+ crop region to the center of the sensor active array (android.sensor.info.activeArraySize)
+ and keep the crop region width and height unchanged. The camera device will return the
+ final used crop region in metadata result android.scaler.cropRegion.
+
+ Camera devices that support FREEFORM cropping will support any crop region that
+ is inside of the active array. The camera device will apply the same crop region and
+ return the final used crop region in capture result metadata android.scaler.cropRegion.
+
+ FULL capability devices (android.info.supportedHardwareLevel `==` FULL) will support
+ FREEFORM cropping.
+ </details>
+ </entry>
</static>
</section>
<section name="sensor">
camera device. Applications can request lens shading map data by setting
android.statistics.lensShadingMapMode to ON, and then the camera device will provide
lens shading map data in android.statistics.lensShadingMap, with size specified
- by android.lens.info.shadingMapSize.
+ by android.lens.info.shadingMapSize; the returned shading map data will be the one
+ applied by the camera device for this capture request.
+
+ The shading map data may depend on the AE and AWB statistics, therefore the reliability
+ of the map data may be affected by the AE and AWB algorithms. When AE and AWB are in
+ AUTO modes(android.control.aeMode `!=` OFF and android.control.awbMode `!=` OFF),
+ to get best results, it is recommended that the applications wait for the AE and AWB to
+ be converged before using the returned shading map data.
</details>
</entry>
<entry name="strength" type="byte">
![Image of a uniform white wall (inverse shading map)](android.statistics.lensShadingMap/inv_shading.png)
</details>
+ <hal_details>
+ The lens shading map calculation may depend on exposure and white balance statistics.
+ When AE and AWB are in AUTO modes
+ (android.control.aeMode `!=` OFF and android.control.awbMode `!=` OFF), the HAL
+ may have all the information it need to generate most accurate lens shading map. When
+ AE or AWB are in manual mode
+ (android.control.aeMode `==` OFF or android.control.awbMode `==` OFF), the shading map
+ may be adversely impacted by manual exposure or white balance parameters. To avoid
+ generating unreliable shading map data, the HAL may choose to lock the shading map with
+ the latest known good map generated when the AE and AWB are in AUTO modes.
+ </hal_details>
</entry>
<entry name="predictedColorGains" type="float"
visibility="hidden"
These values are always available, and as close as possible to the
actually used nonlinear/nonglobal transforms.
- If a request is sent with TRANSFORM_MATRIX with the camera device's
+ If a request is sent with CONTRAST_CURVE with the camera device's
provided curve in FAST or HIGH_QUALITY, the image's tonemap will be
roughly the same.</details>
</entry>
* See the License for the specific language governing permissions and
* limitations under the License.
*/
-#define _GNU_SOURCE // for fdprintf
+
#include <inttypes.h>
#include <system/camera_metadata.h>
#include <camera_metadata_hidden.h>
int verbosity,
int indentation) {
if (metadata == NULL) {
- fdprintf(fd, "%*sDumping camera metadata array: Not allocated\n",
+ dprintf(fd, "%*sDumping camera metadata array: Not allocated\n",
indentation, "");
return;
}
unsigned int i;
- fdprintf(fd,
+ dprintf(fd,
"%*sDumping camera metadata array: %zu / %zu entries, "
"%zu / %zu bytes of extra data.\n", indentation, "",
metadata->entry_count, metadata->entry_capacity,
metadata->data_count, metadata->data_capacity);
- fdprintf(fd, "%*sVersion: %d, Flags: %08x\n",
+ dprintf(fd, "%*sVersion: %d, Flags: %08x\n",
indentation + 2, "",
metadata->version, metadata->flags);
camera_metadata_buffer_entry_t *entry = get_entries(metadata);
} else {
type_name = camera_metadata_type_names[entry->type];
}
- fdprintf(fd, "%*s%s.%s (%05x): %s[%zu]\n",
+ dprintf(fd, "%*s%s.%s (%05x): %s[%zu]\n",
indentation + 2, "",
tag_section,
tag_name,
int index = 0;
int j, k;
for (j = 0; j < lines; j++) {
- fdprintf(fd, "%*s[", indentation + 4, "");
+ dprintf(fd, "%*s[", indentation + 4, "");
for (k = 0;
k < values_per_line[type] && count > 0;
k++, count--, index += type_size) {
value_string_tmp,
sizeof(value_string_tmp))
== OK) {
- fdprintf(fd, "%s ", value_string_tmp);
+ dprintf(fd, "%s ", value_string_tmp);
} else {
- fdprintf(fd, "%hhu ",
+ dprintf(fd, "%hhu ",
*(data_ptr + index));
}
break;
value_string_tmp,
sizeof(value_string_tmp))
== OK) {
- fdprintf(fd, "%s ", value_string_tmp);
+ dprintf(fd, "%s ", value_string_tmp);
} else {
- fdprintf(fd, "%" PRId32 " ",
+ dprintf(fd, "%" PRId32 " ",
*(int32_t*)(data_ptr + index));
}
break;
case TYPE_FLOAT:
- fdprintf(fd, "%0.8f ",
+ dprintf(fd, "%0.8f ",
*(float*)(data_ptr + index));
break;
case TYPE_INT64:
- fdprintf(fd, "%" PRId64 " ",
+ dprintf(fd, "%" PRId64 " ",
*(int64_t*)(data_ptr + index));
break;
case TYPE_DOUBLE:
- fdprintf(fd, "%0.8f ",
+ dprintf(fd, "%0.8f ",
*(double*)(data_ptr + index));
break;
case TYPE_RATIONAL: {
int32_t numerator = *(int32_t*)(data_ptr + index);
int32_t denominator = *(int32_t*)(data_ptr + index + 4);
- fdprintf(fd, "(%d / %d) ",
+ dprintf(fd, "(%d / %d) ",
numerator, denominator);
break;
}
default:
- fdprintf(fd, "??? ");
+ dprintf(fd, "??? ");
}
}
- fdprintf(fd, "]\n");
+ dprintf(fd, "]\n");
}
}