Why are videos selected with UIImagePickerController high and medium image quality settings selected, resulting in exactly the same video attributes, at least on devices like iPhone4 and iPad3?
More details:
We use UIImagePickerController so that users of our application can select images or videos from the photo library and then transfer them to their servers. We allow users to select high, medium, or low quality video quality that we directly map to the video stream constants UIImagePickerControllerQualityTypeHigh, UIImagePickerControllerQualityTypeMedium and UIImagePickerControllerQualityTypeLow.
When 10 seconds or so is selected on the 3GS (iOS 5.0) screen, taken outside of our camera application, we see a clear difference with each quality parameter, for example:
- low: 226 KB on 144x192, codec: AAC H.264
- medium: 1.1MB on 360x480, codec: AAC H.264
- high: 5 MB at 480x640, codec: AAC H.264
When we try to do the same on iPhone4 or iPad3 (we have convenient devices, I'm not sure that this only happens on these devices), we see that a low setting generates an equivalent low-resolution result, but a high and medium setting gives us those same results, something like this:
- low: 194 KB on 144x192, codec: AAC H.264
- environment: 2.87MB at 720x1280, codec: AAC H.264
- high: 2.87MB at 720x1280, codec: AAC H.264
(Note that medium and high results are identical.)
The original from the device is 12.8 MB at 720x1280, codec: AAC H.264 with a higher transfer rate.
Can anyone explain what is going on here? I would like for us to be able to explain this to our customers, even better point out something in the Apple document that covers this.
Thanks in advance for your help ...
source share