Handling Camera and Album UIImages on iOS
In a next release of Tabris we will provide support for Camera. This will allow you to show a UI on the client to take a photo or select a picture on your device. The selected Image will then be scaled and sent to the server so you can use it in your App.
While it might seem a simple task to get a UIImage object on the device and create a JPEG or PNG to upload it to the server, you need to be aware of one important detail: The Images are stored as “RAW” images - this means that all “modifications” like rotation, crop, … are not applied to the Image - they are stored in separate metadata to enable a non-destructive workflow on the iOS device!
Four steps are required to prepare an UIImage for upload. I provided the implementation as Gist on GitHub. First you need to fix the orientation. This also includes mirroring the image. [1] In the second step we scale the image to the maximum size the server expects so we don‘t have to upload the full image and scale it on the server - this saves upload bandwidth. [2] The third step will create the actual image to be uploaded to the server. [3] In the last (optional) step, the image is stored in the PhotoAlbum of the device. [4]
[1] Fix Orientation
ImageHelper.m +(UIImage *)fixOrientation:(UIImage *)image;
[2] Scale
ImageHelper.m +(UIImage *)restrictImage:(UIImage *)image toSize:(CGSize)maxSize;
[3] Create the image
NSData *jpegData = UIImageJPEGRepresentation(UIImage, 1.0); NSData *pngData = UIImagePNGRepresentation(imageToSend, 1.0);
[4] Store on the device
UIImageWriteToSavedPhotosAlbum([UIImage imageWithData:jpegData], nil, nil, nil);
If it’s useful for you or you have any suggestions, drop a line in the comments below.