custom UIImagePickerController camera view

January 12, 2009

WARNING: While there are many apps (including some of mine) that use this technique, you should know some new apps and updates to existing apps have been rejected recently (april/2009). Please read the comments. So far I don’t know of any instances where the developer successfully argued Apple’s decision if the app was rejected.

With all that said, there’s still an outpour of apps that use this technique… so the decision is your.

Here is some information about inspecting and customizing the UIImagePickerController camera view. You can download the working xcode project with all the source code here: customImagePicker

I wanted to remove the top part of the interface (gray bacground and “Take Photo” label) for  Mean Photo and Nice Photo (version 1.2+). There is also a very annoying image shift between the camera view and the preview that I thought would be nice to fix.

You can download the customImagePicker xcode project with all the source code that demonstrates these techniques. Run it on your iPhone in debug mode and check the log output for information.

CustomImagePicker is a subclass of UIImagePickerController. I override the viewDidLoad: method. When this method is called the view is ready to be customized (this approach was suggested in netsharc’s post here.)

-(void) viewDidAppear: (BOOL)animated {
	// make sure to call the same method on the super class
	[super viewDidAppear:animated];

	/* ... customize view here ... */
}

The inspectView:: method to recursively walks the view hierarchy and prints information about the views to the log: the class descriptions, the position and size and the path (talk about the path in a bit.) If you are interested in other information about the views, you can extend this method to print them. Here’s how this method is invoked:

[self inspectView:self.view depth:0 path:@""];

You can uncomment this line in viewDidLoad:, but it’s also called from previewCheck every five seconds, so you can monitor the view hierarchy as it changes in the different states. So for example if you are interested what the view hierarchy is like in the preview mode, just take a photo and wait until the next dump appears in the log. The output looks something like this:

 .description: UILayoutContainerView: 0x125b90
 .frame: 0, 0, 320, 460
 .subviews: 2

 --subview-- /0
   .description: UITransitionView: 0x1205a0
   .frame: 0, 0, 320, 460
   .subviews: 1

   --subview-- /0/0
     .description: UIView: 0x125da0
     .frame: 0, 0, 320, 460
     .subviews: 1

     --subview-- /0/0/0
       .description: PLCameraView: 0x125e50
       .frame: 0, 0, 320, 460
       .subviews: 4

       --subview-- /0/0/0/0
         .description: UIView: 0x1264c0
         .frame: 0, 0, 320, 427
         .subviews: 0

       --subview-- /0/0/0/1
         .description: UIImageView: 0x128850
         .class: UIImageView
         .frame: 10000, 10000, 320, 480
         .subviews: 0

       --subview-- /0/0/0/2
         .description: UIView: 0x11b200
         .frame: 0, 0, 320, 33
         .subviews: 0

       --subview-- /0/0/0/3
         .description: PLCropOverlay: 0x127dc0
         .frame: 0, 0, 320, 460
         .subviews: 3

         --subview-- /0/0/0/3/0
           .description: UIImageView: 0x12b2f0
           .class: UIImageView
           .frame: 0, 0, 320, 96
           .subviews: 0

         --subview-- /0/0/0/3/1
           .description: PLCropLCDLayer: 0x12b350
           .frame: 0, 0, 320, 96
           .subviews: 0

         --subview-- /0/0/0/3/2
           .description: TPBottomDualButtonBar: 0x12b5b0
           .frame: 0, 0, 320, 96
           .subviews: 2

           --subview-- /0/0/0/3/2/0
             .description: TPPushButton: 0x12ba40
             .frame: 22, 22, 128, 47
             .subviews: 0

           --subview-- /0/0/0/3/2/1
             .description: TPCameraPushButton: 0x12df10
             .frame: 170, 170, 128, 47
             .subviews: 1

             --subview-- /0/0/0/3/2/1/0
               .description: UIImageView: 0x12e960
               .class: UIImageView
               .frame: 51, 51, 26, 19
               .subviews: 0

 --subview-- /1
   .description: UINavigationBar: 0x125a30
   .frame: 0, 0, 320, 44
   .subviews: 1

   --subview-- /1/0
     .description: UINavigationItemView: 0x11abe0
     .frame: 160, 160, 0, 27
     .subviews: 0

Confusing? It’s actually pretty simple (although not very pretty.)

Next to each subview you can see it’s path. For example /0/0/0/3 means that it is subview #3 of subview #0 of subview #0 of subview #0. To look it up in the view hierarchy, just do this:

    UIView *theView = [[[[[[[[self.view subviews] objectAtIndex:0]
                                        subviews] objectAtIndex:0]
                                        subviews] objectAtIndex:0]
				        subviews] objectAtIndex:3];

See the numbers next to objectAtIndex:? It’s /0/0/0/3… the path… So now you can look up any UIView from the hierarchy and modify it!

Even if the description shows that the class is part of the private iPhone libraries (eg. PLCropLCDLayer), it must be a subclass of UIView to be in the hierarchy. We don’t know (or at least not supposed to know) what methods the private library classes have, but they do have all the methods and properties of UIView. So we can make theView transparent or hidden like this (this is actually the path for the UI above the camera preview):

    [theView setAlpha:0.5];    // semi transparent
    [theView setHidden:YES];    // hidden

I wanted to get rid of the gray bar and “Take Photo” label on top. The path for the gray background is /0/0/0/3/0, the label is /0/0/0/3/1. I look these up and animate their opacity (alpha) to 0:

    UIImageView *overlay = [[[[[[[[[[self.view subviews] objectAtIndex:0]
                                               subviews] objectAtIndex:0]
                                               subviews] objectAtIndex:0]
                                               subviews] objectAtIndex:3]
                                               subviews] objectAtIndex:0];

    UIView *label = [[[[[[[[[[self.view subviews] objectAtIndex:0]
                                        subviews] objectAtIndex:0]
                                        subviews] objectAtIndex:0]
                                        subviews] objectAtIndex:3]
                                        subviews] objectAtIndex:1];

    // animate their visibility (alpha) to 0 with a simple UIView animation
    //
    [UIView beginAnimations:nil context:nil];
    [overlay setAlpha:0.0];
    [label setAlpha:0.0];
    [UIView commitAnimations];

Here’s what it looks like in Nice Photo. The LOVE graphics is added in a separate view on top of the camera view.

nicephotocamera

Nice Photo App with Camera UI tweaked

The camera view is /0/0/0/0. To make it semi transparent (not sure why you would do this, but shows how to look it up):

    UIView *cameraView = [[[[[[[[self.view subviews] objectAtIndex:0]
                                           subviews] objectAtIndex:0]
                                           subviews] objectAtIndex:0]
                                           subviews] objectAtIndex:0];
    [cameraView setAlpha:0.5];

You might be interested in button push events. The buttons are subclasses of UIControl, so you can easily add an action to them. For example to add an action to the camera button (TPCameraPushButton at /0/0/0/3/2/1):

    UIControl *captureButton = [[[[[[[[[[[[self.view subviews] objectAtIndex:0]
                                                     subviews] objectAtIndex:0]
                                                     subviews] objectAtIndex:0]
                                                     subviews] objectAtIndex:3]
                                                     subviews] objectAtIndex:2]
                                                     subviews] objectAtIndex:1];

    [captureButton addTarget:self action:@selector(captureButtonAction:)
        forControlEvents:UIControlEventTouchUpInside];

(The captureButtonAction: method prints a message to the log when you tap the camera button in customImagePicker.)

I wanted to fix the shift between the preview and the camera view. To do this, I have a timer calling the previewCheck method (I’m pretty sure there’s more elegant ways… but come ont, timers are cool!). The preview will be added to the UIView at /0/0/0/2. By default this view has no subviews, but subviews are added when in preview mode. Then I modify the transform of /0/0/0/2/0/0/0 like such (btw this only makes sense in portrait mode):

    [previewImage setTransform:CGAffineTransformTranslate(
        CGAffineTransformMakeScale(320.0/1200, 320.0/1200),
                                   -12.0*1200/320,-17.0*1200/320)];

(Now you probably know why I was so interested in the view sizes… the view frames for the preview image are completely bizarre… hence the image shift. Are people allowed to drink on the job at Apple?)

You can experiment with other views. The inspectView method gives you a pretty good map of the hierarchy and it also works for other source types (like picking an image from the photo library.) Please look at the customImagePicker xcode project, I added lots of comments (more than what I retyped here ;).

As always, feel free to contact me… and please check out www.meanvsnice.com for some shots taken with Mean Photo and Nice Photo.

107 Responses to “custom UIImagePickerController camera view”

  1. Thank you for your work. I compiled it on iPhone (changed Makefile and added #import in CustomImagePicker.h.

    However, it showed a black screen. Can you pls also upload the compiled binary files? FYI, I’m using a factory unlocked iPhone with firmware/SDK 2.2.

    Thanks!!

  2. Cool stuff. I’m also looking to customize the camera picker view.

    However, don’t you feel there’s an issue here in relying on an internal view structure which could change with the next OS release?

    Also, will Apple even approve an app that does this, do you think?

  3. There are a whole bunch of approved apps (including mine) that modify the camera interface view structure. A couple reasons why I’m not concerned:
    1. The camera interface is terrible. Not sure what apple was thinking here, they definitely didn’t have their smartest cookie write it. Not only does it go against their guidelines (like non-rotating interface), but it’s buggy eg. scaled and modified images are returned incorrectly. Anything you do to it will only make it better.
    2. As long as you are not calling private libraries, you are good.
    3. Even if they change the view structure, you can always compile against older versions of the SDK.
    4. Once they approved the first app, they can’t say no to all the others. Just look at the sudden flood of farts.
    5. Apple was showing off modifying the video playback interface at one of their seminars that involved similar tactics.

  4. Great, thanks for the info.

  5. Thank you for great anatomy of image picker. Today, I found an app it’s called “25shot! - Replay Cam”, and it allows you to take 25 picture within a second. Then I got a question. When you are taking a picture, the cycle is like this:

    1. Dispatch a button pressing event or let user press the button
    2. Wait until the image is ready (you are catching the condition at previewCheck method)
    3. Then extract the image, do something (in your case, scale the image)

    When I do this, I can not minimize the interval less than 0.2 - 0.5 and the waiting time is not steady. Do you have any idea how he solved the problem?

    I’m just wondering if you know a way to extract image from the “live buffer”…

    Thanks again, good job!

    replay cam @ itunes
    http://itunes.apple.com/WebObjects/MZStore.woa/wa/viewSoftware?id=302883733&mt=8

  6. @k1: if you can live with screen resolution images, you could screenshot the preview or save the contents of the preview view.

  7. Very informative article. Thank you. I don’t know how you worked out all the different layers of the camera but well done.

    lajos, how would you suggest taking a screenshot or saving the contents of the preview screen every method I have tried ends up having a black screen?

  8. I’m also having trouble trying to replicate the functionality described by k1.

    Basically, after synthesizing a touch event on the camera press - the camera view stops updating the display until the preview is ready (usually a couple of seconds).

    Does anyone know a work around for this, or how to mimick functionality seen in apps like ‘25 shot!’ or ‘quadcam’?

  9. @CB: 25shot is not getting the image by triggering the take photo button, rather it’s capturing the image from the preview view. That’s why the maximum resolution is 320×400.

    I don’t have code for how to do this, if I find some free time I’ll look into it.

  10. OK that would be great Lajos.

    I’ve seen a few of these apps and they all appear to make the “shutter noise” for every frame which is captured, which is what lead me to believe they were synthesizing touch events, and the OS was triggering the sound.

    I’ve tried manually dumping the contents of the preview view layer into creating a UIImage, however this image continually gets updated at the same rate as the original preview pane (ie image is not static).

    Alternatively, I tried the renderInContext approaches, however ended up with a black image - as I don’t believe this function supports this type of view.

    Any comments appreciated!

  11. So is it possible to house the camera view within another UIViewController and set a clipping rectangle for it?

  12. @sean: you can draw on top of the camera view and mask the view you are drawing. I probably wouldn’t remove the camera view from it’s hierarchy.

  13. Excellent and informative post. I’m trying to modify the views and having some trouble. Basically what I want to do is to get rid of the green “take a picture button” and just have the cancel button, centered. Is that hard to do? Thanks.

  14. @k1 - How did you get the interval down to 0.5 seconds? I still have about a 2-3 second interval between the touch event and the image being ready.

  15. @lajos - excelent post, helped me a lot so far.

  16. - when in camera mode, if you add your own button to the view (instead of the default takePicture button - let’s say you hide this), when the user taps on your button, how would you make it to act like the default button … is there a way you can invoke the same message (or generate the same event) from within the onClick method assigned to your button, like when the user presses the default takePicture button
    - any ideas? did somebody manage to do this?
    - please share your thoughts/code

  17. Hi Lajos,

    Thanks for the excellent post! This is helping me a lot with my very first app.

    Hiding the “Take Photo” label works great with SDK 2.2, but when I compile against 2.0 (exact same code), I get:

    *** Terminating app due to uncaught exception ‘NSRangeException’, reason: ‘*** -[NSCFArray objectAtIndex:]: index (0) beyond bounds (0)’

    Is it supposed to work in 2.0 ? (from the view hierarchy, I don’t see why not)

  18. @Ben: from the error message it looks like the view hierarchy of the controller changed between versions. 2.0 imho is pretty obsolete so I wouldn’t worry about it, but if for any reason you need to use that version, you can find out what the view hierarchy is using the inspectView method of the custom controller. You can read about how to use it in the post. Hope this helps.

  19. @lajos: Thanks for quick answer! I looked at the view hierarchy, and its the same in all versions. It is as if the view is not loaded yet. I get the exception with SDK 2.0 and 2.1. SDK 2.2 and 2.2.1 work fine. My iPhone version is 2.2.1.

  20. Thanks for this article! Saved me from going mad!!

    D.

  21. Great work!!

    I do have a question - How do you access the image after the “use photo” is clicked?

    Thanks!!!

    -Pat

  22. Ahh, figured it out. I had to uncomment the line: picker.delegate = self; and add the function:
    - (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)image editingInfo:(NSDictionary *)editingInfo

    Thanks!

  23. Has anyone tried using Key-Value Observing to detect more specifically when the view structure changes, by observing when subviews of all the relevant views are added / removed?

    One problem is that by the time viewDidAppear is called, the picker has already scrolled onto the screen with the original interface. So you can’t really change it dramatically at this point. It would be nice to change the structure as soon as it is created.

    However, I can’t seem to get observing changes in the subviews to work. Maybe UIView is not KVO compliant for this property? I can observe changes to e.g. bounds. But then I’ve never used KVO before, so maybe I’m missing something.

    BTW, note this comment from the 3.0 beta release notes:

    “Issue: The camera or image picker view does not display correctly.
    Don’t rely on the structure or classes in the view hierarchy of Apple-provided views, as they are subject to change.”

    which indicates that perhaps Apple will now be less likely to approve apps that mess with the structure.

  24. >which indicates that perhaps Apple will now be less likely to approve apps that mess with the structure.

    Does anyone have been rejected the app using the technique in recently ?

    Since I’ve just started to develop an app.

  25. I am trying to remove or at least make invisible the semi transparent frame around the buttons which from what I gather is called TPBottomDualButtonBar.

    No matter what way I try I cannot change the image type of this. I can move it etc.

    As the 2 buttons are children of this hiding it is not an option.

    any clues?

  26. how about moving the background out of frame and then the buttons back with the inverse of the offset?

  27. Thanks for a extremely speedy response! But you lost me, sorry I dont understand what you mean?

  28. Ah, think I got you! testing it out now, thanks

  29. Thanks lajos, that sorted it! I placed it out of frame then set the buttons y axix to a negative and it all worked a treat!

    Excellent post!!!

  30. lajos, alas this does not resolve the problem as now because the buttons are in a position I need they are no longer in their parent frame and as such they do not respond to touch :-(

  31. You could fix the touch problem is you put handle touch events in the customimagepicker and send them to the right place.

  32. Has anyone been able to change the text in the “Cancel” button? It’s not a UIButton subclass, so you can’t use those methods.

  33. I created a new button with the same dimensions and location as the cancel button, set its state for normal and highlight, hid the original and added the new button to to view and hey presto custom cancel button ;-)

  34. Thanks. I did try that. But the new button didn’t have the same look as the old one. Did yours? What type of button did you use?

    One more question — does anyone know when the original view structure is created? If I add the view directly to the view hierarchy, rather than use presentModalViewController, then viewWillAppear is called (in contradiction to the docs, which say it won’t be), but the internal structure does not yet exist. But sometime later it is created, somewhere.

  35. Hi,
    This looks like a very useful post. Thanks very much.

    The problem I’m facing is that I would like a user to be able to resize an image that they select from the library, just like they can when they select from a camera. I set allowsImageEditing to YES in both cases.

    When the user selects an image, scales it, and selects “Use Image”, they wind up back in the library and need to click “Cancel” to get back to my app. When they click cancel, I get the resized image OK.

    This looks to me to be a very funky design from Apple. Any ideas how to avoid having the user bounced back to the library before returning to my app?

    Thanks for any help you can offer.

    Marty

  36. Answer to my last question: calling [picker.view layoutIfNeeded] seems to instantiate the subview structure.

  37. hi, i’m desperately trying to do what @k1 and @Tom have asked for, to take a screenshot of the camera preview. if i take screenshots of 0/0/0 or 0/0/0/0 i always get a black screen with buttons. please help!

  38. regarding the look of the button i use two .png files and add code to link the png files to the control states for normal and highlighted.

    if i create my own buttons, cancel foe example, how do i link it to the original cancel button?

    I know I can add an event/target to my new button which gets called when tapped but what do I call from in that method to make the original cancel button fire?

  39. Also anyone figured out how to stop the changes from happening in full view of user?

    I change the screen layout, ad my own custom buttons etc but as this happens in viewDidAppear it happens AFTER the view is on display.

    If I try using any of the methods that work BEFORE the view is visible I get errors to do with index out of bounds probably due to view structure not being created yet?

  40. Thanks, that makes sense, but it’s too bad you can’t just set the text somehow, or make some kind of ordinary button.

    In answer to your question:

    [cancelButton sendActionsForControlEvents:UIControlEventTouchUpInside];

    I do this with the capture button to bypass the “use photo” screen. I wonder what Apple will think of this…

  41. Oh, that was the answer to your question of 9:31 am.

    In answer to your question of 2:17 pm: that was the problem I was trying to solve above. If you call [picker.view layoutIfNeeded], that forces the view structure to be created; then you can change it, before it appears.

  42. Will try your suggestions thanks, was just going to try using NextResponder but will see.

    Have to say of all the tips I have found on the web this on is top dog!

  43. cancelButton sendActionsForControlEvents:UIControlEventTouchUpInside];

    Works a treat, thanks for the pointer ;-)

  44. Has anyone had any success in removing the black bar at the bottom of the camera view?

    I think it is:

    –subview– /1
    .description: UINavigationBar: 0×125a30
    .frame: 0, 0, 320, 44
    .subviews: 1

    –subview– /1/0
    .description: UINavigationItemView: 0×11abe0
    .frame: 160, 160, 0, 27
    .subviews: 0

    But have had no success yet, looks ugly being just a black bar taking up space not doing anything!

  45. [picker.view layoutIfNeeded]

    placed that in viewWillAppear method and all works dandy. ;-) thanks

  46. Lots of posts from me today!

    So when the user taps the capture button the buttons change to retake Picture and Use Photo respectively.

    I have moved the cancel and capture buttons off screen and replaced with my own and they both work.

    But when I take a picture I do not seem to be connecting to the Use Photo button and so get a crash with Bad Access.

    Any clues?

  47. OK, anyone out there gonna throw me a bone?

    I have made great progress with this code but have come unstuck at the last hurdle!

    I have replaced the camera button on the view and when my new button is tapped this block of code fires:

    -(void) cameraButtonAction: (id) sender
    {
    // forward custom button tap to original camera button

    // TPCameraPushButton /0/0/0/3/2/1 original camera button

    [[[[[[[[[[[[[self.view subviews] objectAtIndex:0]
    subviews] objectAtIndex:0]
    subviews] objectAtIndex:0]
    subviews] objectAtIndex:3]
    subviews] objectAtIndex:2]
    subviews] objectAtIndex:1] sendActionsForControlEvents:UIControlEventTouchUpInside];

    if ( isPreview )
    {
    UIAlertView *alert = [[UIAlertView alloc] initWithTitle: nil
    message: @”Zoom subject’s face to fill the oval template”
    delegate: self
    cancelButtonTitle: @”OK”
    otherButtonTitles: nil];

    [alert show];

    [alert release];

    isPreview = NO;
    }
    else
    {
    isPreview = YES;

    [self performSelector: @selector(imagePickerController: didFinishPickingImage: editingInfo:)];

    }
    }

    The problem I was having is this code gets called when first the button fires the shutter and again when needing to use the photo.

    I was finding the didFinishPickingImage was not firing then found if I entered it in the code as above it does.

    However now I have the problem that when trying to access image to save it I get bad access and am stumped!

    What I don’t understand is this regarding this code:

    -(void) cameraButtonAction: (id) sender
    {
    // forward custom button tap to original camera button

    // TPCameraPushButton /0/0/0/3/2/1 original camera button

    [[[[[[[[[[[[[self.view subviews] objectAtIndex:0]
    subviews] objectAtIndex:0]
    subviews] objectAtIndex:0]
    subviews] objectAtIndex:3]
    subviews] objectAtIndex:2]
    subviews] objectAtIndex:1] sendActionsForControlEvents:UIControlEventTouchUpInside];

    if ( isPreview )
    {
    UIAlertView *alert = [[UIAlertView alloc] initWithTitle: nil
    message: @”Zoom subject’s face to fill the oval template”
    delegate: self
    cancelButtonTitle: @”OK”
    otherButtonTitles: nil];

    [alert show];

    [alert release];

    isPreview = NO;
    }
    }

    when the button is tapped and the camera is called the first section runs, then checks is it in preview mode which it is cos the camera just fired the shutter and displays on screen.

    the the same button is tapped which fires the save image guff as I see the saving image message on screen….but didFinishPickingImage does not get called?

    The only way I can get this to call is to force it to run as it the first block above. And as I type I think the performselector method does not use arguments so….am I stuffed?

    Basically does this mean you cannot remplace the default buttons with your own?

    Please help I going bonkers here!!!!!

  48. “which indicates that perhaps Apple will now be less likely to approve apps that mess with the structure.”

    Yes this seem to be the case. My app just got rejected since I called the “undocumented API:s in UIImagePickerController” (I am just hiding the ugly labels)

    There are obviously a lot of apps in the store that use the API:s but now there is a problem. I am not the only one.

  49. If your app got rejected, send Apple your source code and show them that you are not accessing any of the private APIs. You are rearranging the view by casting the classes in the view hierarchy to UIView, which those classes must be subclasses of.

    I think there might be some new automatic safety check for public API access that they messed up.

    Apple was showing off similar techniques on their iPhone tour about rearranging buttons on the video playback interface.

    So complain. Don’t give up. ‘Cause this is bullshit.

  50. I see they have deprecated the didFinishPickingImage in SDK 3…

  51. BTW I now have this working as I want, my error was having the didFinishPickingImage method in my custom class rather than in the class that creates it!

    However I agre with lajos. My code make NO calls to any undocumented APIs all I am doing is access the views as described to make the screen look the way I want it to, nothing more!

  52. “So complain. Don’t give up. ‘Cause this is bullshit.”

    Yes this is bullshit and no, I am not giving up. However, they could argue that the subviews of the camera controller are undocumented. What is annoying is the inconsistency in reviewing, some get accepted and others doing the same thing get rejected. I have mailed Apple about this.

    BTW, Lajos thanks for this page, it was a great inspiration when I started my project.

  53. When I take the picture and choose Use Photo regardless of whether I edit the photo or not when it closes the picker and goes back to the calling view controller to display the picked image it is always cropped at the top and appears in center of screen.

    I have tried aspect fir / filll etc but I still have the issue f the top of the image missing?

    Does this happen to anyone else?

  54. damn, i was getting close to publishing something based on this method. is there anything we _can_ still get past the reviewers? is a straight overlay still kosher if we just add subviews to the vanilla library objects rather than subclassing? anyone know?

  55. @Greg: there is still a large number of apps added every day that modify the camera view. I just wanted to note that there’s a chance that Apple will reject such apps.

  56. “is a straight overlay still kosher if we just add subviews to the vanilla library objects rather than subclassing? anyone know?”

    Not from my recent experience. I am using a standard cameraview with some graphics on top of it and I just got rejected (again).

  57. @Sten: Can you share Apple’s reason for the rejection?

  58. Thanks for the great article. I tried the CustomImagePicker example today and I have a question. The captureButtonAction method is only called the first time I press the capture button. If I go back via “Retake” and press the capture button again, nothing happens. Why?

  59. @Mario: it’s because the actions on the capture button are deleted and reassigned by the API. If you want to use it again, you have to reregister your action after returning from the preview.

  60. This page was SUPER helpful. Thank you so much for taking the time to work this out and share what you learned.

    There is just one small change I want to make and am hitting a wall. I want to update the Text Field on the “Take Picture” label. We are assigning it to a UIView which does not have a text property. Any thoughts about how to update the text (vs. just hiding it)?

  61. @Michael: you’ll have to replace that view with a UILabel or a UIImage that has the text prerendered.

  62. After taking the picture and selecting use photo, I save the picture and would like to go back to the camera (so that the user can take another picture for example). I can’t seem to get this to work. Any suggestions on how to do this?

    Once the picture is saved and the alert is dismissed I keep the modalViewPresent but the “retake” and “use photo” buttons are still visible and are unresponsive.

  63. @Daniel: After taking the photo your delegate receives the image. Do not dismiss the picker, wait for the image to be saved and then trigger the retake button in the picker. It should go back to camera mode. Not sure what you mean by dismissed alert or why the retake button seems unresponsive, you should be able to trigger it programmatically.

  64. @lajos: Thank you for the suggestion to trigger the retake button in the the program. It took me a few minutes to realize that 0/0/0/3/2/0 was that button. Once I did, your idea worked perfectly. Thanks!

  65. Are these things still working with iPhone OS 3.0?

  66. My app was just rejected. Here’s an excerpt of the email:

    ———————–

    Unfortunately it cannot be added to the App Store because it is modifying or extending an undocumented API, which as outlined in the iPhone SDK Agreement section 3.3.1 is prohibited:

    “3.3.1 Applications may only use Published APIs in the manner prescribed by Apple and must not use or call any unpublished or private APIs. ”

    There is no documentation for the custom subclasses or self-contained views of UIImagePickerController - this includes PLCameraView nor it’s custom subclasses (PLImageTile, PLRotationView, PLImageScroller, PLImageView, PLCropOverlay, PLCropLCDLayer, TPBottomDualButtonBar, TPPushButton and TPCameraPushButton) - in the iPhone SDK.

    —————-

    I was using a basic principle of inheritance. An object I had access to is an instance of a UIView, therefore I could access any method defined on a UIView.

    If Apple still rejects this, it’s completely bogus. If they don’t want us to use “private classes” those classes should have private subclasses. They should not extend from classes to which we have access.

  67. @john: fight it!

    Send an email explaining and proving that you are not using any private or undocumented APIs. There’s hope (wink).

  68. Well… Add me to the rejected list. I sent an appeal, but I am not holding my breath. I am going to move on to my Bluetooth Lego Mindstorm remote control.

    Maybe there will be a breakthrough in 3.0.

  69. I have posted a bug saying how ugly the take picture labels are. #6866567

    Please take a minute to log into bugreport and agree.
    KT

  70. I’ve tried to use this before and also rejected repeatedly. If you don’t mind zooming the camera view, then one approach I’ve used that seems to be “legit” is to simply do this:

    camera.view.transform = CGAffineTransformScale( camera.view.transform, 2.0f, 2.0f);

    where “camera” is the (UIImagePickerController Delegate *)

    No undocumented API are referenced in this approach. If you LIKE zooming, then increase the last 2 values. I just found that 2.0f enlarges just enough to move the buttons and text out of view.

    I also set camera.view.userInteractionEnabled=NO;

  71. Do you know if there is an easy way to get all images in the camera roll. If I want to make a slideshow I do not want to select all images individually.

  72. @Kai: Unfortunately you cannot select multiple images from the camera roll.

  73. @lajos: Thank you for your fast reply.
    Do you know any other way to get all the images?
    Perhaps without the image picker?

  74. @Kai: The only Apple approved way to get to the images is thru the UIImagePickerController. You can of course access the images in other ways, but you can’t post that in the App Store.

  75. @lajos: Could you please tell me about the other ways.
    I tried to read the ithmb files but could not find them.

  76. I feel so sorry to see you guys all twisting yourself, trying to argue with Apple QA from a technical perspective, blah blah blah …

    However, here is the bottom line. Those applications get approved for a REASON. And the REASON is NOT technical.

    $ $ $

    Good luck

  77. This is some really good coding. Great job, and thanks!

  78. I’m able to hide the bottom bar with the following:

    camBar = [[[[[[[[[[[[imagePickerController.view
    subviews] objectAtIndex:0]
    subviews] objectAtIndex:0]
    subviews] objectAtIndex:0]
    subviews] objectAtIndex:0]
    subviews] objectAtIndex:2]
    subviews] objectAtIndex:0];
    [camBar setAlpha:0.0];

    I am next trying to trigger the picture capture with the following code.

    captureButton = [[[[[[[[[[[[[[[[imagePickerController.view
    subviews] objectAtIndex:0]
    subviews] objectAtIndex:0]
    subviews] objectAtIndex:0]
    subviews] objectAtIndex:0]
    subviews] objectAtIndex:2]
    subviews] objectAtIndex:0]
    subviews] objectAtIndex:0]
    subviews] objectAtIndex:0];

    [captureButton sendActionsForControlEvents:UIControlEventTouchUpInside];

    I get an error (uncaught exception) that there’s not an object at –subview– /0/0/0/0/2/0/0/0.

    Why might this be happening? How can I trigger the capture in code.

  79. @JC: So have you had any success with your zoom method on an iPhone with OS 3.0? All variants Ive tried only succeed in shrinking the Camera view, not enlarge it.

    @ex apple employee: So you are meaning to tell us that tools like ReplayCam and PhoneCam are approved because they pay apple? Or what other things are there that you hint at? This is not really helpful, you know.

  80. Konrad: I have (briefely) tested the zoom method on a 3.0 (Beta 5) phone and it works fine. I compiled the app as 2.2.

  81. Hello,

    I found the post really useful, and I have been playing with this stuff for ours.
    Now, I have a problem that I have not manage to solve, and it seems that more people is having the same problem as I do.

    What I am trying to do is to get the viewfinder data ( I think you call it camera preview ). This is what I did:

    I increased the frequency of the timer to 10 times per second ( because I want viewfinder data so fast for my calculations ). In the previewCheck function I get the UIView 0/0/0/0, which according to my understanding is the view rendering the viewfinder.

    Then I call this two functions:

    pixelsRef = CreateDataFromView ( preview2 );
    pPixels = CFDataGetBytePtr ( pixelsRef );

    ( in pPixels I should have a pointer point to the viewfinder data, pixelsRef is around 588Kb, which seem correct considering the dimensions of the screen ( 320×460x4 ). )

    The implementation of those functions are:

    CFDataRef CreateDataFromImage ( UIImage *image )
    {
    return CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage));
    }

    CFDataRef CreateDataFromView ( UIView *view )
    {
    UIGraphicsBeginImageContext( view.bounds.size );
    [view.layer renderInContext:UIGraphicsGetCurrentContext()];
    UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    return CreateDataFromImage( viewImage );
    }

    Everything seems to run fine, but pPixels is pointer to a buffer which is just black, and the current alpha setting, i.e., every pixel has the value 0×80 00 00 00. Seems like default value is 0×80 for alpha, if I change the alpha value before getting pPixels I get the correct value for alpha. Still the RGB values for every pixel are black…

    So, my question after all this long post is, do you know why I am not getting the viewfinder pixel data?

    My peculation is that this view is empty, and that the actual pixel data is rendered “behind the curtains” somehow ( my knowledge in objective C and Cocoa are too limited to understand this right now ).

    thanks in advance!

  82. Konrad: After further testing I agree that the zoom method doesn’t work when compiled for 3.0.

    However, the 3.0 camera interface is much nicer so the need for hiding things is less.

  83. [...] did not discover everything on my own. I am using some information from lajos, The Air Source, and Erica [...]

  84. [...] By the way, you can also use this technique to remove hide the buttons (hint: /0/0/0/3). I followed lajos’s example and start a timer when Camera button is clicked. The timer will invoke a function, [...]

  85. Hey,

    Great info!
    I have this issue, perhaps someone understands it better.
    I’m adding this custom picker to a Tabbar as one of the tabs.
    As it turns, printing the view hierarchy in the ViewDidAppear shows only
    one view with a view inside it. the whole camera view hierarchy is simply not there “yet”.
    If I “wait” a little (by invoking the preview) it then indeed shows the hierarchy everyone talks about.
    Just to clarify, using the “original code” from here works without a problem.

    Is it possible that the tabbar somehow “hides” the view (although when I hit the breakpoint in the debugger, the device is showing the camera!!) and still the view heirarchy is reported empty.

    Any hint, very much appreciated.

    -t

  86. Manast,

    Did you ever figure out how to get something other then a black image. I’ve been trying this too with no luck.

  87. Wow - Apple publishes camera manipulation functions in iPhone 3.1 Beta 2

    UIImagePickerController.h
    Added UIImagePickerController.cameraOverlayView
    Added UIImagePickerController.cameraViewTransform
    Added UIImagePickerController.showsCameraControls
    Added -[UIImagePickerController takePicture]

    No more need for hacking anymore… halleluya !!

  88. I am getting nothing in PLImageView. Description shows View:

  89. [...] By the way, you can also use this technique to remove hide the buttons (hint: /0/0/0/3). I followed lajos’s example and start a timer when Camera button is clicked. The timer will invoke a function, [...]

  90. [...] did not discover everything on my own. I am using some information from lajos, The Air Source, and Erica [...]

  91. As already mentioned, OS 3.1 eliminates the need for this clever technique. There’s a project on github (private until 3.1 is released) that shows exactly how to use the new UIImagePickerController API. Let me know if you’re interested…

    http://blog.bordertownlabs.com/post/157320598/customizing-the-iphone-camera-view-with

  92. Thanks for the tutorial, this is what I’m looking for! Is this technique still being rejected?

  93. Manast, I too am stuck grabbing only black pixels. I suspect Apple is actually blitting the viewfinder pixels directly to the screen using CoreSurface, not via CALayer. But then that doesn’t explain how .overlayView works unless they render that view on top of screen memory.

    Using a modified UIGetScreenImage technique, I can captured and crop (to CIF size) video frames at 16 fps. That time includes converting each frame to JPEG on the main thread. But I can get almost 40 fps grabbing these solid black frames out of the view hierarchy, so I’m trying to do that to maximize my capture rate.

    Has anyone figured this out?

  94. How i can get the imagePickerController’s image postion or index?

  95. Hi. Is apple rejecting apps which use custom camera interface?

  96. Hi guys,

    I’m using the techniques described to hook my own action into the inbuilt camera button (I want to use the preview screen and focus features etc). However, the problem I face is that when I add a selector method to the camera button it calls me AND still takes a picture. How do I remove the call to take the picture?

    Thanks

  97. @Jimbob: look at the UIControl class reference in the documentation. Remove all targets from the button before you add yours. Having said all that, you should not use this method for taking pictures. There is now a takePicture method on UIImagePickerController (see doc).

  98. Hi Lajos,

    Thanks for the reply.

    I guess you mean UIControl’s :-
    - (void)removeTarget:(id)target action:(SEL)action forControlEvents:(UIControlEvents)controlEvents

    I see from the docs that I can give NULL as the action parameter to remove all actions, but I’m not sure of the target I should be sending too, it is a bit of a black box! - any ideas what the camera button is sending the action to?

    I am using the takePicture call when I decide to take the picture, when the image is stable enough, but alas the system has already taken one automatically thanks to the action bound to the control.

    Thanks again.

  99. @Jimbob: you should be sending it to the camera button. On 3.x the UI elements move around in the hierarchy so it can be tricky to track down. If you are using 3.x, why don’t you just set showsCameraControls to false on the UIImagePickerController and make your own button?

  100. @Lajos

    Hi again, well for some reason if you set showsCameraControls to NO you lose the image preview screen. I guess I’ll just have to create my own view after the delegate callback. I was just being a little lazy, but I also wanted to keep the Apple camera look and feel as well as the preview screen.

    As for sending to the camera button - I’d be calling ‘removeTarget’ on the camera button ie [cameraButton removeTarget:target action:action forControlEvents:events]

    So, are you saying the removeTarget call on the cameraButton would have to be given itself as a target??? ie [cameraButton removeTarget:self action:action forControlEvents:events]

    Thanks again!

  101. @Jimbob: showCameraControls = NO should not hide the preview, In fact the preview is the only view visible.

  102. Lajos,

    Awesome! I have been looking for a method of doing custom camera controls for UIImagePickerController. I downloaded the project, and tested with a 3Gs, but only the normal controls show. I am on the newest version of xcode targeting the 3.0 SDK.

    Does this approach still work with the 3.0 SDK?

    best wishes,

    ty

  103. @ty: In 3.0 the view hierarchy constantly changes. This project only works with pre 3.0 SDKs. In 3.0 you can print the view hierarchy and use that info to figure out where specific views are. But again, since the view hierarchy changes, it’s a bit trickier than pre 3.0. You should also look at the UIImagePickerController documentation, in 3.0 there are new methods to hide the camera UI, insert view above the camera UI and take photo.

  104. Ok, I’ve have tried capturing the preview screen (targeting 4.0) and your code works great to dump the classes. as a side note, I’ll probably be using that for other troubleshooting in the future.

    I am still wanting to use this method as opposed to the UIGetScreenImage() method because I don’t want to capture any of the overlay content. But…all I ever get is the shutter screen and it is driving me nuts!!

    Here is the code I’m using. The path 0/0/0/0/2 maps to the UIImageView.

    //Crawl CameraPickerController for Image preview
    UIView *preview = [[[[[[[[[[[thePicker.view subviews] objectAtIndex:0]
    subviews] objectAtIndex: 0]
    subviews] objectAtIndex: 0]
    subviews] objectAtIndex: 0]
    subviews] objectAtIndex: 2] retain];

    //Turn the view into an image
    UIGraphicsBeginImageContext(preview.bounds.size);
    [preview.layer renderInContext:UIGraphicsGetCurrentContext()];
    UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    NSData *imgData=UIImageJPEGRepresentation(viewImage ,0);
    viewImage = nil;

    [preview release];
    preview = nil;

  105. @Ryan: 4.0 gives you full access to the camera stream with the AVFoundation framework. You don’t need any tricks to access the camera stream anymore.

  106. @lajos : does that mean you can put other buttons such as “cancel”,”setting” etc ?

  107. @Forrest: Yes, you should have pretty much full control over the camera. Between AVFoundation and UIImagePickerController, you should be able to show live preview, take pictures, start video recording, and even get frames out of the video stream.

Leave a Reply