Camera Tutorial, Part 7: Pass video frames to ActionScript

Opt In Image
Early bird offer on the ANE eBooks

 

Buy any Easy Native Extensions 2nd Edition package and get our $99 iOS + Android ANE Template completely free before the end of June 2015.

 

 

  • step-by-step guide to making your iOS extension in under an hour
  • library for data conversion between ActionScript and native code
  • tutorials
  • infographics
  • code included

At the end of this part you will have

This:

Camera AIR app

 

In other words, you will have an app that displays the frames you grab from the native camera. You might need to find a cat first.

Time

15-20 minutes

Wait, have you done these first?

Step 1: Decide how your AIR Library should receive a frame

Before we plunge in the Objective-C thicket, it would be good to decide what kind of data you will want from it.

In Flash Builder find your AIR Library project, CameraTutorialAIRLib, and open CameraDriver.as. Then add this public method to it:

What does this mean for your native code?

1. You will need to expose a C function, which ActionScript can call as “as_copyLastFrame”.

2. “as_copyLastFrame” will take a flash.utils.ByteArray as its first argument, which it will need to set to an appropriate size, so it can copy pixels from the camera to it.

3. As a second argument “as_copyLastFrame” will take a flash.geom.Point and set its x and y properties to the width and height of the frame that was copied into the ByteArray.

Now that we know what your native code needs to do, let’s do it.

Step 2: Get a video frame from CameraDelegate

Before passing it to ActionScript, you first need to get hold of a video frame from your CameraDelegate class.

In your Xcode project open CameraDelegate.m and add a method that will do just that:

Remember the fake triple buffering technique we use for handling the frames that are copied from the camera? This is the last step of it, where we ‘swap’ the middleman and the consumer read buffer, making sure that we put a mutex around that (hence the @synchronized directive) to ensure that the middle man buffer won’t be changed in the middle of our ‘swap’.

Put the getLastFrame() signature in CameraDelegate.h to make it callable from outside the CameraDelegate class (make it public). This goes inside @interface CameraDelegate:

Step 3: Copy pixels into the ActionScript ByteArray:

In your Xcode project open the source file that defines your native library’s interface to AIR. That would be CameraLibiOS.m.

3.1. Add a function that ActionScript will call that will take a flash.utils.ByteArray and a flash.geom.Point:

Add a helper function for checking if we’ve got a valid video frame:

 

That’s quite a lot to take in. If you have questions and the comments in the code don’t answer it for you, leave a comment at the bottom of this post – I’ll try and answer as best I can.

3.2. Expose this function to ActionScript. In CameraLibiOS.m find your context initializer, CameraLibContextInitializer(), and add ASCopyLastFrame() to the extensionFunctions array, so it now looks like this:

Step 4: Display frames in your test app

Aren’t you glad you already have an app set up, so you can test your code in a jiffy?

4.1. Add a function that will take a video frame from your ANE in the form of flash.display.BitmapData and pass it to the spark Image you added to your stage. This will go inside the <fx:Script> section in CameraTutorialAppHomeView.mxml:

4.2. So when will you call displayVideoFrame()?

A rant you can happily skip: 

You have a couple of choices here. You can either notify your app every time a frame becomes available by sending it an event or you can have the app ask for frames on a timer, irrespective of when they become ready in the native library. I’ve found the latter to be a lot more flexible, as there are times when you want the camera and the consumer to run at different frame rates. Honest, I have war stories to share. Ask me in the comments below if you are curious.

A timer it is, then. :)

In case you happily skipped the rant above (can’t blame you), this is what you’ll do:

4.2.1. Add a flash.utils.Timer object to CameraTutorialAppHomeView.mxml:

4.2.2. Start the timer in your onCameraStarted() event handler:

of course, add a definition for startRefreshTimer():

4.3.3. In your timer handler, onRefresh(), call displayVideoFrame():

What’s next?

  • If I were you, I would run the app and see if I get a camera preview. Hey, where did this cat come from!?
  • There is something we haven’t done yet however, before we can finish this tutorial. Can you guess what it is? Head on to Part 8: Stop the camera (6-7 minutes).
  • Here is the table of contents for the tutorial, in case you want to jump back or ahead.

 

Wait, want more features and Android support?

Check out the DiaDraw Camera Driver ANE.


Warning: count(): Parameter must be an array or an object that implements Countable in /home/easyna6/public_html/easynativeextensions_wp/wp-includes/class-wp-comment-query.php on line 399

Comments

  1. Quinn

    I think the definition for the isVideFrameValid (isVideoFrameValid) is missing from the above – great tutorial btw :-)

    • Radoslava

      Thanks, Quinn!

      I’ve now corrected the typo and included the definition. It’s great to see readers paying attention! :)
      Radoslava

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code class="" title="" data-url=""> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> <pre class="" title="" data-url=""> <span class="" title="" data-url="">