, AV Foundation moves to center stage as the essential media framework on the device, offering support for playing, capturing, and even editing audio and video. Borrowing some of the core ideas from the Mac's QuickTime, while adding many new concepts of its own, AV Foundation offers extraordinary capabilities for application programmers. This talk will offer a high-level overview of what's in AV Foundation, and a taste of what it can do.
3. iPhone 2 Media
Frameworks
Core Audio Low-level audio streaming
Media Player Full-screen video player
Obj-C wrapper for audio file
AV Foundation
playback (2.2 only)
4. iPhone 3 Media
Frameworks
Core Audio Low-level audio streaming
Media Player iPod library search/playback
Obj-C wrapper for audio file
AV Foundation
playback, recording
5. iOS 4 Media Frameworks
Core Audio Low-level audio streaming
Media Player iPod library search/playback
Audio/video capture, editing,
AV Foundation
playback, export…
Quartz effects on moving
Core Video
images
Objects for representing media
Core Media
times, formats, buffers
12. Capture: Old and Busted
• UIImagePickerController
• Takes user out of your UI
• Low configurability
• No capture-time data access
• AVAudioRecorder
• Audio only, to file only
13. Capture: New Hotness
• AV Foundation capture classes
• Highly configurable
• Live callbacks with capture data
• Image/video preview to a CALayer
15. AVCaptureDevice
• Represents an input (camera,
microphone) or output (speakers)
device
• Discover with +[devices], +
[devicesWithMediaType], +
[defaultDeviceWithMediaType:], …
• Flash, white balance, exposure, focus
settings for camera devices
16. AVCaptureOutput
• A destination for captured data
• Files: AVCaptureFileOutput,
AVCaptureMovieFileOutput
• Images: AVCaptureStillImageOutput
• Live data:
AVCaptureAudioDataOutput,
AVCaptureVideoDataOutput
17. AVCaptureSession
• Coordinates the activity of audio and
video capture devices
• Allows you to connect/disconnect
inputs and outputs
• startRunning/stopRunning
19. Data Output Callbacks
• Audio and video data outputs provide
-[setSampleBufferDelegate:queue:]
• Delegates get -[captureOutput:
didOutputSampleBuffer:
fromConnection];
• Sample buffer is a
CMSampleBufferRef
20. Core Media
• New in iOS 4
• Core Foundation opaque types for
wrapping sample buffers, format
descriptions, time structures
• Functions convert video samples to
CVImageBuffer, audio to Core Audio
AudioBufferList
21. Core Media Time
• CMTime: value, timescale, flags, epoch
• Timescale is n-ths of a second
• Set timescale to a resolution
appropriate to your media (e.g.,
44100 for CD audio). QT convention
is 600 for video (ask Chris why!)
• CMTimeConvertScale()
22. WWDC 2010 Session 409
Using the Camera
with AV Foundation
Overview and best practices
Brad Ford
iPhone Engineering
2
24. “iMovie is built entirely on exactly the same public API in
AV Foundation that we’re presenting to you in iPhone 4.”
25.
26. “Boom Box” APIs
• Simple API for playback,
sometimes recording
• Little or no support for
editing, mixing, metadata,
etc.
• Example: HTML 5 <audio> tag
27. “Streaming” APIs
• Use “stream of audio”
metaphor
• Strong support for mixing,
effects, other real-time
operations
• Example: Core Audio
28. “Document” APIs
• Use “media document”
metaphor
• Strong support for editing
• Mixing may be a special case
of editing
• Example: QuickTime
and AV Foundation
29. Assets and Movies
• AVAsset: Collection of tracks
representing timed media data
• QTMovie: Collection of tracks
representing timed media data
32. AVAsset
• Superclass of all “movie”-like
structures in AVFoundation
• Represents traits of all tracks taken
together: size, duration
• Build your own with AVURLAsset
33. AVComposition
• Subclass of AVAsset representing a
combination of multiple file-based
assets
• Tracks are AVCompositionTracks
• For editing, AVMutableComposition
and AVMutableCompositionTracks
34. Effects 1
• AVAudioMix, AVMutableAudioMix: set
volumes or audio ramps at specific
times
• AVVideoCompositionInstructions:
provide a set of layer-based
instructions for performing time-
based opacity or affine transform
ramps
35. Playback
• AVPlayer: Playback controller
• play, pause, seekToTime, etc.
• AVPlayerLayer: CALayer for
presenting video from an AVLayer
36. Effects 2
• AVSynchronizedLayer: CALayer that
synchronizes with a AVPlayerItem’s
playback timing
• Use for overlays, titles, rendered
images, Ken Burns effects, etc.
37. Export
• AVAssetExportSession
• Must be created with a canned
preset
• -[exportAsynchronouslyWith
CompletionHandler:]
• Takes a block!
• Exporting CA effects is tricky…
38. Other stuff
• AVAssetStillImageGenerator: used for
generating thumbnails
• Not suitable for getting individual
frames (but no GetMediaSample()
equivalent either!)
• NSCoder, NSValue additions for
wrapping CMTimes, CMTimeRanges
39. WWDC 2010 Session 407
Editing Media with AV Foundation
Overview and best practices
Eric Lee
iPhone Engineering
2