- August 1, 2025
- Mins Read
Code examples of Depth APIs in iOS
Use devices which has a dual camera (e.g. iPhone 8 Plus) or a TrueDepth camera (e.g. iPhone X)
Open ARKit-Sampler.xcworkspace
with Xcode 10 and build it!
It can NOT run on Simulator. (Because it uses Metal.)
Depth visualization in real time using AV Foundation.
Blending a background image with a mask created from depth.
Depth visualization from pictures in the camera roll.
Background removal demo using Portrait Effect Matte (or Portrait Effect Matte).
Plaease try this after taking a picture of a HUMAN with PORTRAIT mode.
Available in iOS 12 or later.
Depth visualization on ARKit. The depth on ARKit is available only when using ARFaceTrackingConfiguration
.
A demo to render a 2D image in 3D space.
[WIP] An occlusion sample on ARKit using depth.
A Figma component preview for your SwiftUI views. You can use Figma components instead of real views within your app ...
Motivation At WWDC 2019, Apple announced SwiftUI a new library for building UI in a simple and fast way. Xcode’s ...
Make use of SwiftUI previews for rapidly prototyping your UIViewControllers and UIViews! The SwiftUI preview canvas is tied to a specific version of ...
Requirements Dev environment: Xcode 13+, macOS 12+ iOS 13.0+, macOS 10.15+, Mac Catalyst 13.0+, tvOS 13.0+, watchOS 6.0+ Usage
Horizontal wheel picker for SwiftUI Requirements iOS 13.0+ Installation CocoaPods