Develop

Integrate Wearables Device Access Toolkit into your iOS app

Updated: Mar 10, 2026

Overview

This guide explains how to add Wearables Device Access Toolkit registration, streaming, and photo capture to an existing iOS app. For a complete working sample, compare with the provided sample app.

Prerequisites

Complete the environment and glasses steps in Setup.
Your integration must use a registered bundle identifier. To register or manage bundle IDs, see Apple’s Register an App ID and Bundle IDs documentation.
App Store Submission Warning: We do not currently support publishing to the App Store (or the Google Play Store), but we plan to do so in the future. In the meantime, you can share your integration with test users via our release channels. As a result, since the SDK currently uses the ExternalAccessory framework, it will lead to App Store rejection due to Apple’s MFi program and privacy manifest requirements.

Step 1: Add info properties

In your app’s Info.plist or using Xcode UI, insert the required keys so the Meta AI app can callback to your app and discover the glasses. AppLinkURLScheme is required so that the Meta AI app can callback to your application. The example below uses myexampleapp as a placeholder. Adjust the scheme to match your project.
Add the MetaAppID key, ClientToken and TeamID to provide the Wearables Device Access Toolkit with your application ID. Otherwise, you can omit it or set MetaAppID to an empty string (“”) and Developer Mode will work. Distributed apps should set the dedicated values from the Wearables Developer Center (see Manage projects).
Note: If you pre-process Info.plist, the :// suffix will be stripped unless you add the -traditional-cpp flag. See Apple Technical Note TN2175.
<!-- Configure custom URL scheme for Meta AI callbacks -->
<key>CFBundleURLTypes</key>
<array>
  <dict>
    <key>CFBundleTypeRole</key>
    <string>Editor</string>
    <key>CFBundleURLName</key>
    <string>$(PRODUCT_BUNDLE_IDENTIFIER)</string>
    <key>CFBundleURLSchemes</key>
    <array>
      <string>myexampleapp</string>
    </array>
  </dict>
</array>

<!-- External Accessory protocol for Meta Wearables -->
<key>UISupportedExternalAccessoryProtocols</key>
<array>
  <string>com.meta.ar.wearable</string>
</array>

<!-- Background modes for Bluetooth and external accessories -->
<key>UIBackgroundModes</key>
<array>
  <string>bluetooth-peripheral</string>
  <string>external-accessory</string>
</array>
<key>NSBluetoothAlwaysUsageDescription</key>
<string>Needed to connect to Meta Wearables</string>

<!-- Wearables Device Access Toolkit configuration -->
<key>MWDAT</key>
<dict>
  <key>AppLinkURLScheme</key>
  <string>myexampleapp://</string>
  <key>MetaAppID</key>
  <string></string>
</dict>

Step 2: Add the SDK Swift package

Add the SDK through Swift Package Manager.
  1. In Xcode, select File > Add Package Dependencies...
  2. Search for https://github.com/facebook/meta-wearables-dat-ios in the top right corner.
  3. Select meta-wearables-dat-ios.
  4. Set the version to one of the available versions.
  5. Click Add Package.
  6. Select the target to which you want to add the package.
  7. Click Add Package.
Import the required modules in any Swift files that use the SDK.
import MWDATCamera
import MWDATCore

Step 3: Initialize the SDK

Call Wearables.configure() once when your app launches.
func configureWearables() {
  do {
    try Wearables.configure()
  } catch {
    assertionFailure("Failed to configure Wearables SDK: \(error)")
  }
}

Step 4: Launch registration from your app

Register your application with the Meta AI app either at startup or when the user wants to turn on your wearables integration.
func startRegistration() throws {
  try Wearables.shared.startRegistration()
}

func startUnregistration() throws {
  try Wearables.shared.startUnregistration()
}

func handleWearablesCallback(url: URL) async throws {
  _ = try await Wearables.shared.handleUrl(url)
}
Observe registration and device updates.
let wearables = Wearables.shared

Task {
  for await state in wearables.registrationStateStream() {
    // Update your registration UI or model
  }
}

Task {
  for await devices in wearables.devicesStream() {
    // Update the list of available glasses
  }
}
Important: A device will not appear in the devicesStream until the user has granted at least one permission (e.g., camera) through the Meta AI app. If your devicesStream is empty after registration, ensure you are calling wearables.requestPermission(.camera), as shown in Step 5 below.
Finally, implement attestation for your app to ensure its authenticity. The four identifiers you need to include are:
  1. Team ID (generated from Apple). To find your Team ID, you must first Sign in to your developer account.
  2. AppLinkURLScheme (generated from Apple)
  3. Client token (autogenerated in the Meta Developer Center)
  4. Meta App ID (autogenerated in the Meta Developer Center)
<key>MWDAT</key>
<dict>
  <key>AppLinkURLScheme</key>
	<string>myappname://</string>
  <key>MetaAppID</key>
	<!-- Replace the 0 with your Meta App ID, found in the Wearables Developer Center -->
  <string>0</string>
	<key>ClientToken</key>
  <!-- Replace the 0 and 1 placeholders with your Meta App ID and Client Token, respectively, also found in the Wearables Developer Center -->
	<string>AR|0|1</string>
	<!-- Your Apple Developer Team ID - Set this in Xcode under Signing & Capabilities -->
	<key>TeamID</key>
	<string>$(DEVELOPMENT_TEAM)</string>
</dict>

If incorrect identifiers are used or your app is misconfigured, it won’t connect, and you will receive an error.
Note: App attestation is not used in developer mode, since these apps rely on local logic rather than connecting to a release channel.

Step 5: Manage camera permissions

Check permission status before streaming and request access if necessary.
var cameraStatus: PermissionStatus = .denied
...
cameraStatus = try await wearables.checkPermissionStatus(.camera)
...
cameraStatus = try await wearables.requestPermission(.camera)

Step 6: Start a camera stream

Create a StreamSession, and observe its state and display frames. You can use an auto device selector to make a smart decision for the user to select a device. This example uses AutoDeviceSelector to make this decision for the user. Alternatively, you can use SpecificDeviceSelector if you provide a UI for the user to select a specific device.
You can request resolution and frame rate control using StreamSessionConfig. Valid frameRate values are 2, 7, 15, 24, or 30 FPS. resolution can be set to:
  • high: 720 x 1280 pixels
  • medium: 504 x 896 pixels
  • low: 360 x 640 pixels
StreamSessionState transitions through stopping, stopped, waitingForDevice, starting, streaming, and paused.
Register callbacks to collect frames and state events.
// Let the SDK auto-select from available devices
let deviceSelector = AutoDeviceSelector(wearables: wearables)
let config = StreamSessionConfig(
  videoCodec: VideoCodec.raw,
  resolution: StreamingResolution.low,
  frameRate: 24)
streamSession = StreamSession(streamSessionConfig: config, deviceSelector: deviceSelector)

let stateToken = session.statePublisher.listen { state in
  Task { @MainActor in
    // Update your streaming UI state
  }
}

let frameToken = session.videoFramePublisher.listen { frame in
  guard let image = frame.makeUIImage() else { return }
  Task { @MainActor in
    // Render the frame in your preview surface
  }
}

Task { await session.start() }
Resolution and frame rate are constrained by the Bluetooth Classic connection between the user’s phone and their AI glasses. To manage limited bandwidth, an automatic ladder reduces quality as needed. It first lowers the resolution by one step (for example, from high to medium). If bandwidth remains constrained, it then reduces the frame rate (for example, 30 to 24), but never below 15 fps.
The image delivered to your app may appear lower quality than expected, even when the resolution reports high or medium. This is due to per‑frame compression that adapts to available Bluetooth Classic bandwidth. Requesting a lower resolution, a lower frame rate, or both can yield higher visual quality with less compression loss.

Step 7: Capture and share photos

Listen for photoDataPublisher events and handle the returned PhotoData. Then, when a stream session is active, call capturePhoto.
_ = session.photoDataPublisher.listen { photoData in
  let data = photoData.data
  // Convert to UIImage or hand off to your storage layer
}

session.capturePhoto(format: .jpeg)

Next steps