Develop

Integrate Wearables Device Access Toolkit into your Android app

Updated: Mar 10, 2026

Overview

This guide explains how to add Wearables Device Access Toolkit registration, streaming, and photo capture to an existing Android app. For a complete working sample, compare with the provided sample app.

Prerequisites

Complete the environment and glasses configuration steps in Setup.

Step 1: Add manifest entries

In your app’s AndroidManifest.xml, add the permissions required to allow your app to communicate with the glasses through Bluetooth. The intent filter with the URI scheme is required so that the Meta AI app can callback to your application. The example below uses myexampleapp as a placeholder. Adjust the scheme to match your project.
Add APPLICATION_ID metadata to provide the Wearables Device Access Toolkit with your application ID. Otherwise, you can omit it or use 0 if you are using Developer Mode. Published apps receive a dedicated value from the Wearables Developer Center (see Manage projects).
<manifest ...>
    <!-- Runtime permissions used by DAT -->
    <uses-permission android:name="android.permission.BLUETOOTH" />
    <uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />
    <uses-permission android:name="android.permission.INTERNET" />

    <application ...>
        <!-- Use 0 in Developer Mode; production apps receive a unique ID from Wearables Developer Center -->
        <meta-data
            android:name="com.meta.wearable.mwdat.APPLICATION_ID"
            android:value="0" />

        <!-- Callback scheme Meta AI uses to return to your app -->
        <activity android:name=".MainActivity" ...>
            <intent-filter>
                <action android:name="android.intent.action.VIEW" />
                <category android:name="android.intent.category.DEFAULT" />
                <category android:name="android.intent.category.BROWSABLE" />
                <data android:scheme="myexampleapp" />
            </intent-filter>
        </activity>
    </application>
</manifest>

Step 2: Add the SDK to Gradle

The Wearables Device Access Toolkit is distributed through GitHub Packages.
Add the Wearables Device Access Toolkit Maven repository to your app’s Gradle repositories in settings.gradle.kts.
val localProperties =
    Properties().apply {
        val localPropertiesPath = rootDir.toPath() / "local.properties"
        if (localPropertiesPath.exists()) {
            load(localPropertiesPath.inputStream())
        }
    }

dependencyResolutionManagement {
    ...
    repositories {
        ...
        maven {
            url = uri("https://maven.pkg.github.com/facebook/meta-wearables-dat-android")
            credentials {
                username = "" // not needed
                password = System.getenv("GITHUB_TOKEN") ?: localProperties.getProperty("github_token")
            }
        }
    }
}
Next, declare the Wearables Device Access Toolkit artifacts in libs.versions.toml. Check the available versions in GitHub Packages.
[versions]
mwdat = "0.5.0"

[libraries]
mwdat-core = { group = "com.meta.wearable", name = "mwdat-core", version.ref = "mwdat" }
mwdat-camera = { group = "com.meta.wearable", name = "mwdat-camera", version.ref = "mwdat" }
mwdat-mockdevice = { group = "com.meta.wearable", name = "mwdat-mockdevice", version.ref = "mwdat" }
Then, add them as dependencies in your app’s build.gradle.kts.
dependencies {
    implementation(libs.mwdat.core)
    implementation(libs.mwdat.camera)
    implementation(libs.mwdat.mockdevice)
}
To build and install your app with the Wearables Device Access Toolkit, you need a personal access token (classic) with at least the read:packages scope in GitHub. Follow these instructions to create a new personal access token (classic).
Then, provide this personal access token following one of these two approaches:
  • In a terminal, set the environment variable GITHUB_TOKEN with your personal access token.
    export GITHUB_TOKEN=ghp...  # your personal access token (classic)
    
    ./gradlew installDebug  # from the directory of the actual project
    
  • Alternatively, you can create a local.properties file in the project root and set the key github_token with your personal access token. Then, in Android Studio, refresh the Gradle project by clicking File > Sync Project with Gradle Files.
    github_token=ghp...  # your personal access token (classic)
    

Step 3: Initialize the SDK

Initialize the SDK once per process at start up.
Wearables.initialize(context)
Invoking other Wearables Device Access Toolkit APIs before initialization yields WearablesError.NOT_INITIALIZED.
For lifecycle placement guidance, read Session lifecycle.

Step 4: Launch registration from your app

Register your application with the Meta AI app either at startup or when the user wants to turn on your wearables integration.
fun requestWearablesRegistration(activity: Activity) {
    Wearables.startRegistration(activity)
}

fun requestWearablesUnregistration(activity: Activity) {
    Wearables.startUnregistration(activity)
}
Observe registration and device updates.
...

Wearables.registrationState.collect { state ->
    onState(state)
}

...
Wearables.devices.collect { devices ->
    onDevices(devices.toList())
}
Next, implement attestation for your app to ensure its authenticity. The two identifiers you need to include inside the <application> tag of your AndroidManifest.xml file are:
  1. Application ID
  2. Client token
Both can be found in the Wearables Developer Center (see Manage projects).
<!-- Replace the 0 placeholder with your Application ID, found in the Wearables Developer Center -->
<meta-data android:name="com.meta.wearable.mwdat.APPLICATION_ID" android:value="0" />
<!-- Replace the 0 and 1 placeholders with Your Application ID and Client Token, respectively, also found in the Wearables Developer Center -->
<meta-data android:name="com.meta.wearable.mwdat.CLIENT_TOKEN" android:value="AR|0|1" />
While an App Signature is not required for attestation, the Meta AI app will use it to verify the authenticity of your app. If incorrect identifiers are used or your app is misconfigured, it won’t connect, and you will receive an error.
Note: App attestation is not used in developer mode, since these apps rely on local logic rather than connecting to a release channel.

Step 5: Manage camera permissions

Before streaming, check the Wearables camera permission and launch the SDK contract if required.
var permissionStatus = Wearables.checkPermissionStatus(Permission.CAMERA)
if (permissionStatus == PermissionStatus.Granted) {
    // start streaming
}
permissionStatus = requestWearablesPermission(Permission.CAMERA)

...

private var permissionContinuation: CancellableContinuation<PermissionStatus>? = null
private val permissionMutex = Mutex()
// Requesting wearable device permissions via the Meta AI app
private val permissionsResultLauncher =
    registerForActivityResult(Wearables.RequestPermissionContract()) { result ->
        permissionContinuation?.resume(result)
        permissionContinuation = null
    }

// Convenience method to make a permission request in a sequential manner
// Uses a Mutex to ensure requests are processed one at a time, preventing race conditions
suspend fun requestWearablesPermission(permission: Permission): PermissionStatus {
    return permissionMutex.withLock {
        suspendCancellableCoroutine { continuation ->
            permissionContinuation = continuation
            continuation.invokeOnCancellation { permissionContinuation = null }
            permissionsResultLauncher.launch(permission)
        }
    }
}

Step 6: Start a camera stream

Create a StreamSession to observe its state and display frames. You can use an auto device selector to make a smart decision for the user to select a device. This example uses AutoDeviceSelector to make this decision for the user. Alternatively, you can use SpecificDeviceSelector, if you provide a UI for the user to select a device.
You can request resolution and frame rate control using StreamConfiguration. Valid frameRate values are 2, 7, 15, 24, or 30 FPS. videoQuality can be set to:
  • HIGH: 720 x 1280 pixels
  • MEDIUM: 504 x 896 pixels
  • LOW: 360 x 640 pixels
StreamSessionState transitions through STARTING, STARTED, STREAMING, STOPPING, STOPPED, and CLOSED.
Register callbacks to collect frames and state events.
private var session: StreamSession? = null

fun start(deviceId: DeviceIdentifier) {
    val streamSession = Wearables.startStreamSession(
        context = context,
        deviceSelector = AutoDeviceSelector(),
        streamConfiguration = StreamConfiguration(
            videoQuality = VideoQuality.MEDIUM,
            frameRate = 24,
        ),
    )

    session = streamSession

    scope.launch {
        streamSession.videoStream.collect { frame ->
            displayFrame(frame)
        }
    }

    scope.launch {
        streamSession.state.collect { state ->
            updateStreamUi(state)
            if (state == StreamSessionState.STOPPED) {
                stopStream()
            }
        }
    }
}
Resolution and frame rate are constrained by the Bluetooth Classic connection between the user’s phone and their AI glasses. To manage limited bandwidth, an automatic ladder reduces quality as needed. It first lowers the resolution by one step (for example, from HIGH to MEDIUM). If bandwidth remains constrained, it then reduces the frame rate (for example, 30 to 24), but never below 15 fps.
The image delivered to your app may appear lower quality than expected, even when the resolution reports HIGH or MEDIUM. This is due to per‑frame compression that adapts to available Bluetooth Classic bandwidth. Requesting a lower resolution, a lower frame rate, or both can yield higher visual quality with less compression loss.

Step 7: Capture and share photos

When a stream session is active, call capturePhoto and handle the returned PhotoData. Add app/src/main/res/xml/file_paths.xml so that the FileProvider can expose cached images.
session.capturePhoto()
    .onSuccess { data ->
    ...
    }
    .onFailure(onError)

Next steps