Official Android Client SDK for LiveKit. Easily add video & audio capabilities to your Android apps.
Docs and guides at https://docs.livekit.io
LiveKit for Android is available as a Maven package.
...
dependencies {
implementation "io.livekit:livekit-android:<version>"
}
You'll also need jitpack as one of your repositories.
subprojects {
repositories {
google()
mavenCentral()
// ...
maven { url 'https://jitpack.io' }
}
}
There are two sample apps with similar functionality:
LiveKit relies on the RECORD_AUDIO
and CAMERA
permissions to use the microphone and camera.
These permission must be requested at runtime. Reference the sample app for an example.
room.localParticipant.setCameraEnabled(true)
room.localParticipant.setMicrophoneEnabled(true)
// create an intent launcher for screen capture
// this *must* be registered prior to onCreate(), ideally as an instance val
val screenCaptureIntentLauncher = registerForActivityResult(
ActivityResultContracts.StartActivityForResult()
) { result ->
val resultCode = result.resultCode
val data = result.data
if (resultCode != Activity.RESULT_OK || data == null) {
return@registerForActivityResult
}
lifecycleScope.launch {
room.localParticipant.setScreenShareEnabled(true, data)
}
}
// when it's time to enable the screen share, perform the following
val mediaProjectionManager =
getSystemService(MEDIA_PROJECTION_SERVICE) as MediaProjectionManager
screenCaptureIntentLauncher.launch(mediaProjectionManager.createScreenCaptureIntent())
LiveKit uses WebRTC-provided org.webrtc.SurfaceViewRenderer
to render video tracks. Subscribed audio tracks are automatically played.
class MainActivity : AppCompatActivity(), RoomListener {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
...
val url = "wss://your_host";
val token = "your_token"
launch {
val room = LiveKit.connect(
applicationContext,
url,
token,
ConnectOptions(),
this
)
val localParticipant = room.localParticipant
localParticipant.setMicrophoneEnabled(true)
localParticipant.setCameraEnabled(true)
attachVideo(videoTrack)
}
}
override fun onTrackSubscribed(
track: Track,
publication: RemoteTrackPublication,
participant: RemoteParticipant,
room: Room
) {
if (track is VideoTrack) {
attachVideo(track)
}
}
private fun attachVideo(videoTrack: VideoTrack) {
// viewBinding.renderer is a `org.webrtc.SurfaceViewRenderer` in your
// layout
videoTrack.addRenderer(viewBinding.renderer)
}
}
To develop the Android SDK or running the sample app, you'll need:
- Ensure the protocol submodule repo is initialized and updated with
git submodule update --init
- Install Android Studio Arctic Fox 2020.3.1+
For those developing on Apple M1 Macs, please add below to $HOME/.gradle/gradle.properties
protoc_platform=osx-x86_64
- Download webrtc sources from https://webrtc.googlesource.com/src
- Add sources to Android Studio by pointing at the
webrtc/sdk/android
folder.