Custom Video Source and Renderer
Introduction
Generally, Agora SDKs use default video modules for capturing and rendering in real-time communications.
However, these default modules might not meet your development requirements, such as in the following scenarios:
- Your app has its own video module.
- You want to use a non-camera source, such as recorded screen data.
- You need to process the captured video with a pre-processing library for functions such as image enhancement.
- You need flexible device resource allocation to avoid conflicts with other services.
Agora provides a solution to enable a custom video source and/or renderer in the above scenarios. This article describes how to do so using the Agora Native SDK.
Before proceeding, ensure that you have implemented the basic real-time communication functions in your project. For details, see Start a Video Call or Start Live Interactive Video Streaming.
Sample project
Agora provides the following open-source sample projects on GitHub:
You can view the source code on Github or download the project to try it out.
Custom video source
The Agora Native SDK provides the following two modes for customizing the video source:
- Push mode: In this mode, call the
setExternalVideoSource
method to specify the custom video source. After implementing video capture using the custom video source, call thepushVideoFrame
method to send the captured video frames to the SDK.
- Video frames captured in Push mode cannot be rendered by the SDK. If you capture video frames in Push mode and need to enable local preview, you must use a custom video renderer.
- Switching in the channel from custom video capture by Push to SDK capture is not supported. To switch the video source directly, you must use the custom video capture by MediaIO. See How can I switch from custom video capture to SDK capture.
- MediaIO mode: In this mode, call the
setVideoSource
method to specify the custom video source. Then call theconsumeByteBufferFrame
,consumeByteArrayFrame
, orconsumeTextureFrame
method to retrieve the captured video frames and send them to the SDK.
Push mode
Refer to the following steps to customize the video source in your project:
- Before calling
joinChannel
, callsetExternalVideoSource
to specify the custom video source. - Implement video capture and processing yourself using methods from outside the SDK. According to your app scenario, you can call
AgoraVideoFrame
before sending the captured video frames to the SDK. For example, you can setrotation
as180
to rotate the video frames by 180 degrees clockwise. - Call
pushExternalVideoFrame
to send the video frames to the SDK for later use.
API call sequence
Refer to the following diagram to implement the custom video source.
isTextureEncodeSupported
to find out. Then use the returned result to set the useTexture
parameter in the setExternalVideoSource
method.Video data transfer
The following diagram shows how the video data is transferred when you customize the video source in Push mode:
- You need to implement the capture module yourself using methods from outside the SDK.
- Captured video frames are sent to the SDK via the
pushExternalVideoFrame
method.
Code samples
The following code samples use the camera as the custom video source.
- Before joining a channel, call
setExternalVideoSource
to specify the custom video source.
- Configure the video capture module, and implement your custom video source. The code sample uses the camera as the custom video source.
- The
onFrameAvailable
callback (Android's method, see Android's help document) is triggered when new video frames appear inTextureView
. The callback implements the following operations:
- Renders the captured video frames using the custom renderer for later use in local view.
- Calls
pushExternalVideoFrame
to send the captured video frames to the SDK.
API reference
MediaIO mode
In MediaIO mode, the Agora SDK provides the IVideoSource
interface and the IVideoFrameConsumer
class to configure the format of captured video frames and control the process of video capturing.
Refer to the following steps to customize the video source in your project:
-
Implement the
IVideoSource
interface, which configures the format of captured video frames and controls the process of video capturing through a set of callbacks:- After receiving the
getBufferType
callback, specify the format of the captured video frames in the return value. - After receiving the
onInitialize
callback, save theIVideoFrameConsumer
object, which sends and receives video frames captured by a custom source. - After receiving the
onStart
callback, start sending the captured video frames to the SDK by calling theconsumeByteBufferFrame
,consumeByteArrayFrame
, orconsumeTextureFrame
method in theIVideoFrameConsumer
object. Before sending the video frames, you can modify the video frame parameters inIVideoFrameConsumer
, such asrotation
, according to your app scenario. - After receiving the
onStop
callback, stop theIVideoFrameConsumer
object from sending video frames to the SDK. - After receiving the
onDispose
callback, release theIVideoFrameConsumer
object.
- After receiving the
-
Inherit the
IVideoSource
class implemented in step 1, and construct an object for the custom video source. -
Call the
setVideoSource
method to assign the custom video source object toRtcEngine
. -
According to your app scenario, call the
startPreview
orjoinChannel
method to preview or publish the captured video frames.
API call sequence
Refer to the following diagram to implement the custom video source:
Video data transfer
The following diagram shows how the video data is transferred when you customize the video source in MediaIO mode:
- You need to implement the capture module yourself using methods from outside the SDK.
- Captured video frames are sent to the SDK via the
consumeByteBufferFrame
,consumeByteArrayFrame
, orconsumeTextureFrame
method.
Code samples
The following code samples use a local video file as the custom video source.
- Implement the
IVideoSource
interface and theIVideoFrameConsumer
class, and rewrite the callbacks in theIVideoSource
interface.
- Specify the custom video source before joining a channel.
- Configure external video input.
Refer to the following code sample to implement the setExternalVideoInput
method:
- Implement the local video thread, and decode the local video file. The decoded video frames are rendered to
Surface
.
- After the local user joins the channel, the capture module consumes the video frames through the
consumeTextureFrame
method inExternalVideoInputThread
and sends the frames to the SDK.
API reference
See also
If your app has its own video capture module and needs to integrate the Agora SDK for real-time communication purposes, you can use the Agora Component to enable and disable video frame input through the callbacks in Media Engine. For details, see Customize the Video Source with the Agora Component.
Custom video renderer
The Agora SDK provides the IVideoSink
interface to customize the video renderer in your project.
Refer to the following steps to implement the video renderer:
-
Implement the
IVideoSink
interface, which configures the format of captured video frames and controls the process of video rendering through a set of callbacks:- After receiving the
getBufferType
andgetPixelFormat
callbacks, specify the format of the rendered video frames in the return value. - After receiving the
onInitialize
,onStart
,onStop
,onDispose
, andgetEglContextHandle
callbacks, perform the corresponding operations. - Implement the
IVideoFrameConsumer
class for the rendered video frames' format to retrieve the video frames.
- After receiving the
-
Inherit the
IVideoSource
class implemented in step 1, and create a video capture module for the custom renderer. -
Call the
setLocalVideoRenderer
orsetRemoteVideoRenderer
method to render the video of the local user or remote user. -
According to your app scenario, call the
startPreview
orjoinChannel
method to preview or publish the rendered video frames.
API call sequence
Refer to the following diagram to implement the custom video renderer in MediaIO mode:
Video data transfer
The following diagram shows how the video data is transferred when you customize the video renderer in MediaIO mode:
- You need to implement the rendering module yourself using methods from outside the SDK.
- Captured video frames are sent to the capture module via the
consumeByteBufferFrame
,consumeByteArrayFrame
, orconsumeTextureFrame
method.
Code samples
The code samples provide two options for implementing the custom video renderer in your project.
Option 1: Use components provided by Agora
The Agora SDK provides classes and code samples that are designed to help you easily integrate and create a custom video renderer. You can use these components directly, or you can create a custom renderer based on these components. See Customize the Video Sink with the Agora Component.
After the local user joins the channel, import and implement the AgoraSurfaceView
class, then set the remote video renderer. The AgoraSurfaceView
class inherits the SurfaceView
class and implements the IVideoSink
class. AgoraSurfaceView
also embeds a BaseVideoRenderer
object that serves as the rendering module, which means you do not need to implement the IVideoSink
class and customize the rendering module yourself. The BaseVideoRenderer
object uses OpenGL as the renderer and creates EGLContext, and it shares the Handle of EGLContext with Media Engine. For more information about how to implement the AgoraSurfaceView
class, see the demo project.
Option 2: Use the IVideoSink
interface
You can implement the IVideoSink
class and inherit it to construct a rendering module for the custom renderer.
API reference
Considerations
-
Performing the following operations requires you to use methods from outside the Agora SDK:
- Manage the capture and processing of video frames when using a custom video source.
- Manage the processing and display of video frames when using a custom video renderer.
-
When using a custom video renderer, if the
consumeByteArrayFrame
,consumeByteBufferFrame
, orconsumeTextureFrame
callback reports thatrotation
is not0
, the rendered video frames are rotated by a certain degree. This may be caused by the capture settings of the SDK or your custom video source. You need to modifyrotation
according to your application scenario. -
If the format of the custom captured video is Texture and the remote user sees anomalies (such as flickering and distortion) in the local custom captured video, Agora recommends that you make a copy of the video data before sending the custom video data back to the SDK, and then send both the original video data and the copied video data back to the SDK. This eliminates the anomalies during the internal data encoding.
See also
If you want to customize the audio source or renderer in your project, see Custom Audio Source and Renderer.