Skip to main content

Custom video and audio sources

By default, Video SDK uses the basic audio and video modules on the device your app runs on. However, there are certain scenarios where you want to integrate a custom audio or video source into your app, such as:

  • Your app has its own audio or video module.
  • You want to use a non-camera source, such as recorded screen data.
  • You need to process the captured audio or video with a pre-processing library for audio or image enhancement.
  • You need flexible device resource allocation to avoid conflicts with other services.

Understand the tech

To set an external audio or video source, you configure the Agora Engine before joining a channel. To manage the capture and processing of audio and video frames, you use methods from outside the Video SDK that are specific to your custom source. Video SDK enables you to push processed audio and video data to the subscribers in a channel.

The figure below shows the workflow you need to implement to stream a custom video or audio source in your app.

Custom Video and Audio

Prerequisites

To follow this procedure you must have implemented the SDK quickstart project for Broadcast Streaming.

Project setup

To create the environment necessary to implement custom audio and video into your app, open the SDK quickstart Broadcast Streaming project you created previously.

Integrate custom audio or video

To stream from a custom source, you convert the data stream into a suitable format and push this data using Video SDK.

Implement a custom video source

In this section you create the basic framework required to push video frames from a custom source. Depending on the type of your source, you add your own code to this framework that converts the source data to VideoFrame data. To create the basic framework, take the following steps:

  1. Import the required Agora and Android libraries

    You use the Android TextureView and SurfaceTexture objects for rendering custom video. The video data from the SurfaceTexture is converted to a VideoFrame before it is pushed to the channel. To use these libraries in your app, add the following statements after the last import statement in /app/java/com.example.<projectname>/MainActivity.


    _4
    import io.agora.base.VideoFrame;
    _4
    import android.graphics.SurfaceTexture;
    _4
    import android.view.TextureView;
    _4
    import androidx.annotation.NonNull;

  2. Define variables to process and push video data

    In /app/java/com.example.<ProductWrapper>/MainActivity, add the following declarations to the MainActivity class:


    _5
    private TextureView previewTextureView;
    _5
    private SurfaceTexture previewSurfaceTexture;
    _5
    _5
    private boolean mTextureDestroyed = false;
    _5
    private boolean mPreviewing = false;

  3. Enable custom video track publishing

    When a user presses Join, you configure ChannelMediaOptions to enable publishing of the captured video from a custom source. You set the external video source, and set up a TextureView for the custom video preview. To do this:

    1. Add the following lines to the joinChannel(View view) method in the MainActivity class after ChannelMediaOptions options = new ChannelMediaOptions();:


      _11
      // Enable publishing of the captured video from a custom source
      _11
      options.publishCustomVideoTrack = true;
      _11
      // Configure the external video source.
      _11
      agoraEngine.setExternalVideoSource(true, true, Constants.ExternalVideoSourceType.VIDEO_FRAME);
      _11
      // Check whether texture encoding is supported
      _11
      showMessage(agoraEngine.isTextureEncodeSupported() ? "Texture encoding is supported" :
      _11
      "Texture encoding is not supported");
      _11
      // Set up a preview TextureView for the custom video
      _11
      setupCustomLocalVideoPreview();
      _11
      // Show the preview TextureView
      _11
      previewTextureView.setVisibility(View.VISIBLE);

    2. In the joinChannel(View view) method, remove the following lines:


      _2
      setupLocalVideo();
      _2
      localSurfaceView.setVisibility(View.VISIBLE);

  4. Set up a TextureView for the custom video

    Create a new TextureView object, and add a SurfaceTextureListener to it. The listener triggers the onSurfaceTextureAvailable callback when a SurfaceTexture becomes available. You add the TextureView to the FrameLayout container to display it in the UI. To do this, add the following method to the MainActivity class:


    _9
    private void setupCustomLocalVideoPreview() {
    _9
    // Create TextureView
    _9
    previewTextureView = new TextureView(getBaseContext());
    _9
    // Add a SurfaceTextureListener
    _9
    previewTextureView.setSurfaceTextureListener(surfaceTextureListener);
    _9
    // Add the TextureView to the local video FrameLayout
    _9
    FrameLayout container = findViewById(R.id.local_video_view_container);
    _9
    container.addView(previewTextureView,320,240);
    _9
    }

  5. Define the SurfaceTextureListener

    When a SurfaceTexture becomes available, you create a previewSurfaceTexture and set its onFrameAvailableListener listener. You set up and configure your custom video source, set its SurfaceTexture to the previewSurfaceTexture, and start the preview. To do this, add the following definition of the surfaceTextureListener to the MainActivity class:


    _39
    private final TextureView.SurfaceTextureListener surfaceTextureListener = new TextureView.SurfaceTextureListener(){
    _39
    @Override
    _39
    public void onSurfaceTextureAvailable(@NonNull SurfaceTexture surface, int width, int height) {
    _39
    // Invoked when a TextureView's SurfaceTexture is ready for use.
    _39
    _39
    if (mPreviewing) {
    _39
    // Already previewing custom video
    _39
    return;
    _39
    }
    _39
    _39
    showMessage("Surface Texture Available");
    _39
    mTextureDestroyed = false;
    _39
    _39
    // Set up previewSurfaceTexture
    _39
    previewSurfaceTexture = new SurfaceTexture(true);
    _39
    previewSurfaceTexture.setOnFrameAvailableListener(onFrameAvailableListener);
    _39
    _39
    // Add code here to set up and configure the custom video source
    _39
    // Add code here to set SurfaceTexture of the custom video source to previewSurfaceTexture
    _39
    _39
    // Start preview
    _39
    mPreviewing = true;
    _39
    }
    _39
    _39
    @Override
    _39
    public void onSurfaceTextureSizeChanged(@NonNull SurfaceTexture surface, int width, int height) {
    _39
    }
    _39
    _39
    @Override
    _39
    public boolean onSurfaceTextureDestroyed(@NonNull SurfaceTexture surface) {
    _39
    mTextureDestroyed = true;
    _39
    return false;
    _39
    }
    _39
    _39
    @Override
    _39
    public void onSurfaceTextureUpdated(@NonNull SurfaceTexture surface) {
    _39
    _39
    }
    _39
    };

  6. Push the video frames

    The onFrameAvailableListener callback is triggered when a new video frame is available. In the callback, you convert the SurfaceTexture data to a Video SDK VideoFrame and push the frame to the channel. To do this, add the following OnFrameAvailableListener to the MainActivity class:


    _16
    private final SurfaceTexture.OnFrameAvailableListener onFrameAvailableListener = new SurfaceTexture.OnFrameAvailableListener() {
    _16
    @Override
    _16
    public void onFrameAvailable(SurfaceTexture surfaceTexture) {
    _16
    // Callback to notify that a new stream video frame is available.
    _16
    _16
    if (isJoined) {
    _16
    // Configure the external video frames and send them to the SDK
    _16
    VideoFrame videoFrame = null;
    _16
    _16
    // Add code here to convert the surfaceTexture data to a VideoFrame object
    _16
    _16
    // Send VideoFrame to the SDK
    _16
    agoraEngine.pushExternalVideoFrame(videoFrame);
    _16
    }
    _16
    }
    _16
    };

Implement a custom audio source

To push audio from a custom source to a channel, take the following steps:

  1. Import the required Android and Java libraries

    You use an InputStream to read the contents of the custom audio source. The app starts a separate Process to read and push the audio data. To use these libraries in your app, add the following statements after the last import statement in /app/java/com.example.<projectname>/MainActivity.


    _3
    import android.os.Process;
    _3
    import java.io.InputStream;
    _3
    import java.io.IOException;

  2. Define variables to manage and push the audio stream

    In /app/java/com.example.<projectname>/MainActivity, add the following declarations to the MainActivity class:


    _12
    // Audio file parameters
    _12
    private static final String AUDIO_FILE = "applause.wav"; // raw audio file
    _12
    private static final Integer SAMPLE_RATE = 44100;
    _12
    private static final Integer SAMPLE_NUM_OF_CHANNEL = 2;
    _12
    private static final Integer BITS_PER_SAMPLE = 16;
    _12
    private static final Integer SAMPLES = 441;
    _12
    private static final Integer BUFFER_SIZE = SAMPLES * BITS_PER_SAMPLE / 8 * SAMPLE_NUM_OF_CHANNEL;
    _12
    private static final Integer PUSH_INTERVAL = SAMPLES * 1000 / SAMPLE_RATE;
    _12
    _12
    private InputStream inputStream;
    _12
    private Thread pushingTask = new Thread(new PushingTask());
    _12
    private boolean pushing = false;

  3. Add a raw audio file to the project

    In this example, you use an audio file as the source of your custom audio data. To add the audio file to your Android project, create a folder app\src\main\assets and add a sample audio file in *.wav or *.raw format to this folder. Update the value of the AUDIO_FILE variable to show the audio file name. Also make sure that the values of the audio file parameters in your code match the audio file you placed in the assets folder.

  4. Enable custom audio track publishing

    When a user presses Join, you set the ChannelMediaOptions to disable the microphone audio track and enable the custom audio track. You also enable custom audio local playback and set the external audio source. To do this, add the following lines to the joinChannel(View view) method in the MainActivity class after options.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER;:


    _6
    options.publishMicrophoneTrack = false; // Disable publishing microphone audio
    _6
    options.publishCustomAudioTrack = true; // Enable publishing custom audio
    _6
    options.enableAudioRecordingOrPlayout = true;
    _6
    _6
    agoraEngine.enableCustomAudioLocalPlayback(0, true);
    _6
    agoraEngine.setExternalAudioSource(true, SAMPLE_RATE, SAMPLE_NUM_OF_CHANNEL, 2, false, true);

  5. Open the audio file

    When the app starts, you open the audio file. To do this, add the following lines at the bottom of the onCreate method:


    _5
    try {
    _5
    inputStream = this.getResources().getAssets().open(AUDIO_FILE);
    _5
    } catch (IOException e) {
    _5
    e.printStackTrace();
    _5
    }

  6. Start the task to push audio frames

    When a user successfully joins a channel, you start the task that pushes audio frames. To do this, add the following lines at the bottom of the onJoinChannelSuccess callback in the MainActivity class:


    _2
    pushing = true;
    _2
    pushingTask.start();

  7. Read the input stream into a buffer

    You read data from the input stream into a buffer. To do this, add the following method to the MainActivity class:


    _13
    private byte[] readBuffer() {
    _13
    int byteSize = BUFFER_SIZE;
    _13
    byte[] buffer = new byte[byteSize];
    _13
    try {
    _13
    if (inputStream.read(buffer) < 0) {
    _13
    inputStream.reset();
    _13
    return readBuffer();
    _13
    }
    _13
    } catch (IOException e) {
    _13
    e.printStackTrace();
    _13
    }
    _13
    return buffer;
    _13
    }

  8. Push the audio frames

    You push the data in the buffer as an audio frame using a separate process. To do this, define the following Runnable class in the MainActivity class:


    _20
    class PushingTask implements Runnable {
    _20
    long number = 0;
    _20
    _20
    @Override
    _20
    public void run() {
    _20
    Process.setThreadPriority(Process.THREAD_PRIORITY_URGENT_AUDIO);
    _20
    while (pushing) {
    _20
    long before = System.currentTimeMillis();
    _20
    agoraEngine.pushExternalAudioFrame(readBuffer(), 0);
    _20
    long now = System.currentTimeMillis();
    _20
    long consuming = now - before;
    _20
    if(consuming < PUSH_INTERVAL){
    _20
    try {
    _20
    Thread.sleep(PUSH_INTERVAL - consuming);
    _20
    } catch (InterruptedException e) {
    _20
    }
    _20
    }
    _20
    }
    _20
    }
    _20
    }

  9. Close the audio file

    When the app is closed, you close the audio file. To do this, add the following lines at the bottom of the onDestroy method:


    _5
    try {
    _5
    inputStream.close();
    _5
    } catch (IOException e) {
    _5
    e.printStackTrace();
    _5
    }

Test your implementation

To ensure that you have implemented streaming from a custom source into your app:

  1. Generate a temporary token in Agora Console

  2. Add authentication data to the web demo

    In your browser, navigate to the Agora web demo and update App ID, Channel, and Token with the values for your temporary token, then click Join.

  1. In Android Studio, open app/java/com.example.<projectname>/MainActivity, and update appId, channelName and token with the values for your temporary token.

  2. Connect a physical Android device to your development device.

  3. In Android Studio, click Run app. A moment later you see the project installed on your device.

    If this is the first time you run the project, grant microphone and camera access to your app.

  1. Test the custom video source

    Add code to the basic framework presented above, to do the following:

    1. In onSurfaceTextureAvailable enable the video source and set its parameters.

    2. In onSurfaceTextureAvailable set SurfaceTexture of the custom video source to previewSurfaceTexture.

    3. In onFrameAvailable convert surfaceTexture data to a VideoFrame.

  1. Test the custom audio source

    Press Join. You hear the audio file streamed to the web demo app.

    To use this code for streaming data from your particular custom audio source, modify the readBuffer() method to read the audio data from your source, instead of a raw audio file.

Reference

This section contains information that completes the information in this page, or points you to documentation that explains other aspects to this product.