Skip to main content

Call quality best practice

Customer satisfaction for your Video Calling integrated app depends on the quality of video and audio it provides. Quality of audiovisual communication through your app is affected by the following factors:

  • Bandwidth of network connection: Bandwidth is the volume of information that an Internet connection can handle per unit of time. When the available bandwidth is not sufficient to transmit the amount of data necessary to provide the desired video quality, your users see jerky or frozen video along with audio that cuts in and out.

  • Stability of network connection: Network connections are often unstable with the network quality going up and down. Users get temporarily disconnected and come back online after an interruption. These issues lead to a poor audiovisual experience for your users unless your app is configured to respond to these situations and take remedial actions.

  • Hardware quality: The camera and microphone used to capture video and audio must be of sufficiently good quality. If the user's hardware does not capture the audiovisual information in suitably high definition, it limits the quality of audio and video that is available to the remote user.

  • Video and audio settings: The sharpness, smoothness, and overall quality of the video is directly linked to the frame rate, bitrate and other video settings. Similarly, the audio quality depends on the sample rate, bitrate, number of channels and other audio parameters. If you do not choose proper settings, the audio and video transmitted are of poor quality. On the other hand, if the settings are too demanding, the available bandwidth quickly gets choked, leading to suboptimal experience for your users.

  • Echo: Echo is produced when your audio signal is played by a remote user through a speakerphone or an external device. This audio is captured by the remote user's microphone and sent back to you. Echo negatively affects audio quality, making speech difficult to understand.

  • Multiple users in a channel: When multiple users engage in real-time audio and video communication in a channel, the available bandwidth is quickly used up due to several incoming audio and video streams. The device performance also deteriorates due to the excessive workload required to decode and render multiple video streams.

This page shows you how to use Video SDK features to account for these factors and ensure optimal audio and video quality in your app.

Understand the tech

Video SDK provides the following features to deal with channel quality issues:

  • Network probe test: The network probe test checks the last-mile network quality before you join a channel. The method returns network quality statistics including round-trip latency, packet loss rate, and network bandwidth.

  • Echo test: The echo test captures audio through the microphone on the user’s device, and sends it to Agora SD-RTN™. After a delay of about 2 seconds, Agora SD-RTN™ sends the audio back to the sending device to be played. The returned audio enable a user to judge if their hardware and network connection are of adequate quality. Agora recommends that an echo test be performed before a network probe test.

  • Audio profiles: Delivering the best quality audio to your users requires choosing audio settings customized for your particular application. In Video SDK you can choose from pre-configured audio profiles and audio scenarios to optimize audio settings for a wide range of applications.

    • An audio profile sets the audio sample rate, bitrate, encoding scheme, and the number of channels for your audio. Video SDK offers several preset audio profiles to choose from. To pick the most suitable audio profile for your application, refer to the List of audio profiles.
    • An audio scenario specifies the audio performance in terms of volume, audio quality, and echo cancellation. Based on the nature of your application, you can pick the most suitable option from the List of audio scenarios.
  • Video profiles: In real-time engagement scenarios, user experience is closely tied to the sharpness, smoothness, and overall quality of the video. In Video SDK you can set the video dimensions, framerate, bitrate, orientation mode, and mirror mode by specifying a video profile. You can also set the degradation preference to specify how video quality is degraded under suboptimal network conditions. To find the suitable bitrate for a given combination of dimensions and framerate, refer to the Video profile table.

  • In-call quality statistics: Video SDK provides several callbacks and methods to monitor channel quality in real-time. These methods and callbacks provide vital statistics to evaluate communication quality and provide the information necessary to take remedial actions. Video SDK provides you the following statistics :

    • Network quality: The uplink and downlink network quality in terms of the transmission bitrate, packet loss rate, average Round-Trip Time, and jitter in your network.

    • Call quality: Information on the current user session and the resources being used by the channel in terms of the number of users in a channel, packet loss rate, CPU usage and call duration. Use these statistics to troubleshoot call quality issues.

    • Local audio quality: Local audio measurements such as audio channels, sample rate, sending bitrate, and packet loss rate in the audio stream.

    • Remote audio quality: These statistics provide information such as the number of channels, received bitrate, jitter in the audio stream, audio loss rate, and packet loss rate.

    • Local video quality: Local video quality statistics such as packet loss rate, frame rate, encoded frame width, and sent bitrate.

    • Remote video quality: These statistics include information about the width and height of video frames, packet loss rate, receiving stream type, and bitrate in the reported interval.

    • Video and Audio states: Agora SD-RTN™ reports the new state, and the reason for state change, whenever the state of an audio or video stream changes.

  • Dual stream mode: In dual-stream mode, Video SDK transmits a high-quality and a low-quality video stream from the sender. The high-quality stream has higher resolution and bitrate than the the low-quality video stream. Remote users subscribe to the low-quality stream to improve communication continuity as it reduces bandwidth consumption. Subscribers should also choose the low-quality video streams when network condition are unreliable, or when multiple users publish streams in a channel.

  • Video stream fallback: When network conditions deteriorate, Video SDK automatically switches the video stream from high-quality to low-quality, or disables video to ensure audio delivery. Agora SD-RTN™ continues to monitor the network quality after fallback, and restores the video stream when network conditions allow it. To improve communication quality under extremely poor network conditions, implement a fallback option in your app.

  • Video for multiple users: When multiple users join a channel, several incoming high-quality video streams negatively impact network and device performance. In such cases, you can manage the excess load by playing high-quality video from the user who has focus, and low quality streams from all other users. To implement this feature, it is necessary for all users in the channel to enable the dual stream mode.

  • Echo cancellation when playing audio files: Video SDK offers audio mixing functionality to play media in a channel. You can mix a local or online audio file with the audio captured through the microphone, or completely replace the microphone audio. Audio mixing takes advantage of the echo cancellation features of Video SDK to reduce echo in a channel. Refer to Audio and voice effects to learn more about audio mixing in Video SDK.

  • Connection state monitoring: The connection state between an app and Agora SD-RTN™ changes when the app joins or leaves a channel, or goes offline due to network or authentication issues. Video SDK provides connection state monitoring to detect when and why a network connection is interrupted. When the connection state changes, Agora SD-RTN™ sends a callback to notify the app. Video SDK then automatically tries to reconnect to the server to restore the connection.

  • Log files: Video SDK provides configuration options that you use to customize the location, content and size of log files containing key data of Video SDK operation. When you set up logging, Video SDK writes information messages, warnings, and errors regarding activities such as initialization, configuration, connection and disconnection to log files. Log files are useful in detecting and resolving channel quality issues.

The following figure shows the workflow you need to implement to ensure channel quality in your app:

Ensure Channel Quality

Prerequisites

In order to follow this procedure you must have:

  • Android Studio 4.1 or higher.
  • Android SDK API Level 24 or higher.
  • A mobile device that runs Android 4.1 or higher.
  • An Agora account and project.

  • A computer with Internet access.

    Ensure that no firewall is blocking your network communication.

Project setup

To create the environment necessary to implement call quality best practices into your app, open the SDK quickstart Video Calling project you created previously.

Implement best practice to optimize call quality

This section shows you how to integrate call quality optimization features of Video SDK into your app, step by step.

Implement the user interface

This section guides you through the necessary UI changes in the SDK quickstart project interface to implement call quality features.

Add a network status indicator to the user interface

To enable app users to see the network status, add TextViews to the user interface. To do this, open /app/res/layout/activity_main.xml and add the following lines before </RelativeLayout>:


_25
<TextView
_25
android:id="@+id/networkStatus"
_25
android:layout_width="20dp"
_25
android:layout_height="20dp"
_25
android:layout_below="@id/LeaveButton"
_25
android:layout_alignRight="@id/LeaveButton"
_25
/>
_25
_25
<TextView
_25
android:id="@+id/networkLabel"
_25
android:layout_width="wrap_content"
_25
android:layout_height="wrap_content"
_25
android:layout_below="@id/LeaveButton"
_25
android:layout_toLeftOf="@id/networkStatus"
_25
android:text="Network Status: "
_25
/>
_25
_25
<Button
_25
android:id="@+id/echoTestButton"
_25
android:layout_width="wrap_content"
_25
android:layout_height="wrap_content"
_25
android:layout_below="@+id/LeaveButton"
_25
android:layout_alignStart="@id/JoinButton"
_25
android:onClick="echoTest"
_25
android:text="Start Echo Test" />

Handle the system logic

  1. Import the required Android and Agora libraries

    To access and use the TextView object and integrate Video SDK channel quality libraries, add the following statements after the last import statement in /app/java/com.example.<projectname>/MainActivity.


    _7
    import android.widget.TextView;
    _7
    import android.widget.Button;
    _7
    import android.graphics.Color;
    _7
    _7
    import io.agora.rtc2.EchoTestConfiguration;
    _7
    import io.agora.rtc2.internal.LastmileProbeConfig;
    _7
    import io.agora.rtc2.video.VideoEncoderConfiguration;

  2. Define variables to manage test state and workflow

    In /app/java/com.example.<projectname>/MainActivity, add the following declaration to class MainActivity:


    _7
    private TextView networkStatus; // For updating the network status
    _7
    private int counter1 = 0; // Controls the frequency of messages
    _7
    private int counter2 = 0; // Controls the frequency of messages
    _7
    private int remoteUid; // Uid of the remote user
    _7
    private boolean highQuality = true; // Quality of the remote video stream being played
    _7
    private boolean isEchoTestRunning = false; // Keeps track of the echo test
    _7
    private Button echoTestButton;

  3. Update the network status indication

    To show the network quality result visually to the user, add the following to the MainActivity class:


    _6
    private void updateNetworkStatus(int quality){
    _6
    if (quality > 0 && quality < 3) networkStatus.setBackgroundColor(Color.GREEN);
    _6
    else if (quality <= 4) networkStatus.setBackgroundColor(Color.YELLOW);
    _6
    else if (quality <= 6) networkStatus.setBackgroundColor(Color.RED);
    _6
    else networkStatus.setBackgroundColor(Color.WHITE);
    _6
    }

    To set up access to the elements, add the following lines to the onCreate method after setupVideoSDKEngine();


    _2
    networkStatus = findViewById(R.id.networkStatus);
    _2
    echoTestButton = findViewById(R.id.echoTestButton);

Implement features to ensure quality

To implement the call quality features, take the following steps:

  1. Enable the user to test the network

    In the MainActivity class, add the following method:


    _14
    public void startProbeTest() {
    _14
    // Configure a LastmileProbeConfig instance.
    _14
    LastmileProbeConfig config = new LastmileProbeConfig();
    _14
    // Probe the uplink network quality.
    _14
    config.probeUplink = true;
    _14
    // Probe the downlink network quality.
    _14
    config.probeDownlink = true;
    _14
    // The expected uplink bitrate (bps). The value range is [100000,5000000].
    _14
    config.expectedUplinkBitrate = 100000;
    _14
    // The expected downlink bitrate (bps). The value range is [100000,5000000].
    _14
    config.expectedDownlinkBitrate = 100000;
    _14
    agoraEngine.startLastmileProbeTest(config);
    _14
    showMessage("Running the last mile probe test ...");
    _14
    }

  2. Implement best practice for app initiation

    When a user starts your app, the Agora Engine is initialized in setupVideoSDKEngine. After initialization, do the following:

    • Enable dual stream mode: Required for multi-user scenarios.
    • Set an audio profile and audio scenario: Setting an audio profile is optional and only required if you have special requirements such as streaming music.
    • Set the video profile: Setting a video profile is also optional. It is useful when you want to change one or more of mirrorMode, frameRate, bitrate, dimensions, orientationMode, degradationPrefer or compressionPrefer from the default setting to custom values. For more information, see video profile table.
    • Start the network probe test: A quick test at startup to gauge network quality.

    To implement these features, add the following code to setupVideoSDKEngine after agoraEngine.enableVideo();


    _26
    // Enable the dual stream mode
    _26
    agoraEngine.enableDualStreamMode(true);
    _26
    // Set audio profile and audio scenario.
    _26
    agoraEngine.setAudioProfile(Constants.AUDIO_PROFILE_DEFAULT, Constants.AUDIO_SCENARIO_GAME_STREAMING);
    _26
    _26
    // Set the video profile
    _26
    VideoEncoderConfiguration videoConfig = new VideoEncoderConfiguration();
    _26
    // Set mirror mode
    _26
    videoConfig.mirrorMode = VideoEncoderConfiguration.MIRROR_MODE_TYPE.MIRROR_MODE_AUTO;
    _26
    // Set framerate
    _26
    videoConfig.frameRate = VideoEncoderConfiguration.FRAME_RATE.FRAME_RATE_FPS_10.getValue();
    _26
    // Set bitrate
    _26
    videoConfig.bitrate = VideoEncoderConfiguration.STANDARD_BITRATE;
    _26
    // Set dimensions
    _26
    videoConfig.dimensions = VideoEncoderConfiguration.VD_640x360;
    _26
    // Set orientation mode
    _26
    videoConfig.orientationMode = VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_ADAPTIVE;
    _26
    // Set degradation preference
    _26
    videoConfig.degradationPrefer = VideoEncoderConfiguration.DEGRADATION_PREFERENCE.MAINTAIN_BALANCED;
    _26
    // Set compression preference: low latency or quality
    _26
    videoConfig.compressionPrefer = VideoEncoderConfiguration.COMPRESSION_PREFERENCE.PREFER_LOW_LATENCY;
    _26
    // Apply the configuration
    _26
    agoraEngine.setVideoEncoderConfiguration(videoConfig);
    _26
    _26
    // Start the probe test
    _26
    startProbeTest();

  3. Test the user's hardware

    The echo test checks that the user's hardware is working properly. To start and stop the test, add the following method to the MainActivity class:


    _22
    public void echoTest(View view) {
    _22
    if (!isEchoTestRunning) {
    _22
    EchoTestConfiguration echoConfig = new EchoTestConfiguration();
    _22
    echoConfig.enableAudio = true;
    _22
    echoConfig.enableVideo = true;
    _22
    echoConfig.token = token;
    _22
    echoConfig.channelId = channelName;
    _22
    _22
    setupLocalVideo();
    _22
    echoConfig.view = localSurfaceView;
    _22
    localSurfaceView.setVisibility(View.VISIBLE);
    _22
    agoraEngine.startEchoTest(echoConfig);
    _22
    echoTestButton.setText("Stop Echo Test");
    _22
    isEchoTestRunning = true;
    _22
    } else {
    _22
    agoraEngine.stopEchoTest();
    _22
    echoTestButton.setText("Start Echo Test");
    _22
    isEchoTestRunning = false;
    _22
    setupLocalVideo();
    _22
    localSurfaceView.setVisibility(View.GONE);
    _22
    }
    _22
    }

  4. Listen to Agora Engine events to receive state change notifications and quality statistics

    Add the following event handlers to receive state change notifications and quality statistics:

    • onLastmileQuality: Receives the network quality result.
    • onLastmileProbeResult: Receives detailed probe test results.
    • onNetworkQuality: Receives statistics on network quality.
    • onRtcStats: Receives the Agora Engine stats.
    • onRemoteVideoStateChanged: Receives notification regarding any change in the state of the remote video.
    • onRemoteVideoStats: Receives stats about the remote videos.

    In the MainActivity class, add the following methods after private final IRtcEngineEventHandler mRtcEventHandler = new IRtcEngineEventHandler() {:


    _64
    @Override
    _64
    public void onConnectionStateChanged(int state, int reason) {
    _64
    showMessage("Connection state changed"
    _64
    + "\n New state: " + state
    _64
    + "\n Reason: " + reason);
    _64
    }
    _64
    _64
    @Override
    _64
    public void onLastmileQuality(int quality) {
    _64
    runOnUiThread(() -> updateNetworkStatus(quality));
    _64
    }
    _64
    _64
    @Override
    _64
    public void onLastmileProbeResult(LastmileProbeResult result) {
    _64
    agoraEngine.stopLastmileProbeTest();
    _64
    // The result object contains the detailed test results that help you
    _64
    // manage call quality, for example, the downlink jitter.
    _64
    showMessage("Downlink jitter: " + result.downlinkReport.jitter);
    _64
    }
    _64
    _64
    @Override
    _64
    public void onNetworkQuality(int uid, int txQuality, int rxQuality) {
    _64
    // Use downlink network quality to update the network status
    _64
    runOnUiThread(() -> updateNetworkStatus(rxQuality));
    _64
    }
    _64
    _64
    @Override
    _64
    public void onRtcStats(RtcStats rtcStats) {
    _64
    counter1 += 1;
    _64
    String msg = "";
    _64
    _64
    if (counter1 == 5)
    _64
    msg = rtcStats.users + " user(s)";
    _64
    else if (counter1 == 10 ) {
    _64
    msg = "Packet loss rate: " + rtcStats.rxPacketLossRate;
    _64
    counter1 = 0;
    _64
    }
    _64
    _64
    if (msg.length()>0) showMessage(msg);
    _64
    }
    _64
    _64
    @Override
    _64
    public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) {
    _64
    String msg = "Remote video state changed: \n Uid =" + uid
    _64
    + " \n NewState =" + state
    _64
    + " \n reason =" + reason
    _64
    + " \n elapsed =" + elapsed;
    _64
    _64
    showMessage(msg);
    _64
    }
    _64
    _64
    @Override
    _64
    public void onRemoteVideoStats(RemoteVideoStats stats) {
    _64
    counter2 += 1;
    _64
    _64
    if (counter2 == 5) {
    _64
    String msg = "Remote Video Stats: "
    _64
    + "\n User id =" + stats.uid
    _64
    + "\n Received bitrate =" + stats.receivedBitrate
    _64
    + "\n Total frozen time =" + stats.totalFrozenTime;
    _64
    counter2 = 0;
    _64
    showMessage(msg);
    _64
    }
    _64
    }

    Each event reports the statistics of the audio video streams from each remote user and host.

  5. Switch stream quality when the user taps the remote video

    To take advantage of dual-stream mode and switch remote video quality to high or low, add the following to the MainActivity class:


    _11
    public void setStreamQuality(View view) {
    _11
    highQuality = !highQuality;
    _11
    _11
    if (highQuality) {
    _11
    agoraEngine.setRemoteVideoStreamType(remoteUid, Constants.VIDEO_STREAM_HIGH);
    _11
    showMessage("Switching to high-quality video");
    _11
    } else {
    _11
    agoraEngine.setRemoteVideoStreamType(remoteUid, Constants.VIDEO_STREAM_LOW);
    _11
    showMessage("Switching to low-quality video");
    _11
    }
    _11
    }

    To fire this method when the user taps the remote view panel, add the following line after <FrameLayout android:id="@+id/remote_video_view_container" in activity_main.xml:


    _1
    android:onClick="setStreamQuality"

    To obtain the uid of the remote user, add the following line to the onUserJoined method:


    _1
    remoteUid = uid;

  6. Configure the Video SDK log file

    To customize the location, content and size of log files, add the following code to setupVideoSDKEngine before agoraEngine = RtcEngine.create(config);:


    _6
    // Configure the log file
    _6
    RtcEngineConfig.LogConfig logConfig = new RtcEngineConfig.LogConfig();
    _6
    logConfig.filePath = "/storage/emulated/0/Android/data/<package name>/files/agorasdk1.log.";
    _6
    logConfig.fileSizeInKB = 256; // Range 128-1024 Kb
    _6
    logConfig.level = Constants.LogLevel.getValue(Constants.LogLevel.LOG_LEVEL_WARN);
    _6
    config.mLogConfig = logConfig;

    Make sure you replace the <package name> in filePath with the name of your package.

    If you want to upload the log file automatically to a CDN, call setLocalAccessPoint(LocalAccessPointConfiguration config) to specify the local access point and assign the native access module to the SDK.

Test your implementation

To ensure that you have implemented call quality features into your app:

  1. Generate a temporary token in Agora Console .

  2. In your browser, navigate to the Agora dual stream web demo and update App ID, Channel, and Token with the values for your temporary token, then click Join.

  1. In Android Studio, open app/java/com.example.<projectname>/MainActivity, and update appId, channelName and token with the values for your temporary token.

  2. Connect a physical Android device to your development device.

  3. In Android Studio, click Run app. A moment later you see the project installed on your device.

    If this is the first time you run the project, grant microphone and camera access to your app.

  4. When the app starts, it does the following:

    • Sets the log file location, size, and logging level according to your preference.
    • Enables the dual-stream mode.
    • Sets the audio profile.
    • Sets the video profile.
    • Starts a network probe test.

    You see the result of the network probe test displayed in the network status icon.

  5. Run the echo test.

    1. Press Start Echo Test. You see the local camera feed on the test device screen.

    2. Speak into the device microphone. You hear the recorded audio after a short delay.

      This test confirms that the user's hardware is working properly.

    3. Press Stop Echo Test to end the test before joining a channel.

  1. Press Join to connect to the same channel as your web demo.
  1. After joining a channel, you receive toast messages informing you of some selected call statistics, including:

    • The number of users in the channel
    • Packet loss rate
    • Remote video stats
    • Remote video state changes
  2. You see the network status indicator updated periodically based on the result of the onNetworkQuality callback.

  3. Tap the remote video panel. You see the remote video switches from high-quality to low-quality. Tap the remote video again to switch back to hight-quality video.

Reference

This section contains information that completes the information in this page, or points you to documentation that explains other aspects to this product.

The recommended video settings vary by scenario. For example, in a one-to-one online class, the video windows of the teacher and student are both large, which calls for higher resolutions, frame rate, and bitrate. However, in a one-to-many online class, the video windows are smaller. You can set lower resolution, frame rate, and bitrate to accommodate bandwidth limitations. The recommended settings for these different scenarios are:

  • One-to-one video call:

    • Resolution: 320 x 240; Frame rate: 15 fps; Bitrate: 200 Kbps.
    • Resolution: 640 x 360; Frame rate: 15 fps; Bitrate: 400 Kbps.
  • One-to-many video call:

    • Resolution: 160 x 120; Frame rate: 15 fps; Bitrate: 65 Kbps.
    • Resolution: 320 x 180; Frame rate: 15 fps; Bitrate: 140 Kbps.
    • Resolution: 320 x 240; Frame rate: 15 fps; Bitrate: 200 Kbps.

Video profile table

Video SDK provides a selection of video dimensions, framerate, and bitrate to choose from. You can also customize the values according to the table below.

Video ProfileResolution (Width×Height)Frame rate (fps)Bitrate(Kbps)
120p160 × 1201565
120p_1160 × 1201565
120p_3120 × 1201550
180p320 × 18015140
180p_1320 × 18015140
180p_3180 × 18015100
180p_4240 × 18015120
240p320 × 24015200
240p_1320 × 24015200
240p_3240 × 24015140
240p_4424 × 24015220
360p640 × 36015400
360p_1640 × 36015400
360p_3360 × 36015260
360p_4640 × 36030600
360p_6360 × 36030400
360p_7480 × 36015320
360p_8480 × 36030490
360p_9640 × 36015800
360p_10640 × 36024800
360p_11640 × 360241000
480p640 × 48015500
480p_1640 × 48015500
480p_2640 × 480301000
480p_3480 × 48015400
480p_4640 × 48030750
480p_6480 × 48030600
480p_8848 × 48015610
480p_9848 × 48030930
480p_10640 × 48010400
540p (Default)960 × 540151100
720p1280 × 720151130
720p_11280 × 720151130
720p_21280 × 720302000
720p_31280 × 720301710
720p_5960 × 72015910
720p_6960 × 720301380
1080p1920 × 1080152080
1080p_11920 × 1080152080
1080p_21920 × 1080303000
1080p_31920 × 1080303150
1080p_51920 × 1080604780

For more details, see VideoEncoderConfiguration.

Mainstream video profiles

You can also refer to the following tables to learn the default resolution, frame rate, and bitrate of the low-quality video stream for different mainstream video profiles of the high-quality video stream.

High-quality stream video profile: CommunicationDefault low-quality stream video profile: Communication
320 × 240, 15, 200144 × 108, 5, 20
640 × 360, 15, 400288 × 162, 5, 40
640 × 480, 15, 500288 × 216, 5, 50
1280 × 720, 15, 1130288 × 162, 5, 113
240 × 320, 15, 200108 × 144, 5, 20
240 × 320, 15, 200108 × 144, 5, 20
360 × 640, 15, 400164 × 288, 5, 40
480 × 640, 15, 500216 × 288, 5, 50
720 × 1280, 15, 1130164 × 288, 5, 113
High-quality stream video profile: Live-broadcastDefault low-quality stream video profile: Live-broadcast
320 × 240, 15, 350160 × 120, 5, 45
640 × 360, 15, 650192 × 108, 5, 50
640 × 480, 15, 800160 × 120, 5, 45
1280 × 720, 15, 1600192 × 108, 5, 50
240 × 320, 15, 350120 × 160, 5, 45
360 × 640, 15, 650108 × 192, 5, 50
480 × 640, 15, 800120 × 160, 5, 45
720 × 1280, 15, 1600108 × 192, 5, 50

This section provides the recommended video resolution, frame rate, and bitrate for high-quality and low-quality streams.

Channel profile Video stream type Device system Recommended video profile
Communication high-quality stream macOS, Windows 640 × 480, 15, 500
Android, iOS 640 × 360, 15, 400
low-quality stream macOS, Windows 320 × 180, 7, 75
Android, iOS 160 × 90, 7, 45
Live-broadcast high-quality stream macOS, Windows 640 × 480, 15, 800
Android, iOS 640 × 360, 15, 650
low-quality stream macOS, Windows 320 × 180, 7, 126
Android, iOS 160 × 90, 7, 64

In practice, different user devices, user network conditions, application service locations, and user requirements affect which kinds of video profiles you use. Therefore, if the recommended video profiles are not suitable for you, contact technical support for assistance.

Mirror mode

By default, Video SDK does not mirror the video during encoding. You can use the mirrorMode parameter to decide whether to mirror the video that remote users see.

Connection states

When the connection state changes, Agora sends the onConnectionStateChanged callback. The following diagram illustrates the various states and how the states change as a client app joins and leaves a channel:

When the network connection is interrupted, the SDK automatically tries to reconnect to the server. The following diagram shows the callbacks received by the local user (UID1) and the remote user (UID2) when the local user joins the channel, gets a network exception, lises connection, and rejoins the channal.

As shown in the above diagram:

  • T0: The SDK receives the joinChannel request from UID1.
  • T1: 200 ms after calling joinChannel, UID1 joins the channel. In the process, UID1 also receives the onConnectionStateChanged(CONNECTING, CONNECTING) callback. When successfully joining the channel, UID 1 receives the onConnectionStateChanged(CONNECTED, JOIN_SUCCESS) and onJoinChannelSuccess callbacks.
  • T2: 100 ms after UID1 joins the channel, UID2 receives the onUserJoined callback.
  • T3: The uplink network condition of UID1 deteriorates. The SDK automatically tries rejoining the channel.
  • T4: If UID1 fails to receive any data from the server in four seconds, UID1 receives onConnectionStateChanged(RECONNCTING, INTERRUPTED); meanwhile the SDK continues to try rejoining the channel.
  • T5: If UID1 fails to receive any data from the server in ten seconds, UID1 receives onConnectionLost; meanwhile the SDK continues to try rejoining the channel.
  • T6: If UID2 fails to receive any data from UID1 in 20 seconds, the SDK decides that UID1 is offline. UID2 receives onUserOffline.
  • T7: If UID1 fails to rejoin the channel in 20 minutes, the SDK stops trying to rejoin the channel. UID1 receives onConnectionStateChanged(FAILED, JOIN_FAILED).

For more detailed information, about the connection state and reasons, see IRtcEngineEventHandler.onConnectionStateChanged.

List of audio profiles

Video SDK provides the following audio profile options:

List of audio scenarios

Video SDK provides the following audio scenarios to choose from:

Audio ScenarioPurpose
DefaultBasic communication.
Chatroom EntertainmentEntertainment scenario where users need to frequently switch the user role.
EducationEducation scenario where users want smoothness and stability.
Game StreamingHigh-quality audio chatroom scenario where hosts mainly play music.
ShowroomShowroom scenario where a single host wants high-quality audio.
Chatroom GamingGaming scenario for group chat that only contains human voice.
IoTInternet of Things scenario for devices that require low power consumption.
MeetingMeeting scenario that mainly contains human voice.

Profile and scenario parameter settings for some typical applications

ApplicationProfileScenarioFeatures
One-to-one classroomDefaultDefaultPrioritizes the call quality with smooth transmission and high-fidelity audio.
Battle Royale GameSpeech StandardChatroom GamingNoise reduction. Transmits voice only. Reduces the transmission rate. Suitable for multiplayer games.
Murder Mystery GameMusic StandardChatroom EntertainmentHigh-fidelity audio encoding and decoding. No volume or audio quality change when you mute/unmute the microphone.
KTVMusic High-qualityGame StreamingHigh-fidelity audio and effects. Adapts to the high-fidelity audio application.
PodcastMusic High-quality StereoShowRoomHigh-fidelity audio and stereo panning. Support for professional audio hardware.
Music educationMusic Standard StereoGame StreamingPrioritizes audio quality. Suitable for transmitting live external audio effects.
Collaborative teachingMusic Standard StereoChatroom EntertainmentHigh-fidelity audio and effects. No volume or audio quality change when you mute/unmute the microphone.

How video is oriented on the playing device

The way video is displayed on the playing device depends on orientationMode used on the encoding device, orientation of the capturing device, orientation of the playing device, and whether screen rotation is enabled on the playing device. The following images show how the video is finally oriented based on these factors.

Orientation mode: Adaptive

  • Screen rotation: Disabled

  • Capturing device orientation: Landscape orientation_adaptive_locked_landscape

  • Screen rotation: Disabled

  • Capturing device orientation: Portrait orientation_adaptive_locked_portrait

  • Screen rotation: Enabled

  • Capturing device orientation: Landscape orientation_adaptive_unlocked_landscape

  • Screen rotation: Enabled

  • Capturing device orientation: Portrait orientation_adaptive_unlocked_portrait

Orientation mode: Landscape

  • Capturing device orientation: Landscape orientation_landscape_landscape

  • Capturing device orientation: Portrait orientation_landscape_portrait

Orientation mode: Portrait

  • Capturing device orientation: Portrait orientation_portrait_portrait

  • Capturing device orientation: Landscape orientation_portrait_landscape

API reference

Video Calling