The “Zoom Symbl” Integration describes how to ingest or consume Zoom Cloud Recordings before sending data into Symbl for processing. The aim is to easily have a low-effort, end-to-end integration of Symbl’s Intelligence for Zoom’s Recordings. The outcome would be meaningful insights such as Topics, Questions, Action Items, Follow-ups, etc. 

The current system architecture is based on serverless components. Therefore, it is easy to build and scale things in a cloud deployment. 

The components described in this document are serverless. These are automated using the AWS Serverless Application Model (SAM) which is an open-source framework for building serverless applications. The SAM template in this document provides a mechanism to build and deploy solutions with ease, as it follows the Infrastructure as Code approach.

High-Level Architecture

The following diagram provides you with the High-Level Architecture used during the application runtime described in this document.

  1. When the meeting is concluded and the recording process is completed by Zoom, a notification event is sent to the AWS Hosted API Gateway front-ending the application. The host is the user responsible for setting up the Zoom Meeting and enabling the cloud recording.
  2. The AWS Hosted API Gateway initiates a lambda request which makes an asynchronous call to the “Producer” lambda.
  3. The AWS API Gateway Lambda responds to the HTTP request back to Zoom.
  4. The “Producer” lambda executes several things. It uses FFmpeg to merge the audio recordings per participant and uploads the multichannel file to the AWS S3 bucket for further processing.
    1. Persists the participant audio into the Zoom Input AWS S3 bucket.
    2. Saves the Zoom Recordings metadata into AWS DynamoDB.
    3. Builds the multichannel single audio file by using the FFmpeg lambda layer.

It saves the multichannel audio file into the Zoom Output AWS S3 bucket. So it enables Symbl to process speaker-separated audio via a single multichannel file.

  1. Initiates an AWS SQS Message for further processing.
  2. The “Consumer” lambda executes the following steps:
    1. Read messages from the AWS SQS queue and perform the concurrency checks.
    2. Read the audio recording files from the AWS S3 bucket.
    3. Read the Zoom recording metadata from the AWS DynamoDB.
    4. Submit the audio recordings to Symbl using the Async Audio API call with the metadata read from the AWS DynamoDB.
  3. When Symbl processes the Async Audio job request, it will keep notifying the AWS API Gateway Webhook URL for every status change. The following values are sent to the Webhook URL when the job requests changes: scheduled, in_progress, completed or failed. For more info, please refer to the Symbl Webhook documentation.
  4. The AWS API Gateway executes the following steps:
    1. Invokes the “Notifier” lambda for further processing.
    2. Responds to the HTTP request back to Symbl.
  5. The “Notifier” lambda executes the following steps:
    1. It updates the job status on the AWS DynamoDB and the concurrent request count for concurrency handling.
    2. If the job status is reported as “completed” then a request to the Experience Video Summary UI API and a request to the Conversation Summary API is made.
    3. Sends an email to the meeting host with the results of the Conversation Summary API and the Experience Video Summary UI URL. You can see an example of the Experience Video Summary UI here.


Step 1: Create a Custom Zoom App

Now you will create a custom Zoom app. The instructions provided below are required to be completed in order.

This step is mandatory to set up or configure the custom app webhook endpoint for handling the Zoom recording completed event. The ‘Recording Completed‘ event consists of information on the speaker-separated audio per participant. Make sure to follow the below-mentioned steps for creating your custom Zoom App. Note – This integration uses the JWT authorization mechanism.


  1. Register Your App
  2. App Information
  3. Generate App Credentials
  4. Set App Features
  5. Activation

Step 2: Download and Build Source Code

In this step, you’ll see how to download and build the source code for accomplishing the Zoom Symbl Integration on AWS.

  1. Clone or Download the Zoom Symbl Integration Source Code
  2. Use Visual Studio Code or any other editor of your choice to open the source code
  3. On VS Code → Terminal → Select New Terminal
  4. Type the command sam build and press enter key to build the serverless apps  

Below is the screenshot showing the output of the “sam build” command.

(test_venv) ranjandailata@Ranjans-MacBook-Pro Sam % sam build

Building codeuri: /Users/ranjandailata/Downloads/Sam/ZoomSymblWebhook runtime: python3.9 metadata: {} architecture: x86_64 functions: [‘ZoomSymblWebhook/ZoomSymblWebhook’]

Build Succeeded

Built Artifacts   : .aws-sam/build

Built Template  : .aws-sam/build/template.yaml

Commands you can use next


[*] Invoke Function: sam local invoke

[*] Test Function in the Cloud: sam sync –stack-name {stack-name} –watch

[*] Deploy: sam deploy –guided

You should be able to see the below structure with the “build” folder consisting of a list of applications and the template file that helps with the application deployment.

Step 3: Deploying the Application

In this step, you’ll see how to deploy the serverless application using the “sam deploy” command. The application deployment is handled in a guided manner by making use of the command “sam deploy --guided”.

Note – The “Serverless Application” stack deployment incorporates several components ex: Lambda Layers, S3, SQS, DynamoDB, Lambda Functions, etc. required for accomplishing the Zoom Symbl Integration. These aspects of deployments are taken care of by the SAM-guided deployment.

(test_venv) ranjandailata@Ranjans-MacBook-Pro SAM % sam deploy --guided                                     

Configuring SAM deploy


        Looking for config file [samconfig.toml] :  Found

        Reading default arguments  :  Success

        Setting default arguments for ‘sam deploy’


        Stack Name [ZoomSymblStack]: 

        AWS Region [us-east-1]: 

        #Shows you resources changes to be deployed and require a ‘Y’ to initiate deploy

        Confirm changes before deploy [Y/n]: Y

Continue with the stack deployment, and you should be able to see the below one. Please confirm with “y” to deploy the change set.

Step 4: Configuring the Custom Zoom App Webhook Endpoint

The custom Zoom Application has to be configured with the recording completed webhook endpoint, so the application can receive the necessary “recording completed” event information that can be utilized for processing the recordings.

  1. Make sure to log in to the Zoom Developer Account
  2. Navigate to the Created Apps Section
  3. Select the existing app that you wish to configure. Here’s an example.
  1. Navigate to the “Feature” section
  2. Under the Event Subscription, Click on the Add Event Subscription
  1. Select the Event Types → Recording
  2. Check-mark on the “All Recordings have completed” option
  1. Click on Done.
  2. Specify the Event Notification URL endpoint with the Zoom Recording Complete Lambda API Gateway endpoint.
  1. Click on the Save button

Step 5: Configuring the Lambda for Asynchronous Execution

Background – The SAM deploy command will help in building the Zoom Symbl Integration Stack. Every time you perform a clean and deploy, you’ll notice the lambdas getting created with a unique name.

Based on the specified architecture, two of the lambdas are executed in an asynchronous manner. Hence, you’ll have to configure the lambda for making a call to the respective lambdas. Please make sure to update the lambda function name for “Zoom Recording Completed” and for the “Zoom Symbl Webhook Notifier” lambda.

Step 6: Configuring the Lambda Environment Variables

This step is dedicated to configuring the lambda environment variables with the AWS Region, App Secret, Zoom JWT Token, etc. You’ll learn how to configure them.

Here’s the high-level summary of the lambda function and its environment variables with the description that will help you to understand and update the required aspects of this integration. Up-next, you’ll see a screenshot explaining how to update the environment variables.

Lambda FunctionEnvironment VariableDescription

This lambda function is responsible for copying the zoom recordings to S3, persisting the metadata on DynamoDB, and sending a message to SQS for processing. This one is the “Producer” lambda as mentioned in the architecture – Step 4

Specify your custom Zoom app JWT token

Set with the appropriate Region and Account ID


Alternatively, you can get into the SQS queue named symbl-zoom and then copy the ARN

This lambda function is responsible for submitting or posting the “Zoom” recordings to Symbl. This one is the “Consumer” lambda as mentioned in the architecture – Step 5





Set with the ${AWS::Region} ex: us-east-1


This lambda function is responsible for post-processing of Symbl webhook notifications. This one is the “Notifier” lambda as mentioned in the architecture – Step 8






Set with the ${AWS::Region} ex: us-east-1



Lambda – CopyZoomRecordingToS3 

Lambda – SubmitZoomRecordingToSymbl

Lambda – ZoomSymblWebhook

Step 7: Running the Application

In this step, you’ll see how to run the application. Please follow the below-mentioned steps for initiating the Zoom meeting, adding participants, recording, and ending the meeting.

  1. Create a Zoom Meeting
  2. If you wish, you may invite participants
  3. Make sure to record the meeting
  4. End the meeting

Post-zoom meetings, you’ll have to wait for a couple of minutes, so the processing of the recordings can happen. Various factors matter, for example, the number of participants, meeting duration, etc. You should be receiving an email notification consisting of the “summary” and the meeting URL where you can visually see the transcription and insights.


  1. Please keep in mind the Zoom Cloud Recording Limits
    1. Zoom Cloud Recording Limitation
    2. Zoom Cloud Recordings Per Participant
  2. Zoom License Limit. Depending on the license that you have for Zoom, there are restrictions regarding the number of active participants in a given meeting. Here’s an example. Follow the Zoom Pricing to get some insights on the Zoom Pricing.
  1. The cloud recording(s) processing time varies by the number of audio files and the meeting length. That said, there’s a max execution time for lambda. i.e. 15 mins. You cannot handle anything beyond that. In addition to the execution time, there are other constraints like the Max RAM and Storage. It cannot go beyond 10 GB. The CopyZoomRecordingToS3 lambda deals with producing the multi-channel stereo audio file and hence, the lambda downloads the cloud recording and then merges the file using ffmpeg. Keep in mind these limits while you test the integration.


How to roll back the deployment?

In case of errors, if you are unable to deploy and decide to roll back, please run the following command.

aws cloudformation rollback-stack --stack-name ZoomSymblStack

How can I delete a stack?

Please run the following command.

aws cloudformation delete-stack --stack-name ZoomSymblStack

Wish to change the default stack name?

Open the file named “samconfig.toml” and look for stack_name = "ZoomSymblStack". You may specify the relevant stack name that you wish to use.

Unable to deploy the stack due to the capabilities issue?

Open the file named “samconfig.toml” and look for the “capabilities”. Please make sure that the capabilities are specified with "CAPABILITY_IAM CAPABILITY_AUTO_EXPAND"

How to deal with the Gmail authentication error?

Below is one common error that you might face. However, you might encounter something similar issues while programmatically sending emails via Gmail.

Error: (534, b’5.7.14 <\n5.7.14 YpFVq7pyRFp0RTWibzW52rsRb6u6s44cd5x0VtpGJuYCZynSZRrrDkv5kK1R8D3Smp1sQ\n5.7.14 VkGBcvGmbXvv4v1Guv6jGLZCjKYGyilqL-zEp71dBFjZVU4zowNLpMtgKcDE_V4G>\n5.7.14 Please log in via your web browser and then try again.\n5.7.14  Learn more at\n5.7.14 d1-20020a37b401000000b0069fc13ce21esm2500410qkf.79 – gsmtp’)!

How to deal with the Lambda Layer Version mismatch issues?

This integration deals with two Lambda Layers i.e. ffmpeg and requests and those were set up as part of the stack. When it comes to the layer version, which is something that is automatically incremented by AWS. Let’s say, if you delete the layer and again try to set it up, the version number will not reset. You’ll see the layer dependency on “SubmitZoomRecordingToSymbl” and “CopyZoomRecordingToS3”. Please make a note on the Lambda Layer Version and make sure to use that as part of the stack deployment.

How can I monitor the Symbl Job status?

Log in to AWS and then Search for DynamoDB. You’ll see the below-mentioned DynamoDB tables that are being used for processing the recordings.

zoom_recordings_jobs is the table that you need to look for. It keeps track of the job_id, conversation_id, meeting_uuid, and status.

How to get insights after lambda processing? Are there logs?

Log in to AWS and Search for CloudWatch. Once on CloudWatch, navigate the log groups section and search for the keyword zoom. Select the log group of your interest to know more about the ongoing activities on the lambda function.

Moving Forward with Zoom Symbl Integration on AWS Serverless Infrastructure

The Zoom Symbl Integration with the “Serverless Architecture” demonstrates how to consume or inject Zoom Cloud Recordings and asynchronously process them by merging all audio recordings. Finally, submit the audio recordings to Symbl using the Async Audio mechanism to extract various intelligence aspects like Topics, Action Items, Follow-ups, summaries, etc. In building a full-scale production-ready system, you should have a dedicated infrastructure setup such as EC2 and handle the workflow outlined in the “High-Level Architecture” section.

When it comes to storing the recordings and handling the Symbl concurrency aspects, the recommended best practice is to go with relational databases such as MySQL, Microsoft SQL Server (MSSQL), etc. The reason is the ACID properties support it by default. Also, it is easy to extract data and build some analytics on top of the relational data.

READ MORE: Your Most Common AWS Lambda Challenges: Integrating with Zoom APIs

Avatar photo
Surbhi Rathore

Surbhi is co-founder and CEO of, a technology that makes it simple to deploy contextual AI and analytics across voice, text and video communication, for any stage software. Symbl is now a Series A startup with $24M in venture financing and 70+ team members across the globe.