Batch Processing API

Eden AI's Batch Processing API enables businesses and developers to handle extensive datasets with minimal latency.

What is Batch Processing?

Batch processing is a powerful method for executing multiple AI requests simultaneously, significantly improving efficiency and reducing computational overhead.

Instead of processing each request individually in real-time, batch processing allows you to send a bulk request containing multiple data points—such as images, text, or documents—and receive the results asynchronously.

Easy-to-Use

It’s user-friendly with simple API integration and asynchronous processing, allowing multiple tasks without interrupting other operations. It supports various data types making it efficient for handling large volumes of data.

Boost Efficiency

Save time by drastically reducing data and processing time, eliminating the need to handle each request separately, and enabling multiple tasks to run simultaneously without manual intervention.

What can be batched?

Eden AI's batch processing supports automating AI workflows across multiple services, including text analysis, image processing, OCR and document parsing, translation, and audio processing.

How to Implement Batch Processing? 

Step 1: Send a batch request

Batch processing is an asynchronous API, so you need first to do a POST request that contains all the data that you want to process (text or file URLs).

Here is a Python sample code:

Step 2: Retrieve the processed results

Once processing is complete, use the request ID to fetch the results with a GET request.

Step 3: Review batch history on the app

You can review your batch processing history for better tracking and optimization  directly within the Eden AI dashboard.

You can view all submitted batch projects, including their status, number of requests, and success/failure rates.

Explore More on Batch Processing