Batch api result for large requests and expired jobs

Subject: Batch Job Expired: Retrieving Partial Results and Questions on Concurrency

Hi everyone,

I recently submitted a batch job containing approximately 50k requests. The processing rate was about 15k requests per day. However, after 2 days, the batch status changed to expired.

I have three specific questions regarding this situation:

1. Retrieving Partial Results Even though the batch is marked as “expired,” over 25k requests were successfully processed before the expiration. Is there a way to retrieve the output/results for these completed requests? (at the momment based on documents) we should get file_name from dest but if the result is expired, the dest is an empty dict so we can not download anything)

2. Resuming the Job Is there a mechanism to “resume” the expired job or request it to continue processing the remaining items? Or do I need to create a new batch?

3. Concurrency and Throughput Does a single batch job utilize the full available capacity? If I were to split these 50k requests into 10 separate batch jobs (5k requests each) instead of one large job, would the processing speed increase (e.g., 10x concurrency), or is there a single global queue/rate limit that would make splitting them ineffective?

Thanks in advance for your help.

@Sonali_Kumari1 could you please help us with these questions or tag the relevant person?