Skip to main content
The Batch API lets you run multiple DeepResearch tasks in parallel with shared configuration, unified monitoring, and aggregated cost tracking.

When to Use Batching

  • Bulk operations - Process 10-100 research queries at once
  • Shared settings - Apply the same mode, output formats, and search filters to all tasks
  • Unified tracking - Monitor progress and costs across all tasks in one place
For individual tasks with unique configurations or advanced features (files, deliverables, MCP servers), use the standard DeepResearch API instead.

Quick Example

from valyu import Valyu

valyu = Valyu()

# 1. Create a batch
batch = valyu.batch.create(
    name="Market Research Q4",
    mode="standard",
    output_formats=["markdown"]
)

if batch.success:
    # 2. Add tasks
    valyu.batch.add_tasks(batch.batch_id, [
        {"query": "Analyze AI trends in healthcare"},
        {"query": "Review renewable energy market"},
        {"query": "Research fintech innovations"}
    ])
    
    # 3. Wait for completion
    result = valyu.batch.wait_for_completion(
        batch.batch_id,
        on_progress=lambda s: print(f"Progress: {s.batch.counts.completed}/{s.batch.counts.total}")
    )
    
    print(f"Completed! Total cost: ${result.batch.cost}")

Batch Lifecycle

StatusDescription
openBatch created, ready to accept tasks
processingTasks are queued, running, or completed
completedAll tasks finished successfully
completed_with_errorsAll tasks finished, some failed
cancelledBatch was cancelled

Retrieve Results

# List all tasks in the batch
tasks = valyu.batch.list_tasks(batch_id)

for task in tasks.tasks:
    if task.status == "completed":
        # Get full result using the DeepResearch API
        result = valyu.deepresearch.status(task.deepresearch_id)
        print(f"Query: {task.query}")
        print(f"Output: {result.output[:200]}...")

Next Steps