Fan-Out
Fan-Out distributes multiple tasks across worker nodes for parallel execution, then aggregates the results. This pattern is ideal for processing large datasets, running comparative analyses, or any workload that can be decomposed into independent sub-tasks.
executeFanOut
Execute tasks in parallel and collect results.
const results = await client.executeFanOut(options: FanOutOptions): Promise<FanOutResult>;Parameters
interface FanOutOptions {
/** List of tasks to execute in parallel. */
tasks: CreateTaskOptions[];
/** Maximum concurrent tasks. Default: 10 */
concurrency?: number;
/** Strategy for handling failures. Default: 'continue' */
failureStrategy?: 'continue' | 'abort-all' | 'abort-remaining';
/** Aggregation function applied to results. */
aggregate?: {
/** Prompt that receives all task outputs for aggregation. */
prompt: string;
/** Model for the aggregation step. */
model?: string;
};
/** Overall timeout for the entire fan-out operation. */
timeoutMs?: number;
/** Callback on each task completion. */
onTaskComplete?: (task: Task, index: number, total: number) => void;
}Return Value
interface FanOutResult {
id: string;
status: 'completed' | 'partial' | 'failed';
tasks: Task[];
aggregatedOutput?: string;
summary: {
total: number;
completed: number;
failed: number;
cancelled: number;
totalDurationMs: number;
avgDurationMs: number;
};
usage: TokenUsage;
}Examples
Basic Fan-Out
Process a batch of documents in parallel:
const documents = ['doc1.txt', 'doc2.txt', 'doc3.txt', 'doc4.txt', 'doc5.txt'];
const results = await client.executeFanOut({
tasks: documents.map((doc) => ({
prompt: `Summarize this document in 3 sentences: ${docContents[doc]}`,
model: 'sonnet',
tags: ['summarization', doc],
})),
concurrency: 5,
onTaskComplete: (task, i, total) => {
console.log(`[${i + 1}/${total}] ${task.tags[1]}: ${task.status}`);
},
});
console.log(`Completed: ${results.summary.completed}/${results.summary.total}`);
results.tasks.forEach((t) => {
console.log(`${t.tags[1]}: ${t.output?.slice(0, 100)}...`);
});Fan-Out with Aggregation
Process items in parallel, then combine results:
const competitors = ['Acme Corp', 'GlobalTech', 'InnoSoft', 'DataDyne'];
const results = await client.executeFanOut({
tasks: competitors.map((company) => ({
prompt: `Research ${company}: market position, strengths, weaknesses, recent news.`,
model: 'sonnet',
tags: ['competitor-analysis', company],
})),
concurrency: 4,
aggregate: {
prompt: `Given the following competitor analyses, create a comparative market report.
Rank competitors by overall threat level and identify key differentiators.
Format as a structured table.`,
model: 'opus',
},
});
console.log('Comparative Report:');
console.log(results.aggregatedOutput);Fan-Out with Failure Handling
const results = await client.executeFanOut({
tasks: largeDataset.map((item) => ({
prompt: `Process: ${item}`,
model: 'haiku',
timeoutMs: 10_000,
})),
concurrency: 20,
failureStrategy: 'continue', // Keep going even if some fail
timeoutMs: 300_000,
});
if (results.status === 'partial') {
console.warn(`${results.summary.failed} tasks failed`);
const failed = results.tasks.filter((t) => t.status === 'failed');
failed.forEach((t) => console.error(t.id, t.error));
}Failure Strategies
| Strategy | Behavior |
|---|---|
continue | All tasks run regardless of failures. Returns partial if any failed. |
abort-all | Cancel all tasks (including running) on first failure. |
abort-remaining | Let running tasks finish, but do not start new ones on failure. |
Multi-Model Fan-Out
Run the same prompt across different models for comparison:
const models = ['opus', 'sonnet', 'gpt4o', 'gemini-pro', 'llama-70b'];
const results = await client.executeFanOut({
tasks: models.map((model) => ({
prompt: 'Explain quantum entanglement to a 10-year-old in 3 sentences.',
model,
tags: ['model-comparison', model],
})),
concurrency: 5,
aggregate: {
prompt: `Rate each model response on clarity (1-10), accuracy (1-10), and engagement (1-10).
Declare a winner with justification.`,
model: 'opus',
},
});
console.log(results.aggregatedOutput);Fan-out tasks are distributed across all available workers in the specified queue. For maximum parallelism, ensure you have enough workers to handle the concurrency level.
Performance Tips
- Set
concurrencybased on your worker pool size. Excess concurrency queues tasks without benefit. - Use
haikuor fast models for high-volume fan-out to minimize cost and latency. - For datasets larger than 100 items, consider chunking into multiple fan-out calls.
- The aggregation step runs after all tasks complete. Keep individual task outputs concise to avoid hitting context limits on the aggregation model.
Next Steps
- HITL -- Add human approval to fan-out results
- Multi-Agent Patterns -- Advanced orchestration patterns