Chain actors into automated pipelines
Pipeline Preflight is a pipeline validation and code generation tool that checks field mappings between Apify actor stages, detects type mismatches, and generates complete TypeScript orchestration code — all for $0.35 per build. Validates that output from stage N maps correctly to input of stage N+1 using real actor schemas, then generates Actor.call() chains with dataset forwarding.
Multi-actor pipelines break at stage boundaries when field names don't match, types are incompatible, or required fields are missing. ApifyForge Pipeline Builder catches these issues before you write any orchestration code — and generates the code for you when the pipeline validates.
Checks that output fields from stage N exist in stage N+1's input schema. Catches broken field mappings that would cause silent failures or missing data at runtime.
Flags field mappings where output type doesn't match input type — for example, a string output mapped to an array input. Suggests transformations when possible.
Generates complete orchestration code using the Apify SDK's Actor.call() method with dataset forwarding between stages, error handling, and structured logging.
Sums PPE prices across all pipeline stages for per-run cost projections. Also projects monthly costs at 100 and 1,000 runs for budget planning.
Shows input fields, output fields, PPE price, default memory allocation, and timeout for every stage. Full visibility into what each actor expects and produces.
Fetches input and output schemas from each actor's latest build via the Apify API. Validates against real, current schemas — not assumptions or documentation.
There are several ways to build multi-actor data pipelines on Apify. Each trades off validation depth, code generation, and setup time.
| Method | Field validation | Code generation | Cost |
|---|---|---|---|
| Pipeline Preflight | Automated field + type validation | Complete TypeScript with Actor.call() | $0.35/build |
| Manual Actor.call() coding | Manual schema review | Manual (2-4 hours) | Free (development time) |
| Webhook chaining | No validation | Manual webhook configuration | Free |
| Trial-and-error pipeline testing | Runtime errors only | Manual debugging | Compute cost per failed run |
{
"stages": 3,
"valid": true,
"warnings": ["Stage 2: field 'rating' not in Stage 1 output"],
"generatedCode": "import { Actor } from 'apify';\n\nActor.main(async () => {\n const run1 = await Actor.call(...);\n ...\n});",
"costEstimate": {
"perRun": 0.45,
"monthly100": 45.00,
"monthly1000": 450.00
}
}Define your pipeline stages with actor IDs and field mappings
Pipeline Preflight validates field mappings and type compatibility across all stages
Get validated pipeline with generated TypeScript code and cost estimates
Several approaches exist for building multi-actor data pipelines on Apify, from manual coding to webhook chains.
Write TypeScript orchestration code manually by reading each actor's schema, mapping fields, and implementing error handling. Full control but requires 2-4 hours per pipeline and manual schema review with no automated type checking.
Best for: experienced Apify developers building complex pipelines with custom logic.
Configure Apify webhooks to trigger the next actor when the previous one completes. Simple to set up but provides no field mapping validation, no type checking, and limited error handling. Debugging failures requires checking each stage independently.
Best for: simple two-stage pipelines with straightforward data forwarding.
Create multiple Apify tasks and trigger them sequentially via the API or webhooks. More structured than raw webhook chaining but still requires manual field mapping and has no built-in validation or code generation.
Best for: pipelines where each stage uses the same actor with different inputs.
Use established workflow orchestration tools to manage Apify actor pipelines. Powerful but requires significant infrastructure setup, DevOps expertise, and ongoing maintenance. Overkill for most Apify-only pipelines.
Best for: enterprise teams with existing orchestration infrastructure and complex scheduling needs.
Automated pipeline validation with field mapping checks, type mismatch detection, and TypeScript code generation. $0.35 per build. No manual schema review or code writing required — get a validated, runnable pipeline in seconds.
Best for: developers who want validated, working pipeline code without manual schema inspection.
Every pipeline build executes on your own Apify account at the standard pay-per-event rate of $0.35 per build. ApifyForge has no platform fee or subscription. Apify's free plan includes $5/month in credits, enough for 14 pipeline builds per month.
Pipeline Preflight validates multi-actor data pipelines by checking that output fields from each stage map correctly to input fields of the next stage. It detects type mismatches, missing fields, and schema incompatibilities. Then it generates complete TypeScript orchestration code using Actor.call() chains with dataset forwarding between stages, plus cost estimates for per-run and monthly operations.
Each Pipeline Preflight run costs $0.35, charged as a pay-per-event (PPE) fee on your own Apify account. The tool reads actor schemas from the Apify API — it does not trigger any actor runs. Apify's free tier includes $5/month in credits, enough for approximately 14 pipeline builds per month.
Pipeline Preflight generates TypeScript code using the Apify SDK's Actor.call() method. The generated code includes proper dataset forwarding between stages, error handling, and logging. You can use it directly in a new Apify actor or adapt it for use in external Node.js/TypeScript applications.
Pipeline Preflight fetches the input and output schemas from each actor's latest build. It checks that output fields from stage N exist as input fields in stage N+1's schema and that their types are compatible. For example, if stage 1 outputs 'urls' as an array but stage 2 expects 'url' as a string, the builder flags both a name mismatch and a type mismatch with a suggested transformation.
Yes. Pipeline Preflight supports pipelines with any number of stages. Each stage-to-stage transition is validated independently. A 5-stage pipeline produces 4 transition validations, each checking field mappings and type compatibility. The generated TypeScript code chains all stages sequentially with proper error handling at each step.
Pipeline Preflight's cost estimate sums the PPE prices across all stages for per-run projections. It also projects monthly costs at 100 and 1,000 runs. Compute costs (memory, duration) are not included in the estimate because they depend on input data size and target site behavior. Use ApifyForge Cost Calculator on each stage actor for compute cost projections.
Yes. Pipeline Preflight reads schemas from any public Apify Store actor. You can mix your own actors with third-party Store actors in the same pipeline. The field mapping validation works the same regardless of actor ownership — it reads the declared input and output schemas from each actor's latest build.
Pipeline Preflight requires actors to have declared input_schema.json and/or dataset_schema.json to perform field mapping validation. Actors without declared schemas are flagged with a warning. You can still build the pipeline, but field mapping validation will be limited to stages where schemas are available. Approximately 15% of Apify Store actors lack output schemas.