Webhook

An Apify Webhook is an HTTP callback that Apify triggers automatically when specific actor events occur, enabling real-time integrations between your actors and external services without polling. When a configured event fires — such as an actor run succeeding, failing, timing out, or a build completing — Apify sends an HTTP POST request to your specified URL with a JSON payload containing the run ID, actor ID, status, dataset ID, timing information, and other metadata. Webhooks are the primary mechanism for building automated, event-driven pipelines on the Apify platform. Webhooks matter because modern data workflows are rarely single-step operations. A typical pipeline might scrape product data, enrich it with additional information, load it into a database, and notify a team via Slack — all triggered automatically without human intervention. Without webhooks, you would need to poll the Apify API repeatedly to check if a run finished, which wastes resources, adds latency, and is fragile. Webhooks deliver instant notifications the moment an event occurs, enabling real-time automation. Supported webhook events include: ACTOR.RUN.SUCCEEDED (run completed normally), ACTOR.RUN.FAILED (run crashed), ACTOR.RUN.TIMED_OUT (run exceeded timeout), ACTOR.RUN.ABORTED (run manually stopped), ACTOR.RUN.CREATED (run queued), ACTOR.RUN.STARTED (run began executing), and ACTOR.BUILD.SUCCEEDED and ACTOR.BUILD.FAILED for build events. You can configure a webhook for a single event or multiple events. To configure a webhook via the Apify Console: navigate to your actor, go to the Integrations tab, click Add Webhook, select the event(s), enter your endpoint URL, and optionally customize the payload template. Via the API: POST /v2/webhooks with a body like: { 'eventTypes': ['ACTOR.RUN.SUCCEEDED', 'ACTOR.RUN.FAILED'], 'condition': { 'actorId': 'your-actor-id' }, 'requestUrl': 'https://your-service.com/apify-webhook', 'payloadTemplate': '{"actorId": {{actorId}}, "runId": {{runId}}, "status": {{status}}, "datasetId": {{defaultDatasetId}}}' }. Via the Apify client: const webhook = await client.webhooks().create({ eventTypes: ['ACTOR.RUN.SUCCEEDED'], condition: { actorId: 'actor-id' }, requestUrl: 'https://your-endpoint.com/hook' }); The payload template supports Mustache-like variables: {{actorId}}, {{runId}}, {{status}}, {{startedAt}}, {{finishedAt}}, {{defaultDatasetId}}, {{defaultKeyValueStoreId}}, and more. Customize the template to include exactly the fields your endpoint needs, reducing parsing work on the receiving side. Common webhook use cases include: sending Slack notifications when a scraper fails (use event ACTOR.RUN.FAILED with a Slack incoming webhook URL), triggering a downstream actor when an upstream run succeeds (use Actor.call() in the webhook handler to chain actors), pushing dataset results to Google Sheets via a middleware function, updating a database with freshly scraped data, refreshing a dashboard or cache, and alerting a monitoring system (PagerDuty, Datadog) about repeated failures. Common mistakes include not handling webhook retries. If your endpoint returns an HTTP error (4xx or 5xx) or times out, Apify retries the webhook up to 5 times with exponential backoff (approximately 1 minute, 2 minutes, 4 minutes, 8 minutes, 16 minutes). Your endpoint must be idempotent — processing the same webhook twice should produce the same result, not duplicate data. Use the runId as a deduplication key. Another mistake is not filtering events: if you only care about failures, configure the webhook for ACTOR.RUN.FAILED only, rather than all events. Processing every SUCCEEDED event when you only need failures creates unnecessary load on your endpoint. Not validating webhook authenticity is a security concern. Apify does not sign webhook payloads with a shared secret by default, so any HTTP client could forge a webhook POST to your endpoint. Protect your endpoint by checking the X-Apify-Webhook-Dispatch-Id header, using an obscure URL path, or implementing IP allowlisting for Apify's webhook delivery IPs. For complex multi-step pipelines, combine webhooks with Actor.call() and Actor.metamorph() to build sophisticated DAG-style workflows where each stage triggers the next automatically. Related concepts: Actor Run, Actor Build, MCP Server, Dataset, Key-Value Store.

Related Terms