Accept user input
This example accepts and logs user input:
Add data to dataset
This example saves data to the default dataset. If the dataset doesn't exist, it will be created. You can save data to custom datasets by using
Basic crawler
This is the most bare-bones example of the Apify SDK, which demonstrates some of its building blocks such as the
Call actor
This example demonstrates how to start an Apify actor using Apify.call() and how to call the Apify API using
Capture a screenshot
To run this example on the Apify Platform, select the apify/actor-node-puppeteer-chrome image for your Dockerfile.
Cheerio crawler
This example demonstrates how to use CheerioCrawler to crawl a list of URLs from an external file, load each URL using
Crawl all links on a website
This example uses the Apify.enqueueLinks() method to add new links to the RequestQueue as the crawler navigates from page to page. If only the
Crawl multiple URLs
This example crawls the specified list of URLs.
Crawl a website with relative links
If a website uses relative links, CheerioCrawler and Apify.enqueueLinks() may have trouble following them. This is
Crawl a single URL
This example uses the Apify.utils.requestAsBrowser() function to grab the HTML of a web page.
Crawl a sitemap
This example downloads and crawls the URLs from a sitemap.
Crawl some links on a website
This CheerioCrawler example uses the pseudoUrls property in the Apify.enqueueLinks() method
Forms
This example demonstrates how to use PuppeteerCrawler to automatically fill and submit a search form to look up
Handle broken links
This example uses the handleFailedRequestFunction function to log failed requests. In a real-world project, you might choose to keep track of these
Dataset Map and Reduce methods
This example shows an easy use-case of the Apify dataset map and
Playwright crawler
This example demonstrates how to use PlaywrightCrawler in combination with RequestQueue to
Puppeteer crawler
This example demonstrates how to use PuppeteerCrawler in combination with RequestQueue
Puppeteer recursive crawl
Run the following example to perform a recursive crawl of a website using PuppeteerCrawler.
Puppeteer sitemap
This example demonstrates how to use PuppeteerCrawler to crawl a list of web pages specified in a sitemap. The
Puppeteer with proxy
This example demonstrates how to load pages in headless Chrome / Puppeteer over Apify Proxy. To make it work, you'll
Screenshots
This example demonstrates how to read and write data to the default key-value store using Apify.getValue() and
Synchronous run
This example shows a quick actor that has a run time of just a few seconds. It opens a web page (the Wikipedia home page),
Use stealth mode
Stealth mode allows you to bypass anti-scraping techniques that use