A web crawler is a type of automated script that "crawls" the web, looking at pages, images and other content, usually run by search engines like Google and Bing, to find and index your content. These automated scripts, often referred to as a "bot", can technically be run by anyone.

SEO and performance tools like Ahrefs, SEObility, Morningscore, and many more, use web crawlers for many purposes such as testing page performance, detecting index status, finding page relationships and more.

Most crawlers identify themselves when crawling a site.

You can use a robots.txt file on your server to manage which folders and file types bots are supposed to look for.