The crawler will scan your website and automatically add all found URLs in a Sitemap file. You only need to enter the website address and press START button. The crawler settings could be changed in Crawler Settings.

The crawler could be Minimized to Tray during the scan process. The crawler work could be stopped and continued any time by clicking STOP/START.

Crawler report

Here you can set the number or crawlers that will scan the website in an anisochronous manner. (Number of Crawlers). As many crawlers as soon the Crawler will scan the site, though overloading the internet connection and website server. Recommended number of Crawlers is five.
In the report you can see also the number of pages in queue for scanning, as well as indexed webpages and the list of webpages currently beeing scanned.

List of errors

In this area the crawler shows the list of errors discovered during scan process. Those could be the webpages that do not exist any more or currently unavailable. javascript commands, etc. Right click the error row to open the following menu:
Crawler - Errors

Add to queue - add the webpage back to queue for scanning.
Add to Sitemap - add the webpage to Sitemap right away.
Go to page - open the webpage in a default browser.
Copy - copy to clipboard the webpage address.

You can also export the list of errors in CSV file (comma-separated values) by clicking EXPORT THE LIST OF ERRORS.

URL Count

The number of webpages that were successfully indexed and added to Sitemap list. It is also possible to view the list of webpages added to Sitemap by the crawler.