- Project settings
- Create Sitemap - Quick start
- HTML Sitemap
- Add URL
- Generate URLs
- Submit Sitemap
The crawler will scan your website and automatically add all found URLs in a Sitemap file. You only need to enter the website address and press START button. The crawler settings could be changed in Crawler Settings.
The crawler could be Minimized to Tray during the scan process. The crawler work could be stopped and continued any time by clicking STOP/START.
Here you can set the number or crawlers that will scan the website in an anisochronous manner. (Number of Crawlers). As many crawlers as soon the Crawler will scan the site, though overloading the internet connection and website server. Recommended number of Crawlers is five.
In the report you can see also the number of pages in queue for scanning, as well as indexed webpages and the list of webpages currently beeing scanned.
List of errors
Add to queue - add the webpage back to queue for scanning.
Add to Sitemap - add the webpage to Sitemap right away.
Go to page - open the webpage in a default browser.
Copy - copy to clipboard the webpage address.
You can also export the list of errors in CSV file (comma-separated values) by clicking EXPORT THE LIST OF ERRORS.
The number of webpages that were successfully indexed and added to Sitemap list. It is also possible to view the list of webpages added to Sitemap by the crawler.