For some particular uses cases you might want to only scrape and save unique or new urls found from a crawl session. E.g if you are scraping a property listing site and you only want to find new properties and save them once

For this use case we created this feature for you

How to use

Step 1

Go to Advanced settings

Step 2

CLick on the "FInd & Save unique urls once. Once this is clicked only new urls will be saved. If any links are found that were previously crawled in a past session it will be ignored

If you would like to either reset and start again use the reset option under the "More" button

Did this answer your question?