Google Sitemap vs. Robots.txt
Do I need a robots.txt file of I have a sitemap?
Yes, you might. A Google sitemap provides a fundamentally different function from a robots file (or “robots.txt”). Your Google sitemap says what content could crawled, while a robots.txt file says what must not be crawled.
Google is crawling page that are not in my sitemap. Why?
Your sitemap provides hints to Google on what content to include when it crawls your website. It doesn’t stop Google from indexing pages that are not included in your sitemap. Google will continue to explore your website as it navigates your links. A sitemap file does not guarantee that Google will index all the URLs it contains.
How can I stop Google from crawling certain pages?
You can prevent Google and other search engines from crawling specific content by creating a robots file. This file ensures that Google will not crawl or index any content listed within it. If you don’t want something to appear in Google, you must include those URLs in your robots.txt file. You can use Inspyder Sitemap Creator to create a robots.txt file from your list of Excluded URLs.
About Inspyder Sitemap Creator
Inspyder Sitemap Creator is an easy to use Google sitemap generator that can also create properly formatted robots files. Sitemap Creator provides an affordable, effective way to create and update 100% Google compatible sitemaps, no matter how big your website is or how many websites you have.