Can we listed URL on Website sitemap page which are blocked by Robots.txt
-
Hi,
I need your help here.
I have a website, and few pages are created for country specific. (www.example.com/uk).
I have blocked many country specific pages from Robots.txt file. It is advisable to listed those urls (blocked by robots.txt) on my website sitemap. (html sitemap page)
I really appreciate your help.
Thanks,
Nilay
-
if the content is of benefit to the user then include them in your navigation. Why are you blocking them in the first place, duplicate content?
-
Hi Zora,
But the pages which i have blocked are only visible in specific country. Addition, i have blocked theme also, so you think it's good to put those url on website or link them in website?
-
Hi Zora,
Thanks for your time.
-
Hi Jarno,
Thanks for your time.
Should i put URLs anywhere in my websites which are being excluded by Robot.txt?
Thanks,
Nilay
-
Hi Nilay,
I actually did this yesterday by accident.
I recommend you remove the blocked pages from your XML sitemap, otherwise Google will display a "warning" after you submit it.As far as your HTML sitemap, it does not really matter.
I think you are okay to keep the links there. -
Nilay,
if you have blocked them from your robots.txt but you do enclude them in your sitemap xml or html then they will be indexed unless you enclode a meta robots in it with a noindex tag. If that tag is not in the pages and you enclude that page in your sitemap Google will feel it as important content in list it in the SERPs.
Hope this helps
Regards
Jarno
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I create multiple page URLs that are optimized for location and keywords that may be overlapping or the same?
Hi guys, I am attempting to create unique URLs for several different pages on a website. Let's say hypothetically that this is a website for a chain of Ice Cream Shops in Missouri. Let's say they have 15 locations in Springfield, Missouri. I would ideally like to optimize our Ice Cream Shop's in Springfield, Missouri with the main keyword (ice cream) but also the geo-specific location (Springfield), but we obviously can't have duplicate URLs for these 15 locations. We also have several secondary keywords, think things like: frozen yogurt or waffle cone that we can also use, although it would most likely be more powerful if we use the primary keyword. Any suggestions for how to go about doing this most effectively? Thanks!
On-Page Optimization | | GreenStone0 -
How to optimize WordPress Pages with Duplicate Page Content?
I found the non WWW ans WWW duplicate pages URL only, more than thousand pages.
On-Page Optimization | | eigital0 -
Why is this page not ranking?
Can you please tell me why this page is not ranking. http://goo.gl/BqoRT The page doesn't rank at all for keywords but even if I copy a line or 2 of text it still doesn't rank for that text. Any help will be much appreciated.
On-Page Optimization | | JillB20130 -
Too Many On-Page Links, Can You HELP???
This is the best architecture I found to help my visitors find there furnace filter size. Does it hurt my SEO? Index page has, 210 links and most other pages has, 190 links. Thank you, BigBlaze
On-Page Optimization | | BigBlaze2050 -
Why do I hear that it can be a bad thing to have too many content pages?
My site has lots of long tail opportunity and so intention is to produce as much content as possible to target these keywords effectively. However, over the last 6 months I have come across numerous sources suggesting that Google can/does look down on a site having too many content pages. Is there any truth in this? Thanks.
On-Page Optimization | | Clicksjim0 -
If a site has https versions of every page, will the search engines view them as duplicate pages?
A client's site has HTTPS versions of every page for their site and it is possible to view both http and https versions of the page. Do the search engines view this as duplicate content?
On-Page Optimization | | harryholmes0070 -
3 keywords optomize for home page. Should I create page with thoses keywords or leave it like this?
My online store home page, Furnace Filters Canada has 3 keywords with good ranking in google.ca keywords: ''furnace filters canada'' rank #1 position in google.ca keywords: ''furnace filters'' and ''furnace filter'' are on 5 or 6th position of page 1 in google.ca Those keywords are bringing most of the traffic to our site. To achieve this ranking, I had to use the On-Page Keyword Optimization, tool from seoMoz Questions: It is possible for me to create a page with the URL: https://www.furnacefilterscanada.com/Furnace-Filters or https://www.furnacefilterscanada.com/Furnace-Filter Can this improve my ranking with keywords like, ''furnace filters'' and ''furnace filter''? Is this a waist of time? If I decide to create a new page for optimization with, do I have to create one for singular and another one for plural? Creating a new page also mean removing, '' Furnace Filter'' in the home page title, until the new pages are index, I'm afraid to loss that 5th position in Google. Should I leave the home page title like it is now, '' Furnace Filter - Furnace Filters Canada - Online Shopping Store NOTE: we only do business in Canada, that is why Google.ca is more important to us Thank you, Jean Nichols
On-Page Optimization | | BigBlaze2050 -
What reasons exist to use noindex / robots.txt?
Hi everyone. I realise this may appear to be a bit of an obtuse question, but that's only because it is an obtuse question. What I'm after is a cataloguing of opinion - what reasons have SEOs had to implement noindex or add pages to their robots.txt on the sites they manage?
On-Page Optimization | | digitalstream0