Can we listed URL on Website sitemap page which are blocked by Robots.txt
-
Hi,
I need your help here.
I have a website, and few pages are created for country specific. (www.example.com/uk).
I have blocked many country specific pages from Robots.txt file. It is advisable to listed those urls (blocked by robots.txt) on my website sitemap. (html sitemap page)
I really appreciate your help.
Thanks,
Nilay
-
if the content is of benefit to the user then include them in your navigation. Why are you blocking them in the first place, duplicate content?
-
Hi Zora,
But the pages which i have blocked are only visible in specific country. Addition, i have blocked theme also, so you think it's good to put those url on website or link them in website?
-
Hi Zora,
Thanks for your time.
-
Hi Jarno,
Thanks for your time.
Should i put URLs anywhere in my websites which are being excluded by Robot.txt?
Thanks,
Nilay
-
Hi Nilay,
I actually did this yesterday by accident.
I recommend you remove the blocked pages from your XML sitemap, otherwise Google will display a "warning" after you submit it.As far as your HTML sitemap, it does not really matter.
I think you are okay to keep the links there. -
Nilay,
if you have blocked them from your robots.txt but you do enclude them in your sitemap xml or html then they will be indexed unless you enclode a meta robots in it with a noindex tag. If that tag is not in the pages and you enclude that page in your sitemap Google will feel it as important content in list it in the SERPs.
Hope this helps
Regards
Jarno
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hi i have a few pages with duplicate content but we've added canonical urls to them, but i need help understanding what going on
hi google is seeing many of our pages and dupliates but they have canonical url on there https://www.hijabgem.com/index.php/maxi-shirt-dress.html has tags https://www.hijabgem.com/maxi-shirt-dress.html
On-Page Optimization | | hijabgem
has tagshttps://www.hijabgem.com/index.php/quickview/index/view/id/4693
has tags
my question is which page takes authority?and are they setup correct, can you have more than one link rel="canonical" on one page?0 -
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks,
On-Page Optimization | | Aquatell1 -
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Page Rank Lost After Website Transfer
Hello, I recently transferred a website from Ruby on Rails to Wordpress in order to help with the site's responsiveness. I kept all of the URLs the exact same, but when I transferred it only the homepage and one other page kept their page ranks, all of the others have lost theirs completely. Almost every page on the site was at least a PR 2, and now their not ranked, and not ranking in the top of search engines like they were before. I am not quite sure what might have caused this, like I said, I kept the URLs all the exact same and its strange that 2 pages were not effected. I crawled the site with Screaming From and all of the pages that lost their page rank have a status as "Moved Permanently". They are the exact same URL, but they are listed without www. in front, could this have something to do with it? The website is www.goenergylink.com, and you will see that the homepage and "Residential" page still have their page rank, but none of the others do. Any help would be GREATLY appreciated! Thanks
On-Page Optimization | | mcmckenna10 -
Login webpage blocked by robots
Hi, the SEOMOZ crawl diagnostics shows that this page: www.tarifakitesurfcamp.com/wp-login.php is blocked (noindex, nofollow) Is there any problem with that?
On-Page Optimization | | juanmiguelcr0 -
Would I be safe canonicalizing comments pages on the first page?
We are building comment pages for an article site that live on a separate URL from the article (I know this is not ideal, but it is necessary). Each comments page will have a summary of the article at the top. Would I be safe using the first page of comments as the canonical URL for all subsequent comment pages? Or could I get away with using the actual article page as the canonical URL for all comment pages?
On-Page Optimization | | BostonWright0 -
Should one page with markers or six separate pages?
Hi - I'm working on a site that was set up with 6 bios on one page, with markers jumping to each person's name. I was thinking about separating those into 6 different pages, but not sure if that's the right thing to do. Advice about keeping the bios on one page vs splitting them up? (Am I more likely to rank for those peoples' names if I have a unique page, or is the one page url with each different marker in it, just as good?) Ranking well for those names isn't a huge goal of the site, but it would be nice to make the choice that would help with that rank. Thanks for your input Emma
On-Page Optimization | | emmas0 -
A good title for each page on my website.
Dear SEOs, I know these facts: a) not more than 70 chars b) relevant to the page subject - probably best keyword at the beginning My problem - should I rebuild the current structure of my titles which is: Main Page keywords (in my case 4) | domain name - Main Page Sub-page - one click from main page keywords (again same 4) | domain name - Sub Page Name (or as you like Category Name) Specific page with an article / content keywords (again same 4 as on any other) | domain name - category name - title of article same as used in tag To be more precise, should this title be changed for following pages: Sub Page - where really it's page that let's you chose particular subjects (there will be few article for each subjects) so should the keywords be changed to - "chose your category" Specific page with an article - should the category name - title of article be completely removed and first part of title which at the moment contains same 4 keywords used on main page be replaced with keywords specific for the content ? To be honest I find those titles at the moment bit confusing and sort of illogical. But since I can't really change anything in the code I would like to know what's the right way before I keep pressing on the programmer 🙂 Thanks.
On-Page Optimization | | lolskizz0