Can we listed URL on Website sitemap page which are blocked by Robots.txt
-
Hi,
I need your help here.
I have a website, and few pages are created for country specific. (www.example.com/uk).
I have blocked many country specific pages from Robots.txt file. It is advisable to listed those urls (blocked by robots.txt) on my website sitemap. (html sitemap page)
I really appreciate your help.
Thanks,
Nilay
-
if the content is of benefit to the user then include them in your navigation. Why are you blocking them in the first place, duplicate content?
-
Hi Zora,
But the pages which i have blocked are only visible in specific country. Addition, i have blocked theme also, so you think it's good to put those url on website or link them in website?
-
Hi Zora,
Thanks for your time.
-
Hi Jarno,
Thanks for your time.
Should i put URLs anywhere in my websites which are being excluded by Robot.txt?
Thanks,
Nilay
-
Hi Nilay,
I actually did this yesterday by accident.
I recommend you remove the blocked pages from your XML sitemap, otherwise Google will display a "warning" after you submit it.As far as your HTML sitemap, it does not really matter.
I think you are okay to keep the links there. -
Nilay,
if you have blocked them from your robots.txt but you do enclude them in your sitemap xml or html then they will be indexed unless you enclode a meta robots in it with a noindex tag. If that tag is not in the pages and you enclude that page in your sitemap Google will feel it as important content in list it in the SERPs.
Hope this helps
Regards
Jarno
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I set paramters to maximize the number of producs on all category pages for a retail website?
We have a retail website with over 20K products and do many things to reduce duplicate content including canonicals & parameters. In setting up parameters in Google Search Console, we have told Google to ignore the option to list more products, so the category pages that are indexed have only the default 12 products on them. I don’t want to create too much duplicate product content by doing this, but if there are only 12 products on a category page, the content is very thin Is it better to have as much product content as possible? My question: Should we set the parameter to follow the default (12 products) or should set it to 200 so there are more products associated with the category page?
On-Page Optimization | | apsltd1 -
Page/Website Structure
Hello again Mozzers, We have a category, lets call it widgets. Within widgets are about a hundred or so products. For usability my predecessor made the following layout Widgets Main Cateogry - Links off homepage - (no content just links to the 3 sub-categories)
On-Page Optimization | | ATP
- Widgets by Resolution
---- About 20 subcategories
eg. 0.1 Resolution widgets
0.2 resolution widgets
- Widgets by Capacity
---- About 20 subcategories
eg. 1 capacity widgets
2 capacity widgets
- Widgets by Type
---- About 12 subcategories This was a major improvement from a userbility perspective as it made a very complex product range navigatable by the major features or basic type. However, as you can imaging we now have 60+ very similiar pages all displaying very similiar products a nightmare for SEO. It also isnt ideal for user navigation as it take too many clicks to get to the products. I propose the following fix, and i wanted your opinion. Widget Main Category - Link from homepage (Consolidated with Widgets by Type)
-300 Words of content
-Links to the 12 Sub-type Catoregies (These are pages i can fill with content + products. This would give me a more ordinary structure of which I can focus each page to a keyword) The tricky part comes with incorporating the capacity and resolution options. 1 Browse Capacity Page
(20 sub categories all the same except capacity quantity & products)
1 Browse by Resolution Page
(20 sub categories all the same except resolution value & products) The owner want them, I was going to link from the main widgets page to each of these to give the customer the option. What I can't decide is how to deal with them from an SEO point of view. Should they be no-followed? canonicaled? Can there be any advantage to having so many pages covering slightly different variations or as i suspect it is dangerous to the overall health of the site. To complicate things further, Canonical tags may not be an option due to an old magento version running that doesnt support them. Is there an alternative way around? As always many thanks.0 -
How can i change my landing page title in search engines?
Hi SEO folks, Please help! I've changed my home page title 30 days ago, but my Google search results is still showing the old one! Why is that happening? can I've a brief explanation please so i can learn. thanks a million cRnPa6d
On-Page Optimization | | aptustelecom0 -
Google Indexed = 35, 445 pages, Bing Indexed = 243 pages... Why?
Dear MozSquad, Can anyone check our site and let me know if there's anything super apparent that would cause Bing to treat us like a bum on the street? I recently made some structural changes which really helped with Google, but Bing didn't even budge. It's a lot harder to keep up with all the SEO initiatives I have in mind with it being a small start-up where I'm responsible for planning the entire Internet Marketing campaign, giving constant input on UX and site design, etc on top of 900 other things, so I figured it'd be a good time to use The Moz to help a brother out. Ideas? Domain: homeandgardendesignideas.com (yeah, I know it's a little long =P)
On-Page Optimization | | zDucketz0 -
Page architecture
We have some good content on our site, particularly relating to UK employment law. One section on unfair dismissal is split into 9 pages - there is a fair amount of legal detail. The question is whether we should combine it all into one "mother of all unfair dismissal" page just to satisfy the Google monster or keep in as it is. Some of the individual pages rank on page 1 already. If we change the architecture are 301 redirects the best way to handle the changing urls? The other more important issue is whether it is easier to read it all on one page or split it. Keeping G happy may not actually keep our users happy. As the content is quite dense we want to ensure we don't overload people. Any thoughts appreciated.
On-Page Optimization | | dexm100 -
Home Page
We are re-design our home page, one are of the current home page has a drop down window called "popular products" . We wrote short articles for our keywords and have them linked to product page. In the past, it has helped us rank. However, with new Google rules, our feeling is that such practice is no good. So, we lean towards to remove it. Still, we'd like to hear some opinions and ask some questions too: www.butterflycraze.com is it clear to you that this is not good in Google's eyes? how do I determine if these links serving any SEO purpose now after Panda? depend on the answer to 2), what should we do about these pages? shall be re-direct or shall we remove them from Google index?
On-Page Optimization | | ypl0 -
Is there a SEO penalty for multi links on same page going to same destination page?
Hi, Just a quick note. I hope you are able to assist. To cut a long story short, on the page below http://www.bookbluemountains.com.au/ -> Features Specials & Packages (middle column) we have 3 links per special going to the same page.
On-Page Optimization | | daveupton
1. Header is linked
2. Click on image link - currently with a no follow
3. 'More info' under the description paragraph is linked too - currently with a no follow Two arguments are as follows:
1. The reason we do not follow all 3 links is to reduce too many links which may appear spammy to Google. 2. Counter argument:
The point above has some validity, However, using no follow is basically telling the search engines that the webmaster “does not trust or doesn’t take responsibility” for what is behind the link, something you don’t want to do within your own website. There is no penalty as such for having too many links, the search engines will generally not worry after a certain number.. nothing that would concern this business though. I would suggest changing the no follow links a.s.a.p. Could you please advise thoughts. Many thanks Dave Upton [long signature removed by staff]0 -
How many urls per page is to many
I know it used to be 100 urls per page, but recently Matt cutts has said that they can count a lot more now. I was wonder what you guys thought was how many was to many per page?
On-Page Optimization | | Gordian0