XML Sitemap Issue or not?
-
Hi Everyone,
I submitted a sitemap within the google webmaster tools and I had a warning message of 38 issues.
Issue: Url blocked by robots.txt.
Description: Sitemap contains urls which are blocked by robots.txt.
Example: the ones that were given were urls that we don't want them to be indexed: Sitemap: www.example.org/author.xml
Value: http://www.example.org/author/admin/
My issue here is that the number of URL indexed is pretty low and I know for a fact that Robot.txt aren't good especially if they block URL that needs to be indexed. Apparently the URLs that are blocked seem to be URLs that we don't to be indexed but it doesn't display all URLs that are blocked.
Do you think i m having a major problem or everything is fine?What should I do? How can I fix it?
FYI: Wordpress is what we use for our website
Thanks
-
Hi Dan
Thanks for your answer. Would you really recommend using the plugin instead of just uploading the xml sitemap directly to the website's root directory? If yes why?
Thanks
-
Lisa
I would honestly switch to the Yoast SEO plugin. It handles the SEO (and robots.txt) a lot better, as well as the XML sitemaps all within that one plugin.
I'd check out my guide for setting up WordPress for SEO on the moz blog.
Most WP robots.txt files will look like this;
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/
And that's it.
You could always just try changing yours to the above setting first,
before switching to Yoast SEO - I bet that would clear up
the sitemap issues.
Hope that helps!
-Dan ```
-
Lisa, try checking manually which URL is not getting indexed in Google. Make sure you do not have any no follows on those pages. If all the pages are connected / linked together, then Google will crawl your whole site eventually, just a matter of time.
-
Hi
when generating sitemap there are 46 URLs detected by xml-sitemaps.com but when adding the sitemap to WMT only 12 get submitted and 5 are indexed which is really kind of worrying me. This might be because of the xml sitemap plugin that I installed. May be something is wrong with my settings(doc attached 1&2)
I am kind of lost especially that SEOmoz hasn't detected any URLs blocked by Robot.txt
It would be great if you could tell me what should I do next ?
Thanks
-
The first question i would ask is how big is the difference. If the difference is a large in the # of pages on your site and the ones indexed by Google, then you have an issue. The blocked pages might be the ones linking to the ones that have not been indexed and causing issues. Try removing the no follow on those pages and then resubmit your sitemap and see if that fixes the issue. Also double check your site map to make sure you have correctly added all the pages in it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
XML sitemap and rel alternate hreflang requirements for Google Shopping
Our company implemented Google Shopping for our site for multiple countries, currencies and languages. Every combination of language and country is accessible via a url path and for all site pages, not just the pages with products for sale. I was not part of the project. We support 18 languages and 14 shop countries. When the project was finished we had a total of 240 language/country combinations listed in our rel alternate hreflang tags for every page and 240 language/country combinations in our XML sitemap for each page and canonicals are unique for every one of these page. My concern is with duplicate content. Also I can see odd language/country url combinations (like a country with a language spoken by a very low percentage of people in that country) are being crawled, indexed, and appearing in serps. This uses up my crawl budget for pages I don't care about. I don't this it is wise to disallow urls in robots.txt for that we are simultaneously listing in the XML sitemap. Is it true that these are requirements for Google Shopping to have XML sitemap and rel alternate hreflang for every language/country combination?
Technical SEO | | awilliams_kingston0 -
Duplicate Content Issues with Pagination
Hi Moz Community, We're an eCommerce site so we have a lot of pagination issues but we were able to fix them using the rel=next and rel=prev tags. However, our pages have an option to view 60 items or 180 items at a time. This is now causing duplicate content problems when for example page 2 of the 180 item view is the same as page 4 of the 60 item view. (URL examples below) Wondering if we should just add a canonical tag going to the the main view all page to every page in the paginated series to get ride of this issue. https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2 https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4 Thoughts, ideas or suggestions are welcome. Thanks
Technical SEO | | znotes0 -
Sitemap Question - E-commerce - Magento
Good Morning... I have an ecommerce site running on Magento and the sitemap is automatically generated by Magento based on the categories and sub categories and products. I have recently created new categories that i want to replace the old categories, but they are both in the auto-generated sitemap. The old categories are "active" (as in still exist if you know the URL to type) but not visible (you can't find it just by navigating through the site). The new category pages are active and visible... If i want Google to rank one page (the new category page) and not the old page (old category page) should i remove the old page from the sitemap? Would removing the old page that used to target the same keywords improve my rankings on the newer category page? Sitemap currently contains: www.example.com/oldcategorypage www.example.com/newcategorypage Did I confuse you yet? Any help or guidance is appreciated. Thanks,
Technical SEO | | Prime850 -
Best practice for XML sitemap depth
We run an eCommerce for education products with 20 or so subject based catalogues (Maths, Literacy etc) and each catalogue having numerous ranges (Counting, Maths Games etc) then products within those. We carry approximately 15,000 products. My question is around the sitemap we submit - nightly - and it's depth. It is currently set to cover off home, catalogues and ranges plus all static content (about us etc). Should we be submitting sitemaps to include product pages as well? Does it matter or would it not make much difference in terms of search. Thanks in advance.
Technical SEO | | TTS_Group0 -
What's the issue?
Hi, We have a client who dropped in the rankings (initially from bottom of the first page to page to page 3, and now page 5) for a single keyword (their most important one - targeted on their homepage) back in the middle of March. So far, we've found that the issue isn't the following: Keyword stuffing on the page External anchor text pointing to the page Internal anchor text pointing to the page In addition to the above, the drop didn't coincide with panda or penguin. Any other ideas as to what could cause such a drop for a single keyword (other related rankings haven't moved). We're starting to think that this may just have been another small change in the algorithm but it seems like too big of a drop in a short space of time for that to be the case. Any thoughts would be much appreciated! Thanks.
Technical SEO | | jasarrow0 -
Xml Sitemap
Hi mozzers, I am about to submit a sitemap for one of my clients via webmaster tools. The issue is that I have way too many urls that I don't want them to be indexed by Google such as testing pages, auto generated pages... Is there way to remove certain URL from the XML sitemap or is this impossible? If impossible, is the only way to control these urls is to "No index" all these pages that i don't want the search engine to see? Thanks Mozzers,
Technical SEO | | Ideas-Money-Art0 -
Any issues with lots of pages issuing 301 redirects?
Hi all, I'm working on a site redesign and it is possible the new site could issue a lot of 301 redirects as we may migrate from one forum solution to another. Is there any issue with crawlers getting a lot of 301 redirects from a site? Thanks Nick
Technical SEO | | nickswan0 -
Should XML sitemaps include *all* pages or just the deeper ones?
Hi guys, Ok this is a bit of a sitemap 101 question but I cant find a definitive answer: When we're running out XML sitemaps for google to chew on (we're talking ecommerce and directory sites with many pages inside sub-categories here) is there any point in mentioning the homepage or even the second level pages? We know google is crawling and indexing those and we're thinking we should trim the fat and just send a map of the bottom level pages. What do you think?
Technical SEO | | timwills0