XML Sitemap Issue or not?
-
Hi Everyone,
I submitted a sitemap within the google webmaster tools and I had a warning message of 38 issues.
Issue: Url blocked by robots.txt.
Description: Sitemap contains urls which are blocked by robots.txt.
Example: the ones that were given were urls that we don't want them to be indexed: Sitemap: www.example.org/author.xml
Value: http://www.example.org/author/admin/
My issue here is that the number of URL indexed is pretty low and I know for a fact that Robot.txt aren't good especially if they block URL that needs to be indexed. Apparently the URLs that are blocked seem to be URLs that we don't to be indexed but it doesn't display all URLs that are blocked.
Do you think i m having a major problem or everything is fine?What should I do? How can I fix it?
FYI: Wordpress is what we use for our website
Thanks
-
Hi Dan
Thanks for your answer. Would you really recommend using the plugin instead of just uploading the xml sitemap directly to the website's root directory? If yes why?
Thanks
-
Lisa
I would honestly switch to the Yoast SEO plugin. It handles the SEO (and robots.txt) a lot better, as well as the XML sitemaps all within that one plugin.
I'd check out my guide for setting up WordPress for SEO on the moz blog.
Most WP robots.txt files will look like this;
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/
And that's it.
You could always just try changing yours to the above setting first,
before switching to Yoast SEO - I bet that would clear up
the sitemap issues.
Hope that helps!
-Dan ```
-
Lisa, try checking manually which URL is not getting indexed in Google. Make sure you do not have any no follows on those pages. If all the pages are connected / linked together, then Google will crawl your whole site eventually, just a matter of time.
-
Hi
when generating sitemap there are 46 URLs detected by xml-sitemaps.com but when adding the sitemap to WMT only 12 get submitted and 5 are indexed which is really kind of worrying me. This might be because of the xml sitemap plugin that I installed. May be something is wrong with my settings(doc attached 1&2)
I am kind of lost especially that SEOmoz hasn't detected any URLs blocked by Robot.txt
It would be great if you could tell me what should I do next ?
Thanks
-
The first question i would ask is how big is the difference. If the difference is a large in the # of pages on your site and the ones indexed by Google, then you have an issue. The blocked pages might be the ones linking to the ones that have not been indexed and causing issues. Try removing the no follow on those pages and then resubmit your sitemap and see if that fixes the issue. Also double check your site map to make sure you have correctly added all the pages in it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Include or exclude noindex urls in sitemap?
We just added tags to our pages with thin content. Should we include or exclude those urls from our sitemap.xml file? I've read conflicting recommendations.
Technical SEO | | vcj0 -
Breadcrumb issue
The site has 2 main categories for scooters. One category is Type of Scooter menu item with nested types and the second category is Manufacturer menu item with nest makes. So all the scooters can be found in either of these categories depending on how you search. The Manufacturer category is mainly thin content and set as noindex, as well as the nested makes categories. However when searching for products Google is invariably using the breadcrumb for the Manufacturer category rather than the Type of Scooter category, which is indexed. Should this be of concern Google using breadcrumbs of non indexed URLs, even if they are followed and therefore the site navigable?
Technical SEO | | MickEdwards0 -
Cross domain canonical issue
Hello fella SEOs! I have a very intriguing question concerning different TLDs across the same domain. For eg: www.mainwebsite.com, www.mainwebsite.eu, www.mainwebsite.au, www.mainwebsite.co.uk etc... Now, assuming that all these websites are similar in terms of content, will our lovely friend Google consider all these TLDs as only one and unique domain or will this cause a duplicate content problem? If yes, then how should I fix it? Thnx for your precious help guys!
Technical SEO | | SEObandits1 -
Duplicate Content Issue
SEOMOZ is giving me a number of duplicate content warnings related to pages that have an email a friend and/or email when back in stock versions of a page. I thought I had those blocked via my robots.txt file which contains the following... Disallow: /EmailaFriend.asp Disallow: /Email_Me_When_Back_In_Stock.asp I had thought that the robot.txt file would solve this issue. Anyone have any ideas?
Technical SEO | | WaterSkis.com0 -
How to handle this specific duplicate title issue
Part of my website is a directory of companies. Some of the companies have mane locations in the same city. For these listings titles and url's are like this: 1. Company ABC - Miami, FL http://www.website.com/florida/miami/company-abc-10001 2. Company ABC - Miami, FL http://www.website.com/florida/miami/company-abc-10002 What is the best way to fix this problem? Thank you
Technical SEO | | Boxes0 -
Siemap.xml appearing in SERP
My sitemap.xml was appearing in the google serp for certain keywords (& not my actual page onsite). Please see image. I recently blocked my sitemap.xml with a robots.txt exclusion but now the sitemap.xml is not getting crawled in google webmaster. Is this the correct method of excluding the sitemap.xml for the serp? User-agent: * Disallow: /assets/cache/ Disallow: /assets/docs/ Disallow: /assets/export/ Disallow: /assets/import/ Disallow: /assets/modules/ Disallow: /assets/plugins/ Disallow: /assets/snippets/ Disallow: /manager/ Disallow: /sitemap.xml Sitemap: http://bryansryan.ie/sitemap.xml Any suggestions what should be done here? thanks. nQo2g.png
Technical SEO | | Socialdude0 -
Duplicate Pages Issue
I noticed a problem and I was wondering if anyone knows how to fix it. I was a sitemap for 1oxygen.com, a site that has around 50 pages. The sitemap generator come back with over a 2000 pages. Here is two of the results: http://www.1oxygen.com/portableconcentrators/portableconcentrators/portableconcentrators/services/rentals.htm
Technical SEO | | chuck-layton
http://www.1oxygen.com/portableconcentrators/portableconcentrators/1oxygen/portableconcentrators/portableconcentrators/portableconcentrators/oxusportableconcentrator.htm These are actaully pages somehow. In my FTP there in the first /portableconentrators/ folder there is about 12 html documents and no other folders. It looks like it is creating a page for every possible folder combination. I have no idea why you those pages above actually work, help please???0 -
No Sub-Categories in XML Sitemap
I have a couple of sites using 3dcart, the ecommerce platform. Their tech support recently told me that they do not list sub-categories in the XML sitemap, only products and top-tier categories. Am I the only one that sees a problem with this? Thanks
Technical SEO | | poolguy0