Sitemap issues 19 warnings
-
Hi Guys
I seem to be having a lot of sitemap issues.
1. I have 3 top level domains, all with the co.nz sitemap that was submitted
2. I'm in the midst of a site re-design so I'm unsure if I should be updating these now or when the new site goes live (in two weeks)
3. I have 19 warnings from GWT for the co.nz site and they gave me 3 examples looks like 404 errors however I'm not too sure and a bit green on my behalf to find out where the issues are and how to fix them. (it is also showing that 95 pages submitted and only 53 were indexed)
4. I generated recently 2 sitemaps for .com and com.au submitted these both to google and when i create i still see the co.nz sitemap
Would love some guidance around this.
Thanks
-
Glad it was useful!
-
Oh you are a genius yourself Bob Thanks for the great information!
I will look into this and let you know how I go, thanks a bunch you have really helped me move this along and weed out all the confusion!
-
Hi Justin,
In that case I would ask your developer to make the sitemap on the website update automatically (or generate a new one every day). And submit that link to webmaster tools. If he's a real genius he could add your blog pages from Wordpress to this sitemap aswell but I'm not sure if Wordpress has a hook for this.
Alternative options:
- Let him make the automatically updated sitemap for the custom part of the website and use this combined with the sitemap from the yoast plugin. You can upload both separated in Google Webmaster Tools. Make sure both got their own URL. In this case it’s all automated and is just as good as the previous method.
- Keep on updating your sitemap manually. Just make sure you don't use the yoast sitemap and include the blogposts in your sitemap from screaming frog since this would give double input. If you choose to refresh your sitemap manually I would disable the sitemap within the Yoast plugin and use the Screaming frog sitemap which should include your blog pages aswell.
Good luck and let me know if this works for you!
-
Thanks a lot Dirk, your help has been tremendous to my SEO efforts!!!
-
Hi Bob
Thanks alot for your response!
That makes a lot of sense. We use Wordpress only for the blog, but the main site is custom built and doesn't have an yoast plugin.
So I'm not sure how that will work, when I create the site map with screaming frog do I need to include the blog pages in screaming frog if I'm using the yoast plugin?
Thanks again for your help!
-
Yep - you'll have to upload the file to the server first.
Bob's suggestion to generate the sitemap via the Yoast plugin is an excellent idea.
rgds
Dirk
-
Hi Justin,
Thanks for the screenshots. Dirk's suggestion about screaming frog should be really helpful. This should give you an insight in the true 404 errors that a bot can encounter while crawling through your internal site structure.
Based on what I see I think your main problem is the manual updated sitemap. Whenever you change a page, add a new one or mix up some categories those changes won't apply to your sitemap. This creates a 404 error while those pages aren't linked to from your website and (without a sitemap) wouldn't give any 404 error messages in Google Webmaster Tools.
I saw you were using SEO by Yoast already, I suggest using their sitemap functionality. That should resolve the problem and save you work in the future since there is no need to manually update your sitemap again.
Let me know if this works!
-
Hi Justin,
Could you post a screenshot of the error message and any links pointing to this URL? This way we can identify what pages return a 404. If this are important pages on your website I would fix it right now, if it however are pages you don’t use or your visitors rarely see I would make sure you pick this up with the redesign. No point in fixing this now if things will change in the near future. Besides that, sitemaps help you get your website indexed, releasing this two weeks earlier won’t make a big difference for the number of indexed pages since you won’t change your internal link structure and website authority (both help you get more pages indexed).
About your last point, could you provide me with a screenshot of this as well? When I check zenory.com/sitemap.xml I find the .com sitemap, so that part seems fine.
_PS. I would suggest you change your update frequency in your sitemap. It now states monthly, it’s probably a good idea to set this much faster since there is a blog on your website as well. At the moment you are giving Google hints to only crawl your website a few times a month. Keep in mind that you can give different parts of your website a different change frequency. For example, I give pages with user generated content a much higher change frequency then pages we need to update manually. _
-
Hi Justin,
Are the url's going to change when you update the design? If they are not changing you can already update now.
It's not really abnormal to have only a certain % of the sitemap indexed - it could be that Google judges that a certain number of pages is too light in content to be indexed. 55% of url's indexed seems rather low.
Sitemap errors - check the url's that are listed as errors. If I am not mistaken, you use an external tool to generate the sitemaps. It could be that this tools puts all the internal links in the the sitemap; regardless of their status (200, 301, 404) - normally only url's with status 200 should be put in the sitemap. Check the configuration of the tool you use & see if you can only add url's with status 200. Alternatively, you can check the internal linking on your site & make sure that no links exist to 404 pages (Screaming Frog is the tool to use - it can also generate the sitemap).
For the wrong sitemap- as your sites are exact duplicates, probably hosted on the same server, it could be that the .co.nz sitemap overwrites the .com sitemap , as they have the same name. You could rename your sitemap like sitemap_au.xml, sitemap_us.xml & sitemap_nz.xml. This way, if you add a new sitemap for .nz it will not overwrite the .com version. You submit these to Google & you delete the old ones (both on the server & in Google WMT).
Hope this helps.
Dirk
PS. If your design is also changing the url's - don't forget to put redirects in place that lead the old to the new url's.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Back links issue and how to resolve it
Hi there! We have a client who has been generating back links from external sites over a period of two years with all the same anchor text which all link back to the home page. This anchor text is also their main search phrase they wish to score highly on. In total, they have roughly 300 domain names linking to their site. Over 50 of these domain names all have the same anchor text. These links have been generated through articles and blogs. So roughly 20% of the total number of links all have the same anchor text. Over the past 6 months the client has noticed a steady drop in their rankings for this term. From the back link analysis we have done, we believe it is this which is causing the problem. Does any one else agree? For the remedy, do we go in and see if we can change the anchor text or disavow them through Google webmaster tools? Suggestions? Thanks for your help! P 🙂
White Hat / Black Hat SEO | | Globalgraphics0 -
Third part http links on the page source: Social engineering content warning from Google
Hi, We have received "Social engineering content" warning from Google and one of our important page and it's internal pages have been flagged as "Deceptive site ahead". We wonder what's the reason behind this as Google didn't point exactly to the specific part of the page which made us look so to the Google. We don't employ any such content on the page and the content is same for many months. As our site is WP hosted, we used a WordPress plugin for this page's layout which injected 2 http (non-https) links in our page code. We suspect if this is the reason behind this? Any ideas? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
"Google chose different canonical than user" Issue Can Anyone help?
Our site https://www.travelyaari.com/ , some page are showing this error ("Google chose different canonical than user") on google webmasters. status message "Excluded from search results". Affected on our route page urls mainly. https://www.travelyaari.com/popular-routes-listing Our canonical tags are fine, rel alternate tags are fine. Can anyone help us regarding why it is happening?
White Hat / Black Hat SEO | | RobinJA0 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
More sitemap issues: help
Hey Guys, Seems I'm having more sitemap issues -I just checked my WMT and find that for my com.au and com site - the com.au site is showing i only have 2 pages indexed and 72 Web Pages submitted. The .com I look under sitemaps and it doesn't show any results as to how many pages have been indexed instead it is giving me this error warning - "Your Sitemap appears to be an HTML page. Please use a supported sitemap format instead." All 3 sites are listed here: http://bit.ly/1KTbWg0 http://bit.ly/1AU0f5k http://bit.ly/1yhz96v Any advice would be much appreciate here! Thanks guys
White Hat / Black Hat SEO | | edward-may0 -
Partial Sitemaps Impact on SERP
I have a website having 20 different categories. But have the sitemap for only 1 category and rest 19 categories will not have the sitemaps will this have an impact on the search results on not
White Hat / Black Hat SEO | | seosogo0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
Being penalized for unnatural links, determining the issue, and fixing the problem. What to do?
Hi all, A client has been penalised, having received the message in Google Webmasters last week, along with two more yesterday. It seems the penalty is for something specific: “As a result, for this specific incident we are taking very targeted action on the unnatural links instead of your site as a whole“. This is the first time I've had to deal with this so I'll be a bit layman about it The penalty, firstly, seems to be for the old domain, from which there is a re-direct to the current one. This redirect has been in place since Feb 2012 (no link building has been done for the old domain since then). In Webmasters, I have the old and new domains set up separately and the messages are only coming for the old (but affecting the new, obviously). I need to determine if it’s the old or new URL I’m being hit for, or would that even matter? Some questionable links I can see in WM: There is an affiliate for whom WM is showing 154,000 links (all followed) from their individual products listings to the client’s site (as a related product) but they’re linking to the new domain if that matters. Could this affiliate be an issue? There is also Updowner, which has added 2000+ links unbeknownst to me but apparently they are discounted by Google. I see a ton of recent directory submissions - right up until last week - that I am not responsible for. Could that be intentional spam targeting? I did also use a 3<sup>rd</sup> party link building company for Feb, March and April who ‘manually’ submitted the new domain to directories and social bookmarking sites. Could this be issue? For what kind of time-scale are penalties usually imposed - how far back (or how recently) are they penalising for? Ranking were going really well until this happened last Thursday. Will directories with non-followed links effect us negatively - one such one has over 2000 links. What is the most conclusive way to determine which are the poor, penalty-incurring links pointing to us? I know I now have to contact all the dodgy directories the site is now listed on to get links removed, but any and all advice on how to rectify this, along with determining what had gone wrong, will be most appreciated. Cheers, David
White Hat / Black Hat SEO | | Martin_S0