Issue with Site Map - how critical would you rank this in terms of needing a fix?
-
A problem has been introduced onto our sitemap whereby previously excluded URLs are no longer being correctly excluded. These are returning a HTTP 400 Bad Request server response, although do correctly redirect to users.
We have around 2300 pages of content, and around 600-800 of these previously excluded URLs,
An example would be http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/botswana/suggested-holidays/botswana-classic-camping-safari/Dates and prices.aspx (the page does correctly redirect to users).
The site is currently being rebuilt and only has a life span of a few months. The cost our current developers have given us for resolving this is quite high with this in mind. I was just wondering:
-
How much of a critical issue would you view this?
-
Would it be sufficient (bearing in mind this is an interim measure) to change these pages so that they had a canonical or a redirect - they would however remain on the sitemap.
Thanks
Kate -
-
Agree with Martijn. Technically, Google does not want any pages in the sitemap that 4xx or 3xx or have any other result than a 200.
I would say this, if your site is being rebuilt, having a sitemap that is accurate and that updates when you update the site is a basic requirement. The fact that there is a high cost for fixing this issue is baloney. It sounds like the devs did not build the site correctly the first time if they do not have a way to update the sitemap automatically.
You could generate the sitemap yourself
You can use tools like
http://tools.seochat.com/tools/online-crawl-google-sitemap-generator/
Or read tutorials on how to use Screaming Frog to create a sitemap
http://www.hmtweb.com/marketing-blog/dirty-sitemaps-how-to-download-crawl/
Frankly, the annual cost of Screaming Frog (about $150 a year) gets you so much more than just sitemaps. Buy Screaming Frog, have it generate your sitemap and ask the devs to upload it. If you have a site with several thousand pages, just running Screaming Frog monthly would help you find issues on your site that is well worth the cost. Search "Screaming Frog" here in the forums and you can see that this is one of the "swiss army knives" of technical SEO.
-
- It's not a perfect solution and if you can make the costs than I would always do it, but honestly I can't say this is very critical right now to fix. Google and other search engines want you to have a very high quality sitemap so that all pages in there will exist and work for them as well as for users but if a certain percentage of them won't work then it won't get you massively in trouble I'd say.
- I'm not really sure about this option as it doesn't sound like an actual fix at all so for now I would say don't do it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking of Travel Sites in SERPs
Hello, I have noticed that some travel sites rank for almost all the keywords but when I click the page, it has no relevant content and often no content at all. I remember Google once updated its algorithm to do away with such sites but I still found some. The question is - if they don't have relevant content or if they don't have content at all, how do they even rank? Secondly, how come they have pages for all keyword combination? How is this achieved? Regards
Intermediate & Advanced SEO | | IM_Learner0 -
Site re-design, full site domain A/B test, will we drop in rankings while leaking traffic
We are re-launching a client site that does very well in Google. The new site is on a www2 domain which we are going to send a controlled amount of traffic to, 10%, 25%, 50%, 75% to 100% over a 5 week period. This will lead to a reduction in traffic to the original domain. As I don't want to launch a competing domain the www2 site will not be indexed until 100% is reached. If Google sees the traffic numbers reducing over this period will we drop? This is the only part I am unsure of as the urls and site structure are the same apart from some new lower level pages which we will introduce in a controlled manner later? Any thoughts or experience of this type of re-launch would be much appreciated. Thanks Pete
Intermediate & Advanced SEO | | leshonk0 -
Redirecting Pages from site A to site B
Hi, I have a client who have a solid, high ranking content based site (site A). They have now created an ecommerce site in addition (site B). To give site B a boost in terms of search engine visibility upon launch, they now wish to redirect approx 90% of site As pages to site B. What would be the implications of this? Apart from customers being automatically redirected from the page they thought they where landing on, how would google now view site A? What are your thoughts to thier idea. I am trying to talk them out of it as I think its a poor one.
Intermediate & Advanced SEO | | Webrevolve0 -
Site revamp for neglected site - modifying site structure, URLs and content - is there an optimal approach?
A site I'm involved with, www.organicguide.com, was at one stage (long ago) performing reasonably well in the search engines. It was ranking highly for several keywords. The site has been neglected for some considerable period of time. A new group of people are interested in revamping the site, updating content, removing some of the existing content, and generally refreshing the site entirely. In order to go forward with the site, significant changes need to be made. This will likely involve moving the entire site across to wordpress. The directory software (edirectory.com) currently being used has not been designed with SEO in mind and as a result numerous similar pages of directory listings (all with similar titles and descriptions) are in google's results, albeit with very weak PA. After reading many of the articles/blog posts here I realize that a significant revamp and some serious SEO work is needed. So, I've joined this community to learn from those more experienced. Apart from doing 301 redirects for pages that we need to retain, is there any optimal way of removing/repairing the current URL structure as the site gets updated? Also, is it better to make changes all at once or is an iterative approach preferred? Many thanks in advance for any responses/advice offered. Cheers MacRobbo
Intermediate & Advanced SEO | | macrobbo0 -
Widespread Ranking Drops for Strong, White Hat sites
Hi Moz, A few of our sites experienced widespread ranking drops across all keywords this morning. Both of these are very strong sites with a high DA (56), moz rank and moz trust above 5. We haven't been doing any black hat SEO, creating spammy links, or paying for links. Is anyone else seeing anything similar or have any ideas as to what type of Algorithm update this might be? We currently rank 67th for our name.
Intermediate & Advanced SEO | | OneClickVentures0 -
Redirect micro-niche site to bigger niche site?
I have a micro niche site that performs reasonably well (page 1 at least) for it's main keywords. It is an exact match domain. To save the ongoing maintenance of a site that gets less than 10 visitors a day, I was thinking of redirecting this micro niche site to a bigger site (a niche site that the micro niche fits into, if that makes sense!) Would I lose rankings because of the power that the EMD provided? Would it be better keeping it there for the backlink it provides to the bigger site (although on the same C Class IP)
Intermediate & Advanced SEO | | BigMiniMan0 -
Can you spot the reasons for our site dropping in rankings so significantly?
We've been racking our brains over this since the recent search engine changes (the notorious and non-cuddley Google Panda update) and have, within reason, corrected as many of the problems that we possibly can yet still our traffic drops further. http://www.bedandbreakfastsguide.com used to rank fairly equally with it's competitors however since the update (and a number of suggestions from another SEO company), the traffic has dropped by about 90% and it's dropped almost completely from the search results (unlike the competitors who are breaking many faux-pars yet remain well ranked). I don't think we're seeing the wood from the trees anymore so I'd be grateful if someone could take a look and see if we've missed anything glaringly obvious? Any thoughts welcome. Thanks Tim Big changes around the same time/since that might be worth noting: Setup a canonical domain name of www.bedandbreakfastsguide.com and (using IIS7) 301 redirect all other traffic over. Setup canonical URL meta tag for all results pages so they point to a single page Moved the redirect page (the one which sends users to the B&B's site) to another subdomain. Redesigned the URLs where possible to use "friendlier" and more keyword rich urls and 301 redirecting for the old urls Added XML sitemaps to the various tools (we found out they weren't there before) Added a robots.txt file Lowercased all urls Where possible removed duplicate results pages and pointed them at a single page Restructured the page titles to be more relevant Setup nofollow on the external urls
Intermediate & Advanced SEO | | TimGaunt0 -
What strategies to best use to boost rankings across long-tail articles on site?
Heya! I'm currently engaged in what appears to be a slightly unusual SEO task. I run a large, reasonably well-respected (but not global-standard, yet) site that I'm currently monetising through individual articles targetted at addressing specific search engine queries that I know have decent traffic. It's the EHow / Demand Media model, except with a focus on a single specific (video games) niche, and much, much better quality articles (sufficiently good that they attract a fair amount of praise - all the writers on the site are published authors and the quality's damn high). Most of our articles end up ranking with essentially no backup, but they don't rank high - usually 2nd or 3rd page of Google. I'm trying to determine what the most effective strategy would be for us to boost our article rankings with the least possible expense / effort (we don't have a huge budget). Our long-tail articles are mostly being trumped by articles with either a couple of external links to them or by other articles with no links but from a site with significantly higher Domain Authority (70+ to our 48).I'm working to improve our on-page optimisation, but it's already pretty good (an "A" report from the SEOMoz tools on most or all pages). So, I'm wondering what the best use of our time would be to increase traffic globally across the site. Strategies I'm considering: Focussing on building links to the homepage and to any other pages on the site, by asking for links from community members, doing linkbait articles, directory submissions, guest blogging, and so on. Long-term aim: increase our domain-wide MozRank and MozTrust. Build links to our long-tail articles specifically, most popular first. Get direct links from relevant blogs, press releases, social bookmarking, etc. Long-term aim: get to #1 on Google one page at a time. Something Else? I'm wondering what the big SEO brains here would suggest? Happy to provide additional details if it would help!
Intermediate & Advanced SEO | | Cairmen1