Same day our sitemap finished processing in G WebTools, our SERP results Tank!
-
Need a little help troubleshooting an SEO issue.
The day our sitemap finished processing in Google Webmaster Tools, almost all of our keyword serp results tanked. Our top 4 keywords routinely were placing from 11 - 33 rank in serp result and now they're not even on in the top 200? Would the sitemap processing have anything do do with this or should I look somewhere else.
FYI: Site is build in DNN, sitemap is fine and robot.txt file is good.
Open to all suggestions!!
-
Hey Ben,
Thanks for the reply and I agree with you that is was pretty much coincidental. We had updated quite a few things on the site since so I am assuming that was the cause.
Thanks,
Charles
-
Hi Charles,
I wouldn't have thought a sitemap could have resulted in a penalty. Unless of course it linked to pages which violated Google guidelines which hadn't been crawled before.
I would consider other things before sitemaps, what changes did you make to the site recently? Where there any algorithm changes around that time?
The last few months have been quite busy for the search engines, it's quite possible that it's just coincidental.
Ben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemaps, 404s and URL structure
Hi All! I recently acquired a client and noticed in Search Console over 1300 404s, all starting around late October this year. What's strange is that I can access the pages that are 404ing by cutting and pasting the URLs and via inbound links from other sites. I suspect the issue might have something to do with Sitemaps. The site has 5 Sitemaps, generated by the Yoast plugin. 2 Sitemaps seem to be working (pages being indexed), 3 Sitemaps seem to be not working (pages have warnings, errors and nothing shows up as indexed). The pages listed in the 3 broken sitemaps seem to be the same pages giving 404 errors. I'm wondering if auto URL structure might be the culprit here. For example, one sitemap that works is called newsletter-sitemap.xml, all the URLs listed follow the structure: http://example.com/newsletter/post-title Whereas, one sitemap that doesn't work is called culture-event-sitemap.xml. Here the URLs underneath follow the structure http://example.com/post-title. Could it be that these URLs are not being crawled / found because they don't follow the structure http://example.com/culture-event/post-title? If not, any other ideas? Thank you for reading this long post and helping out a relatively new SEO!
Technical SEO | | DanielFeldman0 -
Giving a final nudge to lead the serps
hi
Technical SEO | | ciznerguy
might be a weird one to ask. working on an ecommerce site, using all new inbound marketing best practices the site positioned on many keywords on the second page of results. the site started to generate income. Assuming budget is not an issue here, do you have any suggestions on how to give it the final push. any ideas will be considered.0 -
Results pages are not getting pagerank
Hello there, I have a website with a PR5 and seo "juice" is passing down smoothly except for results pages (sorry french ) : http://homengo.com/comment-ca-marche/presentation/ is getting a PR http://homengo.com/s/vente/paris_dept-75/ is not The same goes for all results pages which could indicate a problem. Is there something wrong with these pages, i can not figure it out, or do you have some tools which could help identify the trouble ? Thanks a lot
Technical SEO | | seomengo0 -
Should I Edit Sitemap Before Submitting to GWMT?
I use the XML sitemap generator at http://www.auditmypc.com/xml-sitemap.asp and use the filter that forces the tool to respect robots.txt exclusions. This generator allows me to review the entire sitemap before downloading it. Depending on the site, I often see all kinds of non-content files still listed on the sitemap. My question is, should I be editing the sitemap to remove every file listed except ones I really want spidered, or just ignore them and let the Google spiderbot figure it all out after I upload-submit the XML?
Technical SEO | | DonB0 -
What sitemap generator for mac for php web
i would like to generate a sitemap for my web, and i have mac computer, can you advise me about what site map use
Technical SEO | | maestrosonrisas0 -
Will an identical site impact SERP results
I came across two identical sites for two different business owners in the same industry. I'm sure you've seen these. A web company offers individuals in the same profession a template site with the exact same content for each site. All that is different is the domain. i.e. mycompany.com/news/topicsname will have the exact same content, images, tags, etc. as mycompany2.com/news/topicsname. I would assume having the duplicate content, especially if two site owners are in the same town, will ultimately hurt the rankings of at least one site. Is this correct? Thank you for your help.
Technical SEO | | STF0 -
WordPress blog and XML sitemap
I have a friend that just spent 15K on a new site and believe it or not the developer did not incorporate a CMS into the site. If a WP blog is built and the URL is added to the site's XML sitemap, for all intensive purposes, would Google view this URL as part of the site in terms of overall number of links, referring domains etc.? The developer is saying that even if the WP URL is added to the XML sitemap, Google will not view this URL as part of the site domain. I cannot think of another way of adding unique content to the site unless the developer is paid to build new pages every month. If the WP blog is not part of the overall domain, then we're left with the URL simply pointing back to the domain with anchor text and such and not adding to the total number of links and RD... ANY THOUGHTS ON THIS WOULD BE GREATLY APPRECIATED! Thanks Mozzers!
Technical SEO | | hawkvt10 -
URL restructure and phasing out HTML sitemap
Hi SEOMozzies, Love the Q&A resource and already found lots of useful stuff too! I just started as an in-house SEO at a retailer and my first main challenge is to tidy up the complex URL structures and remove the ugly sub sitemap approach currently used. I already found a number of suggestions but it looks like I am dealing with a number of challenges that I need to resolve in a single release. So here is the current setup: The website is an ecommerce site (department store) with around 30k products. We are using multi select navigation (non Ajax). The main website uses a third party search engine to power the multi select navigation, that search engine has a very ugly URL structure. For example www.domain.tld/browse?location=1001/brand=100/color=575&size=1&various other params, or for multi select URL’s www.domain.tld/browse?location=1001/brand=100,104,506/color=575&size=1 &various other non used URL params. URL’s are easily up to 200 characters long and non-descriptive at all to our users. Many of these type of URL’s are indexed by search engines (we currently have 1.2 million of those URL’s indexed including session id’s and all other nasty URL params) Next to this the site is using a “sub site” that is sort of optimized for SEO, not 100% sure this is cloaking but it smells like it. It has a simplified navigation structure and better URL structure for products. Layout is similair to our main site but all complex HTMLelements like multi select, large top navigations menu's etc are all removed. Many of these links are indexed by search engines and rank higher than links from our main website. The URL structure is www.domain.tld/1/optimized-url .Currently 64.000 of these URL’s are indexed. We have links to this sub site in the footer of every page but a normal customer would never reach this site unless they come from organic search. Once a user lands on one of these pages we try to push him back to the main site as quickly as possible. My planned approach to improve this: 1.) Tidy up the URL structure in the main website (e.g. www.domain.tld/women/dresses and www.domain.tld/diesel-red-skirt-4563749. I plan to use Solution 2 as described in http://www.seomoz.org/blog/building-faceted-navigation-that-doesnt-suck to block multi select URL’s from being indexed and would like to use the URL param “location” as an indicator for search engines to ignore the link. A risk here is that all my currently indexed URL (1.2 million URL’s) will be blocked immediately after I put this live. I cannot redirect those URL’s to the optimized URL’s as the old URL’s should still be accessible. 2.) Remove the links to the sub site (www.domain.tld/1/optimized-url) from the footer and redirect (301) all those URL’s to the newly created SEO friendly product URL’s. URL’s that cannot be matched since there is no similar catalog location in the main website will be redirected (301) to our homepage. I wonder if this is a correct approach and if it would be better to do this in a phased way rather than the currently planned big bang? Any feedback would be highly appreciated, also let me know if things are not clear. Thanks! Chris
Technical SEO | | eCommerceSEO0