Over 1000 pages de-indexed over night
-
Hello,
On my site (www.bridgman.co.uk) we had a lot of duplicate page issues as reported by the Seomoz site report tool - this was due to database driven URL strings. As a result, I sent an excel file with all the duplicate pages to my web developer who put rel canonical tags on what I assumed would be all the correct pages.
I am not sure if this is a coincidence, or a direct result of the canonical tags, but a few days after (yesterday) the amount of pages indexed by google dropped from 1,200 to under 200.
The number is still declining, and other than the canonical tags I can't work out why Google would just start de-indexing most of our pages.
If you could offer any solutions that would be greatly appreciated.
Thanks, Robert.
-
Canonical and duplicate content are both interresting issues, thanks for your anserws!
-
The 1st question I would ask myself as a website owner is: How many pages have duplicate content on them? Maybe try going the manual way and check for yourself. There will be pages you do not know about. Use a tool like Xenu's link sleuth to extract all links on your website(which are reachable from atleast 1 link of your website).
It may happen that google is adjusting its index for all. Worth keeping a track on whether your competitors are losing at the same or slower rate or not.
-
I would first remove the canonicals (or actually do them correctly) and wait for a re-crawl of the pages. If not, then re-submit.
Also, problems with canonicals do show up in the page grader app, so when your admin re-writes the canonical, test the page before Google sees it : )
-
It appears so
I've asked my webmaster to remove all of the canonical tags but after reading that article I feel that a lot of the damage has already been done!
Looks like I'll have to submit a reconsideration request and beg Google for forgiveness.
-
Hmm, I'd remove them.
The fact that all your individual product pages have a rel=canonical of productsdetail.asp and category level is products.asp is telling Google that all your products pages are the same!
The problem is there are a lot of parameters in your URLs, let me go have a look at how to deal with that when using the canonical tag.
EDIT - Spent half my lunch on this but couldn't find much I would guess you can get away with parameters in the canonical tag so decide which is the official URL and put that in the header for that paticular piece of content. Only really found this - http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html
-
Could you be in a canonical loop?
-
I don't think this is a result of an algo change, I've just had a look at the crawl diagnostics and it appears that he has put the tag value of the homepage (www.bridgman.co.uk/ - where all the back links are pointed to) as www.bridgman.co.uk/default.asp !!
This has also been done to all of our product pages!! i.e.all of the duplicated product pages now have the tag value - www.bridgman.co.uk/productdetail.asp (a page which doesn't even exist on our site!!!)
Now I may be stating the obvious, but I guess this is the problem?
He did this 8 days ago, where should I go from here? Ask him to remove all rel canonical tags and pray we bounce back....?
-
First question would be do you think this may have had an effect on you? - http://searchengineland.com/google-forecloses-on-content-farms-with-farmer-algorithm-update-66071
All your content unique? Your links come from sites that may have been effected?
If not then I'd give it a few days to see if they come back (as HR128 says, may just be Google working out what's changed and what to do) and I'm doing a quick crawl of your site to see what I can see
-
It could take a couple of days before you see results, Google has to step back and reassess most likely, or your web developer has no clue what he's doin haha.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Over 500 thin URLs indexed from dynamically created pages (for lightboxes)
I have a client who has a resources section. This section is primarily devoted to definitions of terms in the industry. These definitions appear in colored boxes that, when you click on them, turn into a lightbox with their own unique URL. Example URL: /resources/?resource=dlna The information for these lightboxes is pulled from a standard page: /resources/dlna. Both are indexed, resulting in over 500 indexed pages that are either a simple lightbox or a full page with very minimal content. My question is this: Should they be de-indexed? Another option I'm knocking around is working with the client to create Skyscraper pages, but this is obviously a massive undertaking given how many they have. Would appreciate your thoughts. Thanks.
Technical SEO | | Alces0 -
Sitemap.gz is being indexed and is showing up in SERP instead of actual pages.
Sitemap.gz is being indexed and is showing up in SERP instead of actual pages. I recently uploaded my sitemap file - https://psglearning.com/sitemapcustom/sitemap-index.xml - via Search Console. The only record within the XML file is sitemaps.gz. When I searched for some content on my site - here is the search https://goo.gl/mqxBeq - I was shown the following search result, indicating that our GZ file is getting indexed instead of our pages. http://www.psglearning.com/catalog 1 http://www.psglearning.com ...www.psglearning.com/sitemapcustom/sitemap.gz... 1 https://www.psglearning.com/catalog/productdetails/9781284059656/ 1 https://www.psglearning.com/catalog/productdetails/9781284060454/ 1 ... My sitemap is listed at https://psglearning.com/sitemapcustom/sitemap-index.xml inside the sitemap the only reference is to sitemap.gz. Should we remove the link the the sitemap.gz within the xml file and just serve the actual page paths? <sitemapindex< span=""> xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"></sitemapindex<><sitemap></sitemap>https://www.psglearning.com/sitemapcustom/sitemap.gz<lastmod></lastmod>2017-06-12T09:41-04:00
Technical SEO | | pdowling0 -
Does Google index internal anchors as separate pages?
Hi, Back in September, I added a function that sets an anchor on each subheading (h[2-6]) and creates a Table of content that links to each of those anchors. These anchors did show up in the SERPs as JumpTo Links. Fine. Back then I also changed the canonicals to a slightly different structur and meanwhile there was some massive increase in the number of indexed pages - WAY over the top - which has since been fixed by removing (410) a complete section of the site. However ... there are still ~34.000 pages indexed to what really are more like 4.000 plus (all properly canonicalised). Naturally I am wondering, what google thinks it is indexing. The number is just way of and quite inexplainable. So I was wondering: Does Google save JumpTo links as unique pages? Also, does anybody know any method of actually getting all the pages in the google index? (Not actually existing sites via Screaming Frog etc, but actual pages in the index - all methods I found sadly do not work.) Finally: Does somebody have any other explanation for the incongruency in indexed vs. actual pages? Thanks for your replies! Nico
Technical SEO | | netzkern_AG0 -
New site: More pages for usability, or fewer more detailed pages for greater domain authority flow?
Ladies and gents! We're building a new site. We have a list of 28 professions, and we're wondering whether or not to include them all on one long and detailed page, or to keep them on their own separate pages. Thinking about the flow of domain authority - I could see 28 pages diluting it quite heavily - but at the same time, I think having the separate pages would be better for the user. What do you think?
Technical SEO | | Muhammad-Isap1 -
Why is Google Webmaster Tools showing 404 Page Not Found Errors for web pages that don't have anything to do with my site?
I am currently working on a small site with approx 50 web pages. In the crawl error section in WMT Google has highlighted over 10,000 page not found errors for pages that have nothing to do with my site. Anyone come across this before?
Technical SEO | | Pete40 -
Sitemap indexation
3 days ago I sent in a new sitemap for a new platform. Its 23.412 pages but until now its only 4 pages (!!) that are indexed according to the Webmaster Tools. Why so few? Our stage-enviroment got indexed (more than 50K pages) in a few days by a mistake.
Technical SEO | | Morten_Hjort0 -
Why is the Page Authority of my product pages so low?
My domain authority is 35 (homepage Page Authority = 45) and my website has been up for years: www.rainchainsdirect.com Most random pages on my site (like this one) have a Page Authority of around 20. However, as a whole, the individual pages of my products rank exceptionally low. Like these: http://www.rainchainsdirect.com/products/copper-channel-link-rain-chain (Page Authority = 1) http://www.rainchainsdirect.com/collections/todays-deals/products/contempo-chain (Page Authority = 1) I was thinking that for whatever reason they have such low authority, that it may explain why these pages rank lower in google for specific searches using my exact product name (in other words, other sites that are piggybacking of my unique products are ranking higher for my product in a specific name search than the original product itself on my site) In any event, I'm trying to get some perspective on why these pages remain with the same non-existent Page Authority. Can anyone help to shed some light on why and what can be done about it? Thanks!
Technical SEO | | csblev0 -
Getting More Pages Indexed
We have a large E-commerce site (magento based) and have submitted sitemap files for several million pages within Webmaster tools. The number of indexed pages seems to fluctuate, but currently there is less than 300,000 pages indexed out of 4 million submitted. How can we get the number of indexed pages to be higher? Changing the settings on the crawl rate and resubmitting site maps doesn't seem to have an effect on the number of pages indexed. Am I correct in assuming that most individual product pages just don't carry enough link juice to be considered important enough yet by Google to be indexed? Let me know if there are any suggestions or tips for getting more pages indexed. syGtx.png
Technical SEO | | Mattchstick0