Dropped ranking - Penguin penalty or duplicate content issue?
-
Just this weekend a page that had been ranking well for a competitive term fell completely out of the rankings. There are two possible causes and I'm trying to figure out which it is, so I can take action.
I found out that I had accidentally put a canonical on another page that was for the same page as the one that dropped out of the rankings. If there are two pages with the same canonical tag with different content, will google drop both of them from the index?
The other possibility is that this is a result of the recent Penguin update. The page that dropped has a high amount of exact anchor text. As far as I can tell, there were no other pages with any penalties from the Penguin update.
One last question: The page completely dropped from the search index. If this were a Penguin issue, would it have dropped out completely,or just been penalized with a drop in position?
If this is a result of the conflicting canonical tags, should I just wait for it to reindex, or should I request a reconsideration of the page?
-
Yes I think it was a Penguin drop. There is one other thing about the page that dropped. It is using a 301 re-direct. I had updated the page url a while ago, but nearly all of the links to the page are to the old page. So this penalty might be a combination of signals that collectively have tagged that page.
I'm working on cleaning up the link profile right now. I think that Penguin is a very imperfect animal. But I cant change the beast, so I will just have to make some changes here.
-
It's unlikely the canonical is to blame here, if I'm understanding it correctly. If you tried to canonicalize Page B to Page A, and they were clearly different, one of two things should happen:
(1) Google will just ignore it.
(2) Google will follow it anyway, and drop Page B from the index.
Now, it's theoretically possible that, if Google thought you were using the canonical tag inappropriately to benefit Page A, they could punish Page A, but I've honestly never seen that happen (I've seen it with 301-redirects). Typically, Page B would also have to have a lot of links that you were trying to "clean" (think money laundering). Since Page B is new, this seems very unlikely.
If you're hitting exact-match (or close to it) anchor text hard on Page A, it's certainly possible Penguin came into play, especially if Page A is pushing keywords a bit too hard. It's been tough to confirm Penguin cases, but most of the verified ones I've seen are sudden drops. It's not a subtle, gradual impact.
You could wait for the next Penguin data update, but I suspect you may have to do some link clean up. If there's anything that's not only exact-match anchor text but is sitewide (especially footer links), I'd start there. They seem to be major targets of Penguin. Truthfully, though, we're still collecting data on it.
-
Thanks for the reply!
What happened was that I added a new page and accidentally used the canonical for the page that was ranking well for search terms on that new page.
So to state it a different way - I added a new Page B to the site, but instead of using the canonical for that page, I accidentally used the canonical for Page A. Page A is the page that previously had ranked well for search terms. On Saturday night or Sunday, Page A dropped out of all of the search terms that it ranks for. However, I did a little more research and Page A is still in the index, it just doesnt rank for any of the search terms it used to. Page B is also in the index, but since it is a new page, it does not really rank for any terms. Obviously, I have fixed the canonical on Page B and Google already has the new page in its cache.
As far as over-optimization penalties, Page A has nearly all the inbound links with anchor text that is only a slight variation of the search term. It is the page on the site that I would have expected to have got hit by Penguin. There are some other pages that have lost a little bit of ranking, but nothing drastic.
I am just surprised that if it is a Penguin penalty, it would completely lose ranking on the terms in a single day, rather than moving down the rankings to maybe the third or fourth page. Do you find that Penguin penalties usually result in a lower ranking, or completely losing rankings?
Either way, I'm going to go in and clean up the link profile, but it would be nice to know how aggressive I should be to try to recover that page.
-
I've seen some reports of sites being hit by the Penguin data update ("Penguin 1.1") on Friday night, but I'm not clear on the severity. If it's just one page, though, and it was completely de-indexed, that's pretty unlikely.
It is definitely possible for a bad canonical tag to drop a page from the index. I'm a little confused on what you're saying about the two pages. Are they both canonical'ed to a third page, or to each other? Could you give an example (maybe show us two tags that are similar to what you have, but with the exact details changed)?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two high ranking pages instantly dropped from index - no manual penalty notification
We are facing an issue where two of our major rankings pages have just completely disappeared from search results. This has happened in the last 24 - 48 hours and there has been no changes made to the site. From what we can tell, it's only impacted two pages (but two very important category pages). I have double and triple checked all standard indexing protocols - Search Console URL inspection says the pages are fine to crawl and index. URLs have been requested to re-index but nothing has worked. This would have me to believe it could be a manual action yet there are no notifications in Search Console and we are listed as 'No issues detected' in all versions of our web property. Can anyone else think what could be the reason?
Intermediate & Advanced SEO | | Vuly0 -
Duplicate content based on filters
Hi Community, There have probably been a few answers to this and I have more or less made up my mind about it but would like to pose the question or as that you post a link to the correct article for this please. I have a travel site with multiple accommodations (for example), obviously there are many filter to try find exactly what you want, youcan sort by region, city, rating, price, type of accommodation (hotel, guest house, etc.). This all leads to one invevitable conclusion, many of the results would be the same. My question is how would you handle this? Via a rel canonical to the main categories (such as region or town) thus making it the successor, or no follow all the sub-category pages, thereby not allowing any search to reach deeper in. Thanks for the time and effort.
Intermediate & Advanced SEO | | ProsperoDigital0 -
If a website trades internationally and simply translates its online content from English to French, German, etc how can we ensure no duplicate content penalisations and still maintain SEO performance in each territory?
Most of the international sites are as below: example.com example.de example.fr But some countries are on unique domains such example123.rsa
Intermediate & Advanced SEO | | Dave_Schulhof0 -
Duplicate Content for Deep Pages
Hey guys, For deep, deep pages on a website, does duplicate content matter? The pages I'm talk about are image pages associated with products and will never rank in Google which doesn't concern me. What I'm interested to know though is whether the duplicate content would have an overall effect on the site as a whole? Thanks in advance Paul
Intermediate & Advanced SEO | | kevinliao1 -
HELP! How does one prevent regional pages as being counted as "duplicate content," "duplicate meta descriptions," et cetera...?
The organization I am working with has multiple versions of its website geared towards the different regions. US - http://www.orionhealth.com/ CA - http://www.orionhealth.com/ca/ DE - http://www.orionhealth.com/de/ UK - http://www.orionhealth.com/uk/ AU - http://www.orionhealth.com/au/ NZ - http://www.orionhealth.com/nz/ Some of these sites have very similar pages which are registering as duplicate content, meta descriptions and titles. Two examples are: http://www.orionhealth.com/terms-and-conditions http://www.orionhealth.com/uk/terms-and-conditions Now even though the content is the same, the navigation is different since each region has different product options / services, so a redirect won't work since the navigation on the main US site is different from the navigation for the UK site. A rel=canonical seems like a viable option, but (correct me if I'm wrong) it tells search engines to only index the main page, in this case, it would be the US version, but I still want the UK site to appear to search engines. So what is the proper way of treating similar pages accross different regional directories? Any insight would be GREATLY appreciated! Thank you!
Intermediate & Advanced SEO | | Scratch_MM0 -
Best practice with duplicate content. Cd
Our website has recently been updated, now it seems that all of our products pages look like this cdnorigin.companyname.com/catagory/product Google is showing these pages within the search. rather then companyname.com/catagory/product Each product page does have a canaonacal tag on that points to the cdnorigin page. Is this best practice? i dont think that cdnorigin.companyname etc looks very goon in the search. is there any reason why my designer would set the canonical tags up this way?
Intermediate & Advanced SEO | | Alexogilvie0 -
Penalised for duplicate content, time to fix?
Ok, I accept this one is my fault but wondering on time scales to fix... I have a website and I put an affiliate store on it, using merchant datafeeds in a bid to get revenue from the site. This was all good, however, I forgot to put noindex on the datafeed/duplicate content pages and over a period of a couple of weeks the traffic to the site died. I have since nofollowed or removed the products but some 3 months later my site still will not rank for the keywords it was ranking for previously. It will not even rank if I type in the sites' name (bright tights). I have searched for the name using bright tights, "bright tights" and brighttights but none of them return the site anywhere. I am guessing that I have been hit with a drop x place penalty by Google for the duplicate content. What is the easiest way around this? I have no warning about bad links or the such. Is it worth battling on trying to get the domain back or should I write off the domain, buy a new one and start again but minus the duplicate content? The goal of having the duplicate content store on the site was to be able to rank the category pages in the store which had unique content on so there were no problems with that which I could foresee. Like Amazon et al, the categories would have lists of products (amongst other content) and you would click through to the individual product description - the duplicate page. Thanks for reading
Intermediate & Advanced SEO | | Grumpy_Carl0 -
SEO issues with IP based content delivery
Hi, I have two websites say website A and Website B. The website A is set up for the UK audience and the website B is set up for the US audience. Both websites sell same products with some products and offers not available in either country. Website A can't be accessed if you are in US. Similarly website B can't be accessed if you are in UK. This was a decision made by the client long time ago as they don’t want to offer promotions etc in the US and therefore don’t want the US audience to be able to purchase items from the UK site. Now the problem is both the websites have same description for the common products they sell.Search engine spiders tend to enter a site from a variety of different IP addresses/locations. So while a UK visitor will not be able to access the US version of the site and vice versa, a crawler can. Now i have following options with me: 1. Write a different product descriptions for US website to keep both the US and UK versions of the site in the Google Index for the foreseeable future. But this is going to be time consuming and expensive option as there are several hundred products which are common to both sites. 2. Use a single website to target both US and UK audience and make the promotions available only to the UK audience. There is one issue here. Website A address ends with '.co.uk' and website B has different name and ends with .com. So website A can't be used for the US audience. Also website A is older and more authoritative than the new website B. Also website A is pretty popular among UK audience with the .co.uk address. So website B can't be used to target the UK audience. 3. You tell me
Intermediate & Advanced SEO | | DevakiPhatak2