Penguin Recovery Problem - Weird
-
I had an old URL and the link profile of this URL wasn't good - I had been using article syndication and Penguin threw me to the wolves.
I decided to start over with a new URL and build a new natural link profile. I specifically did NOT do a 301 redirect to the new URL and did not make any request to Google to transfer domain as I didn't want old site being associated to the new one. To redirect our old users, I put a link on the old URL index page (nofollowed) that say that we have moved.
I was very surprised to find that in GWT all the links of the old URL have now been associated to the new URL....why is that? I started over to have a clean natural profile and follow Google guidelines.Has anyone heard of this before? All I can guess is that Google itself "decided" to do its own pseudo-301, since the site was the same, page for page.This has Major implications for anyone attempting a "clean start" to recover from Penguin.
-
Nakul -
Re: "other then seeing those links in GWT, are there any other red flags that you are seeing in terms of not ranking, any penalty messages, unnatural links warning on the new?"
No - no red flags. My "new" site has only been live for about a week and is already reached page 2 or 3 of google serp for my main keywords.
But since I know those backlinks cause my old site to go from #2-3 of page 1 to past page 20, I freaked when I saw them following me.
or was there any messages on the old domain when you got penalized ?
No - I crashed on April 24 and never recovered, even though I removed all the pages that had been syndicated and asked Google for reconsideration, they said there was no manual penalty.
-
Thanks for your thoughts, Marie.
** And I think that sometimes Google gets it wrong as to who is the more authoritative.**
I am confident that they are very often wrong.
-
I personally think it is just a WMT glitch. When the link is shown in WMT it says, "Via this intermediate link..." and the intermediate link is the original page.
However, according again to Dejan SEO, if you copy a site's page and your site has a higher PageRank, you can actually outrank the original page. Here is the article on how they did this (with permission) for Rand Fishkin's blog and other pages:
http://dejanseo.com.au/hijacked/
That makes me think that it's possible that link juice is granted to the more authoritative of the two sites. And I think that sometimes Google gets it wrong as to who is the more authoritative.
While I still think that these links would not cause Penguin to affect a site, I wouldn't chance it!
-
**Basically, when Google sees a duplicate of a page they will assign the page's links to that site. **
Oh.... do you think somebody could grab an article from your website, post it on theirs and kidnap your linkjuice?
-
Those pages can still be in the cache. That was my theory as to what was going on with the previous site. When we used the url removal tool (not the disavow tool by the way) to remove them from the cache this seemed to solve the problem.
-
Considering what's done is done and the point that your old domain is penalized, can you possibly do/try any of the following ?
1. Ignore the fact that those links are appearing in your backlink profile for the new domain. See whether this new website works/ranks.
2. If it doesn't (at all), can you possibly disavow those "article marketing" links for the old domain and do nothing at all for the new domain (since those links are not really linking to your new domain).
Coming back to point 1, what I'd like to ask is, other then seeing those links in GWT, are there any other red flags that you are seeing in terms of not ranking, any penalty messages, unnatural links warning on the new or was there any messages on the old domain when you got penalized ?
-
Thanks Marie - I will try your suggestion.
I did a search using the regular operators to see if my old site was still indexed and Google returned a "we can't find it on this server-that's all we know" Sergent Shultz response, but knowing Google that does not necessarily mean pages are not still in their index.
-
I had this happen with a client I worked with. The client's previous site had a severe Penguin issue so he decided to start over. We did everything properly and did not do any redirects from the old site to the new. But we were surprised when suddenly the WMT console for the new site was showing all of the links that went to the old site!
What happened? It's complicated but it has to do with something that is described here by Dejan SEO: http://dejanseo.com.au/mind-blowing-hack/
Basically, when Google sees a duplicate of a page they will assign the page's links to that site.
What I don't know is whether those links are carrying any link juice and also any penalty with them.
What we did was go back into the WMT console for the old site and use the url removal tool to remove every single url from the index AND the cache for the old site.
It took about 2 weeks for the links to disappear from WMT for the new site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Questions on Google Penguin Clean-up Strategy
Hello Moz Community! I was hit with a REAL bad penalty in May 2013, and the date corresponds to Penguin #4. Never received a manual spam action, but the 50% drop in traffic was very apparent. Since then, I've had a slow reduction in traffic, to where I am today... which is almost baseline. Increases in traffic have not occurred regardless of efforts. In researching a little more, I see that my old SEO companies built my links with exact keyterm matches, many of them repeated over and over, verbatim, on different sites. I've heard two pieces of advice that I don't like 1) scrap the site, or 2) disavow all the links. I would rather see if I can get the webmasters to change the link to something generic, or my brand name, before I do either of these. To scrap my site and start new will be damn near impossible because I'm in an extremely competitive niche, and my site has age (since 2007), so rather work with what I have. A couple of questions, for folks who are in the know about this penalty, if I may: This penguin update, #4, on May 22nd, was it ONLY because of the link text? Or was it also because of the link quality? None of the updates before it harmed me, and I believe those were because of the quality? Could it be for links linking from my blog to my site? My blog (ex. www.mysite.com/blog), has close to 1,000 blog posts, and back in the days I would write these really long, keyword stuffed links leading to www.mysite.com. I've been in the process of cleaning these up, and shortening them, and changing them to more generic (click here's), but it is a LONG and painstaking process. If I get webmasters to change text to just the url or brand name, that's better than disavowing, correct? As long the linking site has a decent spam score and PA/DA on OSE? Is having SOME exact anchor text okay on these links? Is it just the abuse that's the problem? If so, how many should I leave? (like 5 max per keyword?) Or should I just change to the url, or disavow altogether, any and all links that have exact keyword matches? I've downloaded my link profile from OSE and Majestic, and will do so from Ahrefs (I believe it is)? Does Webmaster Tools have any section that can help give me insights into the issue? If so, can you point me in the right direction? Can I get partial credit, for some work done? For instance, say a major update, or crawl, happens, and I've only fixed/disavowed 25% percent of the links by then, is there a possibility that I get a small boost in traffic? Or am I in the doghouse till they are all fixed? Say I clean/disavow everything up, will my improvement be seen in the next crawl? Or the next Penguin update? As there may be a substantial difference in time there. 😎 I see AHREFS, has some information on anchor text... any rules of thumb as to percentages of use of a certain anchor text, to see if I'm abusing or not, before I start undertaking all of this? Thanks! Could the penalty have "passed" altogether, and this is just where I rank? Thanks guys, but the last thing I want to do is ditch my site... I will work hard on this, but need some guidance. Much appreciated! David
Intermediate & Advanced SEO | | DavidC.0 -
Hreflang tag could solve any duplicate content problems on the different versions??
I have run across a couple of articles recently suggesting that using the hreflang tag could solve any SEO problems associated with having duplicate content on the different versions (.co.uk, .com, .ca, etc). here is an example here: http://www.emarketeers.com/e-insight/how-to-use-hreflang-for-international-seo/ Over to you and your technical colleagues, I think ….
Intermediate & Advanced SEO | | JordanBrown0 -
Negative seo problem
Hello, Someone attacked our website with negative SEO and our website fell drastically. If i use bing webmaster tools link explorer i see dozens and dozens spam links but if open a link i don't find this link in that website. Is it possible that someone added the links and then remove thouse links when we got hit? Or is it possible to hide the links that they don't show up on webpage but they are still there? How can i use Google disavow links tool if there is no links in thouse websites but Bing link explorer shows spam domains pointing to our website. BR, T
Intermediate & Advanced SEO | | otsinguturundus0 -
Community question- Penguin 2.0 link types?
What type of links do you think Penguin 2.0 targeted most - anchor text abuse , directory links, paid links, low quality guest posts, article directories etc????
Intermediate & Advanced SEO | | DavidKonigsberg0 -
How to solve a PHP problem to increase user experience ?
Hi all, I am having a peculiar PHP problem that is affecting the user experience of the site and thus, the SEO. The site is www.atlantiksurf.com and as you can see I use the QTranslate Plugin to manage the 3 different languages: www.atlantiksurf.com (Spanish) www.atlantiksurf.com/en www.atlantiksurf.com/de If you enter the german version: http://www.atlantiksurf.com/de/ , and you scroll down to the bottom of the page, you will notice there are a couple of posts that have (Español) in their Titles. This means that this particular post is written in Spanish and should only appear in the Spanish version of the site. Every single post is published on the three versions even when they are language orientated. The result is that when you press that specific Spanish title on the German version, you get nothing but this: http://www.atlantiksurf.com/de/aritz-aranburu-tendra-que-estar-un-mes-sin-competir/ Because obviously there is nothing there. If I go to the wordpress admin panel and search for the post manually, I can solve the problem by erasing all the default code of the different language versions that should not appear. But this is only a manual and non practical solution. The problem, I think, might be in the PHP orders that the Wordpress Theme is receiving from somewhere. I come to all of you SEomoz users as my last chance, because I've been months in discussions with Qtranslate users and php amateurs that cannot solve the issue. I know that this question isn't strictly about SEO, but in a way it is, because it must be affecting the way that Google look at us. Please, some help or orientation would be highly appreciated. rTqZI.png
Intermediate & Advanced SEO | | Tintanus0 -
Recovery Steps For Panda 3.5 (Rel. Apr. 19, 2012)?
I'm asking people who have recovered from Panda to share what criteria they used - especially on sites that are not large scale ecommerce sites. Blog site hit by Panda 3.5. Blog has approximately 250 posts. Some of the posts are the most thorough on the subject and regained traffic despite a Penguin mauling a few days after the Panda attack. (The site has probably regained 80% of the traffic it lost since Penguin hit without any link removal or link building, and minimal new content.) Bounce rate is 80% and average time on page is 2:00 min. (Even my most productive pages tend to have very high bounce rates BUT those pages maintain time on page in the 4 to 12 minute range.) The Panda discussions I've read on these boards seem to focus on e-commerce sites with extremely thin content. I assume that Google views much of my content as "thin" too. But, my site seems to need a pruning instead of just combiining the blue model, white model, red model, and white model all on one page like most of the ecommerce sites we've discussed. So, I'm asking people who have recovered from Panda to share what criteria they used to decide whether to combine a page, prune a page, etc. After I combine any series articles to one long post (driving the time on page to nice levels), I plan to prune the remaining pages that have poor time on page and/or bounce rates. Regardless of the analytics, I plan to keep the "thin" pages that are essential for readers to understand the subject matter of the blog. (I'll work on flushing out the content or producing videos for those pages.) How deep should I prune on the first cut? 5% ? 10% ? Even more ? Should I focus on the pages with the worst bounce rates, the worst time on page, or try some of both? If I post unique and informative video content (hosted on site using Wistia), what I should I expect for a range of the decrease in bounce rate ? Thanks for reading this long post.
Intermediate & Advanced SEO | | JustDucky0 -
Penguin Rescue! A lead has been hit and I need to save them!
I had a meeting today with a prospective client who has been hit by Penguin. Their previous SEO company has obviously used some questionable techniques which is great for me, bad for the client. Their leads have dropped from 10 per day to 1 or 2. Their analytics shows a drop after the 25th, a back link check shows a lot of low quality links. Domain metrics are pretty good and they are still ranking ok for some keywords. I have 1 month to turn it around for them. How do you wise people think it can be done? First of all I will check the on-site optimisation. I will ensure that the site isn't over optimised. Secondly, do I try and remove the bad links? Or just hit the site with good content and good links to outweigh the bad ones. Also, do you think G is actually dropping rankings for the over optimisation / bad links or are the links are just being discredited rsulting in the drop in rankings. 2 very different things. Any advice is appreciated. Thanks
Intermediate & Advanced SEO | | SimpsonGareth0 -
How long does a Google penalty last if you have fixed the problem??
Hi I stupidly thought that it would be a good idea to set up a reciprocal links page on my website named 'links'. I did this because my competitors were linking to these pages so I though it would be a good idea and I genuinely didn't know that you could be punished for this. Within about 3 weeks my rank dropped about 3 pages. I have since removed the links and the page was cached last Friday but the site still appears to have a penalty. I assumed when Google cached the page and saw the links were not there anymore that the penalty would be lifted. Anyone got any ideas? ps. The competitor websites had broken their links pages into various categories relating to the website i.e. related directories etc. so this might be why they weren't penalized.
Intermediate & Advanced SEO | | BelfastSEO0