Penguin Recovery Problem - Weird
-
I had an old URL and the link profile of this URL wasn't good - I had been using article syndication and Penguin threw me to the wolves.
I decided to start over with a new URL and build a new natural link profile. I specifically did NOT do a 301 redirect to the new URL and did not make any request to Google to transfer domain as I didn't want old site being associated to the new one. To redirect our old users, I put a link on the old URL index page (nofollowed) that say that we have moved.
I was very surprised to find that in GWT all the links of the old URL have now been associated to the new URL....why is that? I started over to have a clean natural profile and follow Google guidelines.Has anyone heard of this before? All I can guess is that Google itself "decided" to do its own pseudo-301, since the site was the same, page for page.This has Major implications for anyone attempting a "clean start" to recover from Penguin.
-
Nakul -
Re: "other then seeing those links in GWT, are there any other red flags that you are seeing in terms of not ranking, any penalty messages, unnatural links warning on the new?"
No - no red flags. My "new" site has only been live for about a week and is already reached page 2 or 3 of google serp for my main keywords.
But since I know those backlinks cause my old site to go from #2-3 of page 1 to past page 20, I freaked when I saw them following me.
or was there any messages on the old domain when you got penalized ?
No - I crashed on April 24 and never recovered, even though I removed all the pages that had been syndicated and asked Google for reconsideration, they said there was no manual penalty.
-
Thanks for your thoughts, Marie.
** And I think that sometimes Google gets it wrong as to who is the more authoritative.**
I am confident that they are very often wrong.
-
I personally think it is just a WMT glitch. When the link is shown in WMT it says, "Via this intermediate link..." and the intermediate link is the original page.
However, according again to Dejan SEO, if you copy a site's page and your site has a higher PageRank, you can actually outrank the original page. Here is the article on how they did this (with permission) for Rand Fishkin's blog and other pages:
http://dejanseo.com.au/hijacked/
That makes me think that it's possible that link juice is granted to the more authoritative of the two sites. And I think that sometimes Google gets it wrong as to who is the more authoritative.
While I still think that these links would not cause Penguin to affect a site, I wouldn't chance it!
-
**Basically, when Google sees a duplicate of a page they will assign the page's links to that site. **
Oh.... do you think somebody could grab an article from your website, post it on theirs and kidnap your linkjuice?
-
Those pages can still be in the cache. That was my theory as to what was going on with the previous site. When we used the url removal tool (not the disavow tool by the way) to remove them from the cache this seemed to solve the problem.
-
Considering what's done is done and the point that your old domain is penalized, can you possibly do/try any of the following ?
1. Ignore the fact that those links are appearing in your backlink profile for the new domain. See whether this new website works/ranks.
2. If it doesn't (at all), can you possibly disavow those "article marketing" links for the old domain and do nothing at all for the new domain (since those links are not really linking to your new domain).
Coming back to point 1, what I'd like to ask is, other then seeing those links in GWT, are there any other red flags that you are seeing in terms of not ranking, any penalty messages, unnatural links warning on the new or was there any messages on the old domain when you got penalized ?
-
Thanks Marie - I will try your suggestion.
I did a search using the regular operators to see if my old site was still indexed and Google returned a "we can't find it on this server-that's all we know" Sergent Shultz response, but knowing Google that does not necessarily mean pages are not still in their index.
-
I had this happen with a client I worked with. The client's previous site had a severe Penguin issue so he decided to start over. We did everything properly and did not do any redirects from the old site to the new. But we were surprised when suddenly the WMT console for the new site was showing all of the links that went to the old site!
What happened? It's complicated but it has to do with something that is described here by Dejan SEO: http://dejanseo.com.au/mind-blowing-hack/
Basically, when Google sees a duplicate of a page they will assign the page's links to that site.
What I don't know is whether those links are carrying any link juice and also any penalty with them.
What we did was go back into the WMT console for the old site and use the url removal tool to remove every single url from the index AND the cache for the old site.
It took about 2 weeks for the links to disappear from WMT for the new site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Huge problem : All our Internal Links Dropped from 9.000 to 0.What happened?
Hi, I just noticed a huge large problem in our rankings. Our rankings suddenly dropped for more than 50 %. Of course, I immediately started to research the issue. And under Links, I found that we somehow lost all of our internal links! They have dropped from 9k to 0. Now, I am sure that we do have some internal links on our site ( since I put them there myself). Could you please tell me what is going on and how I can fix this issue? Our site is 1solarsolution.com and I will also attach screenshots bellow from Link Explorer, thank you. Fr08UGe
Intermediate & Advanced SEO | | alisamana0 -
Is there a problems with putting encoding into the subdomain of a URL?
We are looking at changing our URL structure for tracking various affiliates from: https://sub.domain.com/quote/?affiliate_id=xxx to https://aff_xxx_affname.domain.com/quote/ Both would allow us to track affiliates, but the second would allow us to use cookies to track. Does anyone know if this could possibly cause SEO concerns? Also, For the site we want to rank for, we will use a reverse proxy to change the URL from https://aff_xxx.maindomain.com/quote/ to https://www.maindomain.com/quote/ would that cause any SEO issues. Thank you.
Intermediate & Advanced SEO | | RoxBrock0 -
Problem with internal links
Hello! Our domain, http://www.unionroom.com/, is having a strange issue with OSE in that it is telling us our internal pages aren't linking to one another. An example of this is that it is showing our About page ( http://www.unionroom.com/about/ ) only having three links, but this link appears twice on every single page on the website (~200 pages) in the header and footer. We've hung around for a little while to see if OSE would correct itself, but it hasn't and this now suggests that it may be an issue with our in-linking structure. Can anyone spot any issues with our build? The rest of the websites that we produce, that are all built in the same way, all have healthy internal linking structures according to OSE. Very confusing! Any help would be greatly appreciated.
Intermediate & Advanced SEO | | unionroom0 -
Search engine simulators are not finding text on my website. Do I have a problem with Javascript or AJAX?
My website text is not appearing in search engine simulators. Is there a problem with the javascript? Or perhaps AJAX is affecting it? Is there a tool I can use to examine how my website architecture is affecting how the site is crawled? I am totally lost. Help!
Intermediate & Advanced SEO | | ecigseo0 -
Www for main site and non www for integrated blog - is this a problem?
Hi Mozzers My client has their main site with www as the preferred version and utilises 301s for the non www version which is good. For their integrated WP blog, they prefer the non www version, again utilising 301s. So we have no duplicates, but is this different use of sub domains going to hurt SEO with regards to the links pointing in? ie do the links pointing into the blog benefit the main site or are we missing a trick and should change the blog to www? Many thanks Wendy
Intermediate & Advanced SEO | | Chammy0 -
I was hit bad by Penguin on 4-24-12.
I was hit bad by Penguin on 4-24-12. Never received a bad link warning from Google but I do have lots of bad links from a SEO company years ago. I canceled with them and they kept sending automated links with the same anchor text. I even asked Google to remove the bad links years before the penguin update I have removed many bad links by contacting webmasters and have used disavowal tool twice. 6 Months later still no improvement. Is it possible that Google needs more time to disavowal bad links ? What about 301 redirect to a new domain? Anyone ever have long term success with 301. What about 301 after using the disavowal tool any luck? What about changing my domain, will it take months for site to move up, anyone with experience trying this? Let me know any ideas. Thanks
Intermediate & Advanced SEO | | DWatters0 -
Recovery Steps For Panda 3.5 (Rel. Apr. 19, 2012)?
I'm asking people who have recovered from Panda to share what criteria they used - especially on sites that are not large scale ecommerce sites. Blog site hit by Panda 3.5. Blog has approximately 250 posts. Some of the posts are the most thorough on the subject and regained traffic despite a Penguin mauling a few days after the Panda attack. (The site has probably regained 80% of the traffic it lost since Penguin hit without any link removal or link building, and minimal new content.) Bounce rate is 80% and average time on page is 2:00 min. (Even my most productive pages tend to have very high bounce rates BUT those pages maintain time on page in the 4 to 12 minute range.) The Panda discussions I've read on these boards seem to focus on e-commerce sites with extremely thin content. I assume that Google views much of my content as "thin" too. But, my site seems to need a pruning instead of just combiining the blue model, white model, red model, and white model all on one page like most of the ecommerce sites we've discussed. So, I'm asking people who have recovered from Panda to share what criteria they used to decide whether to combine a page, prune a page, etc. After I combine any series articles to one long post (driving the time on page to nice levels), I plan to prune the remaining pages that have poor time on page and/or bounce rates. Regardless of the analytics, I plan to keep the "thin" pages that are essential for readers to understand the subject matter of the blog. (I'll work on flushing out the content or producing videos for those pages.) How deep should I prune on the first cut? 5% ? 10% ? Even more ? Should I focus on the pages with the worst bounce rates, the worst time on page, or try some of both? If I post unique and informative video content (hosted on site using Wistia), what I should I expect for a range of the decrease in bounce rate ? Thanks for reading this long post.
Intermediate & Advanced SEO | | JustDucky0 -
Robots.txt 404 problem
I've just set up a wordpress site with a hosting company who only allow you to install your wordpress site in http://www.myurl.com/folder as opposed to the root folder. I now have the problem that the robots.txt file only works in http://www.myurl./com/folder/robots.txt Of course google is looking for it at http://www.myurl.com/robots.txt and returning a 404 error. How can I get around this? Is there a way to tell google in webmaster tools to use a different path to locate it? I'm stumped?
Intermediate & Advanced SEO | | SamCUK0