Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
-
Hi,
A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected.
A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work.
Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com.
However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query.
This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content.
So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue?
Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful!
Thank you,
Denver
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What happens when we change redirects to pass linkjuice to different pages from backlinks? Google's stand?
Hi Moz community, We have employed different pages (topics) at same URLs for years. This has brought different backlinks to same page which has led to non relevancy of backlinks. Now we are planning to redirect some URLs which may improve or drop rankings of certain pages. If we roll back the redirects in case of ranking drop, will there be any negative impact from Google? Does Google notice anything about redirect changes beside just passing pagerank from backlinks? Thanks
Algorithm Updates | | vtmoz0 -
Website's server IP address is redirected to blog by mistake; does Google responds?
Hi all, Our website's server IP address is set to be redirected to our blog by mistake and it stayed same for months. Is there any way Google recognises it and how it responds if so? Thanks
Algorithm Updates | | vtmoz1 -
How to take down a sub domain which is receiving many spammy back-links?
Hi all, We have a sub domain which has less engagement for last few years. Eventually many spammy back links pointed to this sub domain. There are relevant back links too. We have deleted most of the pages which are employing spammy content or which have spammy back links. Still I'm confused whether to take this sub domain down or keep it. The confusion between "relevant backlinks might be helping our website" and "spammy backlinks are affecting to drop in rankings"? Thanks
Algorithm Updates | | vtmoz0 -
AMP pages - should we be creating AMP versions of all site pages?
Hi all, Just wondering what people's opinions are on AMP pages - having seen the Google demo of how AMP pages will be given visibility on page one of Google for news-based content, do you think it is worth considering creating AMP versions of all pages, ready for when Google expands its inclusion of these super-fast pages?
Algorithm Updates | | A_Q1 -
Fetch as Google - removes start words from Meta Title ?? Help!
Hi all, I'm experiencing some strange behaviour with Google Webmaster Tools. I noticed that some of our pages from our ecom site were missing start keywords - I created a template for meta titles that uses Manufacturer - Ref Number - Product Name - Online Shop; all trimmed under 65 chars just in case. To give you an idea, an example meta title looks like:
Algorithm Updates | | bjs2010
Weber 522053 - Electric Barbecue Q 140 Grey - Online Shop The strange behaviour is if I do a "Fetch as Google" in GWT, no problem - I can see it pulls the variables and it's ok. So I click submit to index. Then I do a google site:URL search, to see what it has indexed, and I see the meta description has changed (so I know it's working), but the meta title has been cut so it looks like this:
Electric Barbecue Q 140 Grey - Online Shop So I am confused - why would Google cut off some words at start of meta title? Even after the Fetch as Googlebot looks perfectly ok? I should point out that this method works perfect on our other pages, which are many hundreds - but it's not working on some pages for some weird reason.... Any ideas?0 -
Yahoo & Bing algorithm changes?
We have noticed that several of our clients have been falling fairly significantly in the rankings in both Yahoo and Bing in recent weeks. Do you know if they have made any algorithm changes lately, and if so, do you have any indication of what changes may have been made?
Algorithm Updates | | GregWalt0 -
How to retain those rankings gained from fresh content...
Something tells me I know the answer to this question already but I'd always appreciate the advice of fellow professionals. So.....fresh content is big now in Google, and i've seen some great examples of this. When launching a new product or unleashing (yes unleashing) a new blog post I see our content launches itself into the rankings for some fairly competitive terms. However after 1-2 weeks these newly claimed rankings begin to fade from the lime light. So the question is, what do I need to do to retain these rankings? We're active on social media tweeting, liking, sharing and +1ing our content as well as working to create exciting and relevant content via external sources. So far all this seems to have do is slow the fall from grace. Perhaps this is natural. But i'd love to hear your thoughts, even if it is just keep up the hard work.
Algorithm Updates | | RobertChapman1 -
Is building a WordPress plugin a bad link-building strategy now?
One strategy I was considering for a new site was developing a WordPress plugin that would have the side-effect of generating lots of back links. Given Google's recent over optimization update, this sounds like it could be a really bad idea. The nature of the plugin would be such that it would probably be used on very new blogs with low quality.
Algorithm Updates | | JonDiPietro0