How to move domain content w Penguin Penalty?
-
Hey guys,
I've come to the conclusion the sheer amount of crap links a site of ours has is un repairable. We own a .net version with the same brand name so I'm planning to move our ecommerce store over with all its content.
I can move the site in one swoop but I believe Google will see it as duplicate content if we don't allow the old site to de index first. I would simply take it down for a month but we still get some orders now and then.
Anyone have any ideas? I was thinking of leaving an image up on each page that is no index no follow linked to the new site that explains the site is being moved, etc.
-
If you're trying to completely remove the domain, Anthony's suggestion about using Google's URL removal tool is the quickest way to go.
You'll first want to block crawler access in your robots.txt, then choose the option to "remove directory" in webmaster tools. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663427
(Note: robots.txt by itself DOES NOT prevent Google from indexing your pages, it only blocks them from crawling. Using only robots.txt means your pages are likely to stay in the index for quite some time)
That takes care of Google. Bing's process isn't quite as smooth. I'd also throw a meta robots "NOINDEX, FOLLOW" tag on all your pages - this will help with other search engines as well.
You may want to include a message on your old site instructing visitors to update their bookmarks and links - this may help aid with the transition.
Keep in mind, this will severe all links from your old domain, and none of your built up link equity will transfer over. Obviously, you've given this some thought in order to take such a move.
Hope this helps. Best of luck!
-
Per google webmaster tools:
Removing an entire directory or site
In order for a directory or site-wide removal to be successful, the directory or site must be disallowed in the site's robots.txt file. For example, in order to remove the http://www.example.com/secret/ directory, your robots.txt file would need to include:
User-agent: *
Disallow: /secret/
It isn't enough for the root of the directory to return a 404 status code, because it's possible for a directory to return a 404 but still serve out files underneath it. Using robots.txt to block a directory (or an entire site) ensures that all the URLs under that directory (or site) are blocked as well. You can test whether a directory has been blocked correctly using either the Fetch as Googlebot or Test robots.txt features in Webmaster Tools.Only verified owners of a site can request removal of an entire site or directory in Webmaster Tools. To request removal of a directory or site, click on the site in question, then go to Site configuration > Crawler access > Remove URL. If you enter the root of your site as the URL you want to remove, you'll be asked to confirm that you want to remove the entire site. If you enter a subdirectory, select the "Remove directory" option from the drop-down menu.
-
Thanks for the input. So you would put up the new site immediately? Or would you wait a certain amount of time after you take down the old site and put up the new robots.txt file?
I want to make sure Google doesn't see the new site as dup content.
-
I would suggest no index no follow in the old sites robots.txt file. Take down entire site with exception to a blank index page with text stating your site has moved to the new location, and the robots.txt file. I would not have a hot link to new site.
Then bring up the new site. On the next indexing, google should pick up the no index no follow from old site and start the indexing for the new site. I think that would be the best option so that you don't miss out on any potential business or orders.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Buying a disused website and using their content - penalty risk?
Hi all, I'm in the process of setting up a new website. I have found various old websites covering a similar topic and I'm interested in purchasing two of these websites for their content as it is very good, despite those sites struggling to make ends meet. One of these websites is still live, the other one hasn't been live for 2 years. Let's say I bought these websites for their content, then used that content on my new domain and made sure the two websites where this content came from were offline, would I run a risk of getting penalised? Does Google hold onto content from a website even if it is now offline?
Intermediate & Advanced SEO | | Bee1590 -
Domain Factors
Now that Page Rank seems to have been 'put out to graze' by Google with no further PR update planned, what would you say is the 'main factor' when looking at a domain? Is it Moz DA? or Moz Links? or Majestic TrustFlow? Or none of the above or is it a combination of the above?! Many thanks Andy
Intermediate & Advanced SEO | | TomKing0 -
Duplicate Content
Hi, So I have my great content (that contains a link to our site) that I want to distribute to high quality relevant sites in my niche as part of a link building campaign. Can I distribute this to lots of sites? The reason I ask is that those sites will then have duplicate content to all the other sites I distribute the content to won;t they? I this duplication bad for them and\or us? Thanks
Intermediate & Advanced SEO | | Studio330 -
Why does a site have no domain authority?
A website was built and launched eight months ago, and their domain authority is 1. When a site has been live for a while and has such a low DA, what's causing it?
Intermediate & Advanced SEO | | optimalwebinc0 -
Above the Fold Content
How important is the placement of unique content "Above the Fold". Will attention grabbing images suffice or must their be a lot of unique text?
Intermediate & Advanced SEO | | casper4340 -
What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
Intermediate & Advanced SEO | | irvingw
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta0 -
Homepage Content
I have a website which perform very well for some keywords and much less for other keywords. I would like to try to optimize the keywords with less performance. Let's say our website offers 2 main services: KEYWORD A and KEYWORD Z. KEYWORD Z is a very important keyword for us in terms of revenue. KEYWORD A gives us position Nr 1 on our local Google and redirect properly the visitors to xxxxxx.com/keyword-a/keyword-a.php KEYWORD Z perform badly and gives us position Nr 7 on local Google search. 90% Google traffic is sent to xxxxxx.com/keyword-z/keyword-z.php and the other 10% is sent to the home page of the website. The Homepage is a "soup" of all the services our company offers, some are important (KEYWORD Z) and other much less important. In order to optimize the keyword KEYWORD Z we were thinking to make a permanent redirect for xxxxxx.com/keyword-z/keyword-z.php to xxxxxx.com and optimize the content of the Homepage to ONLY describe our KEYWORD Z. I am not sure if Google gives more importance in the content of the homepage or not. Of course links on the homepage to other pages like xxxxxx.com/keyword-a/keyword-a.php will still exists. The point for us is maybe to optimize better the homepage and give more importance to the KEYWORD Z. Does it make sense or not?
Intermediate & Advanced SEO | | netbuilder0 -
Does duplicate content on a sub-domain affect the rankings of root domain?
We recently moved a community website that we own to our main domain. It now lives on our website as a sub-domain. This new sub-domain has a lot of duplicate page titles. We are going to clean it up but it's huge project. (We had tried to clean it even before migrating the community website) I am wondering if this duplicate content on the new sub-domain could be hurting rankings of our root domain? How does Google treat it? From SEO best practices, I know duplicate content within site is always bad. How severe is it given the fact that it is present on a different sub-domain?
Intermediate & Advanced SEO | | Amjath0