Moving Content To Another Website With No Redirect?
-
I've got a website that has lots of valuable content and tools but it's been hit too hard by both Panda and Penguin. I came to the conclusion that I'd be better off with a new website as this one is going to hell no matter how much time and money I put in it. Had I started a new website the first time it got hit by Penguin, I'd be profitable today.
I'd like to move some of that content to this other domain but I don't want to do 301 redirects as I don't want to pass bad link juice. I know I'll lose all links and visitors to the original website but I don't care.
My only concern is duplicate content. I was thinking of setting the pages to noindex on the original website and wait until they don't appear in Google's index. Then I'd move them over to the new domain to be indexed again.
Do you see any problem with this? Should I rewrite everything instead? I hate spinning content...!
-
If we're understanding the situation correctly, I'd say this sums it up pretty well.
-
It sounds to me as though most of the content from old site is staying but that 3 enigmatic 'tools' are being moved to a new domain.
In which case I would want to be sure that the functionality being moved wasn't the cause of the previously lifted penalty, especially from a Panda perspective (given that the tools on the new domain presumably won't have any links pointing to it, Penguin shouldn't be an issue) - as a penalty would be re-applied if the tools are not Panda-friendly.
So:
- if you want to have the tools on both sites, I'm with Pete - noindex the tools on the old site.
- if you are permanently moving the tools, review them for Panda-friendliness and then noindex the old site's URLs, probably worth blocking the old URL in robots.txt as well.
- If your previous penalty was nothing to do with the tools at all, and the link profile of those pages is good (or if there aren't any links) then 301 the old URLs to the new.
That's if between Pete and myself we've understood correctly what you're trying to achieve.
Good Luck!
-
So, I'm confused - are you looking to keep both sites active? If you're just moving the tools to a new domain, you could NOINDEX the old pages. If the link-based penalty isn't too severe, you might try a cross-domain rel=canonical on the old site. Unfortunately, without understanding the penalty profile, it's a bit tricky to advise. It's really a cost/benefit trade-off - how much risk of carrying the penalty are you willing to accept vs. the alternative of cutting off all authority and starting over on the new site.
If you've had Panda-related problems, though, I wouldn't keep the tools crawlable on both sites. That seems more likely to prolong your problems than it is to solve them.
-
In fact, I am not moving any content from the old website to the new one. It's just 3 online tools that I wanted to keep for the new website. They both have different content though but the functionalities are the same. I've "noindex" the tools on the old website.
By the way, the manual penalty has been revoked on the old website a few weeks ago.
-
I tend to agree with Martin - it seems like there's probably a way to preserve some of the power of the old site and 301-redirect selectively (or potentially use cross-domain rel=canonical tags), but it would take a much deeper understanding of the site than Q&A allows.
If you rebuild the site from scratch, you'd almost always want to de-index the old site. I'd flat out remove it via Google Webmaster Tools - it's the fastest method. Leaving both sites crawlable is only going to compound your problems and haunt the new site.
I'd warn, though, that if this is Panda-related, just moving the content won't solve your problems. You do have to sort out why they happened in the first place, or the same algorithmic issues will just come back. In other words, if the problems are content-related, then it doesn't really matter where the content lives. If the problems are link related, then moving will remove the problems. Of course, moving will also remove and advantages you currently have based on good links.
Unfortunately, this isn't a problem that can be addressed without a pretty deep audit. My gut feeling is that there may be a way to preserve some of the authority of the old site, but you really need to pin down the problems. Panda + Penguin is a wide swath of potential problems and just isn't enough information to do this right.
-
Some of this "content" are in fact online tools and the tutorials that accompanies it.
-
Hi Stephane,
All the below assumes you feel there is some value in keeping the original website live at all.
My first reaction would be to do a full review of all your old content and carefully consider which ones may have been hit by Panda - is there keyword stuffing, content duplicated from other sites, thin content...etc? Then either fix or completely rewrite those.
After that you should avoid publishing duplicated content so my view would be
1. Remove the rewritten/fixed articles completely from the old site
2. Don't implement the 301 so you don't get any redirected bad Penguin vibe
3. Put a block on those URLs using robots.txt
4. Remove the URLs from Google's index in Webmaster ToolsThen you are free to publish your new, Panda-friendly content to your new website.
Not sure what other mozzers would say, but that's my view. This is not about 'spinning content' but removing poor content and republishing great content. Hope it makes sense.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content suggestions and topics
Hello, In the list of topics that moz recommends, how many of the topics that are recommend should I cover just 2 or 3 or 10 of them ? is the more the better ? Then let's say one of the topic recommended is tennis should I just add the topic tennis once in my content or do I need to cover this topic multiple times ? meaning write the topic tennis 3 times across my content ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
How will changing my website's page content affect SEO?
Our company is looking to update the content on our existing web pages and I am curious what the best way to roll out these changes are in order to maintain good SEO rankings for certain pages. The infrastructure of the site will not be modified except for maybe adding a couple new pages, but existing domains will stay the same. If the domains are staying the same does it really matter if I just updated 1 page every week or so, versus updating them all at once? Just looking for some insight into how freshening up the content on the back end pages could potentially hurt SEO rankings initially. Thanks!
Intermediate & Advanced SEO | | Bankable1 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Duplicate Content / Canonical Conundrum on E-Commerce Website
Hi all, I’m looking for some expert advice on use of canonicals to resolve duplicate content for an e-Commerce site. I’ve used a generic example to explain the problem (I do not really run a candy shop). SCENARIO I run a candy shop website that sells candy dispensers and the candy that goes in them. I sell about 5,000 different models of candy dispensers and 10,000 different types of candy. Much of the candy fits in more than one candy dispenser, and some candy dispensers fit exactly the same types of candy as others. To make things easy for customers who need to fill up their candy dispensers, I provide a “candy finder” tool on my website which takes them through three steps: 1. Pick your candy dispenser brand (e.g. Haribo) 2. Pick your candy dispenser type (e.g. soft candy or hard candy) 3. Pick your candy dispenser model (e.g. S4000-A) RESULT: The customer is then presented with a list of candy products that they can buy. on a URL like this: Candy-shop.com/haribo/soft-candy/S4000-A All of these steps are presented as HTML pages with followable/indexable links. PROBLEM: There is a duplicate content issue with the results pages. This is because a lot of the candy dispensers fit exactly the same candy (e.g. S4000-A, S4000-B and S4000-C). This means that the content on these pages are the basically same because the same candy products are listed. I’ll call these the “duplicate dispensers” E.g. Candy-shop.com/haribo/soft-candy/S4000-A Candy-shop.com/haribo/soft-candy/S4000-B Candy-shop.com/haribo/soft-candy/S4000-C The page titles/headings change based on the dispenser model, but that’s not enough for the pages to be deemed unique by Moz. I want to drive organic traffic searches for the dispenser model candy keywords, but with duplicate content like this I’m guessing this is holding me back from any of these dispenser pages ranking. SOLUTIONS 1. Write unique content for each of the duplicate dispenser pages: Manufacturers add or discontinue about 500 dispenser models each quarter and I don’t have the resources to keep on top of this content. I would also question the real value of this content to a user when it’s pretty obvious what the products on the page are. 2. Pick one duplicate dispenser to act as a rel=canonical and point all its duplicates at it. This doesn’t work as dispensers get discontinued so I run the risk of randomly losing my canonicals or them changing as models become unavailable. 3. Create a single page with all of the duplicate dispensers on, and canonical all of the individual duplicate pages to that page. e.g. Canonical: candy-shop.com/haribo/soft-candy/S4000-Series Duplicates (which all point to canonical): candy-shop.com/haribo/soft-candy/S4000-Series?model=A candy-shop.com/haribo/soft-candy/S4000-Series?model=B candy-shop.com/haribo/soft-candy/S4000-Series?model=C PROPOSED SOLUTION Option 3. Anyone agree/disagree or have any other thoughts on how to solve this problem? Thanks for reading.
Intermediate & Advanced SEO | | webmethod0 -
301 Redirect from now defunct website?
Hi guys Quick question about 301 redirection between domains. I currently manage a website, lets call it website A. Website A sells a particular product range, however the decision has been made by the powers that be to pull the plug on the business and sell the products previously sold via Website A via another website within the parent companies control.....lets call it Website B. I need to make it clear to customers of Website A that the company no longer operates but want to pass the SEO equity that has been built up over time to the relevant pages on Website B. My plan was to 1. 301 Redirect all key landing pages on Website A to the most relevant pages on Website B 2. Initially keep the website A homepage live but change the message to say "Website A no longer operates, but Website B can help etc. etc." Remove all sub links from navigation. 3. Monitor referral and direct traffic levels and consider 301 redirecting website A homepage to Website B homepage in the long term. My questions: Does this sound like the best approach? If not, what alternatives are there? Will Website A look like a link farm for Website B? I dont want this obviously!
Intermediate & Advanced SEO | | DHS_SH0 -
How do I best handle Duplicate Content on an IIS site using 301 redirects?
The crawl report for a site indicates the existence of both www and non-www content, which I am aware is duplicate. However, only the www pages are indexed**, which is throwing me off. There are not any 'no-index' tags on the non-www pages and nothing in robots.txt and I can't find a sitemap. I believe a 301 redirect from the non-www pages is what is in order. Is this accurate? I believe the site is built using asp.net on IIS as the pages end in .asp. (not very familiar to me) There are multiple versions of the homepage, including 'index.html' and 'default.asp.' Meta refresh tags are being used to point to 'default.asp'. What has been done: 1. I set the preferred domain to 'www' in Google's Webmaster Tools, as most links already point to www. 2. The Wordpress blog which sits in a /blog subdirectory has been set with rel="canonical" to point to the www version. What I have asked the programmer to do: 1. Add 301 redirects from the non-www pages to the www pages. 2. Set all versions of the homepage to redirect to www.site.org using 301 redirects as opposed to meta refresh tags. Have all bases been covered correctly? One more concern: I notice the canonical tags in the source code of the blog use a trailing slash - will this create a problem of inconsistency? (And why is rel="canonical" the standard for Wordpress SEO plugins while 301 redirects are preferred for SEO?) Thanks a million! **To clarify regarding the indexation of non-www pages: A search for 'site:site.org -inurl:www' returns only 7 pages without www which are all blog pages without content (Code 200, not 404 - maybe deleted or moved - which is perhaps another 301 redirect issue).
Intermediate & Advanced SEO | | kimmiedawn0 -
How do 302 redirects from Akamai content targeting impact SEO?
How do 302 redirects from Akamai content targeting impact SEO? I'm using Akamai content targeting to get people from countries and languages to the right place (eg www.abc.123 to redirect to www.abc.123/NL-nl/default.aspx where folks from the Netherlands get their localized site in dutch) and from the edge server closest to them. As far as I know Akamai doesn't allow me to use anything but a 302. Anyone run across this? is this 302 a problem? I did a fetch as googlebot on my main domain and all I see is the Akamai 302. I can't imagine this is the first time Akamai has run across this but I would like to know for sure.
Intermediate & Advanced SEO | | Positec0 -
Old Redirecting Website Still Showing In SERPs
I have a client, a plumber, who bought another plumbing company (and that company's domain) at one point. This other company was very old and has a lot of name recognition so they created a dedicated page to this other company within their main website, and redirected the other company's old domain to that page. This has worked fine, in that this page on the main site is now #1 when you search for the other old company's name. But for some reason the old domain comes up #2 (despite the fact that it's redirecting). Now, I could understand if the redirect had only been set up recently, but I'm reasonably sure this happened about a year ago. Could it be due to the fact that there are many sites out there still linking to that old domain? Thanks in advance!
Intermediate & Advanced SEO | | VTDesignWorks1